Novel web application for self-assessment of distance visual acuity to support remote consultation: a real-world validation study in children ============================================================================================================================================= * Louise Allen * Arun James Thirunavukarasu * Simon Podgorski * Deborah Mullinger ## Abstract **Objective** The difficulty in accurately assessing distance visual acuity (VA) at home limits the usefulness of remote consultation in ophthalmology. A novel web application, DigiVis, enables automated VA self-assessment using standard digital devices. This study aims to compare its accuracy and reliability in children with clinical assessment by a healthcare professional. **Methods and Analysis** Children aged 4–10 years were recruited from a paediatric ophthalmology service. Those with VA worse than +0.8 logMAR (Logarithm of the Minimum Angle of Resolution) or with cognitive impairment were excluded. Bland-Altman statistics were used to analyse both the accuracy and repeatability of VA self-testing. User feedback was collected by questionnaire. **Results** The left eyes of 89 children (median 7 years) were tested. VA self-testing showed a mean bias of 0.023 logMAR, with a limit of agreement (LOA) of ±0.195 logMAR and an intraclass correlation coefficient (ICC) of 0.816. A second test was possible in 80 (90%) children. Test–retest comparison showed a mean bias of 0.010, with an LOA of ±0.179 logMAR, an ICC of 0.815 and a repeatability coefficient of 0.012. 96% of children rated the test as good or excellent, as did 99% of their parents. **Conclusion** Digital self-testing gave comparable distance VA assessments with clinical testing in children and was well accepted. Since DigiVis self-testing can be performed under direct supervision using medical video consultation software, it may be a useful tool to enable a proportion of paediatric eye clinic attendances to be moved online, reducing time off school and releasing face-to-face clinical capacity for those who need it. * visual acuity * remote consultation * telemedicine * software validation ### Key messages #### What is already known about this subject? * Distance visual acuity (VA) is fundamental to decision making in ophthalmic consultation. * The NHS Long Term Plan target is for a third of consultations to be undertaken remotely. * Most vision testing applications are not certified medical devices or clinically validated. #### What are the new findings? * The novel web application, DigiVis, is accurate and repeatable in children from the age of 4 years in a real-world setting. * The test is well accepted by parents and children. * No training (other than the application’s instruction video) or professional support is needed for successful self-testing. #### How might these results change the focus of research or clinical practice? * VA self-testing may be observed in real time (synchronously) during video consultation to ensure viewing distance and eye occlusion are effective. * Asynchronous testing may enable home monitoring of conditions such as amblyopia. * Either method could reduce the frequency and requirement for clinic attendance. ## Introduction Clinic backlogs were growing even before the COVID-19 pandemic, but disruption during lockdown and ongoing social distancing requirements have added to clinic delays and the risk of preventable visual impairment. The potential benefits of undertaking consultations remotely include reducing the burden of unnecessary hospital attendance on patients while optimising face-to-face capacity for those who need it. The UK NHS Long Term Plan aims for a third of appointments to become virtual to meet the demands of an ageing population within the constraints of limited clinical capacity.1 2 A distance visual acuity (VA) assessment is fundamental to any ophthalmic assessment, and a validated method for assessing VA at home is needed to support remote consultation.3 Although more than 20 vision testing applications are available, very few are clinically validated, designed to be used without a trained observer or certified for medical use. Those available are difficult for the clinician to supervise remotely.3–7 Paediatric ophthalmology clinics have high footfall, with many children requiring frequent reviews of VA for amblyopia therapy, resulting in time off school and expense for the family. An accurate system for self-testing VA which can be used at home may enable a proportion of paediatric appointments to be undertaken remotely. DigiVis is a recently developed, certified medical web application device enabling self-testing of distance VA. It can be integrated within medical video consultation software, giving the clinician the ability to supervise the test to ensure that viewing distance set-up, use of glasses correction and effective eye occlusion are undertaken. In this paper, we report the accuracy and repeatability of DigiVis self-testing during eye clinic attendance in children between 4 and 10 years of age. ## Materials and methods This was a prospective validation study comparing DigiVis VA self-testing with standard clinical testing. Patients and the public were involved in the design, conduct, reporting and dissemination plans of our research. All children between 4 and 10 years of age attending routine paediatric ophthalmology clinic appointments within a 6-week period were invited to participate. Those with documented sight impairment of VA worse than +0.8 logMAR (Logarithm of the Minimum Angle of Resolution, 6/38 Snellen) were excluded, as were children with cognitive impairment. The children’s parents gave informed written consent and children gave informed verbal or written assent. Parents used DigiVis to self-test their children’s vision at the time of clinic attendance, using provided internet-connected devices under the supervision of a medical student. Occluding glasses or occlusive patches were provided but no other help was given. DigiVis VA results were documented by the student after testing and parents and children were asked to complete a usability and acceptance questionnaire. A standard, age-appropriate clinical assessment of VA was undertaken by a trained nurse, optometrist or orthoptist masked to DigiVis results. Where the standard vision assessment was undertaken using a Snellen chart, the value was converted to logMAR in Microsoft Excel. ### The technology The DigiVis test requires two digital devices connected to the internet, with no download necessary. A tablet, laptop or desktop computer is used to display the distant test chart. A paired smartphone or tablet is held by the child sitting 2 m away from the test chart display (figure 1A) and functions as an interactive ‘matching card’ (figure 1B). An animated instruction video in the application demonstrates the steps for screen calibration, measuring the viewing distance and pairing the devices. Sloan letter optotypes are presented on the larger, distant screen, with adjacent letters and indicator arrows providing crowding consistent with the letter size, in a similar manner to standard linear logMAR charts. Where fewer than five letters can be displayed on the display screen (from 0.8 logMAR), a crowding box is used. The child is asked to select the letter optotype out of a group of five displayed on their handheld device (four of which are randomised) which matches the letter indicated on the distant screen. The child is encouraged during the test by collecting cartoon animals after each correctly matched letter. Optotype sizing follows a modified García-Pérez psychophysical staircase starting at 0.6 logMAR with three reversal points, facilitating calculation of the VA threshold.8 For this study, a lower limit of 0.00 logMAR was set to reduce test duration for children. The test usually takes between 30 s and 2 min in each child’s eye, depending on the consistency of the subject’s responses. Results are displayed in logMAR, Snellen and ETDRS (Early Treatment Diabetic Retinopathy Study) chart letters. ![Figure 1](http://bmjophth.bmj.com/https://bmjophth.bmj.com/content/bmjophth/6/1/e000801/F1.medium.gif) [Figure 1](http://bmjophth.bmj.com/content/6/1/e000801/F1) Figure 1 (A) Randomised optotype presentation on the distant device; the arrow indicates the letter to match. (B) The appearance of randomised letters on the handheld device, one of which matches the indicated letter on the distant test chart. The ‘not sure’ button registers as an incorrect attempt. ### Analysis Data from the left eyes only were analysed to avoid codependence. Where standard clinical test results were <0.00 logMAR, the value was rounded up to 0.00 to enable comparability with DigiVis scores. Agreement between DigiVis and clinical VA measurements as well as test–retest (TRT) agreement were evaluated with Bland-Altman plots, looking specifically at 95% limits of agreement (LOA) and mean bias, and with intraclass correlation coefficients (ICC) and repeatability coefficients. A priori standards were used to facilitate appraisal of agreement as quantified by ICC.9 Analysis and data visualisation were conducted in R (V.3.6.1; R Foundation for Statistical Computing, Vienna, Austria) and Affinity Designer (V.1.8.6; Pantone, Carlstadt, New Jersey, USA). ## Results The left eyes of 89 children 4–10 years of age (mean 7.4 years, median 7 years) were tested using the children’s version of the DigiVis app and by standard, age-appropriate clinical assessment. Of these children, 80 (90%) completed two DigiVis tests, enabling TRT agreement to be appraised. Subject VA based on standard clinical testing ranged from 0 to 0.8 logMAR (mean 0.09 logMAR; IQR 0–0.13 logMAR). In both comparisons, good agreement is indicated by ICC values (p<0.001) and repeatability coefficients (table 1). Bland-Altman plots feature an average LOA at ±0.195 logMAR for accuracy, comparing DigiVis and clinical assessment (figure 2), and ±0.179 logMAR for TRT agreement (figure 3). Bias was minimal in both cases, indicating a lack of systematic error between measurement techniques in both comparisons. No significant correlation (p>0.1) was observed between mean VA and difference in VA in either Bland-Altman plot, suggesting that agreement was consistent over the tested range. Repeatability coefficients suggest that the smallest detectable difference in vision with DigiVis is around 0.012 logMAR (table 1). View this table: [Table 1](http://bmjophth.bmj.com/content/6/1/e000801/T1) Table 1 Mean bias and LOA for DigiVis compared with standard clinical assessment of VA and TRT agreement with 95% CI ![Figure 2](http://bmjophth.bmj.com/https://bmjophth.bmj.com/content/bmjophth/6/1/e000801/F2.medium.gif) [Figure 2](http://bmjophth.bmj.com/content/6/1/e000801/F2) Figure 2 Bland-Altman plot comparing DigiVis visual acuity measurements with standard clinical testing to evaluate accuracy. The bias and 95% limits of agreement (dashed lines) are labelled and have 95% CIs (dotted lines) shaded. LOA, limits of agreement. logMAR, Logarithm of the Minimum Angle of Resolution. ![Figure 3](http://bmjophth.bmj.com/https://bmjophth.bmj.com/content/bmjophth/6/1/e000801/F3.medium.gif) [Figure 3](http://bmjophth.bmj.com/content/6/1/e000801/F3) Figure 3 Bland-Altman plot comparing repeated DigiVis measurements to evaluate test–retest agreement. The bias and 95% LOA (dashed lines) are labelled and have 95% CIs (dotted lines) shaded. LOA, limits of agreement. logMAR, Logarithm of the Minimum Angle of Resolution. Of 89 children, 85 (95.6%) rated the test as good or excellent, as did 88 of the 89 (98.9%) parents. Of the 89 parents, 86 (96.7%) said that they would consider using the test to monitor their child’s vision at home. ## Discussion Conventional chart-based assessment of VA in children aged 6–11 years with corrected VA of 0.20 logMAR or better has reported a TRT LOA of ±0.15 logMAR.10 Validated digital distance VA testing systems include Peek Acuity, with an LOA between the app and clinical measurements of ±0.444 logMAR and a TRT LOA of ±0.414 logMAR.6 COMPlog, a distance VA test requiring a specifically sized computer monitor, recorded a TRT LOA of ±0.10–0.12 logMAR and an ICC of 0.964 in adults when comparing face-to-face with remote testing.11 12 Digital Kay picture symbol testing in children has an LOA between the app and ETDRS chart of ±0.21 and a TRT LOA of ±0.14.13 Together, these data provide a priori standards against which DigiVis can be evaluated, although it should be noted that these validation studies used trained examiners to assess visual threshold rather than self-testing. In this study, self-assessment with DigiVis, without trained input from an eyecare professional, had minimal bias, LOA of ±0.195 logMAR when compared with standard clinical testing, and TRT LOA of ±0.179 logMAR, with high ICC values of 0.816 and 0.815 and low repeatability coefficients, reinforcing evidence of its accuracy and reliability. The narrowness of CIs for calculated statistics suggests that the sampled population was sufficiently large to provide robust results. There were several limitations to this study. Standard clinical testing was carried out using a variety of standard charts: Snellen, ETDRS and children’s logMAR flip charts. This reflects real-world variation in paediatric ophthalmology clinics but may have reduced the reliability of clinical measurements. A potential advantage of DigiVis is that it provides uniformity of testing from 4 years upwards and removes observer bias. A further limitation of this analysis was the exclusion of children with VA worse than +0.8 logMAR, a decision made due to the presumed difficulties these individuals may have in accessing the test. Additionally, the IQR of 0.00–0.13 logMAR illustrates bias towards good VA levels in the studied population. Further investigation is required to verify the app’s potential in children with poorer VA levels since they were under-represented in this study. Children with special educational needs and developmental delay were also excluded from this study since prior attempts to use DigiVis in children with Down syndrome had failed due to difficulty in understanding the concept of letter matching. Finally, the apparent consistency of DigiVis in tested subjects may have been inflated by the study participants repeating the test in quick succession, in the same testing environment and on the same devices. However, rapid retesting of children might also have been expected to result in poorer concentration and less repeatability. Despite the limitations of the study, the results indicate that self-testing with DigiVis is comparable with age-appropriate VA assessment by a trained examiner in this childhood population, agreeing with our findings in a wider population containing older children and adults.14 The accuracy of VA assessment is dependent on viewing distance, correct use of glasses and effective occlusion. Confidence in home testing results may be improved by synchronising testing with remote consultation, using the share screen function of medical video conferencing software. This enables the clinician to observe test set-up and directly monitor both the child’s performance and the test chart in the conferencing window in real time. Based on the clinician’s observation and satisfaction with the parent and child’s ability to undertake the synchronised test effectively, unsupervised asynchronous home monitoring of VA may be considered. This could reduce the need for children to miss school in order to attend a face-to-face consultation. Further studies to determine the take-up and accuracy of both synchronous and asynchronous home vision self-testing using DigiVis are in progress. Vision self-testing was well accepted by both children and parents in this study, with almost all willing to use it for future home monitoring. Home testing and monitoring may encourage parents to take a more active role in their child’s eyecare and clinical capacity could be freed for children needing face-to-face clinic time. A disadvantage of DigiVis is its need for the family to have two internet-connected devices. Although most young families will have a smartphone and a tablet, a proportion of families will not be able to access the test. There is a recognised relationship between digital exclusion and the risk of poor health; inability to access digital testing could flag up this risk and prioritise access for face-to-face appointments.15 ## Data availability statement All data relevant to the study are included in the article or uploaded as supplementary information. ## Ethics statements ### Patient consent for publication Not required. ### Ethics approval This study was approved by the Health Research Authority and Health and Care Research Wales Ethics Committee (IRAS 196573). All procedures adhered to the tenets of the Declaration of Helsinki for research involving human subjects. ## Acknowledgments The authors extend their thanks to Sarah Laidlaw, Sarah Hays, Ruth Proffitt, Ciara O’Sullivan and Emily March for their assistance in the clinic and thank the parents and children who participated. ## Footnotes * Contributors All named authors contributed to trial design, recruitment and manuscript development and are accountable for the integrity of the study. LA led the study design and ethics submission. AJT undertook the statistical analysis. SP and DM were integral in recruitment and testing. * Funding The development of the DigiVis web application was funded by an MRC Confidence in Concept grant from the University of Cambridge. This study was funded by Addenbrooke’s Charitable Trust. * Competing interests LA is the inventor and developer of DigiVis and founding director of Cambridge Medical Innovations. An international patent application has been made by Cambridge Enterprise. * Patient and public involvement Patients and/or the public were involved in the design, or conduct, or reporting, or dissemination plans of this research. Refer to the Methods section for further details. * Provenance and peer review Not commissioned; externally peer reviewed. [http://creativecommons.org/licenses/by-nc/4.0/](http://creativecommons.org/licenses/by-nc/4.0/) This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: [http://creativecommons.org/licenses/by-nc/4.0/](http://creativecommons.org/licenses/by-nc/4.0/). ## References 1. BMA. Pressure points in the NHS.. Available: [https://www.bma.org.uk/advice-and-support/nhs-delivery-and-workforce/pressures/pressure-points-in-the-nhs](https://www.bma.org.uk/advice-and-support/nhs-delivery-and-workforce/pressures/pressure-points-in-the-nhs) 2. NHS. Online version of the NHS Long Term Plan: NHS.. Available: [https://www.longtermplan.nhs.uk/online-version/](https://www.longtermplan.nhs.uk/online-version/) 3. Steren BJ, Young B, Chow J. Visual acuity testing for telehealth using mobile applications. JAMA Ophthalmol 2021;139:344–7.[doi:10.1001/jamaophthalmol.2020.6177](http://dx.doi.org/10.1001/jamaophthalmol.2020.6177)pmid:http://www.ncbi.nlm.nih.gov/pubmed/33443550 [PubMed](http://bmjophth.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fbmjophth%2F6%2F1%2Fe000801.atom) 4. Kawamoto K, Stanojcic N, Li J-PO, et al. Visual acuity Apps for rapid integration in Teleconsultation services in all resource settings: a review. Asia Pac J Ophthalmol 2021;10:350–4.[doi:10.1097/APO.0000000000000384](http://dx.doi.org/10.1097/APO.0000000000000384)pmid:http://www.ncbi.nlm.nih.gov/pubmed/33606386 [PubMed](http://bmjophth.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fbmjophth%2F6%2F1%2Fe000801.atom) 5. Painter S, Ramm L, Wadlow L, et al. Parental home vision testing of children during Covid-19 pandemic. Br Ir Orthopt J 2021;17:13. [doi:10.22599/bioj.157](http://dx.doi.org/10.22599/bioj.157)pmid:http://www.ncbi.nlm.nih.gov/pubmed/34278213 [CrossRef](http://bmjophth.bmj.com/lookup/external-ref?access_num=10.22599/bioj.157&link_type=DOI) [PubMed](http://bmjophth.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fbmjophth%2F6%2F1%2Fe000801.atom) 6. Bastawrous A, Rono HK, Livingstone IAT, et al. Development and validation of a smartphone-based visual acuity test (peek acuity) for clinical practice and community-based fieldwork. JAMA Ophthalmol 2015;133:930–7.[doi:10.1001/jamaophthalmol.2015.1468](http://dx.doi.org/10.1001/jamaophthalmol.2015.1468)pmid:http://www.ncbi.nlm.nih.gov/pubmed/26022921 [PubMed](http://bmjophth.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fbmjophth%2F6%2F1%2Fe000801.atom) 7. Dawkins A, Bjerre A. Do the near computerised and non-computerised crowded Kay picture tests produce the same measure of visual acuity? British and Irish Orthoptic Journal 2018;13:22. 8. García-Pérez MA. Properties of some variants of adaptive staircases with fixed step sizes. Spat Vis 2002;15:303–21.[doi:10.1163/15685680260174056](http://dx.doi.org/10.1163/15685680260174056)pmid:http://www.ncbi.nlm.nih.gov/pubmed/12116992 [PubMed](http://bmjophth.bmj.com/lookup/external-ref?access_num=12116992&link_type=MED&atom=%2Fbmjophth%2F6%2F1%2Fe000801.atom) 9. Koo TK, Li MY. A guideline of selecting and reporting intraclass correlation coefficients for reliability research. J Chiropr Med 2016;15:155–63.[doi:10.1016/j.jcm.2016.02.012](http://dx.doi.org/10.1016/j.jcm.2016.02.012)pmid:http://www.ncbi.nlm.nih.gov/pubmed/27330520 [CrossRef](http://bmjophth.bmj.com/lookup/external-ref?access_num=10.1016/j.jcm.2016.02.012&link_type=DOI) [PubMed](http://bmjophth.bmj.com/lookup/external-ref?access_num=27330520&link_type=MED&atom=%2Fbmjophth%2F6%2F1%2Fe000801.atom) 10. Manny RE, Hussein M, Gwiazda J, et al. Repeatability of ETDRS visual acuity in children. Invest Ophthalmol Vis Sci 2003;44:3294–300.[doi:10.1167/iovs.02-1199](http://dx.doi.org/10.1167/iovs.02-1199)pmid:http://www.ncbi.nlm.nih.gov/pubmed/12882773 [Abstract/FREE Full Text](http://bmjophth.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiaW92cyI7czo1OiJyZXNpZCI7czo5OiI0NC84LzMyOTQiO3M6NDoiYXRvbSI7czoyNjoiL2Jtam9waHRoLzYvMS9lMDAwODAxLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 11. Laidlaw DAH, Tailor V, Shah N, et al. Validation of a computerised logMAR visual acuity measurement system (COMPlog): comparison with ETDRS and the electronic ETDRS testing algorithm in adults and amblyopic children. Br J Ophthalmol 2008;92:241–4.[doi:10.1136/bjo.2007.121715](http://dx.doi.org/10.1136/bjo.2007.121715)pmid:http://www.ncbi.nlm.nih.gov/pubmed/17993577 [Abstract/FREE Full Text](http://bmjophth.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTI6ImJqb3BodGhhbG1vbCI7czo1OiJyZXNpZCI7czo4OiI5Mi8yLzI0MSI7czo0OiJhdG9tIjtzOjI2OiIvYm1qb3BodGgvNi8xL2UwMDA4MDEuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 12. Srinivasan K, Ramesh SV, Babu N, et al. Efficacy of a remote based computerised visual acuity measurement. Br J Ophthalmol 2012;96:987–90.[doi:10.1136/bjophthalmol-2012-301751](http://dx.doi.org/10.1136/bjophthalmol-2012-301751)pmid:http://www.ncbi.nlm.nih.gov/pubmed/22539747 [Abstract/FREE Full Text](http://bmjophth.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTI6ImJqb3BodGhhbG1vbCI7czo1OiJyZXNpZCI7czo4OiI5Ni83Lzk4NyI7czo0OiJhdG9tIjtzOjI2OiIvYm1qb3BodGgvNi8xL2UwMDA4MDEuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 13. Shah N, Laidlaw DAH, Rashid S, et al. Validation of printed and computerised crowded Kay picture logMAR tests against gold standard ETDRS acuity test chart measurements in adult and amblyopic paediatric subjects. Eye 2012;26:593–600.[doi:10.1038/eye.2011.333](http://dx.doi.org/10.1038/eye.2011.333)pmid:http://www.ncbi.nlm.nih.gov/pubmed/22193878 [CrossRef](http://bmjophth.bmj.com/lookup/external-ref?access_num=10.1038/eye.2011.333&link_type=DOI) [PubMed](http://bmjophth.bmj.com/lookup/external-ref?access_num=22193878&link_type=MED&atom=%2Fbmjophth%2F6%2F1%2Fe000801.atom) [Web of Science](http://bmjophth.bmj.com/lookup/external-ref?access_num=000302938500016&link_type=ISI) 14. Thirunavukarasu AJ, Mullinger D, Rufus-Toye RM, et al. Clinical validation of a novel web-application for remote assessment of distance visual acuity. Eye 2021. doi:[doi:10.1038/s41433-021-01760-2](http://dx.doi.org/10.1038/s41433-021-01760-2). [Epub ahead of print: 30 Aug 2021].pmid:http://www.ncbi.nlm.nih.gov/pubmed/34462579 [PubMed](http://bmjophth.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fbmjophth%2F6%2F1%2Fe000801.atom) 15. NHS England. Digital inclusion in healthcare. Available: [https://www.england.nhs.uk/ltphimenu/digital-inclusion/digital-inclusion-in-health-and-care/](https://www.england.nhs.uk/ltphimenu/digital-inclusion/digital-inclusion-in-health-and-care/)