Remote monitoring of health has the potential to reduce the burden to patients of face-to-face appointments and make healthcare more efficient. Apps are available for patients to self-monitor vision at home, for example, to detect reactivation of age-related macular degeneration (AMD). Describing the challenges when implementing apps for self-monitoring of vision at home was an objective of the MONARCH study to evaluate two vision-monitoring apps on an iPod Touch (Multibit and MyVisionTrack).
Diagnostic Test Accuracy study.
Six UK hospitals.
The study provides an example of the real-world implementation of such apps across health sectors in an older population. Challenges described include the following: (1) frequency and reason for incoming calls made to a helpline and outgoing calls made to participants; (2) frequency and duration of events responsible for the tests being unavailable; and (3) other technical and logistical challenges.
Patients (n=297) in the study were familiar with technology; 252/296 (85%) had internet at home and 197/296 (67%) had used a smartphone. Nevertheless, 141 (46%) called the study helpline, more often than anticipated. Of 435 reasons for calling, all but 42 (10%) related to testing with the apps or hardware, which contributed to reduced adherence. The team made at least one call to 133 patients (44%) to investigate why data had not been transmitted. Multibit and MyVisionTrack apps were unavailable for 15 and 30 of 1318 testing days for reasons which were the responsibility of the app providers. Researchers also experienced technical challenges with a multiple device management system. Logistical challenges included regulations for transporting lithium-ion batteries and malfunctioning chargers.
Implementation of similar technologies should incorporate a well-resourced helpline and build in additional training time for participants and troubleshooting time for staff. There should also be robust evidence that chosen technologies are fit for the intended purpose.
To test an online training course for non-ophthalmic diabetic retinopathy (DR) graders for recognition of glaucomatous optic nerves in Vietnam.
This was an uncontrolled, experimental, before-and-after study in which 43 non-ophthalmic DR graders underwent baseline testing on a standard image set, completed a self-paced, online training course and were retested using the same photographs presented randomly. Twenty-nine local ophthalmologists completed the same test without the training course. DR graders then underwent additional one–to-one training by a glaucoma specialist and were retested. Test performance (% correct, compared with consensus grades from four fellowship-trained glaucoma experts), sensitivity, specificity, positive and negative predictive value, and area under the receiver operating (AUC) curve, were computed.
Mean age of DR graders (32.6±5.5 years) did not differ from ophthalmologists (32.3±7.3 years, p=0.13). Online training required a mean of 297.9 (SD 144.6) minutes. Graders’ mean baseline score (33.3%±14.3%) improved significantly after training (55.8%±12.6%, p
Non-ophthalmic DR graders can be trained to recognise glaucoma using a short online course in this setting, with no additional benefit from more expensive one–to-one training. After 5-hour online training in recognising glaucomatous optic nerve head, scores of non-ophthalmic DR graders doubled, and did not differ from local ophthalmologists. Intensive one-to-one training did not further improve performance