Google presented new areas of AI-related health research and development at its second “The Check Up” event, including diagnosing diabetes-related problems from external eye pictures and how smartphones could be used to record and translate heartbeats.
Google Health’s recent research demonstrates how a deep learning model may extract potentially relevant biomarkers from external eye pictures taken with existing tabletop cameras in clinics to detect diabetes-related disorders. Now, Google is planning clinical research with partners such as Eye PACS and Chang Gung Memorial Hospital (CGMH) to see if images taken with smartphone cameras may also help diagnose diabetes and non-diabetic problems from exterior eye photos.
Next, Google’s new research division looks into how smartphone mics may be used to record and translate heart sounds when put over the chest. Google Health is also focusing on open-access foundational research studies to evaluate the use of artificial intelligence to assist doctors in performing ultrasounds and assessments.
Google will continue to develop and test these models in collaboration with Northwestern Medicine in order to make them more generalizable across diverse levels of experience and technologies.
Access to appropriate healthcare might be difficult depending on where people live and whether or not local providers have specific equipment or expertise for duties such as disease screening. To assist, Google Health has expanded its research and apps to focus on improving clinicians’ care and allowing treatment to take place outside of hospitals and doctor’s offices,” said Greg Corrado, Google’s Head of Health AI.