Over the last few years, the developments in artificial intelligence (AI) have been rapid. Consequently, we've witnessed the adoption of this technology by the world's biggest tech companies including legacy organizations such as Adobe, Meta and Microsoft. Various applications of AI have emerged, from making driving easier via ADAS to AI chatbots that can create content out of thin air, its uses are many. Now, Google DeepMind researchers are making use of AI to assist in one more field - medical science. With their latest innovation, researchers have shed light on an AI assistant for doctors that can help diagnose patients.
In a research paper published on arXive, Google DeepMind researchers have come up with a new AI model called Articulate Medical Intelligence Explorer or AMIE.
Google Researchers say that AMIE is a research AI system based on a Large Language Model (LLM) and optimized for diagnostic reasoning and conversations. To scale AMIE's knowledge and capabilities across various medical conditions and contexts, researchers designed a self-play-based simulated learning environment with automated feedback mechanisms for diagnostic medical dialogue in a virtual care setting.
It is trained on real-world datasets comprising medical reasoning, medical summarization and real-world clinical conversations. AMIE aims to improve medical diagnostics by enhancing the quality of medical conversations. It can ask clarifying questions to help diagnose better, while still maintaining empathy.
In an evaluation study carried out by Google, AMIE not only demonstrated greater diagnostic accuracy but also better conversation quality compared to Primary Care Physicians (PCP). Google says, “AMIE had greater diagnostic accuracy and superior performance for 28 of 32 axes from the perspective of specialist physicians, and 24 of 26 axes from the perspective of patient actors”.
No, AMIE is still just a research venture. Google says, “AMIE is our exploration of the “art of the possible”, a
Read more on tech.hindustantimes.com