Today was an interesting day for artificial intelligence developments. New research reveals that AI helped a stroke patient who lost the ability to speak. The technology converted the patient's brain signals into words. In other news, MediaTek announced today that it is leveraging Meta's Llama 2 large language model to build an edge computing system that does not require cloud computing for heavy processing. This and more in today's AI roundup. Let us take a closer look.
According to a New York Times report, a 30-year-old woman, suffered a cataclysmic stroke that paralyzed her and took away her ability to speak recently spoke once again with the power of AI. In a research that was published today in the Nature journal, scientists demonstrated that the first words to come out of her since the stroke were produced by synthesizing her brain waves and converting them into words through a complex algorithm powered by artificial intelligence.
“What's quite exciting is that just from the surface of the brain, the investigators were able to get out pretty good information about these different features of communication,” Dr. Parag Patil, a neurosurgeon and biomedical engineer at the University of Michigan, who was asked by Nature to review the study before publication told NYT.
MediaTek today announced that it is working closely with Meta's Llama 2 to build a complete edge computing ecosystem designed to accelerate AI application development on smartphones, IoT, vehicles, smart home, and other edge devices. MediaTek's use of Llama 2 models will enable generative AI applications to run directly on-device instead of entirely through cloud computing. MediaTek said that doing so provides several advantages to developers and users,
Read more on tech.hindustantimes.com