In five years
AI for diagnosing mental health
Cognitive computers will analyze a patient’s speech or written words to look for tell-tale indicators found in language, including meaning, syntax and intonation. Combining the results of these measurements with those from wearables devices and imaging systems (MRIs and EEGs) can paint a more complete picture of the individual for health professionals to better identify, understand and treat the underlying disease, be it Parkinson’s, Alzheimer’s, Huntington’s disease, PTSD or even neurodevelopmental conditions such as autism and ADHD.
What were once invisible signs will become clear signals of patients’ likelihood of entering a certain mental state or how well their treatment plan is working, complementing regular clinical visits with daily assessments from the comfort of their homes.
Hyperimaging and AI will give us superhero vision
In five years, new imaging devices using hyperimaging technology and AI will help us see broadly beyond the domain of visible light by combining multiple bands of the electromagnetic spectrum to reveal valuable insights or potential dangers that would otherwise be unknown or hidden from view. Most importantly, these devices will be portable, affordable and accessible, so superhero vision can be part of our everyday experiences.
Today, more than 99.9 percent of the electromagnetic spectrum cannot be observed by the naked eye. Over the last 100 years, scientists have built instruments that can emit and sense energy at different wavelengths. Today, we rely on some of these to take medical images of our body, see the cavity inside our tooth, check our bags at the airport, or land a plane in fog. However, these instruments are incredibly specialized and expensive and only see across specific portions of the electromagnetic spectrum.
In five years, a view of the invisible or vaguely visible physical phenomena all around us could help make road and traffic conditions clearer for drivers and self-driving cars. For example, using millimeter wave imaging, a camera and other sensors, hyperimaging technology could help a car see through fog or rain, detect hazardous and hard-to-see road conditions such as black ice, or tell us if there is some object up ahead and its distance and size. Cognitive computing technologies will reason about this data and recognize what might be a tipped over garbage can versus a deer crossing the road, or a pot hole that could result in a flat tire.
Macroscopes will help us understand Earth's complexity in infinite detail
In five years, we will use machine-learning algorithms and software to help us organize the information about the physical world to help bring the vast and complex data gathered by billions of devices within the range of our vision and understanding. We call this a "macroscope" – but unlike the microscope to see the very small, or the telescope that can see far away, it is a system of software and algorithms to bring all of Earth's complex data together to analyze it by space and time for meaning.
Medical labs “on a chip” will serve as health detectives for tracing disease at the nanoscale
In 5 years, new medical labs on a chip will serve as nanotechnology health detectives – tracing invisible clues in our bodily fluids and letting us know immediately if we have reason to see a doctor. The goal is to shrink down to a single silicon chip all of the processes necessary to analyze a disease that would normally be carried out in a full-scale biochemistry lab.
Smart sensors will detect environmental pollution at the speed of light
In five years, new, affordable sensing technologies deployed near natural gas extraction wells, around storage facilities, and along distribution pipelines will enable the industry to pinpoint invisible leaks in real-time.