Next stages of AI

DARPA that there have been three waves of AI.

1. The first of which was handcrafted knowledge. It’s still hot, it’s still relevant, it’s still important.

2. The second wave, which is now very much in the mainstream for things like face recognition, is about statistical learning where we build systems that get trained on data. But those two waves by themselves are not going to be sufficient. We see the need to bring them together.

3. The third wave of AI technology built around the concept of contextual adaption. Enabling the automated creation of contextual models. AI is currently brittle and will make categorizations and pattern recognition without an understanding or any context.

There are some high value opportunities where resources like architecting an environment with limited context or using internet of things sensors or cameras to provide data that can be used for context. Crowd resources can also be used for critical error checking. This is used by Google translate and recommendation systems for Facebook, Google, Yelp, Amazon where customers tell the AI based where some result is poor.

Making an environment where AI and robots can succeed despite limitations has already been done in Warehouses and it is being done in factories.

The second wave of AI over the past few years has provided cars that drive themselves, machines that accurately recognize images and speech, computers that beat the most brilliant human players of complex games like Go—stem from breakthroughs in a particular branch of AI: adaptive machine learning.

* learn behavior and patterns by training it on massive amounts of data

1. Start with a lot of digital data
2. Detect patterns in the data
3. Connect and match many patterns. Like correlating genome and dozens of other biological variables.

Tens of billions of dollars per year in effort can be used to gather a lot more data.

In genomics and medicine, there is the sequencing of more genomes and mapping data to medical records and individual microbiomes.
In manufacturing more data can be gathered on every product, machine, materials, process , journal articles etc..

Speeding up development – Modularization, reuse components or reapply training

Modularizing AI and creating marketplaces for AI services will be part of developments to speed up AI development.

SingularityNET and others will create protocols and standardization to get parts to work together.
There will be efforts to enable AI systems to generalize learning and to work in more situations. This also feeds into the DARPA work to create context models.

AI development will get faster and be enabled through various means to apply to many more situations.