Talking with AI Characters in Realtime is Gamechanging

Matt Wolfe interviewed the CEO of Convai, the company behind the viral NVIDIA Computex game demo. They talk about how the AI character talking tool works, how to create characters, how to use it yourself, and demo it. AI will enable unique conversations and make each game play different for any player. Brian WangBrian Wang …

Read more

The New AI As Continuation of Longer Predictive Modeling Trend

Friedberg on the Allinpodcast discusses how the Large Language Model AI are extending statistical inference to improving prediction. LLM have been able to apply what was being done in other domains and industries to language. Humans interact with language, so this is a very important area for computers and AI to make a big impact. …

Read more

Tree of Thoughts Improves AI Reasoning and Logic By Nine Times

Language models are increasingly being deployed for general problem solving across a wide range of tasks, but are still confined to token-level, left-to-right decision-making processes during inference. This means they can fall short in tasks that require exploration, strategic lookahead, or where initial decisions play a pivotal role. Deepmind researchers introduce a new framework for …

Read more

AI Scaling Laws Guide Building to Superhuman Level AI

Scaling laws are as important to artificial intelligence (AI) as the law of gravity is in the world around us. Cerebras makes wafer scale chips that are optimized for AI. Cerebras wafer chips can host large language models (LLMs).They are using open-source data that can be reproduced by developers across the world. James Wang was …

Read more

More AI Breakthroughs and Reaching AGI

Google released a Large Language Model PaLM 2 which is competitive with OpenAI’s GPT 4. Google also announced they are already training Gemini which is a GPT 5 competitor. Gemeni likely use TPU v5 chips. Benchmarks for PaLM-2 beats GPT 4. They use SmartGPT-like techniques to boost performance. PaLM 2 beats even Google Translate, due …

Read more

Google Announces Palm-2 and BARD as a New GPT-4 Competitors at Google IO 2023

Google’s CEO has announced that Google will be an AI first company. They are adding generative AI to all of their products including search, maps, gmail and workspace. Google announces Palm-2 which powers an improved BARD as a new GPT-4 competitors. Palm-2 is integrated into 25 Google products. BARD can handle over 20 programming languages. …

Read more

Scaling Transformer to Output Over 2 Million Words With RMT

Recurrent Memory Transformer retains information across up to 2 million tokens (words). Applying Transformers to long texts does not necessarily require large amounts of memory. By employing a recurrent approach and memory, the quadratic complexity can be reduced to linear. The models trained on sufficiently large inputs can extrapolate their abilities to texts orders of …

Read more

Emergence and Reasoning in Large Language Models

Emergent capabilities are abilities that are not present in smaller models but are present in larger models. This is discussed in the video below by Jason Wei, a Google AI researcher. I had a prior article that discussed the summary from Alan Thompson on what capabilities emerged at what point for the large language models. …

Read more

Timeline of Open and Proprietary Large Language Models

Here is a timeline of open and proprietary large language models. Brian WangBrian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and …

Read more