Generative AI Model Overview March 2023

Alan Thompson provides detailed tracking of the generative AI models and their specifications.

There are well over major 100 generative AI models and the top ten or so are in the 100-1000 GB size range and 100 to 1 trillion parameter range.

3 thoughts on “Generative AI Model Overview March 2023”

  1. Something worth of noticing, is that while all small-ish publicly known LLMs aren’t capable of recursive self improvement (criticizing its output to make it better), GPT-4 seems to be already capable of that. Maybe PaLM or another secret project also is, but we just don’t know.

    This also applies to their application as multi-agent systems, where the LLMs concot a plan, use tools and see the effects of its actions over the world, generating new objectives, following a bigger one.

    At this point, seems the gap between them and AGI is more a matter of internal organization, and the ability to iterate over its output, and/or make its own sub-goals.

  2. Can’t put the genie back in the bottle. That’s every major technological development ever.

    I look at the picture/diagram for this article and wonder if they will soon be gossiping with each other–and would be crushed to learn they were and it wasn’t even about us?

  3. This is why there will be no “pause” in AI development. There is simply too much competition in the field.

Comments are closed.