The CEO of Figure AI calls the all neural net learning of a coffee making task by their humanoid robot a ChatGPT moment.
They are working with Fortune 50 companies.
we just had an AI breakthrough in our lab
robotics is about to have its ChatGPT moment
and that moment is happening tomorrow
— Brett Adcock (@adcock_brett) January 7, 2024
Figure-01 has learned to make coffee ☕️
Our AI learned this after watching humans make coffee
This is end-to-end AI: our neural networks are taking video in, trajectories out
Join us to train our robot fleet: https://t.co/egQy3iz3Kypic.twitter.com/Y0ksEoHZsW
— Brett Adcock (@adcock_brett) January 7, 2024
Incredibly excited to share some recent progress 🙂
What you see in the video:
⁃A learned, end-to-end visuomotor policy mapping onboard images to low level actions at 200hz.
⁃All behaviors (including corrective) are fully autonomous (not teleoperated).
⁃1x speed.— Corey Lynch (@coreylynch) January 7, 2024
Why is this so important?
the reason why this is so groundbreaking is if you can get human data for an application (making coffee, folding laundry, warehouse work, etc)
you can then train an AI system end-to-end on Figure 01
there is a path to scale to every use case
and…
— Brett Adcock (@adcock_brett) January 7, 2024
costs will collapse with volume manufacturing
we’ll have a robot for every human in our lifetime
— Brett Adcock (@adcock_brett) January 7, 2024
Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.
The chat GPT `moment` was millions of people using it and verifying their experience.
THIS?
You have zero way of knowing if its real, how many takes, how much input from humans etc, Boston Dynamics have shown dexterous robots doing a backflip, and no `moment` we know much of Boston’s work is many takes and scripted, but then again how do we know for sure this is not.
People do not trust such `demos` because so many past claims have been made, I don`t think robotics will have its `moment` till it can prove in full in the wild testing that it can do the things any maker claims it can.
The Coffee machine made coffee, robot just pressed a button and did some extra extra.
I agree completely! This year people will be amazed by robots in the same way they were wowed by Midjourney and Chat Gpt last year. Here’s a link to a stunning video by Agility Robotics asking their robot Digit to sort items using a LLM as the locomotion control driver for the robot. This was impossible just a few of months ago.
https://www.youtube.com/watch?v=CnkM0AecxYA
This video is of Digit walking around different environments in Berkeley. Their group is called hybrid robotics at Berkeley. They posted the LLM/code for it on Github already!
https://www.youtube.com/watch?v=eFoBfFhwo18
LLM’s are going to absolutely turbo charge the speed in which robots are deployed in the real world doing real world jobs. LLM’s are the breakthrough tech in robotics everyone has been waiting for! By the end of this year I believe we will be watching absolutely mind blowing videos of multiple humanoid robots doing incredibly complex tasks from a simple voice command prompt. The LLM’s will cause autonomous humanoid robots to arrive much faster than anyone predicted.
Be more impressed if it wasn’t a Keurig… Ten hours to learn how to use one of those is only impressive compared to prior robot performance.