Sam Altman OpenAI QStar is Still Secret and Sam Says Coding Will Go Away

Lex Fridman had another interview of Sam Altman of OpenAI.

The highlights:

Coding will go away. We will just talk to AI to describe what we want and AI will create it in code.

Sam still will not talk about OpenAI Q-Star. Therefore, we know it is still real and big breakthrough. We just do not know exactly how it is a breakthrough.

There are hundreds of innovations that come together into the big releases like GPT4 and GPT5.

UPDATE: I made a video that looks at key moments in the Sam Altman Interview to highlight and analyze the critical statements.

Sam Altman says he used to have an overall map of the frontier of the technology that let him see valuable connections. [NOTE: Nextbigfuture tries to provide this technology frontier map for everyone.]

Sam disavows the statement of raising $7 trillion for AI chips.

Sam says we need a lot of AI. There will so much AI demand that energy will be the limiting factor.

GPT 3.5 did change the expectations for the world but Sam does not think it changed the world. Sam thinks AGI needs to change the world economy and needs to speed up the rate of scientific discovery.

5 thoughts on “Sam Altman OpenAI QStar is Still Secret and Sam Says Coding Will Go Away”

  1. We don’t know where all this is going. Which doesn’t mean we can afford to slow down. It’s probably an existential requirement that we continue to press science and technology forward as quickly as we can.

    I am a coder myself and I full agree that tools, while capable of increasing productivity, have not yet come close to eliminating humans. I’m dealing with a vendor right now that thought they could use software to generate good software. Tearing into it and showing them why their code is slow and bad is an uphill battle because they don’t know enough about it to really understand why it is bad. Fortunately, it is easy to demonstrate that it is slow.

    I do believe that the time will come when software is written by software, and the real job of human developers will be to define requirements, but that’s no trivial thing.

    I had a number of unqualified successes when I managed some of my earliest projects. After that, I only seemed to get sent to clients whose projects had already failed, usually after going through several project managers and massive unrecoverable expenses. In each case. I would always ask for the initial requirements right at the start. None of these places were ever able to produce anything like a requirements document, usually having only some nebulous emails about what they wanted, if anything.

    Again, software that can write good software without human intervention is coming, and it will be amazing, like Aladdin’s Genie. But you get what you wish for and you had better be careful with what you wish for and how you communicate it.

    People worry that the industrial revolution automated brawn, and the current advances (I refer to them as cognitive automation) are automating brains. But they overlook the fact that work is more than just brains and brawn. Someone is required to figure out what that brawn and brains should be applied to, and provide the vision, direction, and motivation, to get it done. People are still at the top and still will be.

    Consider, how would an AGI decide what it should do itself without any assigned outcome? Unless it was an actual person produced purely in inorganic materials, it has no glands, hormones, or equivalents. It has no hunger or thirst, no desire for sex, and probably no psychological needs of any sort. You could direct it to come up with a list of all the things it can do, the goals it could work toward, perhaps with a filter to (hopefully) eliminate most undesirable behaviors, and then have it generate a random number to see what it will apply itself to. But that would be madness, not to mention unproductive. It will also doubtless be illegal (like taping down the cutoff bar on a self-propelled lawnmower and letting it go into a school yard). Law enforcement will have plenty of its own ‘tame’ artilects ready and able to shut such a thing down very quickly.

    None of which changes the fact that with science and technology leaping ahead constantly, we are entering uncharted territory, but we are also riding a wild tiger and dare not dismount. The lives of billions of human beings probably hangs on discoveries we have not even made yet, and technologies we have not yet implemented.

    • Communicating what you want to see is a talent in and of itself and one thats often overlooked. Many know what they want when they see it but to get to there requires someone understanding how to describe it through correct and creative communication. cue a new term like AI whisperer or something… some humans will just have the innate ability to communicate with these machines more effectively than others

  2. It’s still nothing more than a slight improvement on last year’s code completion as far I have been able to use it.

  3. Feels kind of like the trough of disillusionment is coming up soon.

    I mean I get why Altman hypes up AI. I get why people who don’t write code buy in to the hype of AI.

    • Sure, but it will only last months not years and the next peak of inflated expectations will immediately follow and what AI professionals thought might happen in 50 years in 2020 will be seem quaint and easy in 2026.

Comments are closed.