$3 Million Breakthrough Prize for the Influential Supergravity Theory

The $3 Million Breakthrough Prize in Fundamental Physics was shared between theorists Sergio Ferrara (CERN), Daniel Z. Freedman (Massachusetts Institute of Technology and Stanford University), and Peter van Nieuwenhuizen (Stony Brook University). The three are being honored for “the invention of supergravity, in which quantum variables are part of the description of the geometry of spacetime.”

Supergravity

Ferrara, Freedman and van Nieuwenhuizen are the architects of supergravity, a highly influential 1976 theory that successfully integrated the force of gravity into a particular kind of quantum field theory (a theory that describes the fundamental particles and forces of nature in terms of fields embodying the laws of quantum mechanics).

The 1960s and early ’70s saw the construction of the Standard Model, a quantum field theory that still remains the most precisely verified theory in physics, and whose achievements include the prediction of the Higgs boson. However, it was clear that the Standard Model was not complete. In particular, it described only three of the forces of nature: it left out gravity, which was the domain of Einstein’s theory of general relativity. It also retained some major puzzles, including masses of particles that were many orders of magnitude below their expected values, and the lack of any particle that could explain dark matter, the invisible substance that pervades the Universe.

Then in 1973, physicists developed a principle, “supersymmetry,” which extended the Standard Model to include a new family of particles. Supersymmetry postulated that each of the known particles had an unseen “partner:” the fermions (such as electrons and quarks, which make up matter) had bosons (force-carrying particles) as partners; while the bosons (such as photons of light) had corresponding fermions. Though the existence of these “super-bosons” and “super-fermions” is yet to be confirmed experimentally, supersymmetry is an attractive idea because of its explanatory power. It relates the characteristics of fermions and bosons as manifestations of an underlying symmetry – much as different shapes might represent a single object reflected in a mirror. And it offers solutions to some of those perplexing puzzles in the Standard Model, including a mechanism explaining the tiny particle masses, and a natural candidate for dark matter, which – like the hypothesized super-bosons – is massive but invisible.

But for supersymmetry to describe the phenomena we do see around us – like apples falling to Earth – it would have to be extended to include gravity. This was the task that Ferrara, Freedman and van Nieuwenhuizen set their minds to. Beginning with discussions between Ferrara and Freedman at the École Normale Supérieure in Paris in 1975, continuing via collaboration with van Nieuwenhuizen at Stony Brook University, and culminating in a laborious series of calculations on a state-of-the-art computer, they succeeded in constructing a supersymmetric theory that included “gravitinos” – a super-fermion partner to the graviton, the gravity-carrying boson. This theory, supergravity, was not an alternative theory of gravity to general relativity, but a supersymmetric version of it: the algebra they used in the theory included variables representing part of the geometry of spacetime – geometry which in Einstein’s theory constitutes gravity.

A Deeply Influential Theory

In the four decades since its development, supergravity has had a powerful influence on theoretical physics. It showed that supersymmetry was capable of accounting for all the phenomena we see in the real world, including gravity. It represented a completion of the current understanding of particle physics – a rigorous mathematical answer to the question, “What theories of nature are compatible with the principles of both quantum mechanics and special relativity?” And it provided a foundation for the attempt – still ongoing – to build a full theory of quantum gravity that describes space and time at a fundamental level

Special Breakthrough Prize in Fundamental Physics

A Special Breakthrough Prize in Fundamental Physics can be awarded by the Selection Committee at any time, and in addition to the regular Breakthrough Prize awarded through the ordinary annual nomination process. Unlike the annual Breakthrough Prize in Fundamental Physics, the Special Prize is not limited to recent discoveries.

This is the fifth Special Prize awarded: previous winners are Stephen Hawking, seven CERN scientists whose leadership led to the discovery of the Higgs boson, the entire LIGO collaboration that detected gravitational waves, and, last year, Jocelyn Bell Burnell for her discovery of pulsars.

34 thoughts on “$3 Million Breakthrough Prize for the Influential Supergravity Theory”

  1. Any such theory would likely be regarded as sensitive and classified information. So important and sensitive that it would likely be hidden in plain sight so that it could more easily be ridiculed.

  2. It’s all epicycles. Dark matter is used to ‘explain’ a larger mass of the universe than normal matter. Occam rolls in his grave trying to shave.

  3. Concur, it’s the order in which the concepts arose that confuses people. Time and space are likely just attributes of space-time, which itself is not composed of space and time.

    But we developed the concept of space, then time, and finally space-time, which likely created a kind of cognitive trap.

  4. This is kind of personally exciting – one of my father’s published quantum gravity physics papers (from 1975) is usually/often referenced in the footnotes of supergravity papers.

  5. Current theory suggest that while there is a lot of dark matter, it is very widely dispersed through the galaxy, in a cloud of sort that extends well beyond the visible Milky Way. The amounts within the solar system end up being a rounding error, compared to the mass of the sun & planets. But remember that the galaxy as a whole is mostly empty space, thus the dark matter dispersed there ends up outweighing all the stars. Here is an article that answers that in more detail.

    http://cdms.berkeley.edu/Education/DMpages/FAQ/question36.html

  6. Interesting article, though I wonder how well it accounts for those galaxies I mentioned that have rotation curves that do seem to follow our standard gravity laws, with faster rotating stars in the center and slower orbital ones towards the edges, and thus don’t follow the symmetry patterns propsed in this article.

  7. I’ll hedge this by saying it may be way off the mark as this is not my field but…. If dark matter is 10 times the rest of the equation for gravity governing the behavior of matter in a galaxy how can it not behave as normal matter as far a gravity is concerned in a smaller scale? If it can influence the rotation of a galaxy does it influence the rotation of our solar system? Do we need a 10x fudge factor in these equations? If it has 10x the gravitational influence on normal matter shouldn’t it coalesce with it’s self and normal matter under gravitational forces. Do we ever see unexplained mass discrepancies localized with normal matter or with only itself on a smaller scale?

  8. If the electron mass and velocity are m and v, and the daughter nucleus mass and velocity are M and V, then momentum conservation dictates that:

    MV = -mv
    V = -mv/M

    Which means their total kinetic energy will be:

    Ek = 0.5mv^2 + 0.5MV^2 = 0.5mv^2 + 0.5M(-mv/M)^2 = 0.5mv^2 + 0.5(mv)^2/M = 0.5mv^2 * (1 + m/M)

    And conservation of energy dictates that this sum of kinetic energies must equal the internal energy difference between the parent and daughter nuclei, dE (which is quantized):

    dE = Ek = 0.5mv^2 * (1 + m/M)
    v^2 = dE / 0.5m(1 + m/M)
    v = sqrt [ dE / 0.5m(1 + m/M) ]

    m and M are constants. This suggests that you can’t get a continuous velocity spectrum from a quantized dE with only nuclear recoil, while conserving both energy and momentum. That’s even without considering spin.

  9. On your “so what?” regarding spin conservation, spin is the quantum equivalent of angular momentum. We know from elsewhere that angular momentum is a conserved property, just like linear momentum. There is a lot of science behind spin and affected by spin (likely including observations of spin conservation) – it’s not something you can just wave away.

    > Find another way to reconcile the spin issue without inventing phantom particles

    You’re welcome to try. An extra particle is the simplest explanation that allows for spin conservation.

  10. Quantization doesn’t mean a single fixed energy, but multiple fixed energy levels. Whereas energy conservation means the sum of output energies must equal the sum of input energies. Those are separate issues, though related in the case of beta decay.

    As I understand it, beta decay is fundamentally quantized because nuclei have quantized energy states. Both the parent nucleus and daugher nucleus have quantized energy states, which leaves a quantized energy difference for the electron that’s emitted. But as you correctly point out, we observe a continuous spectrum of electron energies. That’s precisely why we need something extra to explain the energy difference between the quantized output that we expect, and the continuous output we actually measure.

    If you rely only on nucleus recoil to explain that difference, my understanding is you’d have problems with momentum conservation. That’s why Pauli proposed the neutrino in the first place – to explain the unexpected continuous spectrum while preserving both energy and momentum at the same time (as well as spin). Do you really think he hasn’t considered nucleus recoil when he was looking for an explanation?

  11. A better understanding of quantum gravity could lead to breakthroughs in propulsion technology, among other things. If we ever want to go beyond our solar system, we need those physics.

    Fundamental physics such as this usually have important applications, but it often takes a long time to realize them. The GPS you’re likely using all the time relies on both quantum physics (for its clocks) and general relativity (for corrections to time and position calculations). The transistors in you computer and smartphone also wouldn’t work without quantum physics. But neither of these applications would be possible to predict at the time when those theories were formulated.

  12. When the fudge factor is 10 times the size of the rest of the equation then it isn’t really a fudge factor any more.

    Fortunately, physics actually has a long history of being quite ready to backtrack all the way back to the beginning and start again. Unlike my car’s GPS.

  13. If beta decay “must” be quantized and have a fixed energy, then the simple explanation is that the fraction of energy imparted to the recoiling nucleus varies from some minimum to a maximum (perhaps 0% to 100%) as is true for many nuclear collisions which have continuous probability functions for scattering angles (e.g. neutron vs. nucleus). You can’t reliably observe the recoil nucleus’ energy because it tends to be miniscule and damped by stopping power; heavy ions don’t go far in materials. It would be very difficult if not impossible to observe several keV imparted to a 3He following the beta decay of 3H. It would be even harder to detect the recoil energy in heavier elements where kinetic energy on the keV order is unlikely to cause significant ionization and the materially is optically opaque to low energy de-excitation photons. IOW, you can’t easily see the recoil energy from beta decay.

    http://xdb.lbl.gov/Section1/Table_1-1.pdf

    Originally invented to account for conservation of spin; spin appears not to be conserved following beta decay without the invented neutrino – I say, so what? It just means the model is flawed. Find another way to reconcile the spin issue without inventing phantom particles that don’t interact with matter while having the ‘capacity’ to carry multiple MeV.

  14. The name “dark matter” comes from the mass of galaxies as determined by their rotation curves (i.e Newtonian orbits about their center), and the mass as determined by visible stars (light matter, because it emits light). The rotation curve gives a larger mass, so some of that matter is “dark”, i.e. doesn’t emit light like stars, or reflect or absorb light like gas clouds.

    Since the discrepancy was discovered, the question has been what is dark matter made of? Since then, we have also gotten good at mapping the dark matter via gravitational lensing. Massive objects, like galaxies and stars, also bend light, as we have known for 100 years. So we have some idea *where* the dark matter is, but still don’t know *what* it is.

  15. Yes, because Fermi was such a hack. This is what happens when you give nuclear engineers the impression they are physicists. They stop studying mathematics after what, I’m going to guess, PDE? complex analysis? And then you accuse everyone else of having dumb ideas because you can’t do quantum. I see a lot of this hashtag wokebro physics attitude among academic burnouts in industry. It’s always from hybrid engineer/physicsts that are disillusioned from being stuck in some corner of low speed industry. Physicists forced to work as engineers are a salty tribe.

  16. Dark matter is just short hand for hypothetical matter that interacts well enough with gravity, but has little, if any, interaction with the EM, weak, and strong forces, that exists in amounts great enough to explain why some large scale observations (like the rotation curves of galaxies) don’t match with our existing gravity theories.

    The alternative explanation, that you lead into, would be modified gravity theories; the idea that at large scales gravity behaves differently. Several versions of these modified gravity theories have been considered over the years, but none of them so far seem to fit the observations as well as the dark matter theory does.

    For example, there are several objects, like the Bullet Cluster, where we can observe gravitational lensing is occurring in a way that is mismatched from where the visible matter is. Another would be the discovery of two galaxies that DO rotate in the way our standard gravity theories predict – which is really hard to fit in with any of the modified gravity theories, but with with the dark matter theory we can propose that those galaxies just don’t have very much dark matter.
    http://www.astronomy.com/news/2019/03/ghostly-galaxy-without-dark-matter-confirmed

  17. Is dark matter just a fancy way of saying we need a fudge factor for our math to work? Does it exist only to fix the math of gravity on the galactic scale? Could we have made a fundamental error far enough back in our theories that it’s unthinkable to have to backtrack that far? Like taking a wrong turn at the start of a journey that gets you one block over from your intended destination but across a river. You’d have to go half the way back to fix it but instead you try to convince yourself you’re at the right place.

  18. We have a famous example of a physicist discovering an entire field of science with his mind through thought experiments (Einstein). That doesn’t elevate the process of dreaming up solutions to problems that only exist in the context of a flawed ‘standard model’ to the status of being consistent with the scientific method. If observed phenomena don’t make sense to the astrophysicists at this time, they might just be SOL for now and until something in the future brightens up the darkness. A lot of modern physics is just bunk, or worse, spunk.

  19. Don’t invoke Pauli or any other real scientist from the past to attempt to lend legitimacy to this gobbledygook.

    Nearly everyone who iseducated in modern physics is ‘on-board’ with the invention of the neutrino as required to explain-away the continuous energy spectrum of beta decay and
    several other phenomena that do not fit the ‘standard model’.  Note that I have personally observed this continuous energy spectrum in a school lab. To me, the invention of the neutrino (for example) only highlights that our ‘standard model’ needs work, AND seeing this fault does not require me to possess an explanation or alternative theory. It is a joke to say that the beta decay is fundamentally fixed-energy event for a given species (e.g 90Sr) when that does not fit the observation.  This sheer bullsnot would not be acceptable in any other field with similar unknown frontier (such as medicine).  The scintillator and photomultiplier tube (PMT) pairing, which every ‘neutrino detector’ relies upon, will respond to ANY form of radiation precisely because ANY FORM of radiation will lead to the emission of the low energy photons to which the PMT responds.  PMT responds to any photon exceeding the work
    function of the coating on the cathode (few eV).  Counting neutrinos is in fact the counting of
    trace decays of anything contaminating any part of the massive multi-ton system (tramp U, tramp K, etc.) claimed to be coincident with a stellar event – in other words: noise and hocus pocus.

  20. So, how close are we to a grand unified field theory now? I want my anti-gravity propulsion dang it! 🙂

  21. There is nothing “gobbledygook” about discussing dark matter; it is just a theory to explain why the movement of certain large scale objects, like the rotation of galaxies, doesn’t match up with our existing theory of gravity (which produces extremely accurate results on smaller scales like our solar system; see all the gravity assist maneuvers space probes use.) There have been alternative theories that suggest that gravity behaves differently at large scales, but so far various observations lean heavily on the side of dark matter.

    Note that this isn’t the first time scientist have proposed difficult to detect matter as a solution to observation anomalies; for example in the early 1800s astronomers noticed that Uranus’s orbit was acting weird; some of them proposed another planet (a “dark” planet if you will) was disturbing it. Decades later Neptune was discovered. Neutrinos are another example; Pauli proposed their existence in the late 1920s in order to make the energy conservation equations balance for beta decay. It took a few decades before they were detected directly. The idea that there is some particle like neutrinos, but even harder to detect, that are behind the gravitation anomalies, is no weirder than those examples.

  22. So, any practical uses for all this, or are we supposed to squeal in joy at the glory and wonder of it all? :-{

  23. The last time I remember watching it, they had basic electron orbital configuration of a noble gas on the whiteboard.

  24. No that was Super Asymmetry. That series did try remarkably hard to get all the science in the background plausible rather than the usual meaningless babble.

  25. Space and time may not be fundamental. As long as we continue to think so we will never solve the problem.

  26. Nit: I don’t think “invent” is the proper description for their work. It’s like inventing the parrot that’s been sitting in you parlor, begging for crackers. It’s there and it’s up to you to find it, describe it, figure out how it works, etc. But you didn’t “invent” the parrot. Of course, if “supergravity” is wrong, then it’s “invented”….like if the parrot doesn’t really exist, it’s an invention of your mind…LOL.

Comments are closed.