Arxiv – Applications of Bayesian model averaging to the curvature and size of the Universe

Bayesian model averaging is a procedure to obtain parameter constraints that account for the uncertainty about the correct cosmological model. We use recent cosmological observations and Bayesian model averaging to derive tight limits on the curvature parameter, as well as robust lower bounds on the curvature radius of the Universe and its minimum size, while allowing for the possibility of an evolving dark energy component. Because flat models are favoured by Bayesian model selection, we find that model-averaged constraints on the curvature and size of the Universe can be considerably stronger than non model-averaged ones. For the most conservative prior choice (based on inflationary considerations), our procedure improves on non model-averaged constraints on the curvature by a factor of ~ 2. The curvature scale of the Universe is conservatively constrained to be Radius of curvature greater than 42 Gigaparsecs (99%), corresponding to a lower limit to the number of Hubble spheres in the Universe NU (number of Universe volumes) over 251 (99%).

In cosmology, the Hubble volume, or Hubble sphere, is the region of the Universe surrounding an observer beyond which objects recede from the observer at a rate greater than the speed of light, due to the expansion of the Universe.

The comoving radius of the Hubble sphere is c / H0, where c is the speed of light and H0 is the Hubble constant. More generally, the term “Hubble volume” can be applied to any region of space with a volume of order (c / H0)^3.

The term “Hubble volume” is also frequently (but mistakenly) used as a synonym for the observable universe; the latter is larger than the Hubble volume

N sub U refers to the number of Hubble volumes that make up the total universe.

The amount of curvature is usually characterized by the curvature parameter.

if the curvature parameter is less than 0 then the geometry of spatial sections

is spherical (i.e., the Universe is closed) and the Universe has a finite size.If instead the curvature parameter is greater than 0 then the geometry is hyperbolic

(i.e., the Universe is open)while for curvature parameter equal to 0 spatial sections are flat. In both the two latter cases, the spatial extent of the Universe is infinite.

We have applied the formalism of Bayesian model averaging to the problem of constraining the curvature and size of the Universe. By employing the Savage-Dickey density ratio, we have obtained model-averaged constraints at almost no additional computational effort than what is needed for parameter estimation. We have demonstrated how model-averaged constraints on the curvature and minimum size of the Universe can be considerably tighter than non model- averaged ones. This is a consequence of the fact that flat models are preferred by Bayesian model selection, although the strength of such preference is fairly strongly dependent on the choice of prior for the curvature parameter.

We have considered two classes of priors that are based on physical and theoretical considerations. We found that even the most conservative prior choice gives model-averaged constraints on curvature that are a factor of ∼ 2 better than non model-averaged intervals. A more aggressive prior choice (the Astronomer’s prior) leads to an improvement in the constraints on (curvature parameter) by a factor ∼ 100, giving curvature parameter of 6.2 × 10^−4 at 99% [very close to flat universe]. The minimum size of the Universe is robustly constrained to encompass NU greater than 251 Hubble spheres, an improvement of a factor of ~ 40 on previous constraints. Finally, the radius of curvature of spatial section is found to be Radius of curvature greater than 42 Gigaparsecs.

Bayesian model averaging gives the most general parameter constraints, which fully account for the uncertainty in the selection of the correct underlying cosmological model. It remains imperative (like in any good Bayesian analysis) to study the dependency of the results of the chosen priors, which are more important in Bayesian model selection (and model averaging) than they are in the usual parameter inference framework. We believe that the formalism presented here can be employed successfully in a large variety of cosmological problems.

That’s not quite right, however. Because the Universe is expanding, the most distant visible things are much further away than that. In fact, the photons in the cosmic microwave background have travelled a cool 45 billion light years to get here. That makes the visible universe some 90 billion light years across.

Obviously, we can’t directly measure the size of the universe but cosmologists have various models that suggest how big it ought to be. For example, one line of thinking is that if the universe expanded at the speed of light during inflation, then it ought to be 10^23 times bigger than the visible universe.

Bayesian model averaging automatically guards against this. Instead of asking how well the model fits the data, its asks a different question: given the data, how likely is the model to be correct. This approach is automatically biased against complex models–it’s a kind of statistical Occam’s razor.

*If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks*