Wednesday, April 12, 2017

Astrophysicists Envision a Universe Without Dark Energy

Dark energy, the hypothesized but unconfirmed entity thought to propel the expansion of the universe, has puzzled astrophysicists since the 1990s. Its subtle effects are even harder to detect directly than those of dark matter. Now some scientists are developing alternative ways to understand the universe's expansion and proposing to dispose of the concept of dark energy altogether.

A portion of the Deep Field image from the Hubble Space Telescope, 1996. Data from Hubble, in which distant galaxies appear to be moving rapidly away from us, prompted scientists to propose dark energy in the first place.

In a new study, they ask the question: is it possible that the formation and arrangement of galaxies throughout time has somehow affected how the universe has expanded since the Big Bang 13.8 billion years ago? In a simulated universe in which this happens, it turns out to be possible to reproduce the latest measurements of the expansion rate gathered by the European Space Agency's Planck space telescope.

“It’s exciting to try to solve the biggest observational puzzle in science right now,” said István Szapudi, astronomer at the University of Hawaii in Honolulu and one of the lead authors of the study, which is being published in the Monthly Notices of the Royal Astronomical Society. “Unlike dark matter, for which we have many specific theories that make sense, for dark energy, it’s the ‘biggest blunder’ in physics,'” he adds, referring to Albert Einstein’s assessment.

According to the standard model of the origin of the cosmos—which this study, if confirmed, could upend—dark energy makes up about 68 percent (and rising) of the total energy density of the universe. In comparison, normal matter, the stuff we’re more familiar with in planets and stars and gas and dust, adds up to only 5 percent, while the rest is dark matter.

Physicists have come up with a variety of dark energy candidates—and have even considered modifying the laws of gravity as part of the effort. Einstein proposed something he called a cosmological constant as part of his general relativity theory to balance the effects of gravity. He later regretted the idea, but a century afterward, cosmologists have embraced it. In this theory, as the universe expands, it releases "vacuum energy," which produces yet more expansion. One tiny problem: the theory’s prediction is a gargantuan mismatch when compared to the actual observed average value of less than one percent of the kinetic energy of a flying mosquito for every cubic meter in the universe.

Szapudi and his colleagues make the case that one could avoid the conundrum with a different explanation for the expansion of the universe, without invoking dark energy. They start by splitting up their simulated universe into regions and measuring the density of matter, which is mostly dark matter, in each area. Instead of blowing up the average expansion with dark energy, they measure the density of matter in each mini-universe and say that that drives the expansion there, especially in places devoid of matter. They argue that as gravity pulls together and assembles numerous galaxies in some regions while nearly empty regions everywhere else expand, these regions accelerate the overall expansion, an effect called backreaction. Then when you measure the average expansion, you end up with something that looks very close to the models that use a different method and dark energy to explain the latest astronomical observations.

“That outcome would be wonderful, if it’s right,” said Nick Kaiser, also at the University of Hawaii. He believes that this backreaction approach needs a more solid mathematical foundation, as Szapudi’s study just provides an approximation for the evolution of the universe.

“It’s certainly not the final answer, but it suggests that people should take this seriously,” said Rocky Kolb, a cosmologist at the University of Chicago. He also points out the troubling discrepancy between measurements of the expansion rate from cosmic microwave background radiation, relics of the Big Bang probed by the Planck telescope, and observations of supernova explosions. Szapudi’s and his team’s calculations go in the right direction toward resolving that tension.

To settle these questions, Szapudi and other theorists will try to develop more sophisticated simulations, while others are looking forward to getting their hands on more precise observations. Upcoming telescopes within the next decade, such as the European Space Agency’s Euclid space telescope and the Large Synoptic Survey Telescope being built in Chile, will measure the precise distances and motions of millions of galaxies. Astronomers expect that they’ll yield a better and deeper measurement of the expansion rate that could help them distinguish between dark energy and models like Szapudi’s.

“It’s clear that structures [like clusters of galaxies] have an effect on the expansion rate, but what’s not clear yet is how strong the effect is,” said Syksy Räsänen, theoretical physicist at the University of Helsinki in Finland. “Until we’ve established that backreaction doesn’t explain the accelerated expansion, we cannot draw the conclusion that dark energy or modified gravity is needed.”

For now, the jury’s still out on dark energy.

Ramin Skibba, Inside Science News

3 comments:

  1. Dark energy is a fudge factor - now other fudge factors will replace it. Unlike special relativity, general relativity is empirical, not deductive. It was not deduced from postulates (has no postulates). General relativity was the result of endlessly changing and adjusting equations until some final version managed to match known in advance experimental results and pet assumptions. And fudge factors belong to the essence of general relativity.

    Can one introduce a fudge factor analogous to the cosmological constant in Lorentz transformation equations? One cannot, and the reason is simple: Special relativity is DEDUCTIVE (even though a false assumption and an invalid argument have spoiled it from the very beginning) and fudging is impossible by definition - one has no right to introduce anything that does not follow from the postulates.

    The only alternative to deductive theory is empirical concoction (a "theory" that is not even wrong) - Einstein clearly explains this here:

    Albert Einstein: "From a systematic theoretical point of view, we may imagine the process of evolution of an empirical science to be a continuous process of induction. Theories are evolved and are expressed in short compass as statements of a large number of individual observations in the form of empirical laws, from which the general laws can be ascertained by comparison. Regarded in this way, the development of a science bears some resemblance to the compilation of a classified catalogue. It is, as it were, a purely empirical enterprise. But this point of view by no means embraces the whole of the actual process ; for it slurs over the important part played by intuition and deductive thought in the development of an exact science. As soon as a science has emerged from its initial stages, theoretical advances are no longer achieved merely by a process of arrangement. Guided by empirical data, the investigator rather develops a system of thought which, in general, is built up logically from a small number of fundamental assumptions, the so-called axioms."

    Special relativity was indeed "built up logically from a small number of fundamental assumptions" but general relativity was, to use Einstein's words, "a purely empirical enterprise". Einstein and his mathematical friends changed and fudged equations countless times until "a classified catalogue" was compiled where known in advance results and pet assumptions (such as the Mercury's precession, the equivalence principle, gravitational time dilation) coexisted in an apparently consistent manner. Being an empirical concoction, general relativity allows Einsteinians to introduce, change and withdraw fudge factors until the "theory" manages to predict anything Einsteinians want. Then the prediction turns out to be confirmed by observations (surprise surprise).

    Pentcho Valev

    ReplyDelete
  2. The Scientists continue their debate in regard to the dark matter. It is or it is not? They cannot agree. One could say they are both right even though it would appear that they have not grasped the essence, or to better say the lack of essence, of the dark matter. In fact, if for Existence is intended all that is made of energy one should say that dark matter does not exist if instead with the same term Existence one wants to include even the absence of energy then it should be said that dark matter exists. That is what existed before the birth of the Universe about 14 billion years ago. Prior to the so called Big Bang there was only Void. With its expansion, energy travels like a web inside one Void which is still at the border of an expanding Universe. Instead, within the Universe, and in between the meshes of this web of energy, there is the dark matter or the non essence from one eternal Void. That is why all galaxies and the entire Universe are connected by the dark matter.
    http://wwtrade.org/m/index.php/Wavevolution

    ReplyDelete
  3. The concept of dark energy is an example of the typical problem of poor analysis of the SNe Ia data. The groups of Riess and Perlmutter simply followed the usual practice of astronomers of using the emission magnitude rather than the emission distance. This selection of metric is not trivial, for magnitude is not the distance but a log function of the distance. A change of metric usually means a change of model fitness relevance. Such mistakes are common in astronomy but not allowed in other sciences. For more on the topic see the short YouTube videos
    Log-Transformed Data Problems: Pitfalls using log-transformed data with examples from astrophysics

    https://youtu.be/Y1nEQmg2yJA and https://youtu.be/IN0jqhqjW3Y

    ReplyDelete