• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Hubble tension? Not so much.

Polymath257

Think & Care
Staff member
Premium Member
One of the crucial parameters for our understanding of how fast the universe is expanding is called the Hubble constant. This is one of the parameters that is involved in figuring out the age of the universe and has been a number a LOT of research has been focused on for decades.

For the last decade or so, there has been a 'tension' between the value of the Hubble constant obtained by looking at distant galaxies (whose light started out in the distant past) and using our theoretical model to determine the current value and using nearby galaxies to determine the value directly. The use of distant galaxies gave a value of 67 kilometers per second per megaparsec and the measurements of nearby galaxies gave a value of 72. This was enough to make some people wonder if there was missing physics involved in the universal expansion; always an exciting possibility.

Now, though, recent measurements of the nearby galaxies, taking a larger sample to get a more accurate result, bring the two numbers closer together, with the nearby galaxy result now being 70. This makes it *much* more likely that we are just talking about uncertainties in our measurements and NOT new physics.

'There may not be a conflict after all' in expanding universe debate

On a personal note, when I was young, the value of the Hubble constant was estimated to be between 50 and 100. So the fact that we are now arguing about a difference between 67 (or 70) and 72 seems remarkable to me. These are not easy measurements and they are subject to a lot of potentially confounding effects. Getting them right is tricky and takes time. In the process, there will be debate about the relevance of different methods for finding the same number that give different results.

This is how science does things.
 

Meow Mix

Chatte Féministe
One of the crucial parameters for our understanding of how fast the universe is expanding is called the Hubble constant. This is one of the parameters that is involved in figuring out the age of the universe and has been a number a LOT of research has been focused on for decades.

For the last decade or so, there has been a 'tension' between the value of the Hubble constant obtained by looking at distant galaxies (whose light started out in the distant past) and using our theoretical model to determine the current value and using nearby galaxies to determine the value directly. The use of distant galaxies gave a value of 67 kilometers per second per megaparsec and the measurements of nearby galaxies gave a value of 72. This was enough to make some people wonder if there was missing physics involved in the universal expansion; always an exciting possibility.

Now, though, recent measurements of the nearby galaxies, taking a larger sample to get a more accurate result, bring the two numbers closer together, with the nearby galaxy result now being 70. This makes it *much* more likely that we are just talking about uncertainties in our measurements and NOT new physics.

'There may not be a conflict after all' in expanding universe debate

On a personal note, when I was young, the value of the Hubble constant was estimated to be between 50 and 100. So the fact that we are now arguing about a difference between 67 (or 70) and 72 seems remarkable to me. These are not easy measurements and they are subject to a lot of potentially confounding effects. Getting them right is tricky and takes time. In the process, there will be debate about the relevance of different methods for finding the same number that give different results.

This is how science does things.

During my dark energy research (the methodology of which is equally applicable to the Hubble parameter), one paper is an overview and the list of systemic errors is pages long. It was daunting when first compiled. “How the hell do the professionals do this, I would forever worry that I left something out.”
 

We Never Know

No Slack
One of the crucial parameters for our understanding of how fast the universe is expanding is called the Hubble constant. This is one of the parameters that is involved in figuring out the age of the universe and has been a number a LOT of research has been focused on for decades.

For the last decade or so, there has been a 'tension' between the value of the Hubble constant obtained by looking at distant galaxies (whose light started out in the distant past) and using our theoretical model to determine the current value and using nearby galaxies to determine the value directly. The use of distant galaxies gave a value of 67 kilometers per second per megaparsec and the measurements of nearby galaxies gave a value of 72. This was enough to make some people wonder if there was missing physics involved in the universal expansion; always an exciting possibility.

Now, though, recent measurements of the nearby galaxies, taking a larger sample to get a more accurate result, bring the two numbers closer together, with the nearby galaxy result now being 70. This makes it *much* more likely that we are just talking about uncertainties in our measurements and NOT new physics.

'There may not be a conflict after all' in expanding universe debate

On a personal note, when I was young, the value of the Hubble constant was estimated to be between 50 and 100. So the fact that we are now arguing about a difference between 67 (or 70) and 72 seems remarkable to me. These are not easy measurements and they are subject to a lot of potentially confounding effects. Getting them right is tricky and takes time. In the process, there will be debate about the relevance of different methods for finding the same number that give different results.

This is how science does things.

Here's what they said 2 years ago.

"A study, written by researchers at the Max Planck Institute of Astrophysics in Germany and other universities, has described a new method of gauging the universe's accelerating growth.

It puts the rate of expansion at 82.4 kilometers per second per megaparsec, higher than previous calculations—though it does admit to a 10 percent margin of error, meaning it could as low as 74 or as high as 90."

How fast is the universe expanding? The mystery endures
 

Meow Mix

Chatte Féministe
Here's what they said 2 years ago.

"A study, written by researchers at the Max Planck Institute of Astrophysics in Germany and other universities, has described a new method of gauging the universe's accelerating growth.

It puts the rate of expansion at 82.4 kilometers per second per megaparsec, higher than previous calculations—though it does admit to a 10 percent margin of error, meaning it could as low as 74 or as high as 90."

How fast is the universe expanding? The mystery endures

I remember this being mentioned in coursework. For the purposes of envelope calculations we just used 70 as it was still between 68 and 72; this measurement was an oddball.
 

Polymath257

Think & Care
Staff member
Premium Member
During my dark energy research (the methodology of which is equally applicable to the Hubble parameter), one paper is an overview and the list of systemic errors is pages long. It was daunting when first compiled. “How the hell do the professionals do this, I would forever worry that I left something out.”


I don't think the average person realizes just how much time and energy scientists put into making sure some confounding variable isn't present. You often get 'what if' questions that would negate the results *if* the effect wasn't well known and guarded against.
 

Suave

Simulated character
One of the crucial parameters for our understanding of how fast the universe is expanding is called the Hubble constant. This is one of the parameters that is involved in figuring out the age of the universe and has been a number a LOT of research has been focused on for decades.

For the last decade or so, there has been a 'tension' between the value of the Hubble constant obtained by looking at distant galaxies (whose light started out in the distant past) and using our theoretical model to determine the current value and using nearby galaxies to determine the value directly. The use of distant galaxies gave a value of 67 kilometers per second per megaparsec and the measurements of nearby galaxies gave a value of 72. This was enough to make some people wonder if there was missing physics involved in the universal expansion; always an exciting possibility.

Now, though, recent measurements of the nearby galaxies, taking a larger sample to get a more accurate result, bring the two numbers closer together, with the nearby galaxy result now being 70. This makes it *much* more likely that we are just talking about uncertainties in our measurements and NOT new physics.

'There may not be a conflict after all' in expanding universe debate

On a personal note, when I was young, the value of the Hubble constant was estimated to be between 50 and 100. So the fact that we are now arguing about a difference between 67 (or 70) and 72 seems remarkable to me. These are not easy measurements and they are subject to a lot of potentially confounding effects. Getting them right is tricky and takes time. In the process, there will be debate about the relevance of different methods for finding the same number that give different results.

This is how science does things.
One of the crucial parameters for our understanding of how fast the universe is expanding is called the Hubble constant. This is one of the parameters that is involved in figuring out the age of the universe and has been a number a LOT of research has been focused on for decades.

For the last decade or so, there has been a 'tension' between the value of the Hubble constant obtained by looking at distant galaxies (whose light started out in the distant past) and using our theoretical model to determine the current value and using nearby galaxies to determine the value directly. The use of distant galaxies gave a value of 67 kilometers per second per megaparsec and the measurements of nearby galaxies gave a value of 72. This was enough to make some people wonder if there was missing physics involved in the universal expansion; always an exciting possibility.

Now, though, recent measurements of the nearby galaxies, taking a larger sample to get a more accurate result, bring the two numbers closer together, with the nearby galaxy result now being 70. This makes it *much* more likely that we are just talking about uncertainties in our measurements and NOT new physics.

'There may not be a conflict after all' in expanding universe debate

On a personal note, when I was young, the value of the Hubble constant was estimated to be between 50 and 100. So the fact that we are now arguing about a difference between 67 (or 70) and 72 seems remarkable to me. These are not easy measurements and they are subject to a lot of potentially confounding effects. Getting them right is tricky and takes time. In the process, there will be debate about the relevance of different methods for finding the same number that give different results.

This is how science does things.

If the expansion rate of our universe were accelerating, then would not the rate of its expansion increase over time?
 

Heyo

Veteran Member
One of the crucial parameters for our understanding of how fast the universe is expanding is called the Hubble constant. This is one of the parameters that is involved in figuring out the age of the universe and has been a number a LOT of research has been focused on for decades.

For the last decade or so, there has been a 'tension' between the value of the Hubble constant obtained by looking at distant galaxies (whose light started out in the distant past) and using our theoretical model to determine the current value and using nearby galaxies to determine the value directly. The use of distant galaxies gave a value of 67 kilometers per second per megaparsec and the measurements of nearby galaxies gave a value of 72. This was enough to make some people wonder if there was missing physics involved in the universal expansion; always an exciting possibility.

Now, though, recent measurements of the nearby galaxies, taking a larger sample to get a more accurate result, bring the two numbers closer together, with the nearby galaxy result now being 70. This makes it *much* more likely that we are just talking about uncertainties in our measurements and NOT new physics.

'There may not be a conflict after all' in expanding universe debate

On a personal note, when I was young, the value of the Hubble constant was estimated to be between 50 and 100. So the fact that we are now arguing about a difference between 67 (or 70) and 72 seems remarkable to me. These are not easy measurements and they are subject to a lot of potentially confounding effects. Getting them right is tricky and takes time. In the process, there will be debate about the relevance of different methods for finding the same number that give different results.

This is how science does things.
The fact is that the differences together with their error bars don't overlap. This is the current "crisis in cosmology".

This is a video more directed towards interested lay people:

this is a bit more demanding:
 

Meow Mix

Chatte Féministe
The fact is that the differences together with their error bars don't overlap. This is the current "crisis in cosmology".

This is a video more directed towards interested lay people:

this is a bit more demanding:

It’s a problem; but the thing about calculating error bars is that you don’t know what you don’t know. When I typed up an overview paper for an assignment where I had to synthesize hundreds of papers on stuff, my errors section was pages long out of 48 total. There were systemics that I hadn’t even dreamed of considering until encountering them. It could well be that nobody has had the “aha” moment on a source of error that could cause them to overlap when accounted for.
 

Meow Mix

Chatte Féministe
Off the top of my head a serious concern for very high z observations is malmquist bias; I didn’t have any conception that was a thing to be considered until starting research. Was never touched on in any undergrad astronomy or astrophysics classes.

Typing on a phone so no link, but it’s easy to wikipedia I imagine.
 

Heyo

Veteran Member
It’s a problem; but the thing about calculating error bars is that you don’t know what you don’t know. When I typed up an overview paper for an assignment where I had to synthesize hundreds of papers on stuff, my errors section was pages long out of 48 total. There were systemics that I hadn’t even dreamed of considering until encountering them. It could well be that nobody has had the “aha” moment on a source of error that could cause them to overlap when accounted for.
Hmmm ... So, should we stop saying that the universe is 13.8 billion years old because that pretends to have a precision we haven't reached? If the potential error is really as big as to include both measurement even saying 14 billion is too precise. We should say about 10 billion, that would be honest.
(And then, of course, the chance of the universe being flat also shrinks into insecurity. (I don't like flat universes as I previously said.))
 

Meow Mix

Chatte Féministe
Hmmm ... So, should we stop saying that the universe is 13.8 billion years old because that pretends to have a precision we haven't reached? If the potential error is really as big as to include both measurement even saying 14 billion is too precise. We should say about 10 billion, that would be honest.
(And then, of course, the chance of the universe being flat also shrinks into insecurity. (I don't like flat universes as I previously said.))

You would still get about 13.8 billion if you fudged the Hubble parameter all the way up to 75 km/s/Mpc.

The calculation for t also depends on the model (for instance, which density parameters are dominant or neglected), and that is where you will get a bigger difference for t. However we measure some of these parameters using the same measurements we use for the Hubble parameter (like high-z supernovae). So... it's complicated.
 

Meow Mix

Chatte Féministe
[GALLERY=media, 9527]Ageofuniverse by Meow Mix posted Jul 13, 2021 at 1:04 AM[/GALLERY]

This is how we'd be getting the age of the universe ultimately: this isn't too bad if you neglect terms (bye, radiation; and bye, curvature). It's easy to solve in the early universe where you only have radiation to worry about. But when you have multiple terms, it pretty much can't be solved analytically.

Sometimes you can write it in parametric form.

For a matter + lambda universe like we have today, we can do some tricks to get it to reduce to an analytical form. If the density parameter for matter is around 0.31 and the density parameter for lambda is about 0.69, this yields 13.74 +/- 0.40 Gyr (this is with H_0 = 68 +/- 0.2 km/s/Mpc). Fudging around the Hubble parameter doesn't move it away from 13.8 very quickly.
 
Top