Can't we argue for the low amount of anti-matter as a type of anthropic principle? The early universe was super dense meaning that areas with imbalance would quickly annihilate and leave only one type of matter. Then, due to rapid expansion, our observable universe is dominated by only one type of matter. If we imagine a universe with a more even mix it would be less welcoming to life, so we are less likely to observe it. Has someone modeled something like this?
The first one "220" has a nice discussion, in particular a comment by pfdietz:
> It increases the rate of production of neutral antihydrogen from antiprotons and positrons by a factor of 8. It doesn't increase the efficiency of production of antiprotons, which is the extremely inefficient, energy intensive part.
You can skip the first 42 minutes that are about how bad is an article titled "how antimatter space craft will work". This part is absolutely boring as hell !
If you can electromagnetically trap enough antimatter to use it as fuel you could as well trap a miniature charged black hole that can be fed regular matter to produce power, which skips the whole inefficient part of making antimatter.
It's a pretty fundamental prediction though, and it's been derived in many different ways, all of which give the same prediction.
It's closely related to the Unruh effect, which is a direct consequence of pure QFT. The Unruh effect describes how an accelerated observer sees a different vacuum from an inertial observer - they see radiation that the inertial observer doesn't.
Hawking radiation is essentially this same effect, except that "acceleration" is replaced by "gravity" (Einstein's equivalence principle.) There's a bit more to it, but that's the basic intuition.
For Hawking radiation to be wrong would require some fundamental changes to GR, QFT, or both.
There have been several proposals. This paper proposes a feasable mechanism[1]:
-"a SBH could be artificially created by firing a huge number of gamma rays from a spherically converging laser. The idea is to pack so much energy into such a small space that a BH will form."
The biggest problem is that if you're creating it with lasers, you're only going to get the energy out that you put in. You really want to be able to feed it matter, which would effectively make it an anything-to-gamma-radiation converter, which means you have to feed it quite a lot of matter, against the radiation pressure of all that energy coming out. The paper mentioned assumes a worst case of not being able to feed the black hole at all, but doesn't (in my skim) address the fact that this means you have to put in all the energy you'll be using for the lifetime of the black hole at the creation of it, which seems significantly more outrageously infeasible than the bare necessity of creating a black hole at all.
I admit to invoking the phrase “Where we’re going, we won’t need eyes to see” at least once a year when something feels like it’s going horribly wrong.
Before we get too excited, this current "breakthrough" is making less than 1 antihydrogen atom per second. This corresponds to a delivered annihilation power of less than 1 nanowatt.
Neutrons were first definitively observed in 1932.
First nuclear reactor was 1942, and bomb was 1945.
Once the science is established, we have smart engineers to make a short work out of it.
Fusion energy is really the only counterexample in history, which makes me think we are still missing some crucial physics about how it works, for example in stars. Specifically the particle physics view of how it's reliably triggered with minimal energy.
The antiproton decelerator at CERN has been operational for 25 years, and they have plenty of smart engineers there. Unlike in the 1940s, the underlying physics has been well understood for many decades. I would argue that nuclear fission is the counter example that happens to be surprisingly easy to do.
All experiments at the AD are strongly limited by the low rates. If there was a straightforward way to improve this by many orders of magnitude, they would have done it a long time ago.
> Fusion energy is really the only counterexample in history, which makes me think we are still missing some crucial physics about how it works
This is magical thinking. We know how fusion works in great detail. And “reliably triggered with minimal energy” is essentially not a thing, unless by minimal energy you mean something like 10 million times the energy of an air particle at room temperature, for every particle in a reactor.
What we’re trying to do is recreate the conditions at the core of a star - which is powered by gravity due to hundreds of thousands of Earth masses. And since we don’t have the benefit of gravity anything like that, we actually have to make our plasmas significantly hotter than the core of a star. And then contain that somehow, in a way that can be maintained over time despite how neutron radiation will compromise any material used to house it.
The reality is, we still don’t know if usable fusion power is even possible - there’s no guarantee that it is - let alone how to achieve it. The state of the art is orders of magnitude away from even being able to break even and achieve the same power out as was put into the whole system.
> at the core of a star - which is powered by gravity
That is what I meant, I doubt we really understand what 'powered by gravity' means. You could win a Nobel prize or two by discovering all the details involved here. You would also win a Nobel prize by definitively proving that nothing special happens, you just have high temperatures and high pressures.
The way we are trying to study fusion is like rubbing larger and larger rocks to produce more fire.
The processes involved in solar fusion have been well understood since the 1930s [1,2]. Hans Bethe won a Nobel Prize for this in 1967. The problem is that one cannot produce stellar densities and pressures in any kind of apparatus.
What's the key point regarding how we would get a bajillion times more anti-matter than we can now generate, and without expending all the energy we now expend on getting it?
His point seems to be that we haven't yet seriously tried optimizing for energy efficiency of producing antimatter. It's a call to action. If we actually tried it's plausible that we could get to a level that, while still fantastically inefficient in an absolute sense, would still be worthwhile for spaceflight propulsion, where energy density is vitally important. As far as I know, antimatter is the most energy dense fuel possible in known physics by many orders of magnitude.
Also he proposes a few ways that antimatter could be practically used for propulsion, including as a catalyst for fission which seems interesting.
As a side note, it's mind boggling that overwhelming majority (more than 98%) of the visible universe's mass are only from two most lightweight of chemical elements namely Hydrogen and Helium.
There is a theory that primordial black holes formed in the very early universe. I'm not sure when this process would happen relative to the formation of atoms. But, if it actually happened, it would have been long before stars started forming.
Yes, it's a little mind boggling because the typical human context is this rocky ball of what is ultimately a very uncommon distribution of heavy elements. It's a strange feeling to know that almost everything is utterly unlike the everyday human experience. If you turn down the uhm acksshuwlly a few notches I think parent post's point is quite obvious.
As I learned it long ago in school, elements up to the mass of iron are formed by stellar fusion. That's the point where fusion is no longer exothermic. Any element on earth that is heavier than iron is the product of a supernova. So we live on a ball of supernova debris.
Most of what we live on, the vast majority, is iron or lighter. So it's more that we're sprinkled with supernova debris. But we are made out of stardust, so that's something.
Can't we argue for the low amount of anti-matter as a type of anthropic principle? The early universe was super dense meaning that areas with imbalance would quickly annihilate and leave only one type of matter. Then, due to rapid expansion, our observable universe is dominated by only one type of matter. If we imagine a universe with a more even mix it would be less welcoming to life, so we are less likely to observe it. Has someone modeled something like this?
I think the same subject was addressed in both of these...
https://news.ycombinator.com/item?id=45979220
https://news.ycombinator.com/item?id=46011889
The first one "220" has a nice discussion, in particular a comment by pfdietz:
> It increases the rate of production of neutral antihydrogen from antiprotons and positrons by a factor of 8. It doesn't increase the efficiency of production of antiprotons, which is the extremely inefficient, energy intensive part.
This piece argues that antimatter could be feasible for space propulsion and we could start developing it now: https://news.ycombinator.com/item?id=46073414
For those who are time-rich and knowledge-poor:
https://youtube.com/watch?v=i6jMnz6nlkw
(Angela is genuinely a great science communicator and that video is time well spent if you are interested in this topic.)
You can skip the first 42 minutes that are about how bad is an article titled "how antimatter space craft will work". This part is absolutely boring as hell !
Angela is great, albeit her rants can get quite windy.
If you can electromagnetically trap enough antimatter to use it as fuel you could as well trap a miniature charged black hole that can be fed regular matter to produce power, which skips the whole inefficient part of making antimatter.
Miniature black holes would just evaporate. Antimatter wouldn't.
Minor nit-pick but Hawking Radiation hasn't been observed and remains a theoretical prediction.
It's a pretty fundamental prediction though, and it's been derived in many different ways, all of which give the same prediction.
It's closely related to the Unruh effect, which is a direct consequence of pure QFT. The Unruh effect describes how an accelerated observer sees a different vacuum from an inertial observer - they see radiation that the inertial observer doesn't.
Hawking radiation is essentially this same effect, except that "acceleration" is replaced by "gravity" (Einstein's equivalence principle.) There's a bit more to it, but that's the basic intuition.
For Hawking radiation to be wrong would require some fundamental changes to GR, QFT, or both.
It's pretty widely accepted though. He himself hated the idea so you can expect he did the calculations thoroughly.
I love that major scientists had a intense hatred for the concepts forced upon them by the universe. Einstein and quantum mechanics come to mind
Not before efficiently converting a large amount of mass into usable energy.
But you want that to happen in space and to control the output of energy.
Otherwise you just have a bomb.
The difference between a bomb and a reactor is just clever engineering.
Dual use technology, you say?
In the same way that atomic weapons and radioisotope generators both convert mass into energy. It's just a matter of slightly different timescales.
How could we harness this energy and make it usable?
You use it to boil water.
The real question is if we'll get back hole or antimatter powered steam engines before GTA 6
It's almost a meme at this point
If I knew that, I'd probably have more important things to do than comment it here.
You could pen a carefully-worded a letter of demands and send it to some Billionaire? A bit on the risky side, but - hey, you only live once etc.
We know how to make antimatter and have actually done it. We have no realistic way to obtain a black hole of any size.
Depends. Do we know how to obtain a miniature black hole?
There have been several proposals. This paper proposes a feasable mechanism[1]:
-"a SBH could be artificially created by firing a huge number of gamma rays from a spherically converging laser. The idea is to pack so much energy into such a small space that a BH will form."
1. https://arxiv.org/abs/0908.1803
The biggest problem is that if you're creating it with lasers, you're only going to get the energy out that you put in. You really want to be able to feed it matter, which would effectively make it an anything-to-gamma-radiation converter, which means you have to feed it quite a lot of matter, against the radiation pressure of all that energy coming out. The paper mentioned assumes a worst case of not being able to feed the black hole at all, but doesn't (in my skim) address the fact that this means you have to put in all the energy you'll be using for the lifetime of the black hole at the creation of it, which seems significantly more outrageously infeasible than the bare necessity of creating a black hole at all.
There’s a recent paper on the formation of such a “kugelblitz”; it’s argued to be unfeasible.
https://arxiv.org/abs/2405.02389
The romulan empire does this.
They made a movie about this. It didn’t end so well for the crew.
I admit to invoking the phrase “Where we’re going, we won’t need eyes to see” at least once a year when something feels like it’s going horribly wrong.
> a miniature charged black hole that can be fed regular matter to produce power,
What form of power and through what principle?
Hawking radiation, I think. Yes, this is at best speculatively feasible.
Probably more like a water wheel - matter spinning around the hole can be accelerated.
A spacecraft carrying a blackhole as propulsion means probably would have poor power to weight ratio.
Not at all. It would have one of the best power to weight ratios possible.
Now as to whether you could use all that power....
Before we get too excited, this current "breakthrough" is making less than 1 antihydrogen atom per second. This corresponds to a delivered annihilation power of less than 1 nanowatt.
Neutrons were first definitively observed in 1932.
First nuclear reactor was 1942, and bomb was 1945.
Once the science is established, we have smart engineers to make a short work out of it.
Fusion energy is really the only counterexample in history, which makes me think we are still missing some crucial physics about how it works, for example in stars. Specifically the particle physics view of how it's reliably triggered with minimal energy.
The antiproton decelerator at CERN has been operational for 25 years, and they have plenty of smart engineers there. Unlike in the 1940s, the underlying physics has been well understood for many decades. I would argue that nuclear fission is the counter example that happens to be surprisingly easy to do.
CERN is trying to do fundamental physics, not trying to weaponize antimatter. If/when it comes to that, the pace will pick up.
Also, 25 years to the breakthrough discussed in the article seems like a reasonably good pace.
All experiments at the AD are strongly limited by the low rates. If there was a straightforward way to improve this by many orders of magnitude, they would have done it a long time ago.
> Fusion energy is really the only counterexample in history, which makes me think we are still missing some crucial physics about how it works
This is magical thinking. We know how fusion works in great detail. And “reliably triggered with minimal energy” is essentially not a thing, unless by minimal energy you mean something like 10 million times the energy of an air particle at room temperature, for every particle in a reactor.
What we’re trying to do is recreate the conditions at the core of a star - which is powered by gravity due to hundreds of thousands of Earth masses. And since we don’t have the benefit of gravity anything like that, we actually have to make our plasmas significantly hotter than the core of a star. And then contain that somehow, in a way that can be maintained over time despite how neutron radiation will compromise any material used to house it.
The reality is, we still don’t know if usable fusion power is even possible - there’s no guarantee that it is - let alone how to achieve it. The state of the art is orders of magnitude away from even being able to break even and achieve the same power out as was put into the whole system.
> at the core of a star - which is powered by gravity
That is what I meant, I doubt we really understand what 'powered by gravity' means. You could win a Nobel prize or two by discovering all the details involved here. You would also win a Nobel prize by definitively proving that nothing special happens, you just have high temperatures and high pressures.
The way we are trying to study fusion is like rubbing larger and larger rocks to produce more fire.
The processes involved in solar fusion have been well understood since the 1930s [1,2]. Hans Bethe won a Nobel Prize for this in 1967. The problem is that one cannot produce stellar densities and pressures in any kind of apparatus.
[1] https://en.wikipedia.org/wiki/CNO_cycle
[2] https://en.wikipedia.org/wiki/Proton%E2%80%93proton_chain
There was also a great episode on antimatter engines recently by PBS Space Time.
https://www.youtube.com/watch?v=eA4X9P98ess
What's the key point regarding how we would get a bajillion times more anti-matter than we can now generate, and without expending all the energy we now expend on getting it?
His point seems to be that we haven't yet seriously tried optimizing for energy efficiency of producing antimatter. It's a call to action. If we actually tried it's plausible that we could get to a level that, while still fantastically inefficient in an absolute sense, would still be worthwhile for spaceflight propulsion, where energy density is vitally important. As far as I know, antimatter is the most energy dense fuel possible in known physics by many orders of magnitude.
Also he proposes a few ways that antimatter could be practically used for propulsion, including as a catalyst for fission which seems interesting.
As a side note, it's mind boggling that overwhelming majority (more than 98%) of the visible universe's mass are only from two most lightweight of chemical elements namely Hydrogen and Helium.
> it's mind boggling that overwhelming majority
is it though? I mean literally everything has to start there and the only way get to heavier elements is via stars and many-many iterations.
it's not like heavier things popped into existence.... or did they...
There is a theory that primordial black holes formed in the very early universe. I'm not sure when this process would happen relative to the formation of atoms. But, if it actually happened, it would have been long before stars started forming.
Yes, it's a little mind boggling because the typical human context is this rocky ball of what is ultimately a very uncommon distribution of heavy elements. It's a strange feeling to know that almost everything is utterly unlike the everyday human experience. If you turn down the uhm acksshuwlly a few notches I think parent post's point is quite obvious.
https://xkcd.com/2640/
The alt text is on point.
And earth contains so much of heavier elements.
As I learned it long ago in school, elements up to the mass of iron are formed by stellar fusion. That's the point where fusion is no longer exothermic. Any element on earth that is heavier than iron is the product of a supernova. So we live on a ball of supernova debris.
Most of what we live on, the vast majority, is iron or lighter. So it's more that we're sprinkled with supernova debris. But we are made out of stardust, so that's something.
> elements up to the mass of iron are formed by stellar fusion
And elements down to the mass of iron can also be formed. But iron is at the bottom of the well.
How many times does the rate need to be increased 10x before it's a problem?
If I remember correctly, 6.023x10^23 protons (with electrons) is one gram of hydrogen.