Temperatures there reach an astonishing 30,000-50,000 kelvin.
In 1977, NASA launched the Voyager probes to study the Solar System's edge, and the interstellar medium between the stars. One by one, they both hit the "wall of fire" at the boundaries of our home system, measuring temperatures of 30,000-50,000 kelvin (54,000-90,000 degrees Fahrenheit) on their passage through it.
There are a few ways you could define the edge of the Solar System – for instance, where the planets end, or at the Oort cloud, the boundary of the Sun's gravitational influence where objects may still return closer to the Sun. One way is to define it as the edge of the Sun's magnetic field, where it pushes up against the interstellar medium, known as the heliopause.
"The Sun sends out a constant flow of charged particles called the solar wind, which ultimately travels past all the planets to some three times the distance to Pluto before being impeded by the interstellar medium," NASA explains. "This forms a giant bubble around the Sun and its planets, known as the heliosphere."
It is beyond that where the heliopause lies.
"The boundary between solar wind and interstellar wind is the heliopause, where the pressure of the two winds are in balance. This balance in pressure causes the solar wind to turn back and flow down the tail of the heliosphere," NASA continues.
"As the heliosphere plows through interstellar space, a bow shock forms, similar to what forms as a ship plowing through the ocean."
On August 25, 2012, Voyager 1 became the first spacecraft to go beyond the heliosphere and cross the heliopause, followed by Voyager 2 in 2018. Prior to the Voyager spacecraft crossing the heliopause, scientists didn't really know where the boundary would be, but the fact that the probes hit it at different distances helped support a few predictions about it.
"Scientists expected that the edge of the heliosphere, called the heliopause, can move as the Sun's activity changes, sort of like a lung expanding and contracting with breath," a NASA statement explains. "This was consistent with the fact that the two probes encountered the heliopause at different distances from the Sun."
While not a hard edge, or a "wall" as it has sometimes been called, here both spacecraft measured temperatures of 30,000-50,000 kelvin (54,000-90,000 degrees Fahrenheit), which is why it is sometimes also referred to as a "wall of fire". The craft survived the wall as, though the particles they measured were extremely energetic, the chances of collision in this particle-sparse region of space are so low that not enough heat could be transferred to the duo.
The Voyager spacecraft continue to send us data from beyond this "wall", the only two probes that have crossed it so far, nearly 50 years after they were launched. Together they have found several surprises on our first glimpse outside the Solar System.
"An observation by Voyager 2's magnetic field instrument confirms a surprising result from Voyager 1: The magnetic field in the region just beyond the heliopause is parallel to the magnetic field inside the heliosphere," NASA explained, shortly after one such surprise.
"With Voyager 1, scientists had only one sample of these magnetic fields and couldn't say for sure whether the apparent alignment was characteristic of the entire exterior region or just a coincidence. Voyager 2's magnetometer observations confirm the Voyager 1 finding and indicate that the two fields align."
It’s very odd to think of something extremely hot but with almost no density, and therefore very little heat transfer.
Closer to home you can get similar things when you grind metals for instance. The sparks are at extremely high temperatures, but won't typically start fires or cause burns (it depends) because they're just too small to impart much actual energy to anything they touch.
You only get fire risks when the things they touch are themselves tiny (like dust), so they're unable to absorb and spread the heat.
A similar thing happens when you bake with tinfoil. The foil will be at like 350 F, but you can still touch it basically immediately if you're willing to gamble that nothing with thermal mass is stuck to it where you can't see. It just doesn't have enough thermal mass on its own to burn you, but if there's a good-sized glob of cheese or water or something on the other side you can really be in for a nasty surprise.
I wonder if actual tin foil would behave differently from the aluminum foil that we are all now using.
Tin foil and aluminum foil do have generally different properties. For instance, tin foil can disrupt mind control and aluminum foil can't, and corrosion effects are likely at least different. But any thin metal foil isn't going to be able to hold much heat, because there's just not that much material.
Ooooh! I get to share my favorite Stack Exchange answer!
I do not think that you are correct.
"The thermal conductivity of aluminum is 237 W/mK, and that of tin is only 66.6 W/mK, so the thermal conductivity of aluminum foil is much better than that of tin foil. Due to its high thermal conductivity, aluminum foil is often used in cooking, for example, to wrap food to promote even heating and grilling, and to make heat sinks to facilitate rapid heat conduction and cooling."
Well, I guess Germans are in for a nasty surprise since they call that garment an "Aluhut" in the respective circles where such things are required for street cred ...
The anti mind control tinfoil hat was invented in 1926 by SciFi author Julian Huxley, brother of Aldous.
> tin foil can disrupt mind control
You're not weaponizing Gell-Mann amnesia against us are you?
Not at all. Just doing my part to point out, whenever it's topical, that tin foil hats work and aluminum foil hats don't. There's a reason they want you to call aluminum foil by the wrong name.
Committed to the bit.
Kudos
Mind control waves are pure magnetic fields as opposed to traditional EM waves. So although aluminum can act as a Faraday cage, its not a magnetic shield and hence not capable of stopping mind control.
Tin foil keeps the illuminati out of your brain. That's why they cancelled it. And I have proof!
Alec Steele (youtube blacksmith) installed a particulate filter into his grinding room before he branched off into exotic metals. He also started keeping his shop floor a lot cleaner.
Both because you probably shouldn't breathe that shit in, and also magnesium and titanium dust are very enthusiastic about combusting. Everyone knows about magnesium but nobody knows titanium is almost as surly.
True, dust of combustable stuff can be very dangerous if it accumulates, and the things that will combust as dust are not terribly predictable. Eg, flour is a _serious_ explosion risk if it's mixed with the right amount of air.
> magnesium and titanium dust are very enthusiastic about combusting
Iron dust too. Make sure to keep it away from your pre-lit candles:
Don't let your rust and aluminum filings mix too well either. It's bad.
OK, this has gotten silly.
Almost ANY small particle in a light-density air suspension (dust cloud) will ignite. Certainly anything that oxidizes is prone to going WHOOF! around flames.
This includes non-dairy creamers, paint spray, insecticide sprays (canned or pumped), and sawdust tossed over a fire.
Corn silos know about this intimately.
I was referring to thermite though.
So am I to understand we could theoretically make tempered magnesium swords that explode when struck?
It’s a matter of surface area. You’d have to ask a chemist, this is far above my pay grade.
I think similar of radiant heaters. The heating elements are clearly very hot, glowing even, but you never reach equilibrium with it: your leg will not get that hot. This is because your leg is cooled by conduction and convection (which is basically conduction again) and possibly a little evaporation.
Yeah, radiative cooling/heating is actually super slow compared to any other type. This is why it's so hard to cool anything in space, it's your only option and it kind of sucks at its job.
Wouldn't the other option be ejecting heat "ballast"?
I'm sure that would lead to other issues (sure, ejecting it would move you, but you could just always eject it in the opposite of the direction you want to go, which is how spaceships work in the first place), but what if you had super-cooled ice in a thermos-like enclosure, and as you needed to cool you pulled some out, let it melt, then vaporized it, then superheated the steam, then vented that out the back?
I think you could do that, but mass in space is kind of hard to come by. If it wasn't (like if you're on the moon) you could just use the mass for conduction anyway. If you have to ship it up and consume it like that, that's expensive and limiting.
I'm not sure you can practically superheat the ballast without just causing more heat that you have to deal with. Maybe a heat pump works? Something about that feels vaguely wrong.
If you're about to generate super high temperatures (via a heat pump), might as well use a radiator again. Radiative heat transfer rate scales with temperature to the fourth power. Any such system requires energy, however.
The other thing that helps you is that you're made mostly of water, which is one of the substances with the highest heat capacity. So it's hard to heat up or cool.
> if there's a good-sized glob of cheese or water or something on the other side you can really be in for a nasty surprise.
My next band will be named Velveeta Disfigurement. The stuff never unmelts.
Great examples!
That's actually most of space. Space is a very hot environment, especially where we are so close to the sun. Think about it. When you stand outside in the sun you heat up. All that heat is coming from the sun. But a lot of it was filtered by the atmosphere, so if you're in space near earth it will be hotter than standing at the equator on a sunny day, in terms of radiation.
Then there's the fact that heat is very difficult to get rid of when in space. The ISS's radiators are much bigger than its solar panels. If you wanted to have a very-long eva spacesuit you'd have to have radiators much bigger than your body hanging off of it. Short evas are handled by starting the eva with cold liquids in the suit and letting them heat up.
All of the mockups of starships going to Mars mostly fail to represent where they're going to put the radiators to get rid of all the excess heat.
> If you wanted to have a very-long eva spacesuit you'd have to have radiators much bigger than your body hanging off of it.
I was curious about this! The Extravehicular Mobility Units on the ISS have 8 hours of life support running on 1.42 kg of LiOH. That releases ~2 kJ per gram used, so .092 watts.
The 390 Wh battery puts out an average of 50 watts.
And the human is putting out at minimum 100 watts with bursts of 200+.
Long term it's probably reasonable to need at least 200 watts of heat rejection. That's about a square meter of most radiator, but it needs to be facing away from the station. You could put zones on the front/back and swap them depending on direction, as long as you aren't inside an enclosed but evacuated area, like between the Hubble and the Shuttle. The human body has a surface area of roughly 2 m^2 so its definitely not enough to handle it- half of that area is on your arms or between your legs and will just be radiating onto itself.
It's also not very feasible to have a sail-sized radiator floating around you. You'd definitely need a more effective radiator- something that absorbs all your heat and glows red hot to dump all that energy.
Or, evaporative cooling for spacewalks. Water heat of evaporation at 25°C is 678 Wh/kg, so 200W of heat is about 0.3 kg per hour. Quite manageable!
EDIT: Apparently the Apollo suits did this. An interesting detail is that they used sublimation (evaporating ice directly to vapor), because I suppose that's a lot more practical to exchange the heat.
But boiling water is just a few hundred Kelvin, this is tens of thousands. Would EVA spacesuits be able to radiate that much away if it was really that hot but for the atmosphere absorbing some?
I know it is much hotter, but that's way way hotter and they only find it at a "wall" way farther out.
This is more the temperature of the solar wind, dwarfing the steady state temperature you'd reach from the photonic solar radiation at any distance. The Sun's blackbody varies from like 5000K to 7000K, you won't see objects heated in the solar system heated higher than that even with full reflectors covering the field of view of the rear with more sun and being near the surface of the sun, other than a tiny amount higher from stellar wind, tidal friction, or nuclear radiation from the object's own material I don't think.
> Would EVA spacesuits be able to radiate that much away if it was really that hot but for the atmosphere absorbing some?
Yes! The tiny number of particles are moving really fast, but there are very few of them. We are talking about vacuum that is less than 10^-17 torr. A thermos is about 10^-4 torr. The LHC only gets down to 10^-10 torr. At those pressures you can lower the temperature of a kilometer cube by 10 thousand kelvin by raising the temperature of a cubic centimeter of water by 1 kelvin. There is very little thermal mass in such a vacuum which is why temperature can swing to such wild levels.
This is also why spacecraft have to reject heat purely using radiation. Typically you heat up a panel with a lot of surface area using a heat pump and dump the energy into space as infrared. Some cooling paints on roofing do this at night which is kind of neat.
To add to this: Most of the heat the EVA suits deal with is generated by the human inside not the giant ball of nuclear fusion 8 light minutes away.
Solar radiation is roughly 1 kilowatt per square meter. Human beings generate about 0.1 kilowatts. A good suit will try to reject as much of that kilowatt as possible. Also your dark side will radiate heat but the temperature differential is much lower.
Suits are insulating for a reason. You want to prevent heating on the sun side and prevent too much cooling on the space side. Your body is essentially encapsulated in a giant thermos.
Cooling is achieved using a recirculating cold water system that is good for a few hours of body heat. Water is initially cooled by the primary life support system of the spacecraft before an EVA. Pretty much it starts off pretty cold and slowly over time comes up to your body heat. Recent designs use evaporative cooling to re-cool the water.
Life support systems are so cool.
Absorbed light too but that's a bit easier to deal with and is why most things are white or reflective on the outside of anything in space that's not intentionally trying to absorb heat.
At this low density, temperature is very different from what you are used to experiencing. You have to work through a heat flux balance to really get a grasp of it.
Temperature is just the heat of particles moving. In the extreme case of a handful of N2 molecules moving at 1% the speed of light, it has a temperature of something like 9 billion Kelvin. But it's not going to heat you up if it hits you.
Even at low density, if it were a large volume, solid objects would heat up to that ambient temp. But this one is a minor volume and you would still be radiating it away much faster and not reach anywhere near the ambient temperature. In the middle of a large volume thoigh, you'd get too much incoming thermal radiation from particles within the volume and not be able to shed heat anywhere through radiation.
Reminds me of the book Saturn Run, by John Sanford - which has a lot of effort put into the technology and radiation of heat in their space ship. Fun science fiction book.
I recall a good treatment of this issue in the early part of Joe Haldeman's classic The Forever War. Highly recommended.
Started reading a preview of that. It starts really well. Thanks.
Lack of radiators is endemic in sci-fi. All those cool starships and torch rockets would bake their crews and then melt.
I didn't like the Avatar films except for the starships, which are among the more physically realistic in construction including massive radiators. They'd probably need to be even bigger though IRL if you're talking about something loony like an antimatter rocket.
Okay this may sound silly but what about a solar powered ac for cooling? Like solar radiation is 6000K right, so if you used that to pump your waste heat into say a 1000K radiator (aimed away from the sun obviously) I'm thinking it might give you plenty of negentropy but also radiate away heat at a decent pace.
Skip the Sun! There's an "atmospheric window" in the IR. If you make a material that emits/absorbs (they're reversible) only in that region, and don't expose it to the Sun, then it will cool down to the temperature of space, roughly 3K or -270°C. In practice, it won't cool down anywhere near that much. It'll steal energy from it's surroundings due to conduction/convection, and the amount of energy that's actually radiated in this band by a slightly below room temperature material is pretty minimal. Still neat, entirely passive cooling by radiating to space!
https://en.wikipedia.org/wiki/Atmospheric_window
https://en.wikipedia.org/wiki/Passive_daytime_radiative_cool...
Note that for most of the United States, you must resurface your roof twice a year for this to be effective.
It's a thing in from thousands of years ago https://en.m.wikipedia.org/wiki/Yakhch%C4%81l and today https://en.m.wikipedia.org/wiki/Passive_daytime_radiative_co...
for PDRC there are a couple good videos about it from NightHawkInLight https://youtu.be/N3bJnKmeNJY?t=19s, https://youtu.be/KDRnEm-B3AI and Tech Ingredients https://www.youtube.com/watch?v=5zW9_ztTiw8 https://www.youtube.com/watch?v=dNs_kNilSjk
Acs don't get rid of heat, they just move it around. At some point you need to put the heat somewhere and then your just back to giant radiators
https://en.wikipedia.org/wiki/Absorption_refrigerator
> An absorption refrigerator is a refrigerator that uses a heat source to provide the energy needed to drive the cooling process. Solar energy, burning a fossil fuel, waste heat from factories, and district heating systems are examples of heat sources that can be used. An absorption refrigerator uses two coolants: the first coolant performs evaporative cooling and then is absorbed into the second coolant; heat is needed to reset the two coolants to their initial states.
https://www.scientificamerican.com/article/solar-refrigerati...
> Fishermen in the village of Maruata, which is located on the Mexican Pacific coast 18 degrees north of the equator, have no electricity. But for the past 16 years they have been able to store their fish on ice: Seven ice makers, powered by nothing but the scorching sun, churn out a half ton of ice every day.
It literally doesn't matter what your refrigeration process is. You have to "reject" the heat energy at some point. In space, you can only do that with large radiators.
There is no physical process that turns energy into cold. All "cooling" processes are just a way of extracting heat from a closed space and rejecting it to a different space. You cannot destroy heat, only move it. That's fundamental to the universe. You cannot destroy energy, only transform it.
Neither link is a rebuttal of that. An absorption refrigerator still has to reject the pumped heat somewhere else. Those people making ice with solar energy are still rejecting at minimum the ~334kj/kg to the environment.
An absorption refrigerator does not absorb heat, it's called that because you are taking advantage of some energy configurations that occur when one fluid absorbs another. The action of pumping heat is the same.
The question was 'what about a solar powered ac for cooling?', yes?
Giant radiators don't make ice.
The proposed method of pumping heat into someplace hot to make it hotter doesn't work. But there area definitely ways to do solar powered ac for cooling.
I provided links. It's how propane-powered fridges work. And it was a homework problem in thermodynamics class.
Since this discussion is still active, I think hwillis was the only one that got my idea. Pumping heat into the radiators will make them hotter then they would be by just passive conduction, and then the T^4 radiation scaling means that the radiators will start radiating a lot, i.e. a lot of heat will be sent into deep space.
Yes? That's in the atmosphere where heat rejection is a vastly easier problem than in vacuum, thanks to convection.
Radiative heat transfer is proportional to T^4. If your suit is 300 K(80F), bumping the temperature up by 100 C lets you radiate 3.16x as much heat from the same area.
Per wiki: radiators reject 100-350 watts per m^2 and weigh ~12 kg per m^2. Not unlikely you would need 10x as much radiator as server. You need about as much area for radiators as you do for solar panels, but radiators are much heavier.
That also makes nuclear totally infeasible- since turbines are inefficient you'd need 2.5x as many radiators to reject waste heat. Solar would be much lighter.
https://en.wikipedia.org/wiki/Spacecraft_thermal_control#Rad...
Nuclear power is very feasible in space. Perhaps you're overlooking that radiated power scales with the quartic of absolute temperature (T⁴); it's not difficult at all to radiate heat from a hot object, as it is for a room-temperature one.
(How hot? I won't quote a number, but space nuclear reactors are generally engineered around molten metals).
Yeah, fair to say its feasible. ROSA on the ISS produces 240 W/m^2 and weighs 4 kg/m^2.
The S6W reactor in the seawolf submarines run at ~300 C and produce 177 MW waste heat for 43 MWe. If the radiators are 12 kg/m^2 and reject 16x as much heat (call it 3600 W/m^2) then you can produce 875 watts of electricity per m^2 and 290 watts at the same weight as the solar panels. Water coolant at 300 C also needs to be pressurized to 2000+ PSI, which would require a much heavier radiator, and the weight of the reactor, shielding, turbines and coolant makes it very hard to believe it could ever be better than solar panels, but it isn't infeasible.
Plus, liquid metal reactors can run at ~600 C and reject 5x as much heat per unit area. They have their own problems: it would be extremely difficult to re-liquify a lead-bismuth mix if the reactor is ever shut off. I'm also not particularly convinced that radiators running at higher temperatures wouldn't be far heavier, but for a sufficiently large station it would be an obvious choice.
It goes up to 1,344 °C with Li, I think—it's a very different engineering space from the stuff on Earth.
The Soviet ones used K (or maybe NaK eutectic); there's a ring of potassium metal dust around the Earth people track by radar (highly reflective)—a remnant from one of them exploding.
The idea is not completely without merit. In gravity less environment, you can have much bigger and much thinner structure possible than on Earth.
Also the radiated heat from the Sun won't have much effect if the heat sink panels are facing perpendicular to the sun with two sides pointing sideway to deep space to radiate away the heat.
> Think about it. When you stand outside in the sun you heat up. All that heat is coming from the sun. But a lot of it was filtered by the atmosphere, so if you're in space near earth it will be hotter than standing at the equator on a sunny day, in terms of radiation.
I think you’re missing the key point - heat transfer. The reason we feel hot at the beach is not solely because of heat we absorb directly from solar energy. Some of the heat we feel is the lack of cooling because the surrounding air is warm, and our bodies cannot reject heat into it as easily as we can into air that is cool. And some is from heat reflecting up from the sand.
Theres a heat wave across much of the US right now. Even when the sun goes down it will still be hot. People will still be sweating , doing nothing, sitting on their porches. Because the air and the surrounding environment has absorbed the sun’s heat all day and is storing it.
That’s what you’re neglecting in your analysis of space.
The plasma inside arc lamps (e.g. xenon headlights) are somewhere around 6,000-10,000 K.
Then there are things like fusion reactors where the temperature is in the millions of degrees and the whole point of the design is to keep the heat in.
Edit: although interestingly in an electric arc, often the electrons have a higher kinetic energy (temperature) than the heavier ions and atoms in the plasma. It's a highly non-equilibrium situation. That plays into your "high temperature, slow transfer" thing quite nicely: even the atoms within the plasma don't reach the full temperature of the electrons.
Came to say this about fluorescents, but even the tungsten filament in an old style bulb could easily be 5000K which is ~8500F.
What is the temperature on either side of this “wall”? My mental model here, which is probably incorrect, is that the “temperature” on the outside of the wall could be higher but the density is much lower, thus even less heat transfer going on (but, still, high energy particles that can hit you, registering a high temperature). I get all kinds of mixed up regarding the difference between heat transfer and measured temperature.
Temperature is a totally valid measurement. For physicists. Not really for clickbait articles. High energy particles wouldn't attract as many views.
If it were really that hot we'd never observe the CMB at a balmy 2.7K.
I thought the same thing too. It is very hot, without having very much heat - in a way.
The Parker Solar probe encounters a similar situation where it has to handle high amounts of direct radiation, but the latent/ambient environment is full of incredibly hot particles at very low density (because they are so hot) which means it isn't that hard to make the probe survive it.
It's one of the reasons I love Oxygen Not Included so much. That game's materials have both Thermal Conductivity and Specific Heat as stats, and density plays into it as well.
See also: the exosphere. Helium and Hydrogen in Earth's atmosphere float up above the others and form a layer which hangs out around 1000 °C. Despite the high temperature, you'd probably get hypothermia if you stayed too long in the shade at that altitude (supposing the low pressure didn't get you first).
Temperature, it would seem, is an idea that would only have developed at the bottom of a gravity well.
Just like the rest of space - very very cold, but very empty.
>It’s very odd to think of something extremely hot but with almost no density
Not at all odd, in fact very normal, consider any Hollywood actress who gets by on looks alone, your Pamela Andersons or Megan Foxes of the world.
Perhaps change the link to the original NASA JPL post: https://www.jpl.nasa.gov/news/voyager-2-illuminates-boundary...
Seems I no longer can edit it but that link doesn't directly reference the high temperature environment, unless I misread it?
I often think about how cold our lifeforms on earth are, relative to temperatures of things in the universe. 0 Kelvin is theoretical lowest possible temp, quasars are apparently > 10 trillion Kelvin (10,000,000,000,000K), yet all life we know of is between what, 250K and 400K?
Basically it's because the relevant structures are somewhat fragile. Matt Strassler has a good post about "why does everything we care about move so slowly compared to the cosmic speed limit?" (https://profmattstrassler.com/2024/10/03/why-is-the-speed-of...), and the answer is, it's because we're made of atoms, atoms are held together by the electromagenetic force, and that's only so strong, if things moved way faster then collisions would tear atoms apart. But of course life is dependent not only on atoms, but also on electromagnetic bonds much weaker than the ones that hold atoms together. So this limits how hot it can get.
If you’ll excuse a bit of trivia: SI units named after people are not capitalized. So we have newton, joule, weber, kelvin, named after Newton, Joule, Weber, and Kelvin. (But their abbreviations are capitalized: N, J, Wb, K.)
Correct. The parent of your post should have written "10 trillion kelvins", "10 terakelvins", or "10 TK". The article wrote "Temperatures there reach an astonishing 30,000-50,000 kelvin" instead of "kelvins" (or better yet, 30–50 kK).
Very few people use the unit kelvin correctly. ( https://www.reddit.com/r/Metric/comments/126sniq/everyone_mi... )
The only exception regarding capitalization is that the person Celsius is capitalized in the multi-word unit "degree(s) Celsius", and the pluralization is on "degree".
Good point about pluralization. I tend to be confused about that because we don't pluralize units in Norwegian (except the equivalent of degrees). But confusingly, in English, you sometimes see people trying to pluralize the abbreviations, such as kgs for kilograms. Or (even worse) ms for meters. That way madness lies.
"You aren't really famous in math or science until people stop capitalizing your name"
Joke I heard in the math department.
0 Kelvin is theoretical lowest possible temp
Let me introduce you to negative temperature systems!
Negative temperatures are hotter than positive temperatures, though, so this isn't really relevant to the parent comment.
I was aware of this, but you putting it into numerical terms rather than an intuitive understanding is really cool. Even a small fire is dramatically hotter than life, yet nothing in comparison to what happens outside of our relatively frozen little bubble here on Earth
We're also interestingly enough at around the geometric mean between atoms and stars! (as in the scale of humans)
Well, lifeforms on earth are all pretty dependent on being water based, and water in the liquid state specifically. Maybe there is a possibility of exotic life based on some other types of chemistry and/or phases of matter. But the fact that earth happened to form in this particular goldilocks zone for water-based life is probably why that's the only life we can see for now.
I have to mention Robert L. Forward’s Dragon Egg—it explores life on a white dwarf with nuclear reactions instead of chemical ones. Not the best book, IMHO, but a fun thought to entertain.
Well unless there's some ghost-like life form in a gas state, we sort of need the molecules to stay together to form life.