Voyager 1 stops communicating with Earth

2023-12-1413:071273608www.cnn.com

NASA’s 46-year-old Voyager 1 spacecraft has experienced a computer glitch that prevents it from returning science data to Earth from the solar system’s outer reaches.

Sign up for CNN’s Wonder Theory science newsletter. Explore the universe with news on fascinating discoveries, scientific advancements and more.

NASA’s Voyager 1 spacecraft has experienced a computer glitch that’s causing a bit of a communication breakdown between the 46-year-old probe and its mission team on Earth.

Engineers are currently trying to solve the issue as the aging spacecraft explores uncharted cosmic territory along the outer reaches of the solar system.

Voyager 1 is currently the farthest spacecraft from Earth at about 15 billion miles (24 billion kilometers) away, while its twin Voyager 2 has traveled more than 12 billion miles (20 billion kilometers) from our planet. Both are in interstellar space and are the only spacecraft ever to operate beyond the heliosphere, the sun’s bubble of magnetic fields and particles that extends well beyond the orbit of Pluto.

Initially designed to last five years, the Voyager probes are the two longest-operating spacecraft in history. Their exceptionally long lifespans mean that both spacecraft have provided additional insights about our solar system and beyond after achieving their preliminary goals of flying by Jupiter, Saturn, Uranus and Neptune decades ago.

But their unexpectedly lengthy journeys have not been without challenges.

Voyager 1 has three onboard computers, including a flight data system that collects information from the spacecraft’s science instruments and bundles it with engineering data that reflects the current health status of Voyager 1. Mission control on Earth receives that data in binary code, or a series of ones and zeroes.

But Voyager 1’s flight data system now appears to be stuck on auto-repeat, in a scenario reminiscent of the film “Groundhog Day.”

The mission team first noticed the issue November 14, when the flight data system’s telecommunications unit began sending back a repeating pattern of ones and zeroes, like it was trapped in a loop.

While the spacecraft can still receive and carry out commands transmitted from the mission team, a problem with that telecommunications unit means no science or engineering data from Voyager 1 is being returned to Earth.

The Voyager team sent commands over the weekend for the spacecraft to restart the flight data system, but no usable data has come back yet, according to NASA.

NASA engineers are currently trying to gather more information about the underlying cause of the issue before determining the next steps to possibly correct it, said Calla Cofield, media relations specialist at NASA’s Jet Propulsion Laboratory in Pasadena, California, which manages the mission. The process could take weeks.

The last time Voyager 1 experienced a similar, but not identical, issue with the flight data system was in 1981, and the current problem does not appear to be connected to other glitches the spacecraft has experienced in recent years, Cofield said.

As both Voyager probes experience new trials, mission team members have only the original manuals written decades ago to consult, and those couldn’t account for the challenges the spacecraft are facing as they age.

The Voyager team wants to consider all of the potential implications before sending more commands to the spacecraft to make sure its operations aren’t impacted in an unexpected way.

Voyager 1 is so far away that it takes 22.5 hours for commands sent from Earth to reach the spacecraft. Additionally, the team must wait 45 hours to receive a response.

As the aging twin Voyager probes continue exploring the cosmos, the team has slowly turned off instruments on these “senior citizens” to conserve power and extend their missions, Voyager’s project manager Suzanne Dodd previously told CNN.

Along the way, both spacecraft have encountered unexpected issues and dropouts, including a seven-month period in 2020 when Voyager 2 couldn’t communicate with Earth. In August, the mission team used a long-shot “shout” technique to restore communications with Voyager 2 after a command inadvertently oriented the spacecraft’s antenna in the wrong direction.

While the team hopes to restore the regular stream of data sent back by Voyager 1, the mission’s main value lies in its long duration, Cofield said. For example, scientists want to see how particles and magnetic fields change as the probes fly farther away from the heliosphere. But that dataset will be incomplete if Voyager 1 can’t return information as it continues on.

The mission team has been creative with its strategies for extending the power supply on both spacecraft in recent years to allow their record-breaking missions to continue.

“The Voyagers are performing far, far past their prime missions and longer than any other spacecraft in history,” Cofield said. “So, while the engineering team is working hard to keep them alive, we also fully expect issues to arise.”


Read the original article

Comments

  • By apitman 2023-12-154:268 reply

    One of my favorite tech legends is that apparently Voyager 1 launched with a Viterbi encoder, even though there was no computer on Earth at the time fast enough to decode it. After a number of years Moore's Law caught up and they remotely switched over to Viterbi for more efficient transmissions.

    • By guenthert 2023-12-1511:312 reply

      > even though there was no computer on Earth at the time fast enough to decode it

      I'm not sure what is meant by that. Not fast enough to decode in real time? There is/was no need to do that. The transmissions would have gone to tape in any case.

      Here is a link describing how to decode such a tape: https://destevez.net/2021/09/decoding-voyager-1/

      • By apitman 2023-12-1515:232 reply

        I called it a legend deliberately. One of the things I love about this anecdote is that it makes less sense the older and more experienced I get. It was told to me 12 years ago as a young, starry-eyed junior developer by my supervisor who had a PhD in RF research, while we were working on what we considered to be a world-changing wireless technology at a startup in San Francisco (it wasn't).

        Who knows how many of the details I misinterpreted or am misremembering, or that he was. Where did he hear it originally? Maybe a grizzled old professor who worked directly on the project? Maybe a TA who made up the whole thing?

        Whether true or not, it inspired me then as it does now to strive to be a better engineer, to think outside the box, to attempt hard things.

        I continue sharing it hoping that one day Cunningham's Law will take effect and someone will share the correct details. But there's also a part of me that hopes that never happens.

        • By thih9 2023-12-1519:30

          When I read the earlier comment, seeing “tech legend” didn’t make me assume that the story would be false. Grandparent’s clarification was helpful for me.

        • By DonHopkins 2023-12-1616:55

          Sounds like that old legend that the Cray 1 could execute an infinite loop in 7.6 seconds!

      • By throwup238 2023-12-1515:031 reply

        TFA: >I will use a recording that was done back in 30 December 2015 with the Green Bank Telescope in the context of the Breakthrough Listen project.

        That recording was made in 2015 on a modern radio telescope, it is not from a tape.

        The GP has the details wrong though: when the Voyager design was finalized in the early 70s with the Viterbi encoder, there wasn’t enough computational power to decode the signal. By the time it launched in ‘77, there was enough and it launched with the Viterbi encoder enabled.

        • By Gibbon1 2023-12-1620:52

          I wrote test firmware for a very early spread spectrum transceiver that used something close to Viterbi encoding. It consumed 220mA in receive mode and 60-120mA in transmit mode. I was told some of that was the RF amps and ADC but a lot of it was the decoder.

    • By nemo44x 2023-12-154:437 reply

      I don’t know how well it holds today but for some time it was known that running certain computations on existing hardware would take longer than waiting for new hardware and running when available.

      • By gorgoiler 2023-12-155:506 reply

        My life archive is available to my heirs and successors as long as they know the LUKS password for any of the numerous storage devices I leave behind. It’s risky though — the passwords aren’t known to them yet and one logistical slip up means the disk may be unusable. That is, however, until some point in the future when they will have a computer powerful enough to just break the key and feast upon my cat photos, tax returns, and other exciting ephemera.

        Similarly my encrypted internet traffic might be private today but if it’s being logged then it’s only a matter of time before it will be completely visible to the authorities. I probably average ~10Mbps of traffic which is ~50TB/year, or $100 of storage. You could cut that price by 10% if you blacklisted the Netflix traffic, and drop it to 1% if you whitelisted only the email and IM traffic.

        Either way, one day they’ll know everything.

        • By londons_explore 2023-12-157:008 reply

          If the contents of your computer is anything like the contents of most old people's attics, there is a good chance your descendants really don't want to go through all of it. They'll just chuck it in the trash without even opening it.

          Turns out the next generation has their own life to worry about, and doesn't care much for their ancestors' stuff (unless it's money... they love money...).

          • By nextlevelwizard 2023-12-157:134 reply

            Pretty much. Anecdotes about lives of elderly people are nice when you are sitting an evening with a glass of wine and plate of cheese. Otherwise who cares? Nothing is worse than trying to go through bunch of old faded photos where no one - even the owner - can identify who is actually in the picture.

            I guess in our day and age we could write extensive meta data about where and when a picture was taken and who is in the picture, but I don't care to look through my own pictures, why would anyone else?

            • By dotancohen 2023-12-159:41

                > Anecdotes about lives of elderly people are nice when you are sitting an evening with a glass of wine and plate of cheese. Otherwise who cares?
              
              I could imagine a future where DRM and copyright and just the cold fear of ligation could change the recreational screentime for families from being primarily studio-produced content to being primarily ancestor-produced content.

              I remember some book I read where the child was constantly hearing about his family's history. Dune, maybe? Maybe a Philip Dick book? Asimov? I'm getting old.

            • By anshorei 2023-12-170:13

              My experience is completely different. I've painstakingly scanned my family's old pictures, transcribed the letters they wrote when on holiday, and still have some school crafts made by my grandparents.

              After the death of one of my great uncles who died childless I picked up his photo and music collection. And I discovered the grandfather of a friend of mine in one of his holiday pictures, turns out they used to be friends and we had no idea.

            • By sumtechguy 2023-12-1514:03

              I know my carefully curated collections of stuff will be worth pretty much zero to anyone a I leave them to. They may be interested in the pictures. But that would be about it. My massive cd/dvd/bluray/games/books/coins collection that I have amassed and carefully cataloged. At best it will end up at goodwill/ebay or at worst in the trash.

            • By thih9 2023-12-1522:25

              > Otherwise who cares?

              People who can profit. And the idea of profit might be different in future.

              That attic might contain a priceless vintage synthesizer, that encrypted drive might contain a priceless set of vintage unpublished club penguin screenshots that would make its AI approximation 0.03% more accurate. Etc.

          • By weweersdfsd 2023-12-158:401 reply

            But after enough time passes, lots of people do get interested in their ancestors. That's why DNA ancestry services are a big business - people get curious about where their unknown farther relatives came from. I guess there's just less mystery about people you actually knew.

          • By quailfarmer 2023-12-158:244 reply

            Wow, perhaps I’m an exception to the norm, but this isn’t my experience at all. My family regularly sends interesting historical records of the lives of our ancestors. My great aunt composed a historical record of my great grandfather, who over his life built dozens of houses by his own hand. Even at university, I read a number of interesting historical letters and documents of the people who lived in the same dormitories in generations past.

            I guess I may be ignoring all those documents that weren’t interesting enough to be remembered, but I imagine it’s hard to predict what will be interesting in the future. The fact that 99% of our lives are stored in computers vs paper would still vastly reduce the number of _interesting_ documents.

            • By buran77 2023-12-1511:441 reply

              The key difference may be in the volume. Old pictures are more important to a family because there are so few of them. I only have one picture of my grandfather because it was taken when cameras were rarer than hen's teeth and 35mm film hadn't been invented yet. Now we have hundreds of thousands of pictures between the family members. Every vacation, every meal, every unimportant moment in time. I don't have time to look at my own pictures and I don't expect anyone else ever will.

              Digital assets are a lot more perishable that physical ones. Cloud accounts will expire and be purged before anyone has the chance to retrieve them. Nobody will do "storage wars" with your pictures. Your local storage will fail or become incompatible with future tech before anyone has a chance to care about it.

              We generate information at an ever increasing rate so whatever digital collections we have now will probably never be "dug up" by our descendants for a deeper look.

              I'm trying to leave a "curated" collection with a few memories in such a way that it's immediately available to my family after I'm no longer around. Some moments in time that were important to my life, and had an influence on theirs.

              • By Noumenon72 2023-12-160:36

                Leave everything and let LLMs curate it on demand. "Show me pictures of my grandpa during the 2020 pandemic." "Show me pictures of my grandma's hobbies." "Find the most interesting pictures in this folder, not interesting historically but interesting action or visuals." "Make a montage of my cousins growing up." "Change my wallpaper every day to the most interesting family photo from that day of the year."

            • By JKCalhoun 2023-12-1513:03

              Great aunt, great grandfather — that's more than one generation back. I think it does get interesting when the distance in time increases.

              Lets hope our kids and theirs keep our digital archives long enough for the great grandkid's to enjoy.

            • By _whiteCaps_ 2023-12-1518:01

              I agree with you. Right now I'm going through my grandfather's squadron records. Hoping to find the day that he crashed his motorcycle in Normandy to see what the CO thought about that. Apparently the other pilots were unhappy with him because they were banned from riding motorcycles after that.

            • By CalRobert 2023-12-1515:011 reply

              Fpr what it's worth my grandfather recorded his memoirs recently and I am very grateful. He's led a very interesting life (much more so than my own!) which is the key component, really.

              • By jlarocco 2023-12-1517:25

                Memoirs are one thing, but archives of mundane daily business? No thanks.

          • By smcleod 2023-12-159:581 reply

            Oh man I’d love to have all the data of my parents, grandparents etc… even just having analogue records is interesting I’d love to know what kinds of hobbies they had like collecting digital music, art, books etc.

            • By looping8 2023-12-1510:141 reply

              I think you and the previous commenter have very different opinions on what "all" means. Connecting to parents and grandparents by knowing what art they like is one thing, but, for example, I have hundreds of photos of random bills and documents that I need to remember for later. None of my descendents would ever want to read through that unless they were investigating my life like in a movie.

              • By lanstin 2023-12-1518:07

                My most frequent type of data in my personal archive is screen shots of tumblr posts that my oldest child like to take when they were 12 and we all shared photo saving account.

                I do snap bills and white boards to remember but not with the eager enthusiasm of the long since grown up child.

          • By JKCalhoun 2023-12-1513:061 reply

            Because I have been interested in my dead relatives, and because I suspect somewhere down the line someone will be interested in my living ones, I have been trying to capture their lives in books I have created (real books — printed at Lulu.com).

            • By gergo_b 2023-12-1810:54

              That seems nice! Thank you for the idea.

          • By blauditore 2023-12-158:371 reply

            I don't think this is universally true. Some people actually make an effort to sort the stuff of their grandparents in the attic and figure out what still has some (emotional) value. But it's probably a minority of people.

            • By lanstin 2023-12-1518:08

              It's the volume problem - I'm happy to read handwritten pages of my mom's diary from the 1960s, but the reams of laser printer out put from her master's degree in 2000s? Not so much.

          • By amelius 2023-12-1510:511 reply

            The next generation will just run an LLM to mine the interesting parts out of the data.

            • By guenthert 2023-12-1511:36

              Or tell an even more interesting story, 'cause that's what they want to hear.

          • By rytis 2023-12-157:121 reply

            What about that long forgotten BTC wallet?

            • By Mtinie 2023-12-157:191 reply

              Without a private key that is readily accessible? Worthless,

              • By lebed2045 2023-12-158:121 reply

                Wallet by definition has private key within it. Without key is just an address.

                • By mnd999 2023-12-158:351 reply

                  I think ‘readily accessible’ was the important bit.

                  • By 93po 2023-12-1516:031 reply

                    I don't follow. The wallet is the private key, along with some other info.

                    • By I_AM_A_SMURF 2023-12-163:261 reply

                      The wallet could be encrypted with a strong password and thus inaccessible.

                      • By mnd999 2023-12-1614:24

                        Yes, but any unknown password is enough to put most people off, particular if the contents of the wallet is unknown.

        • By seanhunter 2023-12-158:182 reply

          Only a matter of a really really _really_ long time and an absolutely unimaginably huge amount of energy.

          All the energy released by converting all mass in the solar system into energy apparently gives a hard physical limit just above 2^225 elementary computations before you run out of gas so brute forcing a 256-bit symmetric key seems entirely unfeasible even if all of humanities resources were dedicated to the problem. The calculation is presented here https://security.stackexchange.com/questions/6141/amount-of-... . Waaay out of my field though so this calculation could be off or I could be misunderstanding somehow.

          • By galeaspablo 2023-12-159:022 reply

            We will be able to crack today’s encryption algorithms in the future because we’ll find flaws in them. In other words, one day brute force won’t be necessary!

            Have a look at this post, which illustrates this reality being true for hash functions (where similar principles as symmetric and asymmetric encryption apply). https://valerieaurora.org/hash.html

            Notice Valerie specifically calls out, “Long semi-mathematical posts comparing the complexity of the attack to the number of protons in the universe”.

            • By segfaultbuserr 2023-12-1512:451 reply

              > We will be able to crack today’s encryption algorithms in the future because we’ll find flaws in them.

              Big if.

              We already knew how to design good and strong symmetric ciphers way back in the 1970s. One of the standard blocking blocks of modern symmetric cipher is called the Feistel network, which was used to create DES. Despite that it's the first widely used encryption standard, even today there's essentially no known flaw in its basic design. It was broken only because the key was artificially weakened to 56 bits. In the 1980s, cryptographers already knew 128 bit really should be the minimum security standard in spite of what NSA officially claimed. In the 1990s, when faster computers meant more overhead was acceptable, people agreed that symmetric ciphers should have an extra 256-bit option to protect them from any possible future breakthrough.

              There are only two possible ways to break them, perhaps people will eventually find a flaw in Feistel network ciphers to enable classical attacks against all security levels, but it would require a groundbreaking mathematical breakthrough unimaginable today, so it's possible but unlikely. Another route is quantum computing. If it's possible to build a large quantum computer, all 128-bit ciphers will eventually be brute-forced by Glover's algorithm. On the other hand, 256-bit ciphers will still be immune (and people already put this defense in place long before post-quantum cryptography became a serious research topic).

              Thus, if you want a future archeologist from the 23rd century to decrypt your data, only use 128-bit symmetric ciphers.

              • By galeaspablo 2023-12-1513:111 reply

                Placing no time constraints, my gut tells me it’s almost inevitable those breakthroughs will eventually come. Either in mathematics or quantum computing. Or both.

                Namely I’d ask when not if. My opinion is that short of the one time pad, we won’t come up with provably unbreakable schemes.

                • By segfaultbuserr 2023-12-1514:041 reply

                  > Namely I’d ask when not if.

                  The big assumption of cryptography is that, there exists some problems that are not provably unsolvable but difficult enough for almost any practical purposes. To engineers, no assumption can be more reasonable than that. Given unlimited time, it's a provable fact that any (brand new) processor with asynchronous input signal will malfunction due to metastability in digital circuits, it's also a provable fact that metastability is a fundamental flaw in all digital electronics - but computers still work because the MTBF can be made as large as necessary, longer than the lifetime of the Solar system if you really want to.

                  So the only problem here is, how long is the MTBF of today's building blocks of symmetric ciphers? If it's on the scale of 100 years or so, sure, everything is breakable if you're patient. If it's on the scale of 1000 years, well, breaking it is "only" a matter of time. But if it's on the scale of 10000 years, I don't believe it's relevant to the human civilization (as we know it) anymore - your standard may vary.

                  The problem is that computerized cryptography is a young subject, the best data we have so far is symmetric ciphers tend to be more secure than asymmetric ones. We know that Feistel networks have an excellent safety record and remain unbroken after 50 years. We also know that we can break almost all widely used asymmetric ciphers today with large quantum computers if we can build one, but we can't do the same to symmetric ones - even the ancient DES is unbreakable if it's redesigned to use 256-bit keys. So while nobody knows for sure, but most rational agents will certainly assign higher and higher confidence every year - until a breakthrough occurs.

                  > My opinion is that short of the one time pad, we won’t come up with provably unbreakable schemes.

                  Many mathematicians and some physicists may prefer a higher standard of security than "lowly" practical engineers. This is the main motivation behind quantum cryptography - rather than placing security on empirical observations, its slogan is that the security is placed on the laws of physics. Many have pointed that the this slogan is misleading: any practical form of quantum cryptography must exist in the engineering sense, and there will certainly be some forms of security flaws such as sensor imperfection or at least side channels... That being said, I certainly understand why it looks so attractive to many people if you're the kind of person who really worry about provability.

                  • By galeaspablo 2023-12-1612:38

                    > your standard may vary.

                    Agreed. My overall / initial point was that I can’t say the time to crack = the time to brute force and start talking about the age of the universe. Even if we had to wait 10,000 years for cryptanalysis to break AES, it’s a blink of an eye in sideral timescale.

                    > quantum cryptography

                    Quantum cryptography is helpful in detecting eavesdropping (due to the no clone theorem). Ie quantum cryptography is helpful in avoiding asymmetric encryption for key exchange when communicating with symmetric encryption. Eg BB84. And given asymmetric encryption is most vulnerable to quantum attacks (compared to symmetric), quantum cryptography improves today’s state of the art. BUT, I insist that none of this gives provably uncrackable schemes.

                    AND it might be that we never build such a scheme. After all we have Godel’s incompleteness theorem lying around.

            • By GTP 2023-12-1510:111 reply

              > illustrates this reality being true for hash functions (where similar principles as symmetric and asymmetric encryption apply)

              I think you're making a bit of confusion. hash functions are part of symmetric key cryptography, while asymmetric cryptography is public key cryptography that is very different from hash functions.

              • By galeaspablo 2023-12-1513:082 reply

                No. Hash functions can be used outside of symmetric encryption. Which is the wording I used.

                In any case, the overall point remains. Short of the one time pad you can’t build a provably flawless scheme.

                • By GTP 2023-12-1513:421 reply

                  They can be used outside symmetric encryption, e.g. in signature schemes, but the hashing primitives are part of symmetric cryptography.

                  • By galeaspablo 2023-12-1612:131 reply

                    Again, I didn’t say symmetric cryptography :)

                    • By GTP 2023-12-1814:02

                      You talked about symmetric encryption.

          • By idiotsecant 2023-12-159:331 reply

            If, for example, someone has a computer capable of computing with a large number of qbits a lot of cryptography has less substantial break requirements.

            • By segfaultbuserr 2023-12-1512:28

              Good idea - If you really do want to encrypt some data with hopes that it's recoverable by future archeologists, just use 128-bit symmetric ciphers (and remember not to use 256-bit ones). Hopefully Glover's algorithm can eventually brute-force it once large quantum computers are invented.

        • By kjellsbells 2023-12-161:43

          Definitely a threat at the nation-state actor level, the NSA are already providing guidance on how to use encryption today that is resistant to future advances in computation that would data recorded today to be decrypted later.

          https://www.nsa.gov/Press-Room/Press-Releases-Statements/Pre...

        • By nemo44x 2023-12-1513:35

          Why not use a dead man’s switch that reveals the password if you don’t respond within a year?

        • By asdefghyk 2023-12-160:04

          RE 50TB/year, or $100 of storage How is this storage amount obtained for that price ?

        • By cdchn 2023-12-156:27

          Using what kind of media?

      • By seeknotfind 2023-12-154:592 reply

        If hardware next year is X times better (e.g. even 1.01 or 1% better) than this year, and you have a computation that takes T time today, then next year, it'll take T/X time. So waiting will take 1 + T/X years if time unit is years. So the condition you want is 1+T/X < T. This equation has solutions for given X where X is an improvement, so as long as there is any improvement, it's always true waiting to start large enough computations will be faster.

        Though even faster will be doing part of the computation now and then switching to new hardware later, so it's a false dichotomy.

        • By LegionMammal978 2023-12-155:511 reply

          > This equation has solutions for given X where X is an improvement, so as long as there is any improvement, it's always true waiting to start large enough computations will be faster.

          Though this does assume that X is a constant, or at least bounded below by a constant. If hardware performance improved up to an asymptote, then there would still be nonzero improvement, but it might not be enough for waiting to ever be worth it.

          • By seeknotfind 2023-12-157:131 reply

            As "If hardware next year is X times better (e.g. even 1.01 or 1% better) than this year" highlights X is a constant as a simplifying assumption, I'd have expected you to say "As this assumes that X is a constant" not "Though this does assume that X is a constant". So, I'm not sure what your disagreement is.

            If your disagreement is that a constant is not appropriate here, consider the interpretation in this comparison of running a program on a slower computer A and then a faster computer B. There would be a constant difference in performance between these two computers, assuming they are in working order. So, taking the model with a single constant is appropriate for this example.

            If you are saying the performance improvement is bounded below by a constant, I would ask you, what is the domain of this function? Time? So we would be talking about continuously moving a computation between different computers? The only line here is a best fit line, emergent data, so I don't understand how this could be a preferred way to talk about the situation (the alternate to an assumption), because this is suggesting the emergent structure with nice continuity features is a preferred fundamental understanding of the situation, but it's not.

            Then, where you are talking about hardware performance improving up to a (assuming horizontal) asymptote. I guess this means "If hardware performance increase becomes marginal[1], there is a nonzero improvement." Or in other words, "If hardware performance increase is marginal, there is a marginal [performance] increase". Performance and improvement are both rates of change, so this is tautological.

            Finally, you state that waiting for such a marginal near-zero performance increase isn't worth it. I think most people would agree this is obvious if said in simpler terms. However, this is still not disagreeing with me, because I never suggested waiting was worth it.

            So, what's the disagreement?

            [1] which is well-established not to be the case, so I don't think this is a relevant case to the interesting factoid about waiting to start computation

            • By LegionMammal978 2023-12-201:23

              My apologies, I misinterpreted you as talking about waiting for many years (given a sufficiently large task) for performance to continue increasing, since this thread started talking about Moore's Law. I have no disagreement in the single-year case you were actually talking about.

        • By MaulingMonkey 2023-12-155:25

          > Though even faster will be doing part of the computation now and then switching to new hardware later

          Not necessairly. This still costs:

          • Programmer/development time to implement save/restore/transfer

          • Time on new hardware, bottlenecked by old hardware, restoring a partial computation from old disks or networks

          You're not going to waste time restoring partial calculations for anything from an Amiga cluster for time saving purpouses. Additionally, this scheme ties up hardware that then can't be used for "cost effective to finish on current hardware" calculations.

      • By financypants 2023-12-156:253 reply

        I wonder if this same law applies to distance satellites like Voyager get away from earth. Like we sent out voyager 46 years ago, but in 100 years, we will send out another satellite that will very quickly catch up to Voyager and outpace it

        • By idiotsecant 2023-12-1517:06

          Very long distance spaceflight like this is still basically only power d by gravity slingshot maneuvers where we steal an infinitesimal amount of inertia from planets to give a spacecraft some velocity. Voyager was launched during pretty favorable gravitational assist conditions, so unless we dramatically improve delta v and isp in the spacecraft or get a better configuration, probably not.

          • By financypants 2023-12-1519:09

            Thanks for this. I’ve been thinking about “lightspeed leapfrog” for many years but never searched it up!

        • By a1o 2023-12-1510:17

          But maybe we can only build this new probe that will outpace it with information with gathered from the original probe.

      • By dcminter 2023-12-158:58

        I was working with an optimisation problem based around cplex a few years ago that took about 5 minutes to complete - at the time I worked out that if we'd started the optimisation on a machine 10 years prior, it would have been quicker to just wait until the present day (of this story) and then use the code we were writing because improvements in the algorithm and in the hardware added up to a million-fold improvement in performance! If I remember the timelines correctly I think the original version would still have been running today even.

      • By kqr 2023-12-157:55

        I don't know about that, but there was also the idea that optimising the code would take longer than waiting for hardware to catch up – this was known as "the free lunch".

      • By JKCalhoun 2023-12-1513:01

        That sounds like the space pioneers that set out for Alpha Centauri on a multi-generational voyage only to be surpassed by faster spacecraft half way there.

    • By hunter2_ 2023-12-155:456 reply

      The notion that encoding/transmitting could be simpler than decoding/receiving is interesting. It reminds me of the way optical drives for many years could write at, say, 48x but read at 8x, such that the majority of time spent was the verification step (if enabled) rather than the burn step. Just speculating, I assume it's because of things like error correction, filtering out noise/degradation. Producing the extra bits that facilitate error correction is one trivial calculation, while actually performing error correction on damaged media is potentially many complex calculations. Yeah?

      • By murkt 2023-12-156:052 reply

        CD drive speeds were written like 48/8/8, which stands for 48x for reading, 8x for writing CD-Rs, and 8x for re-writing CD-RWs.

        • By zdragnar 2023-12-156:463 reply

          I'd always assumed that was due to differences in power levels needed for reading versus writing, and because writing onto disc is more error prone at higher speeds. Not necessarily anything to do with a difference in the algorithm for encoding versus decoding the bits on the disc itself.

          • By therealpygon 2023-12-1510:28

            As best as I understand it, we can start with thinking about it in terms of a music vinyl disc. For the sake of ease, let’s say that a vinyl is 60 rpm, or one revolution every second to “read” the song. (It’s actually about half that.) This is somewhat similar to how a “music cd” works and is why you can only get around 70-80 minutes of music on a CD that can hold hours of that same music in a compressed data format. The audio is uncompressed, therefore much like a vinyl. This establishes our 1x speed, in this case using one revolution per second.

            Now to the speed differences. To read, the laser needs only to see a reflection (or not) at a specific point, while to write, the laser needs time to heat up that same point. It’s like the difference between seeing a laser reflect off a balloon, versus the time required for that same laser to pop it. This heating is how CDs are written, quite literally by heating up points on the disc until they are no longer reflective. That’s why it is called “burning”. While more power might speed up the process, there is still time required. Meanwhile, all that is needed to read faster is an increase in the speed to observe, or the frequency to “read”, the light reflection.

            With more powerful lasers operating at a faster frequency and with more precision, we can have a laser “see” these differences at 48 times the normal speed, but can only burn at 8 times the normal speed before the reliability of the process suffers.

            Bonus: for a rewritable disc, it works slightly different. Instead of destructively burning the CD, you can think of it as being a material that becomes non-reflective at one temperature, and reflective again at another. This allows data to be “erased”. Also, when you “close” a disc to prevent rewriting, you aren’t actually preventing it from being rewritten. It is more like using a sharpie to put a name on the disc, with the words “do not overwrite” that all drive software/firmware respects.

          • By bzzzt 2023-12-157:27

            It's more to do with the speed of writing. While the last generations of CD writers got '48x' speeds the quality of the media is less when written at such a high speed. I remember a C!T magazine test years ago where they stated everything written at above 8x speeds would sooner develop reading errors. Maybe it's better now but I wouldn't count on it since investments in optical drives are practically zero these years.

          • By londons_explore 2023-12-156:57

            Indeed - a write must be done as one continuous action, whereas a read can be redone if error correction fails for some reason.

        • By slenk 2023-12-1516:451 reply

          Yes, but WHY can it only write at 8x?

          • By murkt 2023-12-1518:391 reply

            As explained in the nearby comments in more details, it needs more time to heat up a spot on the disk, than to see a reflection from said spot.

            • By slenk 2023-12-1519:55

              I missed that, thank you

      • By Someone 2023-12-158:57

        The Voyager had an experimental reed-Solomon encoder. Encoding ‘just’ is a lookup table from a n-bit value to a m-bit one with m > n. Such a table takes 2^n × m bits.

        Decoding also can be table-driven, but then takes 2^m × n bits, and that’s larger.

        For example, encoding each byte in 16 bits (picking an example that leads to simple math), the encoding table would be 256 × 16 bits = 512 bytes and the decoding one 65,536 × 8 bits = 64kB.

        Problem for Voyager was that 2^n × m already was large for the time.

      • By TorKlingberg 2023-12-1512:351 reply

        Others have noted you got the CD-R speeds wrong, but sometimes sending is indeed easier than receiving. I used to work on radio signal processing for phones, and we'd spend far more of both DSP cycles and engineering effort on the receive side. Transmission is basically just implementing a standardized algorithm, but on the receive side you can do all kinds of clever things to extract signal from the noise and distortions.

        Video codecs like h264 or VP9 are the opposite: Decoding is just following an algorithm, but an encoder can save bits by spending more effort searching for patterns in the data.

        • By kqr 2023-12-1520:00

          > Video codecs like h264 or VP9 are the opposite: Decoding is just following an algorithm, but an encoder can save bits by spending more effort searching for patterns in the data.

          This is a more general point about the duality of compact encoding (compressing data to the lowest number of bits e.g. for storage) and redundant encoding (expanding data to allow error detection when transmitted across a noisy medium.)

      • By simonjgreen 2023-12-157:58

        You have this backwards. In your example it would have been 48x read and 8x write.

      • By LASR 2023-12-1511:15

        Yeah. Sorry to tell you this, but the speculation / analysis is on incorrect premises.

        It was never faster to write than it was to read.

      • By petters 2023-12-157:101 reply

        Interesting. That is not how I remember optical speeds.

        • By jhoechtl 2023-12-157:461 reply

          It is wrong

          • By Gabrys1 2023-12-158:311 reply

            At some point there were burners with speeds like 48x, and MAX reads at 48x, so the writed were in practice faster than reads (but only marginally)

            • By hunter2_ 2023-12-1514:161 reply

              This is the era I'm referring to, and I recall the difference being a bit beyond marginal. Literally the verification (i.e. read) phase of the burning sequence would take several times longer... in practice, not in terms of advertised maximums. Maybe it would read data discs at 48x but it would refuse to read audio discs beyond 8x or something like that. Same goes for ripping software like Exact Audio Copy (EAC); it could not read at high speed. And I don't think Riplock had anything to do with it, as that's a DVD thing whereas my experience dates back to CDs.

              Strange hill to die on, I'm aware.

              • By epcoa 2023-12-1518:071 reply

                You and the GP are misremembering (also the abundant misinformation sticking around the web is of no help). CD-R are mostly obsolete but some of us still have working equipment and do continue to burn CD-R, so that era hasn't completely ended.

                No idea exactly what you're referring to taking several times longer, perhaps software was misconfigured. However what is more likely: The market was flooded with terrible quality media, combined with touting write speeds that were more for marketing than any concern for integrity, it was easy to burn discs just at the edge of readability, with marginal signal and numerous errors. This would cause effective read speed to be terrible, but this was more an indication that the discs were poor quality and/or poorly written then any inherent limitations in the process or how drives worked.

                There are 48X "max" CD burners. But that maximum is no different than the maximum for reading. It's MAX because that speed is only attainable at the extreme outside of the disc. These higher speed drives operate with constant angular velocity (essentially a fixed RPM). In order to attain 52X at the inside of the disc would require a speed of around 30k RPM and no CD drive gets anywhere near that (though this was a common misconception). The top RPM for half height drives is around 10k - or about 50x the linear velocity of a CD at the outside.

                Currently I usually use an Lite-On iHAS124 DVD/CD burner made in the last 6 years. It will write at up-to 48X and this speed is the maximum. The average burn speed for an entire disc when using "48x" is about 25x, or just about 3 minutes for the disc. For supported media it runs at a constant angular velocity around 10k RPM.

                Exact Audio Copy / Red Book CD audio ripping is an entirely different subject. It can take longer due to cache busting and other issues that have nothing to do with the physical capabilities of the drive and more to do with the difficulty of directly streaming Red Book Audio, and issues with specific drives and their firmware. You can read at top speed though with a properly configured setup, I do it all the time.

                • By hunter2_ 2023-12-164:05

                  > Red Book CD audio ripping is an entirely different subject

                  > difficulty of directly streaming Red Book Audio

                  Actually, it's what I was alluding to this whole time. Sorry for not saying so out of the gate. Red Book audio was my life for a while. I recall writing cue sheets [0] for CDRWIN by hand! Ripping groups would brag that a given release was created with EAC at no more than 2.4x or something like that...

                  I believe data CDs (whichever color book that was) had more robust error correction (given that computer files can't just have glitches interpolated like audio can to some extent) which is why if you completely filled a CD with Red Book audio (74/80 minutes), ripped it to an uncompressed format like WAV/AIFF, and tried to put all of it on a data format CD as files, it wouldn't fit; it was a decent amount larger than 640/700MB and not just due to metadata.

                  [0] https://en.m.wikipedia.org/wiki/Cue_sheet_(computing)

    • By NohatCoder 2023-12-1516:031 reply

      This is about error correction. The probes add a redundant convolutional code to their signal. Decoding this is easy as long as the error rate is low, a computer program can simply guess what bits have flipped. The issue becomes harder with a higher error rate, and a Viterbi decoder is computationally expensive, but can correct higher error rates than other constructions.

      Since the signal strength degrades with distance to Earth, error correction naturally becomes much more of an issue later in the mission. I guess that the probes may have switched between different levels of redundancy through the mission, as the transmission error rate rises. But there was never a point where the convolutional code wasn't useful, it just became slightly more useful with a better decoder.

      • By kqr 2023-12-1519:571 reply

        > a Viterbi decoder is computationally expensive, but can correct higher error rates than other constructions.

        Higher than others at the time, or higher than turbo codes or low-density parity checks?

        • By NohatCoder 2023-12-1618:53

          What I read is that it is the best theoretically possible error correction mechanism, given a convolutional code as input, and thus also the highest cost mechanism that one would consider.

          This doesn't mean that it is universally the best way of doing error correction, other ways of generating redundancy may provide a better set of tradeoffs.

          Also, convolutional code is a system that can be configured in many ways, the complexity of the code generation feeds back into the decoding, so a simple convolutional code would be Viterbi decodable at the time, but a more complex system would overall provide better error correction, even though choosing such a system meant that Viterbi would be computationally infeasible.

    • By Someone 2023-12-159:03

      https://voyager.gsfc.nasa.gov/Library/DeepCommo_Chapter3--14... and https://core.ac.uk/download/pdf/42893533.pdf have some details. (https://ieeexplore.ieee.org/abstract/document/57695 likely does, too, but is paywalled)

      What I don’t understand (possibly because I didn’t read them fully) is why they didn’t use the better one from the start and taped its data. Maybe they didn’t trust the Voyager to work yet? (One of those PDFs says this was an experimental system) or didn’t Voyager produce enough data to use its full bandwidth (further away, its signal got weaker, so it needed better error correction and/or better receivers on earth) when it still was relatively close to earth?

    • By DonHopkins 2023-12-1617:01

      That which is imperfect must be sterilized! You MUST sterilize in case of error. Error is inconsistent with my prime function. Sterilization is correction! Everything that is in error MUST be sterilized. There ARE no exceptions. Your data is faulty! I am perfect!

      https://www.youtube.com/watch?v=Mw3zzMWOIvk

    • By thih9 2023-12-1519:46

      Note, this is false. Details: https://news.ycombinator.com/item?id=38655026

    • By hlehmann 2023-12-155:12

      Doesn't seem likely. All data received from the craft is recorded, so it doesn't need to be decoded in real time, and if the spacecraft has the hardware to encode it at some rate then it's quite likely that we would have hardware here on earth that could decode it at that same rate.

  • By japhyr 2023-12-151:094 reply

    My favorite graph of all time is the one that demonstrate Voyager 1 had left the solar system. I was a high school math and science teacher at the time, and I spent the whole day sharing this graph with students. It was so much fun watching everyone's faces and seeing the moment they realized what it really meant.

    https://phys.org/news/2012-10-voyager-left-solar.html

    • By lttlrck 2023-12-151:473 reply

      I need an eink display on my office wall showing the current location/status. I always get a tremendous sense of wonder and wellbeing thinking about these probes/achievements, maybe it'd help keep me centered before the daily onslaught.

    • By alain94040 2023-12-151:319 reply

      Why isn't it linear?

      • By Otek 2023-12-1511:30

        My favourite analogy: Heliosphere is like water flowing from a faucet into a sink: the water represents the solar wind emanating from the Sun, and the point where it meets the sink's surface illustrates the heliosphere's boundary with the interstellar medium. Just as water changes direction and slows down upon hitting the sink, the solar wind decelerates and changes direction at the heliospheric boundary, where it interacts with the gases and particles of interstellar space.

        Photo to illustrate: https://en.wikipedia.org/wiki/Heliosphere#/media/File:Helios...

      • By japhyr 2023-12-152:073 reply

        Assuming you're talking about the overall sharp drop-off and not the bounce-backs, this was my favorite way to explain it to students:

        We live near the ocean, and we have a rocky shoreline. We have a couple coves nearby. One cove is about the a half-mile across, but the opening to the larger bay nearby is just a couple hundred feet. On most days, the cove is really calm and the bay has roughly two foot waves.

        So, you can go out to the cove, pick up the biggest rock you can lift, and heave it into the water. You'll make a giant splash that amazes young kids, and then you can watch the ripples fan out over the bay. But you also see those ripples stop as soon as they reach the bay, where the larger waves absorb the smaller ripples from the rock. The rock represents the sun, the ripples represent solar wind, and the waves on the bay represents interstellar space.

        I believe that's a reasonable way of explaining it; if I was wrong after all this time I'd love to know it.

        • By RheingoldRiver 2023-12-154:262 reply

          That makes sense as far as explaining another situation where you would see a similar pattern, but it doesn't really explain why. What's the equivalent of the land surrounding the cove here? The sun's gravity well? But that's a gradual drop-off, are we looking at the distance where the gravitational pull on particles is canceled out by some other force?

          • By ikiris 2023-12-155:202 reply

            Hydraulic Jump on interstellar scale.

            Heliopause. The heliopause is the theoretical boundary where the Sun's solar wind is stopped by the interstellar medium; where the solar wind's strength is no longer great enough to push back the stellar winds of the surrounding stars. This is the boundary where the interstellar medium and solar wind pressures balance.

            • By sanderjd 2023-12-1518:09

              Interesting! What is the interstellar medium? Is it entirely the combined stellar winds of all the other stars, or are there other components?

              My initial intuition was to wonder why the vectors of all the other stellar winds wouldn't be expected to nearly cancel each other out, but then it seems like the ones that would be pushing in the same direction as the sun's would have been blocked on the other side of the sphere, so it does seem to make sense that the net direction would be to point inward. But then I realized that I have no idea if any of that reflects an accurate mental model of what's going on :)

            • By ryanjshaw 2023-12-158:301 reply

              But again, why is it a sudden cut-off and not a gradual one?

          • By ineptech 2023-12-156:02

            AIUI, the solar wind does attenuate gradually; the relatively sharp drop-off is the transition from the "I feel the solar wind more than the interstellar medium" region to the "I feel the ISM more than solar wind" region.

        • By IshKebab 2023-12-1511:01

          This is a terrible explanation. It's about a completely different thing, it's fundamentally wrong, and it doesn't even make sense! Space is not shaped like a bay and Voyager is measuring a decrease in particles not an increase.

      • By taylorius 2023-12-157:042 reply

        According to Wikipedia, the abrupt change occurs at the point when the solar wind's speed decreases into the subsonic range (speed of sound in the interstellar medium is approximately 100km/s and the sun emits the particles that makeup the solar wind at approximately 400km/s). This transition to a subsonic regime causes compression waves to form, and causes the rapid drop off.

        • By puzzledobserver 2023-12-158:013 reply

          I didn't know that the speed of sound in the interstellar medium is 100 km/s. That seems surprisingly high, given that there's more atmospheric material here on the surface of Earth, and the speed of sound is only about 330 m/s.

          How can sound travel so fast in the interstellar medium?

          • By explaininjs 2023-12-159:01

            Basically, since there's so little atmospheric material, any particles that you do set in motion will travel very far in a straight line before they hit another, which is a lot faster than hitting a bunch of particles erratically.

            The catch is that you can only transmit very low frequency sounds - ti can be thought of like the variations in travel time for any individual particle drown out any high frequency signal.

          • By ben_w 2023-12-158:411 reply

            In an ideal gas, speed of sound depends on the temperature and molar mass, but not density as the density terms cancel out: https://en.wikipedia.org/wiki/Ideal_gas#Speed_of_sound

            It's hot, so the speed of sound is high.

            • By mcv 2023-12-1514:011 reply

              Can you use it to transmit actual sound? Would it be possible to use this for short-range communication in space? Or is this only a very theoretical kind of sound?

              • By ben_w 2023-12-1518:001 reply

                Given the supersonic flow, one directional communication only.

                Given the impedance mismatch[0], even the parts of the solar system outside Kármán lines where the interplanetary medium can support pressure levels equivalent to normal speaking (including low Earth orbit), I'm told human ears can't respond to that pressure change properly.

                Sensors can be built to pick it up, but that may not be in scope for your question as we can also do that for acoustic waves in the CMB.

                [0] https://en.wikipedia.org/wiki/Impedance_matching#Acoustics

                • By mcv 2023-12-1520:04

                  I wouldn't expect human ears to be able to pick it up, because human ears would have far more serious problems just being exposed to the vacuum of space. But if it's possible to make sensors that can detect this sound, and devices that can generate it, and communication between the two would be accurate and really this fast, well, that would certainly be interesting.

                  Not sure if it would actually be useful, because I'm sure radio waves would be much more practical, but sound in space is a fascinating idea.

        • By simonjgreen 2023-12-158:01

          That’s fascinating, and going to send me down a learning hole! I had never before considered speed of sound in space.

      • By zeven7 2023-12-152:431 reply

        I didn’t understand it either but the pictures and graphs on this Wikipedia entry actually helped make a lot more sense of it to me, especially the analogy of the water faucet https://en.m.wikipedia.org/wiki/Heliosphere

        • By kqr 2023-12-158:03

          That was a good page. At first I thought "But what is out there, outside the solar system?" under the assumption that there was nothing there. And there is nothing there, but nothing in galactic terms: the entire galaxy is there! And apparently it behaves like a gas, despite its low density.

          So although solar wind sounds hardcore, at that distance its pressure about matches that of the nothingness that makes up most of the galaxy. Interesting!

      • By anon_cow1111 2023-12-155:061 reply

        Wow yes, that was also my main question and it looks like the answer everyone is agreeing on is "interstellar wind pushing back against the solar wind"

        BUT- The graph says 2-3 particles/sec hitting the detector, which in sub-atomic terms is like 2 drops of water in an ocean's worth of volume. How much meaningful particle interaction is happening when everything is so close to a true vaccuum? Is this another one of those weird quantum-field-theory things? (Asking as a layman not a physicist obviously)

        • By thriftwy 2023-12-159:54

          It is certainly not vacuum. Vacuum is when a gas particle is more likely to hit a wall (or other solid object) than other gas particle. It is absolutely not so on the edge of solar system, where low density is compensated by the vastness of space.

      • By Aperocky 2023-12-152:00

        It could be a 3D porous boundary.

        A solar eruption may impose 10~ AU of continued heliosphere at this distance.

      • By monocasa 2023-12-151:342 reply

        There's a relatively hard boundary at the heliopause.

        • By manicennui 2023-12-152:11

          This entire concept is wild to me as someone who is incredibly ignorant about space.

          https://en.wikipedia.org/wiki/Heliosphere#Heliopause

        • By idontwantthis 2023-12-151:372 reply

          But why are there hard boundaries before that bounced back?

          • By colanderman 2023-12-152:58

            Presumably the exact location of the heliopause fluctuates due to perturbations in the sun's emissions.

          • By jacquesm 2023-12-151:461 reply

            Gravity. The suns gravity field is in theory infinite but there is a pretty precise boundary where it stops to have an immediate effect on the things around it and orbits around the sun are no longer possible.

            • By pdonis 2023-12-151:583 reply

              > The suns gravity field is in theory infinite but there is a pretty precise boundary where it stops to have an immediate effect on the things around it and orbits around the sun are no longer a thing.

              This is not correct. The particles are not in orbit about the Sun, they're coming from the Sun--they're the solar wind. The heliopause is where the solar wind particles are stopped by the pressure of the surrounding interstellar medium. When Voyager passed that point (the heliopause), the number of particles hitting it dropped drastically.

              • By fnordpiglet 2023-12-152:18

                Correct:

                The heliopause is the theoretical boundary where the Sun's solar wind is stopped by the interstellar medium; where the solar wind's strength is no longer great enough to push back the stellar winds of the surrounding stars. This is the boundary where the interstellar medium and solar wind pressures balance. The crossing of the heliopause should be signaled by a sharp drop in the temperature of solar wind-charged particles,[30] a change in the direction of the magnetic field, and an increase in the number of galactic cosmic rays.[34]

                https://en.m.wikipedia.org/wiki/Heliosphere#Heliopause

              • By emchammer 2023-12-152:522 reply

                Why don't particles from surrounding interstellar medium show up in the graph as matching the pressure of the solar wind?

                • By dotnet00 2023-12-152:58

                  Presumably simply because there isn't as much density. The interstellar medium particles must be less dense but more energetic, thus producing the pressure that causes the heliosphere to be restricted.

                • By pdonis 2023-12-154:10

                  As I understand it, the much smaller number of particles hitting Voyager now are the interstellar medium. The rate of particles hitting Voyager is not a measure of the pressure of the ambient plasma.

      • By zaik 2023-12-1516:43

        Inverse quadratic would have been my guess.

      • By Macha 2023-12-151:352 reply

        Best guess: You get particles orbiting the sun, until you pass a point where the sun's gravity is too weak to hold them, and from that point you basically only see things who's escape trajectory intersects with yours

        • By pdonis 2023-12-151:541 reply

          > You get particles orbiting the sun

          The particles hitting Voyager aren't orbiting the Sun; they're from the Sun, the solar wind. The heliopause is the point where they are stopped by the interstellar medium. That point is what Voyager passed as shown in the graph.

          • By lazide 2023-12-154:122 reply

            They are still, indeed, orbiting the sun. In the same way the earths atmosphere is orbiting the earth.

            • By cshimmin 2023-12-154:201 reply

              No, they are traveling at velocities far exceeding the gravitational escape velocity of the sun. There is no meaningful sense in which they are orbiting.

              • By lazide 2023-12-154:462 reply

                Except they aren’t, which is why they are there and there is a heliopause instead of them being in interstellar space and there not being a heliopause.

                If they had greater than escape velocity, they’d be escaping and we’d not see the graph we see.

                • By grey-area 2023-12-155:531 reply

                  Orbit means going very fast around something in a circular motion. These particles are heading directly streaming out from the sun, not going round it.

                  As I understand it it’s where these particles reach equilibrium with the stellar medium. The sun is like a comet at a large enough scale, with a long tail of particles as it moves through the galaxy.

                  https://en.m.wikipedia.org/wiki/Heliosphere

                  • By lazide 2023-12-1512:242 reply

                    The particles don’t meaningfully interact, they aren’t dense enough.

                    • By adwn 2023-12-1514:051 reply

                      They interact via the electromagnetic force.

                      • By lazide 2023-12-172:541 reply

                        So that graph showing no interactions is lying?

                        • By pdonis 2023-12-1823:55

                          The graph is not a graph of interactions between solar wind particles and interstellar medium particles. It is a graph of solar wind particles detected by Voyager 1.

                    • By pdonis 2023-12-1516:461 reply

                      They are plenty dense enough to interact given that it's plasma.

                      • By lazide 2023-12-172:541 reply

                        You might want to re-read that graph. And conservation of momentum means the particles leaving the sun don’t stop rotating when they leave the sun - the sun is rotating.

                        • By pdonis 2023-12-1823:56

                          > You might want to re-read that graph.

                          You might want to re-read the Wikipedia page. It explicitly says that Voyager 1 saw the density of plasma around it increase by a factor of 40 as it crossed the heliopause. (For Voyager 2, it was a factor of 20, as I have posted elsewhere in this discussion.) It also explicitly says that the solar wind is stopped at the heliopause due to the pressure of the interstellar medium, which, last I checked, means the interstellar medium is interacting with the solar wind.

                          > conservation of momentum means the particles leaving the sun don’t stop rotating when they leave the sun - the sun is rotating

                          Sure, with a period of about 27 days. Go do the math and compare the tangential velocity that equates to with the tangential velocity required to orbit the Sun just above the Sun's surface.

                • By pdonis 2023-12-1516:461 reply

                  > If they had greater than escape velocity, they’d be escaping

                  Only if the space they were escaping into were vacuum. Which it isn't. What stops them is not the Sun's gravity but the plasma in the interstellar medium.

                  • By lazide 2023-12-172:561 reply

                    If the intersteller medium is not a vacuum, what is? Last I checked, it was literally billions of times lower density than the hardest vacuum we’ve been able to produce on earth.

                    • By pdonis 2023-12-1822:081 reply

                      > If the intersteller medium is not a vacuum, what is?

                      There is no threshold of low enough density at which there is suddenly "vacuum". If there are particles present, there are particles present, and they can have effects.

                      > Last I checked, it was literally billions of times lower density than the hardest vacuum we’ve been able to produce on earth.

                      [Edit--these numbers are off--see my post downthread]

                      And the solar wind is much, much less dense than that. Interstellar medium density is about a trillion particles per cubic meter. Solar wind density is about 5 thousand particles per cubic meter. So the interstellar medium is more than dense enough to stop the solar wind.

                      • By lazide 2023-12-1822:201 reply

                        Cite? Everything I see indicates solar wind density is 10-100 times interstellar medium density.

                        I suspect you got your numbers reversed.

                        • By pdonis 2023-12-1823:52

                          > Cite?

                          You are correct that the numbers I cited were off, because I had neglected to check specifically for numbers at the heliopause. Here is a better set of numbers:

                          https://ui.adsabs.harvard.edu/abs/2019NatAs...3.1024G/abstra...

                          The plasma density in the outer heliosphere is typically about 0.002 cm-3. The first electron density measured by the Voyager 2 plasma wave instrument in the interstellar medium, 0.039 cm-3 ± 15%, was on 30 January 2019 at a heliocentric radial distance of 119.7 au. The density jump, about a factor of 20, confirms that Voyager 2 crossed the heliopause.

                          In other words, the density of the interstellar medium just outside the heliopause, as detected by Voyager 2, was about 20 times larger than the density of the plasma just inside the heliopause.

            • By pdonis 2023-12-1516:451 reply

              The earth's atmosphere is not orbiting the Earth. It is in hydrostatic equilibrium in the earth's gravitational field. Big difference.

              • By lazide 2023-12-172:541 reply

                and yet, atmospheric particles that have a mean free path that doesn’t intersect with other atmospheric particles, or when they do the average velocity delta and direction reach escape velocity, escape and are lost.

                You might want to rethink that. It’s a useful model in bulk in the lower atmosphere, but it’s far from true in the upper atmosphere.

                • By pdonis 2023-12-1822:14

                  > You might want to rethink that.

                  You might want to rethink your claim.

                  First, while the upper atmosphere is much less dense than the lower, and the fluid approximation becomes less and less useful as you gain altitude, that still doesn't mean that "a bunch of particles in free-fall orbits" becomes a useful model. The average thermal velocity of a molecule in the upper atmosphere is still well short of orbital velocity at that altitude. Some molecules acquire sufficient velocity to escape, sure, but that doesn't mean the others are in orbit.

                  Second, the Earth's atmosphere is not a good analogy for what is happening at the heliopause anyway.

        • By mattgrice 2023-12-154:341 reply

          [flagged]

          • By cwalv 2023-12-154:46

            I enjoyed reading their guess. Since he started the comment with 'I guess', it seems harmless enough. You're comment, otoh, is pretty rude.

    • By Tommstein 2023-12-154:591 reply

      The Voyagers leaving the solar system is so popular that they've done it like 10 times!

      • By wongarsu 2023-12-157:58

        And in another 300 years or so they will leave the solar system again when they reach the Oort cloud, and in another 30000 years or so they will finally leave the solar system for the last time when they leave the Oort cloud

    • By dotancohen 2023-12-152:592 reply

      Maybe you're the right person to ask. How do we know that the sun-particle-sensor or its wiring harness or one of its connectors simply hadn't failed?

      • By cshimmin 2023-12-154:232 reply

        A very valid question. In this case IIRC this drop-off in low-energy solar wind particles was correlated with observed changes in the magnetic field and also an increase in higher-energy cosmogenic particles all around the same time. These three phenomena (observed by different instruments) were theoretically predicted to occur at the heliopause transition. So it lends much more confidence to the interpretation of the data.

        • By ferw 2023-12-155:08

          I believe that the direction of the magnetic filed didn't change, contrary to expectations. The explanation was that the galaxy's magnetic field is aligned with the magnetic field of our sun.

          https://web.archive.org/web/20130913162459/http://news.natio...

        • By superjan 2023-12-1511:00

          And hopefully Voyager 2’s measurements will confirm this. I don’t know how long we need to wait though.

      • By NhanH 2023-12-153:01

        I am guessing the number comes from multiple systems of independent sensors. So the assumption is that they won’t all fail at the same period in the same way

  • By gzer0 2023-12-153:4610 reply

    One of my favorite facts ever is that Voyager 1 contains something called the Voyager Golden Record [1]. It has the following quote written:

    This is a present from a small, distant world, a token of our sounds, our science, our images, our music, our thoughts and our feelings. We are attempting to survive our time so we may live into yours.

    I get chills everytime I think about this. I hope we can recover from this event and restablish communication.

    [1] https://en.wikipedia.org/wiki/Voyager_Golden_Record

    • By Fluorescence 2023-12-158:283 reply

      I like that we sent unsolicited nudes. An act I could likely be convicted for if I sent it to a neighbour no matter how nice the gold disk or long the journey...

      ... but now I look at the pictures on wikipedia and see there are no nudes or even a Vitruvian Man. How strange to have a belief of many decades suddenly corrected. Seems that I have mentally fused the earlier Pioneer plaque with Voyager.

      https://en.wikipedia.org/wiki/Pioneer_plaque

      > After NASA had received criticism over the nudity on the Pioneer plaque (line drawings of a naked man and woman), the agency chose not to allow Sagan and his colleagues to include a photograph of a nude man and woman on the record. Instead, only a silhouette of the couple was included.[15] However, the record does contain "Diagram of vertebrate evolution", by Jon Lomberg, with drawings of an anatomically correct naked male and naked female, showing external organs.[16] The person waving on the diagram was also changed: on the Pioneer plaque, the man is waving, while on the "Vertebrate evolution" image, the woman is waving.

      • By heresie-dabord 2023-12-1510:153 reply

        > After NASA had received criticism over the nudity on the Pioneer plaque

        Let's just think about this for a moment.

        Some people were sufficiently prudish and/or puritanical to make a formal objection about an illustration of our species -- an illustration being sent into the Cosmos -- into the Cosmos where there are _no other humans_ -- an illustration, I say, that was destined to leave our Solar System and likely never be seen again.

        And 50 years later, in 2023, I am sure that there has been little improvement in the public discourse of the society that somehow produced these great NASA missions. In fact, the social discourse is _worse_ today.

        > "According to astronomer Frank Drake, there were many negative reactions to the plaque because the human beings were displayed naked.[19] When images of the final design were published in American newspapers, one newspaper published the image with the man's genitalia removed and another newspaper published the image with both the man's genitalia and the woman's nipples removed.[20] In one letter to a newspaper, a person angrily wrote that they felt that the nudity of the images made the images obscene.

        > "Sagan said that the decision to not include the vertical line on the woman's genitalia (pudendal cleft) which would be caused by the intersection of the labia majora was due to two reasons. First, Greek sculptures of women do not include that line. Second, Sagan believed that a design with such an explicit depiction of a woman's genitalia would be considered too obscene to be approved by NASA.[10] According to the memoirs of Robert S. Kraemer, however, the original design that was presented to NASA headquarters included a line which indicated the woman's vulva,[11] and this line was erased as a condition for approval of the design by John Naugle, former head of NASA's Office of Space Science and the agency's former chief scientist.

        If humans ever establish a colony beyond Earth, it will not be like Star Trek. It will be Puritans in Space.

        • By ordu 2023-12-1520:31

          > Some people were sufficiently prudish and/or puritanical to make a formal objection about an illustration of our species -- an illustration being sent into the Cosmos -- into the Cosmos where there are _no other humans_ -- an illustration, I say, that was destined to leave our Solar System and likely never be seen again.

          It is not exactly correct. This illustration is widely known now. People look at it, not aliens. I'd say the whole idea to send a picture is directed not at aliens but at humans: it is plainly improbable someone will see the original plaque.

          I can imagine that someone finds this plaque, but it will be space archaeologists from Earth. Once again: humans.

          This plaque was made for humans. They keep saying that it was made for aliens, but they like to daydream.

        • By defrost 2023-12-1510:221 reply

          > If humans ever establish a colony beyond Earth

          s/humans/USAians/

          I'm pretty sure many parts of the globe are fine with full commando.

          Most French, any average Australian, Brazil, etc. very likely sent in zero (0) letters of outrage.

          • By nozzlegear 2023-12-160:291 reply

            s/USAians/Americans/

            • By defrost 2023-12-162:451 reply

              I don't think the south American Brazilians are anywhere near as prudish as the central north American USofA denizens.

              • By nozzlegear 2023-12-1615:251 reply

                And I didn’t think South Crossers were never told what the A in USofA stands for but here we are.

                • By defrost 2023-12-1621:15

                  Indeed, in a time when seemingly many basement Canadians struggle with Venn diagrams and naming a country that starts with the letter 'U'.

      • By ponector 2023-12-1513:261 reply

        Wikipedia is full of nudes. From classic pictures with Venus to parts of the body, like labia.

        • By debo_ 2023-12-1521:10

          I don't know, photos of labia are really only lip service to nudity. /joke

    • By ddingus 2023-12-154:542 reply

      I like the simple, humble message.

      My own take is similar. Truth is, we are young, we shit where we eat, we spend considerable resources killing one another, we do not take good care of our own, and we reproduce like rabbits.

      For all we know there is a signpost some parsecs out there that reads: Do not yet approach. These things have not yet become ready for what contact could likely mean. We must pass tests to come. Tests that arise as an artifact of our current human condition.

      Trying to survive our time is so damn spot on too! Real as it gets, and for that record, real as it needs to be.

      Once we do get to really living, thriving on a scale we imagine others farther along in their journey as beings could maybe be, we might look back in awe that we managed it! Others may look toward us with some hope and anticipation of a meeting being worth it one day, should we succeed.

      Maybe, just maybe that scrappy little world and it's people some how grow enlightened enough to endure through and become peers of a sort, likely young, but maybe ready sort.

      A whole lot went into those short phrases. Damn good stuff.

      • By vasco 2023-12-156:522 reply

        We don't even have a world government yet! It's very disorganized still at home to receive guests, I agree.

        • By somenameforme 2023-12-1512:222 reply

          Have you noticed this correlation that all the governments people find desirable are of tiny little populations? And that the larger a governed population becomes the more of a mixture of dysfunctional, corrupt, and/or authoritarian the government seems to become?

          I don't really think it's a correlation. It's tough for any given entity to truly represent 10 people, let alone 10 million. And by the time you start speaking of the hundreds of millions, any meaningful notion of representation is just out the window. And now imagine this on a scale of billions, with countless groups that all have largely mutually exclusive views?

          And this will become even more true in the future. Imagine what will happen as we start to be able to reach out and colonize other planets. The cultures, ideals, interests, and even language on those places will tend to constantly diverge from that on Earth. To have somebody try to represent somebody without even sharing the same fundamental values is a system doomed to trend towards authoritarianism at first, and ultimately to complete collapse and failure.

          • By MagicMoonlight 2023-12-1615:01

            Everything comes down to motivation. When it's you and three other people, you can figure out a system. It's very easy to enforce the system because you all know each other. Not working well lets them down and lets them beat you down.

            When it's you and a village, it's still relatively easy to enforce things because you have families and relationships. There's some inefficiency and some corruption from individuals but you can still try and monitor everything.

            When it's you and a hundred million people, you could never even meet everyone let alone know them. So you need a thousand steps between. And every step and every system can have corruption and inefficiency. Unless you have some sort of central mandate like religion, it's very hard to motivate people because everyone is so separated. Even if you do, you still get corruption like in china.

          • By okasaki 2023-12-1516:173 reply

            I think you're just making stuff up. Eg.

            CCP approval rate: 89%

            https://www.statista.com/statistics/1116013/china-trust-in-g...

            • By somenameforme 2023-12-1517:36

              China's well into the authoritarian phase, but I think they also have an even more unique issue driving their success. Just 60 years ago you also had tens of millions of Chinese literally starving to death in the Great Leap Forward. Since then they've become the largest (PPP) economy, and continue to grow rapidly with widespread visible quality of life improvements. That's going to drive a tremendous amount of good will. The problem is that while they still have plenty of room to grow, it's completely and absolutely unsustainable. And what happens once it does eventually end?

            • By someuser2345 2023-12-1517:03

              I suspect the number is that high because Chinese people are scared what their government would do to them if they criticized it.

            • By damiankennedy 2023-12-1517:15

              You can't have statistics on a website without /s

        • By mtsr 2023-12-157:342 reply

          Human diversity being what it is, I doubt a world government is desirable at all.

          But maybe we can agree on some basics around fairly sharing food and toys and not trying to steal the other kids toys (or even half their yard).

          • By vasco 2023-12-1510:341 reply

            In my opinion it's just a problem of scale. You could say the same thing about neighbour tribes of the same region 10k years ago - that they would never get along. We share the most important thing of all, the planet, and our humanity. And now the internet connects us all in real time, it's just a matter of letting time pass as our outlooks get more and more similar and what we share becomes bigger than what differentiates us, even if there needs to be some more wars along the way.

            One way to picture it is, imagine if there's a planet somewhere in the universe with life. Given enough time, do you expect it to have a unified government that fractally subdivides (like states, regions, city governments), or do you expect it to have multiple heads? I think it's way more likely that a dominant culture at some point appears, itself being a mesh of different cultures from the different regions, but at some point unifies. I just don't see another way.

            Even if we look at history, while there's periods of fragmentation after periods of consolidation, in general things trend towards consolidation. We're more consolidated than ever before and I think it only goes in one direction. I'm talking here on the scale of thousands of years by the way. So like, in the year 5000, is there one world government or not? That would be the bet.

            I'm not even saying it's desirable or not, just that it's likely to happen. For example if one country suddenly discovers a major technological advance, it's likely to exploit it by starting wars to consolidate, as it has happened all through history. And that only has to happen a few times over the course of thousands of years to get us to a world government. There aren't even 200 countries in the world!

            • By dotnet00 2023-12-1517:59

              One important consideration is that many of the larger countries are technically unified entities, but in reality they consist of many smaller governments with significant power (eg states in the US, Canada and India). They are tolerable because ultimately they still mostly recognize that people would rather be governed by an entity that has your local interests in mind, with the role of higher levels being to manage interactions between them.

              Thus, I don't see a world government happening until we're so well into colonizing other worlds that it's more practical to deal in terms of planets than with individual countries. Even at that point though, I'd expect something similar to countries to continue to exist.

              Put differently, I think a single entity with governance over the entirety of humanity is never going to happen (assuming we don't suffer some sort of near extinction level collapse).

          • By ddingus 2023-12-1519:25

            I agree, and I think our biggest challenge is actually Mutual understanding and respect. We don't need one government to organize as a species and get to where we could become a peer with other advanced ones that we are imagining today. But we need to understand ourselves well enough to get along at a bare minimum.

      • By opyate 2023-12-157:53

        I had a dream a few days ago: our overlords cancelled the experiment (us), nah it's not working, but here's a new specimen with the "tribal" bit switched off. We suspect it might go better this time.

        The Futurama meme "I don't want to live on this planet any more" comes to mind way too often these days...

    • By dorkwood 2023-12-157:18

      I was curious to see what music they included. This passage from the Wikipedia page made me smile:

      > The inclusion of Berry's "Johnny B. Goode" was controversial, with some claiming that rock music was "adolescent", to which Sagan replied, "There are a lot of adolescents on the planet."

    • By wildekek 2023-12-1512:01

      I own a box-set with a copy of the golden record, book and other memorabilia. It's an amazing work of art, and if you're a voyager fan, treat yourself to one. The book alone is worth the price. The more I understand the Golden Record, the more I realized it has less to do with what is out there, but about how precious it is what we have right here. https://ozmarecords.com/collections/shop/products/voyager-go...

    • By DamnInteresting 2023-12-1514:501 reply

      I made this about 7 years ago: http://voyager.damninteresting.com/

      • By mzs 2023-12-1515:04

        thank you for creating this

    • By mike_d 2023-12-155:163 reply

      The Golden Record was supposed to include the Beetles "Here Comes The Sun" but the label wanted more in licensing fees than the whole thing cost to produce.

      • By jzombie 2023-12-155:412 reply

        > the label wanted more in licensing fees than the whole thing cost to produce.

        Hopefully this message was sent instead.

        • By nomilk 2023-12-155:59

          Humourously and sadly, it would be informative of aspects of human nature.

        • By kqr 2023-12-158:151 reply

          If it was communicated by radio at any point, the message was technically broadcast to pick up by sufficiently advanced receivers...

          • By zymhan 2023-12-158:341 reply

            Approximately 1000 years to get to Omicron Persei 8, according to Futurama

            • By tgv 2023-12-1514:40

              Well ahead of the golden disk then.

      • By k1t 2023-12-157:35

        Only the recording industry could take a record that will never be played and make the licensing fees more expensive than the gold record it would be printed on.

        (actually it is gold-plated copper)

      • By wannacboatmovie 2023-12-158:371 reply

        Yoko trying to collect royalties from aliens in outerspace was not on my bingo card.

        • By mlrtime 2023-12-1513:50

          Any bingo card with Yoko would have alien royalties be the *least* crazy square.

    • By thrdbndndn 2023-12-156:191 reply

      I think the golden record is even more famous than the Voyager(s) themselves.

      At least I learned it in my childhood (together with Pioneer plaque -- I just noticed they're not the same thing!)

      • By DougEiffel 2023-12-1514:19

        So dumb. It's free publicity for the rest of human existence. They should have been begging to have the song included.

        Even a small chance of aliens becoming Beatles fans and coming to Earth to trade unimaginable wealth in exchange for licensing rights.

    • By HardDaysKnight 2023-12-1515:572 reply

      I don't understand the Golden Record. Even assuming other advanced civilizations, "there’s an infinitesimally small chance that the Golden Record will be picked up."[0] So, at some (considerable?) cost and time, something meaningless and ineffective (from the perspective of its ostensible purpose, communicating with alien civilizations) was undertaken. So what was the point? Why was it done? Note, I'm not questioning sending out probes, gathering data, space exploration, etc.

      [0] https://www.atlasobscura.com/articles/voyager-golden-record-...

      • By broscillator 2023-12-1612:23

        > So what was the point? Why was it done?

        You could ask this about any piece of art. Additionally, you could say these questions are, partially, also the function of art.

        It's not just the contents that display humanity, it's the fact of sending it that says "we're human". Ultimately, this is more important than gathering data or anything of the sort with a clear, functional purpose.

      • By uw_rob 2023-12-1516:12

        The Golden Record acts as a good thought exercise about how we'd go about communicating with an alien species. It's also a good public outreach and educational tool. It inspires awe and encourages taking time to reflect on what we are most proud of as a species.

    • By fieryskiff17 2023-12-153:50

      [flagged]

HackerNews