I should have loved biology too

2025-04-2216:46271193nehalslearnings.substack.com

How I went from hating it to being obsessed, the allure of great writing, and a post-scuba-dive moment of clarity

About a year ago, I came across James Somers’ blog post, I should have loved biology. I began reading it and every sentence struck a chord: “I should have loved biology but found it a lifeless recitation of names”; “In textbooks, astonishing facts were presented without astonishment”; “In biology class, biology wasn’t presented as a quest for the secrets of life. The textbooks wrung out the questing.” In fact, the chord was so neatly stuck that I stopped reading about a quarter of the way through, and found myself falling into a memory. I was sitting in my 7th grade biology class, completely disinterested. Every time our teacher would turn her back to us to write on the blackboard, my friends and I would sling paper pellets at each other across the room, barely paying attention as she narrated wearily about cell walls or chloroplasts or mitochondria being the powerhouse of the cell. I liked math and physics and economics and even chemistry, to some extent (much less pellet slinging), but biology, with its endless memorization of definitions and regurgitation of facts – no, biology could go back under the soil it came from.

Now, I’m obsessed. I can’t get enough. I’ve read about fifteen books in the last year or so, watched countless YouTube videos, and started a bioinformatics course. And my list keeps growing. The first quarter of Somers’ post was so effective in making me consider my own disinterest-to-obsession journey – (I didn’t even read the rest until months later) – that I decided to look back and examine what caused this complete change of heart.

More than anything – nature documentaries, science shows, museum visits – it was great writing that allowed me to see the world of biology differently. My interest in biology, or rather the reversal of my disinterest in biology, began when I read The Sixth Extinction in 2016, during my second year of university. Elizabeth Kolbert’s gripping writing unveiled a completely different perspective of the subject, right alongside the scientists and researchers: driving through a Panamanian rainforest looking for golden frogs, searching a littered New Jersey creek for ammonites, scuba-diving in Castello Aragonese to inspect carbon dioxide rushing out of sea vents and in The Great Barrier Reef to look at octopi and coral reefs and blue starfish and leopard sharks and giant clams. Biology, suddenly, didn’t seem just a list of facts to memorize; it was an adventure.

I still remember how I felt after finishing her book: a strange mix of wonder and tragedy, awe and despair. That narrative structure – vivid reporting and meticulous research built on a foundation of context and history – changed how I saw science and scientists. No more dry paragraphs of definitions and explanations; every discovery had a story.

The Great Barrier Reef, the world’s largest coral reef system. Elizabeth Kolbert occasionally reports about the impact of climate change there.

I wanted more books just like that, and luckily for me, several months later in an airport bookshop in Bangalore, I came across and picked up The Gene. I wasn’t aware of who Siddhartha Mukherjee was at the time (possibly the mention of Pulitzer Prize winner on the cover influenced me), and I had no prior interest in genetics, but that book would end up completely changing my worldview on biology and non-fiction writing. If Kolbert made a crack in the dam I had built around biology, Mukherjee would go on to smash the whole thing down to pieces.

One of the stories in the book, the discovery of the gene that caused Huntington’s disease, moved me tremendously when I first read it a few years ago. It’s the perfect example of the amount of effort that goes into a scientific discovery that then ends up as a single sentence in a textbook; in this case, that Huntington’s disease is a hereditary, neurodegenerative disorder caused by a mutation in a single gene.

The story of finding that mutation would make a thrilling movie: a young woman named Nancy Wexler, devastated by the news that her mother has been diagnosed with Huntignton’s and that she and her sister would have a 50-50 chance of getting it, decides to devote her life to solving this medical mystery. Her quest takes her from nursing homes in Los Angeles to interdisciplinary scientific workshops in Boston to stilt villages surrounding Lake Maracaibo in Venezuela. Her decade-long blood and skin sample collection efforts there would create the largest family tree with Huntington’s, leading to the first genetic test for the disease, followed by locating the precise genetic mutation that caused it.

Dr. Nancy Wexler in 1990, with a family tree that traced the path of Huntington’s. Acey Harper/The LIFE Collection, via Getty Images. Taken from the New York Times.

The gene sequence had a strange repeating structure, CAGCAGCAG… continuing for 17 repeats on average (ranging between 10 to 35 normally), encoding a huge protein that’s found in neurons and testicular tissue (its exact function is still not well understood). The mutation that causes HD increases the number of repeats to more than forty – a “molecular stutter” – creating a longer huntingtin protein, which is believed to form abnormally sized clumps when enzymes in neural cells cut it. The more repeats there are, the sooner the symptoms occur and the higher the severity.

Nancy herself opted not to take the genetic test she helped create. “If the test showed I have the gene,” she wrote in 1991, “would I continue to feel the happiness, the passion, the occasional ecstasy I feel now? Is the chance of release from Huntington’s worth the risk of losing joy?”. In 2020, at the age of 74, she revealed that she had Huntington’s. The public acknowledgment was not a surprise for those close to her – for the last decade, they noticed her gait slowly deteriorate, speech slur, and limbs jerk in random directions, the same characteristics she saw in her mother half a century ago, and in the hundreds of Venezuelan patients she tended to ever since.

There’s still no cure for Huntington’s disease, but every time I hear about progress on cures, I feel a rush of emotions, like I have a personal stake in its invention. I really wish to see one found within Nancy Wexler’s lifetime; this movie deserves a happy ending.

Pick a field in biology, or a slice of history, and you’ll find countless stories just like this. Mischievous Watson and Crick figuring out the structure of DNA after getting a peek at Rosalind Franklin’s crisp x-ray crystallography photograph; Baruch Blumberg discovering hepatitis B after locating the antigen in the blood of an Australian Aboriginal, and beating NIH to its cure, the world’s first cancer vaccine; James Simpson systematically inhaling various vapors and recording its effects in the search for a better anesthetic, resulting in the discovery of chloroform; Andreas Vesalius taking prisoners’ corpses hanging in the gallows in 16th century Paris and, along with painter Andrea Mategna, publishing nearly 700 incredibly detailed drawings of the human anatomy.

An illustration from De Humani Corporis Fabrica (On the Fabric of the Human Body), published in 1543 by Andreas Vesalius and Andrea Mategna. The first edition included over 200 high-detail anatomical illustrations. I particularly like this one.

History and stories may not be immediately applicable, but when used as a key ingredient it makes the discoveries more majestic, more impactful. That’s what I love about Mukherjee’s writing: it’s a unique stew of history, biography, experimental methods and results, scientific findings and their significance, seasoned well with personal anecdotes, and presented with the candor of a physician and the artistry of a poet. The context creates a kind of multiplier when the mind-shattering discoveries are explained – how a genotype gives rise to a phenotype, how cancer works, how a heart beats or a bone mends itself or a brain remembers a memory. Like the climax of a movie scene, the beauty and immensity of the discovery or the invention feels far more compelling after following the steps that got us there.

Every discovery might not have an entertaining backstory, but even when focusing on just the phenomenon, great technical writing has this striking ability to make you see the world differently. The same molecule or cell or organ, theory or experiment or discovery, suddenly seems monumental, like it’s the most important thing in the world. It makes you think: why didn’t I learn about this before?

One of my favourites is the way Mukherjee describes how a neuron communicates in The Song of the Cell:

Imagine the nerve, first, in its “resting” state. At rest, the internal milieu of the neuron contains a high concentration of potassium ions and a minimal concentration of sodium ions. This exclusion of sodium from the neuron’s interior is critical; we might imagine these sodium ions as a throng outside the citadel, locked out of the castle’s walls and banging at the gates to get inside. Natural chemical equilibrium would drive the influx of sodium into the neuron. In its resting state, the cell actively excludes sodium from entry, using energy to drive the ions out…

[...] The dendrites are the site within the neuron where the “input” of the signal originates. When a stimulus—typically a chemical called a “neurotransmitter”—arrives at one of the dendrites, it binds to a cognate receptor on the membrane. And it is at this point that the cascade of nerve conduction begins.

The binding of the chemical to the receptor causes channels in the membrane to open. The citadel’s gates are thrown ajar, and sodium floods into the cell. As more ions swarm in, the neuron’s net charge changes: every influx of ions generates a small positive pulse. And as more and more transmitters bind, and more such channels open, the pulse increases in amplitude. A cumulative charge courses through the cell body.

The mental picture of a throng of sodium ions locked out of the castle walls is so helpful and convincing. I can see, in my mind’s eye, these shadowy ions banging at the gates to get inside, like an invading army. Then, after the neurotransmitter binds to the cognate receptor, the sodium ions don’t just enter, they flood and swarm in; the membrane doesn’t just open, its gates are thrown ajar. The metaphor makes the chemical process relatable without leaving out the details; the vivid language romanticizes it, creating a mental picture that not only stays with you, but makes you want to learn more.

A little later in the chapter, Mukherjee writes about neural connection in the fetus:

Neural connections between the eyes and the brain are formed long before birth, establishing the wiring and the circuitry that allow a child to begin visualizing the world the minute she emerges from the womb. Long before the eyelids open, during the early development of the visual system, waves of spontaneous activity ripple from the retina to the brain, like dancers practicing their moves before a performance… This fetal warm-up act—the soldering of neural connections before the eyes actually function—is crucial to the performance of the visual system. The world has to be dreamed before it is seen.

There’s something about this evocative language that leaves a sweet, lingering imprint on my mind — a new set of neural connections; my own throng of sodium ions banging at the gates, my own ripples. The details – which ions, the name of the receptor – might get murky after the passage of time, but the sweet feeling remains, like a memory of a heavenly meal; you may have forgotten the exact taste, but the feeling of satisfaction lingers, and occasionally, when it enters front and center, you might imagine visiting the restaurant (or home) once more.

That’s what I feel after reading books like this – the belief that I’ll revisit it, relive it, relearn it. It fills up a reservoir of curiosity, and every subsequent piece of stimulus – a neurology article or academic paper shared on Twitter, a documentary or YouTube video, another book (even textbooks) – opens the floodgates, and makes you want to explore a little more. I might not have the equipment to see this cell myself, but when written like this, this world too can be dreamed before it is seen.

Santiago Ramón y Cajal’s famous drawing of neurons, circa late 19th century. He would go on to create more than 2,900 drawings detailing the nervous system’s architecture. Image taken from Quanta Magazine

The more you explore, the more astonishing it gets. Suddenly, you’re surrounded by these facts that stop you in your tracks. Like the fact that there are 20-30 trillion red blood cells in our body, making up roughly 84% of all our cells, and 1.2 million are created in our bone marrow every second. Or the fact that our visual system is predictive, calculating where to move the hand to catch a ball before your visual system has fully registered its trajectory.

One of my favorite ‘sentences that stopped me in my tracks’ comes from Nick Lane’s book, The Vital Question. He starts with carefully explaining that all cells derive their energy from a single type of chemical reaction, the redox reaction, where electrons are transferred from one molecule to another. Rust is a redox reaction: iron donates electrons to oxygen, being oxidized in the process. Same with fire: oxygen (O2) is reduced to water after receiving two electrons (O2-) and then two protons (H2O), balancing the charges, and releasing heat in the process. Respiration — the process that turns our food into energy — does exactly this as well, except that it conserves some of the energy in the form of a molecule called adenosine triphosphate (ATP). Think of ATP as an energy currency, able to be stored or converted back into energy by splitting the molecule into ADP (adenosine diphosphate) and Pi (phospate). And so, he writes, “in the end respiration and burning are equivalent; the slight delay in the middle is what we know as life.”

Wait, what? The slight delay in the middle is what we know as life? I think when I first read that I might have skipped a heartbeat. I learned about mitochondria and ATP and redox reactions and aerobic respiration in high school, but I never pictured it as millions of molecular fires that keep us alive. Actually, not a million; it’s at least a quadrillion – per second.

ATP is synthesized by the fabled mitochondria, but that’s not all they do. They also regulate metabolism, participate in cell growth and death, manage calcium levels, and are involved in detoxification, hormone production, and cellular signalling. They even have their own genetic code. In fact, your mitochondria come from your mother and your mother only; they’re not genetically recombined like the rest of you. They’re remarkably fascinating; even the universally memed “powerhouse” doesn’t quite cover its capabilities.

All of this is still merely scratching the surface of wonder. I’ve only really described three examples in biology, all of which relate to human cells. But we’re just one of the millions of organisms on this planet. Bacteria, plants, fungi, insects, birds, reptiles, mammals, and everything in between, are all made up of cells. And every level – ecological, species, organism, tissue, cellular, organelle, protein, genome – has its own stories, each its own magic.

In his blog post, Somers advised to learn in small, deep slices. But I took a different approach: I went shallow and wide. Kolbert, Mukherjee, and Lane inspired exploring adjacent domains, and so I read about epidemiology, drug discovery, gene editing, molecular biology, systems and synthetic biology, immunotherapy, and memoirs from surgeons, cancer patients, and “biology watchers”. Even my fiction choices started to exhibit a biology tinge: The Shell Collector, The Covenant of Water, The Overstory. Eventually, I started seeing biology everywhere — the roots of a sidewalk tree battling with concrete, a group of sparrows frolicking in a bush, a young woman in an air cast fiddling with her crutches — as if it escaped the pages and began whispering its presence wherever I went.

Last summer, I went scuba diving for the first time in my life. I’ve wanted to go since I was a teen, a desire amplified after reading Kolbert’s adventures and watching ocean documentaries. After years and years of postponing, I finally pulled the trigger and flew to Puerto Vallarta to get Open Water certified. I could fill an entire essay with just this certification experience — the anxiety-inducing pre-dive coursework that essentially just lists the many ways you can get seriously injured or die; the silly awkwardness of training in a Mexican hotel pool surrounded by curious onlookers; the ear injury I sustained after my first ocean dive, where a rupture caused by improper depressurization caused middle ear fluid to flood my right ear canal, leaving me with partial hearing loss for a week (even PADI’s intimidating coursework could only do so much) — but I will focus on just the experience of my second dive here.

It was a picture-perfect day in Puerto Vallarta: deep blue skies, fluffy cotton-candy clouds floating above, a momentary cool breeze tempering the unrelenting summer humidity. As our boat sped along to Playa Majahuitas, about a 40 minutes ride from the main pier, I watched the lush green hills roll by just behind the shore, the ocean shimmering as the sun flung silver disks across its surface. During the ride, I asked the couple sharing the boat about their scuba experiences, and, again, I got a common response I still couldn’t relate to: that it was meditative — it was where your problems of land disappear, and you get to be a visitor in the home of sea-life, a polite guest just observing.

Our dive spot looked like a painting: water so clear you could see schools of fish just by peering over the edge of the boat. Just before we began, we got a surprise visit from a manta ray – this enormous, ethereal creature silently gliding under the water, just flicking the tips of its wings above the surface, as if to say hello, and welcome us into its home.

The dive spot in Playa Majahuitas, Mexico

After the dive that day, I understood what the couple meant. I felt a lot more comfortable with my equipment second time around, and so, no longer apprehensive about buoyancy or breathing rate or how deep I was, I finally felt free to fully take in my surroundings. I fell into a gentle rhythm: inhale, listen to the hiss of the regulator, exhale, watch the bubbles float away. You’re distinctly aware of each and every moment, mind blank and in awe of the world around you: a large school of Cortez wrasses passing by; a camouflaged octopus hiding under the seabed; a moray eel sticking its neck out of a little hole, an angry look on its face, as if you’ve just disturbed its sleep; the vast, splendid diversity of corals – you can see it living, with little, wavy hand-like appendages collecting bits of floating food to eat, with tiny fish swimming in and out and around, as if playing a game of tag.

It was truly marvelous. Colors, too, are more vibrant underwater, as if the gods enhanced saturation as a gift to those that dare venture below. The body of spotted boxfish are a glittery blue, and the yellow speckled top shines in contrast. The corals too are rich: deep oranges, yellows, greens and browns. Even ocean documentaries, with their film-grade color editing, don’t capture the true shades.

During the boat ride back, I had this incredibly calming bliss completely take over my body. (Maybe that’s also what people attribute to its meditative quality, although meditative isn’t exactly the right word). For me, the whole experience would mark the start of a gradual realization that I wanted my role in biology to be more than just reading. My favorite science writers – Kolbert, Mukherjee, Lane, Lewis Thomas, Donald Kirsch – all wrote from experience, and if I wanted to write, or create, like that, I’d have to experience the world too. I began piecing together the things that had been swimming in my mind: namely, how to combine my past passion, interactive learning, with my latest obsession, biology.

I have since restarted working on my website, Newt Interactive, to make interactive articles and accessible simulators for topics in biology. I too, like Somers mentions at the end of his blog post, want to bring the three dimensional nature of biology to life. The subject is teeming with fascinating phenomena that remain hidden or inaccessible to those outside scientific and research communities. Occasionally, I’ll come across something incredible — like a video of a molecular motor in action — but the sheer marvel of that just fundamentally doesn’t click unless you’re already well versed in the subject.

My interactive simulator of a coherent type-1 feed forward loop, a common gene circuit. My hope is that these kinds of playgrounds can make complex topics more accessible. Try it out on Newt Interactive

I hope to bridge this gap and make some of biology's intricate mechanisms comprehensible and awe-inspiring for everyone. I’ve started with an interactive series on systems biology (and wrote about my idea and motivation behind it in a previous post), as well as some standalone simulators for a few concepts: coherent type-1 feed forward loops and genetic circuit evolution, for two. My goal is to work my way up to more sophisticated simulations, tools, and interactive articles that will help illustrate, and importantly, allow you to play with, more advanced concepts. In addition, I’d like to generally write and draw more as well (also started with this by making my first science graphic and biological math model for Asimov Press).

Stories of science can elicit all kinds of emotions: joy, sadness, enchantment, heartbreak, optimism, valiance, apprehension, intrigue. I find, however, that one theme seems to be consistent among the characters: curiosity. This shouldn’t come as a surprise, of course, but what I hadn’t anticipated was how infectious it could be. Just reading about these scientists — their history, theories, efforts, mistakes and unwavering dedication to truth — kindled an active curiosity in me. I don’t think I have the patience to do what the scientists I read about did, experimenting day after day, week and week, year after year, exploring a small sliver in the “infinite vastness of biology”. And, since my curiosity started and ended with books, I didn’t think there was a meaningful role I could play. I couldn’t hear the calling.

But now I’m not so sure. I have this recurring desire to look down a microscope, and see a cell live its life, see its components swimming, squirming, dividing. I want to see a sequencing machine take in an organism’s DNA and spit out all its nucleotide bases; to hold a test-tube with genetic material that I edited with CRISPR-Cas9; to roam around a laboratory and peek at each bench’s weird collection of tools and equipment and liquids, slide my feet across the polished laboratory floor, smell the lingering scent of disinfectant; to go on more dives and hikes and explore the breathtaking diversity of life. It’s not quite a calling, more like hearing a faint ringtone in a distant room. You’re not sure if your phone’s ringing or your mind’s making the sound up. Maybe this time it’s worth taking a look.


Read the original article

Comments

  • By kleiba 2025-04-2217:149 reply

    A while ago, I taught CS for a year in a local high school. I can very much relate to the notion of "astonishing facts were presented without astonishment": as a teacher, you don't have the freedom to teach whatever you want (of course), but you're very tightly bound to a curriculum that's developed by the state government. And for CS, this curriculum was so uninteresting and uninspiring (what a surprise: 13 year old kids don't care about the history of computers), that I couldn't blame any of my students not to show much interest in my classes.

    As a matter of fact, I gave up after just one year. It wasn't any fun for anyone, not for the students, not for me.

    • By ern 2025-04-2223:341 reply

      I can really relate to your experience, even though mine was from a parent's perspective rather than as a teacher. I found a similar thing when tutoring one of my children in trigonometry. The way the material was being presented in school didn't click with him, but astonishingly, despite having studied it decades ago both at school and university, explaining it to him, it finally made sense to me. The unit circle definition of a tangent is a thing of beauty. I had the time to get my child to appreciate it as well, because of the extra time I had to spend with him, whereas the teacher had to hit curriculum benchmarks.

      I also think this is where things like intergenerational math-phobia come from: parents who don't grasp core concepts and are scared off, and can't help their own children, creating an ongoing cycle.

      • By BeFlatXIII 2025-04-231:261 reply

        > I also think this is where things like intergenerational math phobia come from: (elementary) teachers who don't grasp core concepts, are scared off, and can't help their own students, creating an ongoing cycle.

        I hope you appreciate my addition of the other common path of math phobia.

        • By ern 2025-04-236:12

          Absolutely, I do appreciate that addition — I definitely had teachers like that.

          It’s probably why, when I got to university and tackled subjects like probability theory, discrete math, and theoretical CS, I did extremely well — they weren’t reliant on the shaky algebra and trig foundation I had from school. Once the focus shifted to logic and conceptual thinking, without the baggage of poorly taught fundamentals, everything clicked

    • By slicktux 2025-04-2218:154 reply

      I think the whole teaching the history of computers is a big failure at an attempt to Segway into computer organization and architecture. Nonetheless, I get what is happening. If it’s a pure Computer programming class then the goal maybe to have them understand the “basics”…like what is the hard drive vs RAM (memory allocation) or what is a transistor (Boolean logic) and what is a punch card (mnemonics and abstractions of those mnemonics to what is now just a computer programming language).

      • By nightpool 2025-04-2220:082 reply

        (Unless you're riding a motorized vehicle, the word is segue, not Segway)

        • By girvo 2025-04-230:213 reply

          This is very much a tangent, but I think it's nearly certain that "segway" will end up overtaking "segue" as the predominant spelling for the word that is defined as: "to make a transition without interruption from one activity, topic, scene, or part to another"

          The "mistake" happens so often, partially because "segway" is a much more straightforward spelling if one has only heard the word said aloud, that I think it will eventually become the actual way it is spelled!

          • By cherrybajan 2025-04-2310:222 reply

            As a non native speaker, given the prevalence of spelling "segway" in corporate, this is how I thought the word was spelled, until now that is!

            • By nightpool 2025-04-2316:04

              FWIW, I did a quick search of our local slack and found 2x the the number of instances of "segue" compared to "segway". And most of the instances of "segway" (around 60%) refer to the actual device, with only a handful of mistakes (around 4). So I'm not sure that this spelling is more common in a corporate environment—maybe do a search for yourself and see!

            • By LargoLasskhyfv 2025-04-268:52

              Hrm. I haven't even heard that so far, maybe because I rarely did meetings.

              Looking it up in https://en.wikipedia.org/wiki/Segue they even warn about that!

          • By milesrout 2025-04-237:341 reply

            It will likely end up like many other more phonetic spellings: an indicator of ignorance, but more acceptable in America than elsewhere.

            • By hoseja 2025-04-239:33

              English is spelled phonetically. Just not Modern English phonetically but Middle English phonetically. And then it froze into ideography because of printing press.

          • By gh0stcat 2025-04-230:387 reply

            This is an absurd take. We should not bend language around ignorance. There is a beauty to effort. Please take a second to explore this for yourself.

            • By abletonlive 2025-04-231:003 reply

              Actually it's not an absurd take at all. The absurd take is that we "should not bend language around ignorance."

              That's precisely how language changes over time. Language is not a strict set of rules. It's based on understanding and consensus, so sometimes things that are "wrong" do end up being accepted.

              I suggest this as a great introduction into what languages are and how they evolve over time https://www.amazon.com/Language-Families-of-World-audiobook/...

              • By zelphirkalt 2025-04-234:174 reply

                I am not a native speaker, but the two words do not sound even remotely the same.

                How does this mistake happen so often? Can you explain people's thought process a bit? Is it just: "Something something 'seg...' ... ah I know, I will simply use another random word that starts with the same 3 letters and doesn't make sense in this phrase!"?

                Also this is the first time I see it.

                • By roryokane 2025-04-234:461 reply

                  > the two words do not sound even remotely the same

                  Pronounced correctly, “segue” sounds just like “Segway” – not like “seg-oo”, as you might have assumed.

                  • By zelphirkalt 2025-04-2311:04

                    TIL, thanks! You are right, that I assumed it would be like "seg-oo".

                • By girvo 2025-04-235:57

                  The two words are pronounced identically.

                • By nightpool 2025-04-2316:09

                  Why would you opine on the way the words are pronounced if you've never seen them before and clearly did not take the time to look them up at all?

                • By tomjakubowski 2025-04-244:41

                  Segue is borrowed from Italian, the "ue" is a diphthong like English "way" or Spanish "güey"

              • By milesrout 2025-04-237:372 reply

                Most mistakes remain mistakes, and do not become part of the language. The idea that mistakes generally get accepted as correct is simply untrue, which is what you are implying.

                I am sure people will make the mistake, as they sometimes do today. But it is a mistake, and will likely be recognised as one.

                It is likely that the language gets more cemented by automatic spelling and grammatical correction, including using AI. For example, there are a number of grammatical and spelling changes that have been cemented by American spelling/grammar checking programmes ie. by MS Word.

                • By abletonlive 2025-04-237:441 reply

                  > The idea that mistakes generally get accepted as correct is simply untrue, which is what you are implying.

                  I did not imply that at all. I said sometimes, so it's not that absurd that it could happen. It does happen though, and a quick google search will give you pages of examples.

                  • By shadowgovt 2025-04-2311:57

                    Precisely. In English, while mistakes usually get corrected back to common or traditional usage, they are also the fuel for almost every change to English that becomes common usage (and I only add the almost qualifier because I can't decide if categorizing things like "cromulent" as a mistake should count; it was an intentionally made up word in a context where the joke was made up words but may have fallen into common usage because people using it because they were in on the joke were dwarfed by people who didn't know it was a joke and absorbed it as a real word).

                    With machines looking over our shoulders now and so much of language being typed instead of handwritten, odds are such drift might actually decrease in English... On the other hand, the introduction of AI leaves an interesting avenue for people to begin acting as if something is common usage and have the AI begin confirming that as common if it consumes that action. And then, of course, there's the effect of the machine itself... Most of us have a way to type "résumé", but we don't bother because the machine makes it too much work to do so, So the alternate spelling without accent, which was called out in my high school days as wrong, has fallen into common usage in a generation of people having to submit their resumes online (example: https://www.linkedin.com/help/linkedin/answer/a510363).

              • By juped 2025-04-235:461 reply

                [flagged]

                • By abletonlive 2025-04-237:47

                  Was that the point? Don't forget that you're on hackernews, not reddit. Strawmans are less accepted in this community. Individually, you are neither a consensus that was described nor did anybody in this thread implied that "all errors of usage are correct" and accepted. Your sarcasm is unwarranted and provides little value to this conversation.

            • By munificent 2025-04-233:421 reply

              Write that in Old English orthography and you'll make a more consistent argument.

              • By chias 2025-04-234:20

                Þis is án ungeswutellic andswaru. Wé ne sculon bǣgan spræce ymbe ungewitt. Þǣr is fægernes tō earfeþe. Bidd þé niman án ōþer tid tō smeagan þis sylf.

            • By andsoitis 2025-04-234:051 reply

              Tbh, I’m more critical of commonly confused words in English like affect and effect, or discrete and discreet.

              In more forgiving of mixing up homophones, even if one of them is a registered trademark (Segway).

              • By irishsultan 2025-04-237:46

                Aren't discrete and discreet homophones?

            • By girvo 2025-04-235:58

              I passed no judgement for or against, merely discussed that it was likely to happen.

              I suggest you yourself take a second and explore why you think being smarmy on the internet is a way of getting people to agree with you.

            • By Zambyte 2025-04-2310:13

              thyself*

            • By gg-plz 2025-04-232:11

              [dead]

      • By 0_gravitas 2025-04-2218:451 reply

        Personally, I struggled a lot in my earlier CS/Informatics education, partly because I never felt like I understood what was actually happening/how we got here, everything was just factoids in a void. When I took a gap semester between my A.S. and B.S., I finally studied/explored a bit of the history and it put a lot finally in perspective.

        • By barrenko 2025-04-2314:20

          Well, you need to come up to something like analysis to appreciate something that's seemingly simple like the number line and that's a loot of math if done only in spare time.

      • By SoftTalker 2025-04-2218:43

        > have them understand the “basics”…like what is the hard drive vs RAM (memory allocation) or what is a transistor (Boolean logic)

        You must understand these things at least conceptually if you want to really understand how to write efficient programs. Maybe not at the level of how memory can electronically "remember" a 1 or a zero, or how a hard drive can magnetically do it, but at least the relative speeds e.g. register vs. cache vs. RAM vs. disk.

    • By tlb 2025-04-238:583 reply

      What a horrendous crime, to turn a fascinating subject into a boring curriculum to be forced on teachers and children.

      I've received great intellectual satisfaction from various well-taught subjects. I would rather chop off a finger than lose them. So curriculum committees that make subjects boring are doing something worse than chopping off millions of children's fingers.

      • By Akronymus 2025-04-239:23

        With any kind of history especially, its just rote memorization of facts and not the connections between those facts.I hated history in school because of that, but now I actually find it interesting to learn that x happened because of y that also led to z and such. Or just rote memorization of technical facts, like how many wires does a PATA cable have. Or why must kids memoroze how an ethernet frame is built up? Sure go over it in class and show it as a lesson in how to read how binary protocols are defined. Because either you forget it anyways because its not relevant to your job, or you can look it up and memorize it over time as you use it often enough.

        I really wish that teaching of history will get better for current and future kids.

      • By lloeki 2025-04-2310:33

        > Children ask: 'why?' So we put them in school, which cures them of this instinct and conquers curiosity through boredom.

        - Paul Valéry

      • By brainzap 2025-04-2316:05

        its a job, not a mission

    • By liquidpele 2025-04-2221:431 reply

      This is why most good teachers don’t use the books but find creative ways to still meet the standards. More work though, so fewer do it now with pay being so shit.

      • By kleiba 2025-04-2313:11

        For what it's worth, the pay in my case was quite good, and there weren't any books, so that wasn't the issue.

    • By alnwlsn 2025-04-2220:37

      I've loved the history of computers since I was young, although if I was forced to learn about it in school I know it would suck.

    • By hfgjbcgjbvg 2025-04-2217:403 reply

      Imagine if they taught the history of English to kids before they could read

      • By moffkalast 2025-04-2218:111 reply

        Since most people throughout history couldn't read, I guess it would be relatable?

        • By cafard 2025-04-239:58

          How many of those who couldn't read knew the history of their or other languages?

      • By milesrout 2025-04-237:471 reply

        The history of English is taught in English classes. Historical context is important and interesting. You don't really understand a subject without knowing a bit of its history.

        My favourite classes were those where we didn't just get taught facts and theorems but we also got taught a bit about who proved the theorem for the first time, who discovered this fact, what this algorithm was first used for, etc. So much easier to remember too.

        This is one of the best things about studying law: the very nature of it makes it impossible to teach it without the historical context.

        • By Zambyte 2025-04-2310:17

          The key part to me is the "before they could read". I think the history of computing is probably far more interesting when you have more context as to where that history got us.

      • By internet_rand0 2025-04-2218:411 reply

        they might just remember it all once they're adults!

        imagine that!? an historically informed populace???

        you'd need more expensive lies and higher quality fakes... the government would be costlier to run.

        ideally, in the long term this would make the national currency's value in the international money market rise up. but why wait for that when one can directly manipulate money through trade fraud and covert military ploys?

        • By RogerL 2025-04-2218:591 reply

          That's not the point, the point is the ordering is inverted, not that history shouldn't be learned.

    • By cainxinth 2025-04-2413:07

      > 13 year old kids don't care about the history of computers

      Speaking for myself, and I’m sure many others on hn, I was very interested in the history of computers at 13!

    • By PicassoCTs 2025-04-2218:083 reply

      Those curiculums developed by sould-dead gremiums in consensus on the minimum knowledge you goto have are a blight on western civilization. Instead of giving students the ability to discover a topic, or built something they are interested in themselves and then give them a understanding and fascination with the discoverers who have gone before them. Instead they kill the subject..

      I must confess, it gives my dry old heart some joy, to see the anti-education masses coming from this, voting and storming the fortresses that produced the paywall around education, that only money with tutors could or accidental intrinsic motivation could overcome and burn & salt those outposts of classists academia.

      • By mlinhares 2025-04-2218:262 reply

        Yes, definitely, destroying education as we know it without any plans for what the next thing is will definitely work.

        Developed countries really need a come to Jesus moment, because the disdain for everything that made them great places is unbelievable. People will understand, after great suffering, that destroying stuff is much easier than building it.

        • By AnthonyMouse 2025-04-235:04

          > People will understand, after great suffering, that destroying stuff is much easier than building it.

          "It is easier to destroy than to create" doesn't tell you when something should be torn down.

          You can have a house that provided shelter for your family for generations, but if it's water damaged, the floors are rotting and it's full of toxic mold, the person who shows up with a bulldozer isn't necessarily wrong.

        • By immibis 2025-04-2218:391 reply

          We're in the destroying phase right now. Unless you live in China - I hear they're mostly doing well. Or middle of nowhere Africa, where there's nothing to destroy because there's nothing there.

          But systems can rot from within too, or just decay naturally, and don't need to be destroyed. What if the core ideas that built our current civilization were ideas of the past, that we don't have any more, and we don't know what to do when The Machine Stops? Doesn't have to be a literal machine - it's a good metaphor for how democracy fell apart.

          • By PicassoCTs 2025-04-2317:301 reply

            The do as well as the USSR did. Meaning "excellent" till the last day!

            • By immibis 2025-04-2320:27

              I can't parse this comment, but yes, in some respects, we're in a similar stage now to the USSR's final stage.

      • By fads_go 2025-04-2219:19

        Forgetting that it was the anti-education forces that created the curriculums. The war on public education goes back a long time; teachers lost the freedom to teach decades ago. and it has been the same forces behind it all along.

      • By tqi 2025-04-2221:192 reply

        Ok... what would you do differently? Keep in mind you have to educate millions of students across an enormous spectrum of abilities, socioeconomic backgrounds, and interests.

        • By PicassoCTs 2025-04-2317:33

          I would build a "intrinsic motivation" first curriculum, where knowledge is handed as powertools to a already existing passion and the self-thought "expansion" of knowledge is the most important gift to be made.

          If the child is fascinated by video games- i would help it make video games, the curriculum be damned. All knowledge holes can be filled later, but the passion to wanting to know, can never be restored unless the want for knowledge remains intact.

        • By milesrout 2025-04-237:50

          No you don't. There is a narrow range of abilities at each level if students are properly held back when they haven't mastered the material.

          Their interests are built by what they are taught. "Socioeconomic background" is a tautology. Their backgrounds are irrelevant.

    • By 999900000999 2025-04-233:241 reply

      I basically found this in college too, I quickly gave up on computer science as a major. I'd rather just go out and learn how to build what I want to build versus hearing a 3-hour lecture about how the jvm works.

      The answer is it's magic and no one cares, now let's go build some games

      • By bruce511 2025-04-235:553 reply

        Firstly, and this is worth pointing out, "computer science" is not about programming. It's about science, in this case specifically the science that makes computers work.

        At school I thought "computer science" meant "programming" - which it doesn't. So well done for recognizing this before wasting your much time. (Seriously, not sarcastic.) programming can easily be learned outside college.

        To other general readers here though I'll say that understanding the science can be really helpful over a career. It's not terribly applicable in getting that first job, but as you progress more and more of those theoretical fundamentals come into play.

        Ultimately there are a small fraction of people who need to understand how it all works, all the way down, because those people build the things that programmers use to build everything else.

        • By 999900000999 2025-04-2315:171 reply

          It depends on where you took computer science. I took a few foundational classes at community college.

          It very much felt like a Wikipedia article on the history of computers somehow stretched out over an entire summer.

          I have my own issues with the way college is generally setup. Do students really need a massive amusement park when self study along with 3 or 4 exams would provided the same value. Will spending 70k per year in total cost of attendence at said amusement park serve them?

          I don't really like boot camps either, personally I'd like companies to be more open to actually training people again. I doubt it'll happen though.

          • By bruce511 2025-04-2318:491 reply

            >> It depends on where you took computer science.

            Well, yeah. That's true for any field of study. Every college has strengths and weaknesses- its the opposite of a franchise.

            >> I took a few foundational classes at community college.

            A few foundational classes is somewhat different to classes you take in prep for a major. I did a foundational class in astronomy, designed for students who were just looking for an introduction. It was very different to my comp Sci classes in tone and style.

            Yes there was some math involved, but not much in the comp science classes. Math was a pre-requisite though so we got our math in, well, math.

            • By 999900000999 2025-04-2321:411 reply

              This is one of the only skills you can learn for practically nothing. A cheap laptop is all you need. I taught myself enough to get a middle class job with nothing but free time and 3$ iced coffees.

              I just don’t like the idea of gate keeping it behind an expensive degree. The source code for most popular frameworks and tools is free for anyone to read.

              It’s not like medicine or something where you need to drop 300k on education.

              • By bruce511 2025-04-245:51

                No, it's certainly not like medicine or law. And you can certainly aquire skills on your own.

                Of course, in this field, learning is continuous. You're not going to use just one language (much less one framework) over a decades-long career. It's also likely that your domain will change, your focus area and so on.

                A good college course doesn't prepare you for programming in one language, but all of them. (In the sense that once you understand the theory of programming, language is just syntax.)

                You get exposure to different types of languages (imperative, functional etc).

                I think for me the critical takeaways though were research, critical thinking and communication. The "skills" are easy to learn yourself, but the formality in which you place that learning is harder to do yourself.

                Which is not to say a degree is a requirement- it's clearly not. But it's helpful because it builds a strong foundation on which the self-learning can rest.

        • By dekhn 2025-04-2315:57

          I think CS is math, not science. Computer engineering is science (using lots of math).

        • By milesrout 2025-04-237:441 reply

          This is a myth. Computer science absolutely is about programming. The science that makes computers work is called physics.

          There are theoretical parts of computer science, but it is fundamentally a practical subject. All of it is in service to programming. Type systems are about typing programs. Algorithms are implemented using programs. Data structures are for use in programs.

          The very worst computer science lecturers are those that forget it is a practical subject and try to teach it like abstract mathematics, because they believe (whether they realise they believe it or not) that it is more prestigious to teach abstract concepts than practical concrete things.

          It is the same in mathematics, where unfortunately there has developed a tradition since Bourbaki of trying to teach abstract notions as fundamental while concrete problem solving is left to the engineers. The result is that many engineers are much stronger mathematicians than many mathematically-trained students, and those students have to relearn the practical foundations of the subject before they can make progress at the graduate level. If they don't, they get stuck doing what looks like maths, but is actually just abstract roleplaying.

          • By shadowgovt 2025-04-2311:471 reply

            This might be just a semantic argument, but if you mean "programming" as in "configuring a machine to implement one or more algorithms" (which I would assert most people do when they use the term), computer science is emphatically not about programming, although programming is taught for much the same reason that artists learn how to use a pencil. Computing, as a discipline, predates the machine (although the machine justified the existence of a whole discipline for studying it because the force multiplier it represented made it worthwhile to dive deeply on the subject of algorithm development and execution, the nature of algorithms, the nature of computability, formal logics, etc... Before the machine, it was just a subset of mathematics).

            This was a point repeatedly driven home in my undergraduate curriculum, and in fact, they made a point of having multiple classes where a computer was completely uninvolved.

            • By bruce511 2025-04-2318:56

              Yeah, I'm more in this camp too. We did a lot of practical modules, things like OS development, databases and so on. So yeah, learning programming was the first couple months, then programming becomes the tool to express progress in knowledge depth.

              It's probably fair to say that although we learned some history, we had the privilege of learning at a time the field was exploding. That history you learned, I lived and worked through that. It's somewhat surreal to realize that my career is your history class.

              As mentioned above though, it'll vary a lot from one school to another.

  • By intrasight 2025-04-2217:266 reply

    My fork in the road with hard tech hard science versus biology was in high school. It seemed that students that wanted to become doctors took AP biology and students that wanted to be engineers took physics and chemistry. I had wanted to be an engineer since I was 12 years old so I felt the decision was already made. But all studying neural networks in college in the 80s I realized that there was this tremendously rich domain of real neurons which I knew nothing about. I worked as a software engineer for a couple years after graduating but then went back to school to study Neurophysiology. I did not pursue it as my area of work or research, but I am grateful for having had the opportunity to look at the world from the perspective of a biologist.

    If you're an engineer and early in your career and feel there's something missing from your intellectual space, I encourage you to go back and get a graduate degree in something totally different. Humans live a very long time so don't feel like you're wasting time.

    • By keithwhor 2025-04-2219:361 reply

      I've been programming since I was eight, but truly fell in love with biology in 12th grade chemistry: the first introduction to organic chemistry and biochemistry. It was the first time I truly started grokking the application of systems-level thinking to the biological world; how do trees "know" to turn red in the autumn? How do fetuses assemble themselves from two cells?

      I decided to purse a double major in biochemistry and evolutionary biology and it was one of the best decisions I've made in my life. The perspective you gain from understanding all life in terms of both networks and population dynamics of atoms, molecules, cells, tissue, organisms and populations -- and how every layer reflects the layer both underneath and above it in a fractal pattern -- is mind-expanding in a way I think you just don't and can't get designing software systems alone.

      I work as a software engineer / founder now, but always reflect wistfully on my time as a biologist. I hope to get back to it some day in some way, and think what the Arc Institute team is doing is inspirational [0].

      [0] https://arcinstitute.org/

      • By mncharity 2025-04-231:50

        Has anyone seen content that used this multiscale networking and population dynamics as an instructional approach?

        For small example, there was a Princeton(?) coffee-table book which used "everyday" examples to illustrate cell/embryonic organizational techniques - like birds equally spacing themselves along a wire. Or compartmentalization, as a cross-cutting theme from molecules to ecosystems.

        I've an odd hobby interest in exploring what science education content might look like, if incentives were vastly different, and massive collaborative domain expertise was allocated to crafting insightful powerful rough-quantitative richly-interwoven tapestry.

    • By TinyRick 2025-04-2218:236 reply

      I would love to do something like this but simply cannot afford it. I think it is good advice but going back to school for a degree one does not plan on utilizing is not as feasible today as it was in the 80's, largely due to the sizeable increase in tuition without reciprocal increases in wages.

      • By nosianu 2025-04-2221:521 reply

        In this day and age, you can do this for FREE and on the side, whenever you have time!

        There are tons of very well-done professional level video courses on Youtube.

        There are more organized courses that only ask you for money for the "extras", like some tests and a certificate, but the main parts, texts and videos, are free.

        You could start with a really good teaching professor (Eric Lander, MIT) and his course: https://www.edx.org/learn/biology/massachusetts-institute-of... (the "Audit" track is free, ignore the prices; also ignore the "expires" - this course restarts every few months and has been available in new versions for many years now)

        It's very engaging!

        There's similar courses for everything in the life sciences, there on edX, on Youtube, many other places.

        I feel the true Internet is soooo underutilized by most people! Forget news sites, opinion blogs, or social media. Knowledge is there for the taking, free. Only the organized stuff, where you end up with a certificate costs money, but they usually still provide the actual content for free.

        • By tsimionescu 2025-04-235:38

          Time and energy are also at a premium in the current economy. Good luck learning biochemistry by watching YouTube videos after 8+h of coding and meetings plus commute plus making dinner plus cleaning up.

      • By toast0 2025-04-2220:17

        Depending on where you live, and what you want to study, you might be able to take a couple courses at the community college in areas of interest without spending a lot of money.

      • By biomcgary 2025-04-2220:351 reply

        I was paid to get a PhD in Biology, albeit just enough to live on. Most people in PhD programs are, either through being a TA (teacher's assistant) or RA (research assistant). The real financial cost is the opportunity cost of 5-6 years of your life.

        Whether or not broad support for training scientists holds up during and after the current administration remains to be seen.

        • By sitkack 2025-04-231:06

          Please, the cost isn’t your life, that is life and it is great.

      • By dpc050505 2025-04-231:05

        My current tuition is under 500 CAD per class. The opportunity cost of not working full time is the real bulk of the cost of studying in places that have a functional government.

      • By intrasight 2025-04-2311:23

        I'm pretty sure it's still the case that you get paid to be a graduate student in science.

      • By Suppafly 2025-04-2218:431 reply

        >I would love to do something like this but simply cannot afford it.

        Work for a company that will pay for it.

        • By shortrounddev2 2025-04-2220:012 reply

          I can't imagine why a company would pay an engineer to get a masters degree in biology

          • By MattGrommes 2025-04-2220:352 reply

            A lot of companies will pay for at least part of whatever college classes you take, without auditing whether or not it would be good for your specific job.

            I encourage people to look into it, it's a benefit a lot of people have but don't use and it's leaving money on the table.

            • By shortrounddev2 2025-04-2220:522 reply

              Every company I ever worked for constrained it in many ways

              1. Masters degree only, they won't pay for anyone to get a bachelor's or associates

              2. Must maintain a B average or better

              3. Cannot take any time off, it has to be entirely on nights and weekends

              4. Reimbursement after the fact, so you're taking on the initial financial risk up front.

              • By ponector 2025-04-239:14

                I had a job with an education budget listed as benefits.

                However, to use it there are constraints: 1. The topic should be related to technologies used by company. Cannot get a Google cloud certification as they are using aws. 2. To get it you need approval by line manager, hr, and director of the office. 3. If it is more than €250 you need to sign up loyalty agreement for a year. Meaning if you will return some amount of you quit.

                With all that strings attached it is just a marketing bullshit to attract new hires.

              • By zelphirkalt 2025-04-234:07

                Plus usually the employer wants it to be related to ones job, from their very limited perspective of the world and management decisions. For example I couldn't even take a language course for education vacation, as the employer did not make any use of my language skills.

            • By dominicq 2025-04-2221:043 reply

              Can you say more? What kind of company would so such a thing? Maybe I live in a bubble but that's so far outside of what I've seen that it just sounds fantastical.

              • By MattGrommes 2025-04-2221:25

                Ok, both of these comments made me doubt my memory so I just checked and my current employer, a very large consumer company, and the limits of the program are that you get a C or above, and the class is "related" to your job or any job you can get at the company. But I've gotten classes paid for that only tangentially related to my job with no problem. So I concede that you might not get a biology degree as an engineer but my particular company does a lot of different things so my guess is in practice you'd have no problems. I also worked at a now-defunct mid-size startup and a hospital system with similarly loose requirements but I don't have access to their docs anymore.

              • By Suppafly 2025-04-234:01

                My company uses guildeducation.com and we can use basically $5k a year (I think, it might be semester), a lot of if it is just individual classes, but there are also some degree programs. I don't know if they preselect which courses are available to us or if we have access to the whole catalog. I suspect it's somewhat curated, because we are a medical company and most of it is medical stuff. There is a CS bachelor's program but last I checked there wasn't an MS CS program.

              • By fouc 2025-04-235:07

                I would assume most companies with 100+ office workers (essentially big enough for an HR department) usually offer something like this in western countries.

          • By Suppafly 2025-04-233:56

            Try something in the medical field, my company will pay for a bunch of medical related stuff when I just want to further my CS background.

    • By bsder 2025-04-2221:45

      The breakpoint was molecular biology around 1986 with the introduction of PCR. Once that happened, biology went from being alchemy to being science.

      I loathed biology as taught prior to that. Once I got a molecular biology course, I thought biology was amazing and wondered "Why the hell did we teach all that other crap?"

      Well, that was because the tools we had for biology sucked prior to PCR. My problem was that I recognized that even as a child.

    • By SoftTalker 2025-04-2218:51

      Same. Biology was an elective in high school and I never took it. I took Earth Science (basically introductory geology) and then went into the Chemistry/Physics track (two years of each). Never felt I missed it, last time I had any real biology education was a unit in 8th grade science and I didn't care for it then.

    • By gh0stcat 2025-04-230:422 reply

      I would love to do this, I just cannot afford it as others have already stated. It's depressing to feel like I spend so much of my life at my day job and yet require it to afford the tiny portion I get left. I wish things were different.

      • By intrasight 2025-04-2311:22

        Much, much easier to do when you're young. I was just married so no kids yet. We moved to Toronto so I could attend UT and we treated our stay as an extended honeymoon.

      • By sitkack 2025-04-231:05

        Jobs are a prison, if we had a slice of those efficiency gains, you would have ample time for all the things.

    • By AnnikaL 2025-04-2219:231 reply

      I am not sure biology is not a "hard science"?

      • By intrasight 2025-04-2311:20

        I know. I questioned that word choice, but it's sort of a play on words - as most of the biological things that I ended up doing are soft and squishy :)

  • By dekhn 2025-04-2218:072 reply

    I invested a great deal of effort over 30+ years to learn biology, which I started to love in high school when a teacher introduced us to molecular biology. Over time I've come to appreciate that biology is a huge field and people who master one area often know little to nothing about many others.

    To be proficient in biology you need to have "Extra" skills: extra ability to work with ambiguity,ability to memorize enormous amounts of descriptive information, and highly abstract representations. Digital biology often loses many aspects of biological reality, and then fails to make useful predictions.

    Over the years, I've come to realize I know less and less about biology- that I greatly underestimated the complexity and subtlety of biological processes, and have come to admit that my own intelligence is too limited to work on some problems that I originally thought would be "easy engineering problems".

    A great example of the rabbit hole that is modern biology is summed up here: what is the nature of junk DNA? To what extents are digital readouts like ENCODE representative of true biology, rather than just measuring noise? What is the nature of gene and protein evolution?

    https://www.cell.com/current-biology/fulltext/S0960-9822(12)... (note that while I disagree strongly with Eddy in many ways, I've come to recognize that I simply don't understand the modern view of evolution outside the perspective of molecular biology (IE, what geneticists like Eddy think).

    Also, recently, Demis Hassabis postulated that if he is successful, we will come up with silver bullet cures in 10 years time simply using machine learning. It's amazing how many computer scientists (I call him that rather than a biologist, although he has worked into neuro) make this conclusion.

    • By Wojtkie 2025-04-2222:28

      I've got a background in neuroscience and transitioned to data science a few years ago. Your comment about the rabbit hole of modern biology is spot on. I've been hearing for 10+ years about how ML like computer vision will revolutionize medical diagnosis and treatment. It hasn't happened yet and I think that enthusiasm comes from the fact that we built computer systems from the ground up and therefore know them deeply, whereas biological systems aren't fully understood.

    • By baq 2025-04-2218:201 reply

      Why would biology be so hard? It’s only a billion years of evolution, after all. We’re dealing with billions of things all the time. /s

      • By dekhn 2025-04-2218:30

        Appreciate the sarcasm, but... it's really 3 billion years of evolution, with astronomical levels of actual entities living and dying in a dynamic world environment. Chemical reactions happening in nanoseconds. Polymers have extraordinarily complex behavior!

HackerNews