Bill Atkinson has died

2025-06-0716:191588273daringfireball.net

Link to: https://www.facebook.com/story.php?story_fbid=10238073579963378&id=1378467145&_rdr

From his family, on Atkinson’s Facebook page:

We regret to write that our beloved husband, father, and stepfather Bill Atkinson passed away on the night of Thursday, June 5th, 2025, due to pancreatic cancer. He was at home in Portola Valley in his bed, surrounded by family. We will miss him greatly, and he will be missed by many of you, too. He was a remarkable person, and the world will be forever different because he lived in it. He was fascinated by consciousness, and as he has passed on to a different level of consciousness, we wish him a journey as meaningful as the one it has been to have him in our lives. He is survived by his wife, two daughters, stepson, stepdaughter, two brothers, four sisters, and dog, Poppy.

One of the great heroes in not just Apple history, but computer history. If you want to cheer yourself up, go to Andy Hertzfeld’s Folklore.org site and (re-)read all the entries about Atkinson. Here’s just one, with Steve Jobs inspiring Atkinson to invent the roundrect. Here’s another (surely near and dear to my friend Brent Simmons’s heart) with this kicker of a closing line: “I’m not sure how the managers reacted to that, but I do know that after a couple more weeks, they stopped asking Bill to fill out the form, and he gladly complied.”

Some of his code and algorithms are among the most efficient and elegant ever devised. The original Macintosh team was chock full of geniuses, but Atkinson might have been the most essential to making the impossible possible under the extraordinary technical limitations of that hardware. Atkinson’s genius dithering algorithm was my inspiration for the name of Dithering, my podcast with Ben Thompson. I find that effect beautiful and love that it continues to prove useful, like on the Playdate and apps like BitCam.

In addition to his low-level contributions like QuickDraw, Atkinson was also the creator of MacPaint (which to this day stands as the model for bitmap image editors — Photoshop, I would argue, was conceptually derived directly from MacPaint) and HyperCard (“inspired by a mind-expanding LSD journey in 1985”), the influence of which cannot be overstated.

I say this with no hyperbole: Bill Atkinson may well have been the best computer programmer who ever lived. Without question, he’s on the short list. What a man, what a mind, what gifts to the world he left us.

Saturday, 7 June 2025


Read the original article

Comments

  • By matthewn 2025-06-0717:2615 reply

    In an alternate timeline, HyperCard was not allowed to wither and die, but instead continued to mature, embraced the web, and inspired an entire genre of software-creating software. In this timeline, people shape their computing experiences as easily as one might sculpt a piece of clay, creating personal apps that make perfect sense to them and fit like a glove; computing devices actually become (for everyone, not just programmers) the "bicycle for the mind" that Steve Jobs spoke of. I think this is the timeline that Atkinson envisioned, and I wish I lived in it. We've lost a true visionary. Memory eternal!

    • By asveikau 2025-06-0717:483 reply

      Maybe there's some sense of longing for a tool that's similar today, but there's no way of knowing how much hypercard did have the impact you are talking about. For example many of us reading here experienced HyperCard. It planted seeds in our future endeavors.

      I remember in elementary school, I had some computer lab classes where the whole class worked in hypercard on some task. Multiply that by however many classrooms did something like that in the 80s and 90s. That's a lot of brains that can be influenced and have been.

      We can judge it as a success in its own right, even if it never entered the next paradigm or never had quite an equivalent later on.

      • By lambdaone 2025-06-0718:571 reply

        HyperCard was undoubtedly the inspiration for Visual Basic, which for quite some time dominated the bespoke UI industry in the same way web frameworks do today.

        • By Stratoscope 2025-06-0721:481 reply

          HyperCard was great, but it wasn't the inspiration for Visual Basic.

          I was on the team that built Ruby (no relation to the programming language), which became the "Visual" side of Visual Basic.

          Alan Cooper did the initial design of the product, via a prototype he called Tripod.

          Alan had an unusual design philosophy at the time. He preferred to not look at any existing products that may have similar goals, so he could "design in a vacuum" from first principles.

          I will ask him about it, but I'm almost certain that he never looked at HyperCard.

          • By canucker2016 2025-06-0810:04

            A blog post about Tripod/Ruby/VB history - https://retool.com/visual-basic

              Cooper's solution to this problem didn't click until late 1987, when a friend at Microsoft brought him along on a sales call with an IT manager at Bank of America. The manager explained that he needed Windows to be usable by all of the bank's employees: highly technical systems administrators, semi-technical analysts, and even users entirely unfamiliar with computers, like tellers. Cooper recalls the moment of inspiration:
            
              In an instant, I perceived the solution to the shell design problem: it would be a shell construction set—a tool where each user would be able to construct exactly the shell that they needed for their unique mix of applications and training. Instead of me telling the users what the ideal shell was, they could design their own, personalized ideal shell.
            
            Thus was born Tripod, Cooper's shell construction kit.

      • By cortesoft 2025-06-0718:131 reply

        HyperCard was the foundation of my programming career. I treated the HyperCard Bible like an actual Bible.

        • By leakycap 2025-06-0817:291 reply

          I miss the days of For Dummies, Bibles, and all the rest. If you'd read that thing carefully a few times, you usually knew your stuff. There was a finish line.

          Modern continual versioning and constant updates means there is no finish line. No Bible could ever be printed. Ah, nostalgia.

          • By mark-r 2025-06-0823:32

            They don't make nostalgia like they used to.

      • By jkestner 2025-06-0718:23

        Word. This is the Papert philosophy of constructionism, learning to think by making that so many of us still carry. I’m still trying to build software-building software. We do live in that timeline; it’s just unevenly distributed.

    • By nostrademons 2025-06-0720:052 reply

      The Web was significantly influenced by HyperCard. Tim Berners-Lee's original prototypes envisioned it as bidirectional, with a hypertext editor shipping alongside the browser. In that sense it does live on, and serves as the basis for much of the modern Internet.

      • By ebcode 2025-06-087:282 reply

        IIRC, the mouse pointer turning into a hand when you mouse over something clickable was original to HyperCard. And I think Brendan Eich was under a heavy influence of HyperTalk when created JavaScript.

        • By jjcob 2025-06-088:42

          JavaScript felt like it took the best parts of C (concise expressiveness) and the ease of use of HyperTalk (event handlers, easy hierarchical access to objects, etc). It was pretty sweet.

        • By Tabular-Iceberg 2025-06-088:401 reply

          Wasn't the pointer always a hand in HyperCard?

          • By WillAdams 2025-06-0813:191 reply

            Depended on context, and what the stack programmer set it to. Possibilities (per Fig. 51-1 in _The Complete Hypercard Handbook, 2nd edition_ were:

            - watch

            - busy

            - hand

            - arrow

            - iBeam

            - cross

            - plus

      • By snickerbockers 2025-06-087:591 reply

        I honestly don't think the modern web is a legitimate hypertext system at this point. It was already bad enough 20 years ago with flash and serverside CGI but now most of the major websites are just serving JavaScript programs that then fetch data using a dedicated API. And then there's all the paywalls and constant CAPTCHA checks to make sure you aren't training an LLM off their content without a license.

        Look up hyperland, it's a early 90s documentary by Douglas Adams and the guy from doctor who about the then-future hypermedia revolution. I can remember the web resembling that a long time ago but the modern web is very far removed from anything remotely resembling hypertext.

        • By WillAdams 2025-06-0813:211 reply

          This is discussed a bit in the book:

          https://www.goodreads.com/book/show/192405005-hypermedia-sys...

          which maybe argues for a return to early ideas of the web as a successor to Hypercard...

          • By snickerbockers 2025-06-109:06

            That look like my kind of book, I'll definitely be checking it out. Overall I'm still pretty pessimistic about hypertext making a true return, there's too much money in the web as an app-delivery mechanism, plus we have an entire generation of adults who are younger than Facebook, and now companies are trying to gatekeep the content itself because they want to be able to charge for LLM training on their text (I've noticed a significant increase in how many CAPTCHA challenges I get and I'm pretty sure it has nothing to do with DDOS).

            But I'm glad to see I'm not the only one who misses the old web when it really was all about exchanging ideas and information over open protocols. I will never get over this massive sense of nostalgia whenever I remember browsing through weird groceries fan sites and seeing people just documenting their love for whatever hobbies they had and the platforms weren't always pushing political BS to "drive engagement" by making me angry.

    • By asnyder 2025-06-0813:511 reply

      His legacy still exists and continues today. Even updated to modern sensibilities, cross-platform, and compatible with all your legacy Hypercard stacks!

      As far as I remember, progression was Hypercard -> Metacard -> Runtime Revolution -> Livecode.

      https://livecode.com

      I was a kid when this progression first happened, my older brother Tuviah Snyder (now at Apple), was responsible for much of these updates and changes first at Metacard and then at its acquirer Runtime Revolution.

      I even wrote some of my first programs as Hypercard compatible stacks. Was quite fun to see my apps on download.com, back in the day when that meant something :).

      I always joked it required please and thank you due to its verbosity, but was super simple, accessible, and worked!

      How nice, that even today one can take their legacy Hypercard Stacks and run them in the web, mobile, etc. Or create something new in what was more structured vibecoding before vibecoding :).

      • By mort96 2025-06-0813:561 reply

        This seems like something completely different? Livecode looks like just another toolkit or SDK for developing standalone apps, which might be great for the handful of developers using it but certainly doesn't do anything to re-shape how users interact with their computers

        • By asnyder 2025-06-0814:061 reply

          Nope, is completely the same base. Scroll the homepage, and you'll see an example of Livecode (updated HyperTalk).

          You can open your HyperCard stacks, or MetaCard stacks, or Runtime/Livecode Stacks in their IDE, code, edit, etc, similar to what you would have back in Hypercard days, but with modern features, updates, and additions.

          It's backwards compatible with HyperTalk, its current language is an updated HyperTalk (i.e. an updated MetaTalk), that incorporates all that was, but adds new features for today.

          Your Livecode apps can be deployed and run as cross-platform desktop applications (Mac, Win, *nix) , mobile applications, and as far as I remember, web applications with HTML5 deployment (so they say).

          Not affiliated with them in any way, just sharing my understanding and memories.

          • By mort96 2025-06-0823:051 reply

            And how exactly does this re-shape the user's (not developer's) relationship with their computer?

            • By pazimzadeh 2025-06-092:411 reply

              it says "GUI coding built in"

              • By mort96 2025-06-0911:47

                Can you elaborate on how that answers my question?

    • By zahlman 2025-06-0719:36

      Mr. Atkinson's passing was sad enough without thinking about this.

      (More seriously: I can still recall using ResEdit to hack a custom FONT resource into a HyperCard stack, then using string manipulation in a text field to create tiled graphics. This performed much better than button icons or any other approach I could find. And then it stopped working in System 7.)

    • By Arathorn 2025-06-0718:075 reply

      It’s ironic that the next graphical programming environment similar to Hypercard was probably Flash - and it obviously died too.

      What actually are the best successors now, at least for authoring generic apps for the open web? (Other than vibe coding things)

      • By jx47 2025-06-0718:131 reply

        I think that would be Decker (https://internet-janitor.itch.io/decker). Not my project but I found it some time ago when I searched for Hypercard successors. The neat thing is that it works in the browser.

        • By WillAdams 2025-06-0719:282 reply

          This gets mentioned pretty much every time HyperCard is --- but I can't see that anyone has done anything with it.

          Why use it rather than Livecode (aside from the licensing of the latter) or Hypernext Studio?

          • By RodgerTheGreat 2025-06-0719:381 reply

            Some programs, games, and zines made with Decker: https://itch.io/games/tag-decker

            Unlike LiveCode (or so far as I am aware HyperNext), Decker is free and open-source: https://github.com/JohnEarnest/Decker

            HyperNext doesn't appear to be actively developed; the most recent updates I see are from last year, and it can't be used on modern computers. Decker's most recent release was yesterday morning.

            I'd be happy to go into more detail if you like.

            • By WillAdams 2025-06-0721:551 reply

              Livecode used to be opensource, which made me want to use it, but that window closed.

              I guess I want a Flash replacement....

              • By leakycap 2025-06-0817:33

                https://ruffle.rs/ recently came to my attention when I needed to resuscitate a back into tool that had been completely built in Macromedia products

          • By jhbadger 2025-06-0720:09

            There's a fair amount of usage of it on Itch.io, if you are into that indie crowd. I was skeptical of it at first -- the whole 1-bit dithering aesthetic seems a bit too retro-twee, but I find it it is the best Hypercard-alike in terms of functionality -- it "just works" as compared to most Hyperclones that seem more like a proof of concept than a functional program.

      • By crucialfelix 2025-06-0812:501 reply

        - Minecraft - Roblox - LittleBigPlanet - Mario Maker

        This is what kids do to be creative.

        Slightly more serious (and therefore less succesful): - Logo/Turtle Graphics - Scratch - HyperStudio

        HyperCard was both graphic design and hypertext (links). These two modalities got separated, and I think there are practical reasons for that. Because html/css design actually sucks and never became an amateur art form.

        For writing and publishing we got Wiki, Obsidian et al, Blogs (RIP), forums, social media. Not meant to be interactive or programmable, but these fulfill people's needs for publishing.

        • By WillAdams 2025-06-0813:28

          Yeah, that sums things up well --- the problem of course is what happens when one works on a project which blurs boundaries.

          I had to drop into BlockSCAD to rough out an arc algorithm for my current project:

          https://github.com/WillAdams/gcodepreview

          (see the subsubsection "Arcs for toolpaths and DXFs")

          Jupyter Notebooks come close to allowing a seamless blending of text and algorithm, but they are sorely missing on the graphic design and vector graphics front --- which now that I write that, makes me realize that that is the big thing which I miss when trying to use them. Makes me wish for JuMP, a Jupyter Notebook which incorporates METAPOST --- if it also had an interactive drawing mode, it would be perfect.... (for my needs).

      • By RossBencina 2025-06-0723:23

        Pretty sure the next after Hypercard was Macromind (later Macromedia) Director. I recall running an early version of a Director animation on a black and white Mac not long after I started playing with Hypercard. Later I was a Director developer. I recall when Future Splash released -- the fast scaling vector graphics were a new and impressive thing. The web browser plugin helped a lot and it really brought multimedia to the browser. It was only later that Macromedia acquired Future Splash and renamed it Flash.

      • By jonnytran 2025-06-0815:49

        Have you seen Scrappy? It’s still early, but it’s the most interesting thing I’ve seen in a while.

        https://pontus.granstrom.me/scrappy/

      • By DonHopkins 2025-06-0720:071 reply

        Flash completely missed the most important point of HyperCard, which was that end users could put it into edit mode, explore the source code, learn from it, extend it, copy parts of it out, and build their own user interfaces with it.

        It's not just "View Source", but "Edit Source" with a built-in, easy to use, scriptable, graphical, interactive WYSIWYG editor that anyone can use.

        HyperCard did all that and more long before the web existed, was fully scriptable years before JavaScript existed, was extensible with plug-in XCMDs long before COM/OLE/ActiveX or even OpenDoc/CyberDog or Java/HotJava/Applets, and was widely available and embraced by millions of end-users, was used for games, storytelling, art, business, personal productivity, app development, education, publishing, porn, and so much more, way before merely static web page WYSIWYG editors (let alone live interactive scriptable extensible web application editors) ever existed.

        LiveCard (HyperCard as a live HTTP web app server back-end via WebStar/MacHTTP) was probably the first tool that made it possible to create live web pages with graphics and forms with an interactive WYSIWYG editor that even kids could use to publish live HyperCard apps, databases, and clickable graphics on the web.

        HyperCard deeply inspired HyperLook for NeWS, which was scripted, drawn, and modeled with PostScript, that I used to port SimCity to Unix:

        Alan Kay on “Should web browsers have stuck to being document viewers?” and a discussion of Smalltalk, HyperCard, NeWS, and HyperLook

        https://donhopkins.medium.com/alan-kay-on-should-web-browser...

        >"Apple’s Hypercard was a terrific and highly successful end-user authoring system whose media was scripted, WYSIWYG, and “symmetric” (in the sense that the “reader” could turn around and “author” in the same high-level terms and forms). It should be the start of — and the guide for — the “User Experience” of encountering and dealing with web content.

        >"The underlying system for a browser should not be that of an “app” but of an Operating System whose job would be to protectively and safely run encapsulated systems (i.e. “real objects”) gotten from the web. It should be the way that web content could be open-ended, and not tied to functional subsets in the browser." -Alan Kay

        >[...] This work is so good — for any time — and especially for its time — that I don’t want to sully it with any criticisms in the same reply that contains this praise.

        >I will confess to not knowing about most of this work until your comments here — and this lack of knowledge was a minus in a number of ways wrt some of the work that we did at Viewpoints since ca 2000.

        >(Separate reply) My only real regret about this terrific work is that your group missed the significance for personal computing of the design of Hypertalk in Hypercard.

        >It’s not even that Hypertalk is the very best possible way to solve the problems and goals it took on — hard to say one way or another — but I think it is the best example ever actually done and given to millions of end users. And by quite a distance.

        >Dan Winkler and Bill Atkinson violated a lot of important principles of “good programming language design”, but they achieved the first overall system in which end-users “could see their own faces”, and could do many projects, and learn as they went.

        >For many reasons, a second pass at the end-user programming problem — that takes advantage of what was learned from Hypercard and Hypertalk — has never been done (AFAIK). The Etoys system in Squeak Smalltalk in the early 2000s was very successful, but the design was purposely limited to 8–11 year olds (in part because of constraints from working at Disney).

        >It’s interesting to contemplate that the follow on system might not have a close resemblance to Hypertalk — perhaps only a vague one ….

        SimCity, Cellular Automata, and Happy Tool for HyperLook (nee HyperNeWS (nee GoodNeWS))

        https://donhopkins.medium.com/hyperlook-nee-hypernews-nee-go...

        >HyperLook was like HyperCard for NeWS, with PostScript graphics and scripting plus networking. Here are three unique and wacky examples that plug together to show what HyperNeWS was all about, and where we could go in the future!

        >The Axis of Eval: Code, Graphics, and Data

        >Hi Alan! Outside of Sun, at the Turing Institute in Glasgow, Arthur van Hoff developed a NeWS based reimagination of HyperCard in PostScript, first called GoodNeWS, then HyperNeWS, and finally HyperLook. It used PostScript for code, graphics, and data (the axis of eval). [...]

        >What’s the Big Deal About HyperCard?

        >"I thought HyperCard was quite brilliant in the end-user problems it solved. (It would have been wonderfully better with a deep dynamic language underneath, but I think part of the success of the design is that they didn’t have all the degrees of freedom to worry about, and were just able to concentrate on their end-user’s direct needs.

        >"HyperCard is an especially good example of a system that was “finished and smoothed and documented” beautifully. It deserved to be successful. And Apple blew it by not making the design framework the basis of a web browser (as old PARC hands advised in the early 90s …)" -Alan Kay

        HyperLook SimCity Demo Transcript

        https://donhopkins.medium.com/hyperlook-simcity-demo-transcr...

        >[...] All this is written in PostScript, all the graphics. The SimCity engine is in C, but all the user interface and the graphics are in PostScript.

        >The neat thing about doing something like this in HyperLook is that HyperLook is kind of like HyperCard, in that all of the user interface is editable. So these windows we’re looking at here are like stacks, that we can edit.

        >Now I’ll flip this into edit mode, while the program’s running. That’s a unique thing.

        >Now I’m in edit mode, and this reset button here is just a user interface component that I can move around, and I can hit the “Props” key, and get a property sheet on it.

        >I’ll show you what it really is. See, every one of these HyperLook objects has a property sheet, and you can define its graphics. I’ll zoom in here. We have this nice PostScript graphics editor, and we could turn it upside down, or sideways, or, you know, like that. Or scale it. I’ll just undo, that’s pretty useful.

        https://news.ycombinator.com/item?id=34134403

        DonHopkins on Dec 26, 2022 | parent | context | favorite | on: The Psychedelic Inspiration for Hypercard (2018)

        Speaking about HyperCard, creating web pages, and publishing live interactive HyperCard stacks on the web, I wrote this about LiveCard:

        https://news.ycombinator.com/item?id=22283045

        DonHopkins on Feb 9, 2020 | parent | context | favorite | on: HyperCard: What Could Have Been (2002)

        Check out this mind-blowing thing called "LiveCard" that somebody made by combining HyperCard with MacHTTP/WebStar (a Mac web server by Chuck Shotton that supported integration with other apps via Apple Events)! It was like implementing interactive graphical CGI scripts with HyperCard, without even programming (but also allowing you to script them in HyperTalk, and publish live HyperCard databases and graphics)! Normal HyperCard stacks would even work without modification. It was far ahead of its time, and inspired me to integrate WebStar with ScriptX to generate static and dynamic HTML web sites and services!

        https://news.ycombinator.com/item?id=16226209

        MacHTTP / WebStar from StarNine by Chuck Shotton, and LiveCard HyperCard stack publisher:

        CGI and AppleScript:

        http://www.drdobbs.com/web-development/cgi-and-applescript/1...

        >Cal discusses the Macintosh as an Internet platform, then describes how you can use the AppleScript language for writing CGI applications that run on Macintosh servers.

        https://news.ycombinator.com/item?id=7865263

        MacHTTP / WebStar from StarNine by Chuck Shotton! He was also VP of Engineering at Quarterdeck, another pioneering company.

        https://web.archive.org/web/20110705053055/http://www.astron...

        http://infomotions.com/musings/tricks/manuscript/0800-machtt...

        http://tidbits.com/article/6292

        >It had an AppleScript / OSA API that let you write handlers for responding to web hits in other languages that supported AppleScript.

        I used it to integrate ScriptX with the web:

        http://www.art.net/~hopkins/Don/lang/scriptx/scriptx-www.htm...

        https://medium.com/@donhopkins/1995-apple-world-wide-develop...

        The coolest thing somebody did with WebStar was to integrate it with HyperCard so you could actually publish live INTERACTIVE HyperCard stacks on the web, that you could see as images you could click on to follow links, and followed by html form elements corresponding to the text fields, radio buttons, checkboxes, drop down menus, scrolling lists, etc in the HyperCard stack that you could use in the browser to interactive with live HyperCard pages!

        That was the earliest easiest way that non-programmers and even kids could both not just create graphical web pages, but publish live interactive apps on the web!

        Using HyperCard as a CGI application

        https://web.archive.org/web/20060205023024/http://aaa-protei...

        https://web.archive.org/web/20021013161709/http://pfhyper.co...

        http://www.drdobbs.com/web-development/cgi-and-applescript/1...

        https://web.archive.org/web/19990208235151/http://www.royals...

        What was it actually ever used for? Saving kid's lives, for one thing:

        >Livecard has exceeded all expectations and allows me to serve a stack 8 years in the making and previously confined to individual hospitals running Apples. A whole Childrens Hospital and University Department of Child Health should now swing in behind me and this product will become core curriculum for our medical course. Your product will save lives starting early 1997. Well done.

        - Director, Emergency Medicine, Mater Childrens Hospital

        • By rezmason 2025-06-0817:31

          You're right. Flash and its legacy would have been better if it had built in "Edit Source".

          The earliest Flash projects were these artful assemblages of scripts dangling from nested timelines, like an Alexander Calder mobile. They were at times labyrinthine, like they are in many similar tools, but there were ways to mitigate that. Later on, AS3 code was sometimes written like Java, because we wanted to be taken seriously.

          Many Flash community members wanted to share their source, wanted a space where interested people could make changes. We did the best we could, uploading FLA files and zipped project directories. None of it turned out to be especially resilient.

          It's one of the things I admire about Scratch. If you want, you can peek inside, and it's all there, for you to learn from and build off of, with virtually no arbitrary barriers in place.

    • By jjcob 2025-06-0811:273 reply

      We kind of had that for a time with FileMaker and MS Access. People could build pretty amazing stuff with those apps, even without being a programmer.

      I think the reason those apps never became mainstream is that they didn't have a good solution for sharing data. There were some ways you could use them to access database servers, but setting them up was so difficult that they were for all intents and purposes limited to local, single user programs.

      HTML, CSS, PHP and MySQL had a learning curve, but you could easily make multi-user programs with them. That's why the web won.

      • By crucialfelix 2025-06-0812:16

        Yes! I used FileMaker a lot, and built my first journaling system with it. Like a cross between hypercard and a wiki. It really changed my life and this lead to programming.

      • By specialist 2025-06-0812:28

        Yes. They didn't survive the transition from workgroup (shared files on a LAN) to client/server.

      • By skeeter2020 2025-06-0814:05

        the stuff I made in Access (and later Excel) looks a lot like the stuff I generate with AI these days!

    • By garyrob 2025-06-0723:461 reply

      "In an alternate timeline, HyperCard was not allowed to wither and die, but instead continued to mature, embraced the web..."

      In yet another alternate timeline, someone thought to add something like URLs with something like GET, PUT, etc. to HyperCard, and Tim Berners-Lee's invention of the Web browser never happened because Hypercard already did it all.

      • By jandrese 2025-06-080:071 reply

        On one hand this would be simply amazing, on the other hand it would have been a total security nightmare that makes early Javascript look like a TPM Secure Enclave.

        • By duskwuff 2025-06-080:59

          Those who were there will remember:

            on openbackground --merryxmas
              merryxmas "on openbackground --merryxmas"
            end openbackground
          
          (And now I'm curious if this post will trip anyone's antivirus software...)

    • By jostylr 2025-06-0819:31

      There is hypersrcipt: https://hyperscript.org which claims a descent from hypercard and certainly embraces the web.

      Also, this might happen in a few years if AI improves enough to be trusted to make things by novices. Hard to imagine, but just maybe.

    • By dan-robertson 2025-06-0719:441 reply

      Not sure that sculpting clay is the best analogy. Lots of sculpting is hard, as is turning clay, especially if you want to successfully fire the result. Maybe it is an accurate analogy, but people may understand the difficulty differently.

      • By bombcar 2025-06-0722:36

        Hypercard is more like Lego - you can simply buy completed sets (use other's hypercard programs) - or you can put together things according to instructions - but you can always take them apart and change them, and eventually build your own.

    • By moffkalast 2025-06-088:46

      Looking at the HyperTalk syntax [0] it's interesting how we take left hand variable assignment as a given while math typically teaches the exact opposite since you can't really write the answer before you have the question.

      Makes you think if lambda expressions would be more consistent with the rest if they were reversed.

      [0] https://en.wikipedia.org/wiki/HyperTalk#Fundamental_operatio...

    • By jchrisa 2025-06-0718:02

      I haven't posted it here yet b/c it's not show ready, but we have been building this vision -- I like to think of it as an e-bike for the mind.

      https://vibes.diy/

      We had a lot of fun last night with Vibecode Karaoke, where you code an app at the same time as you sing a song.

    • By mannyv 2025-06-0719:03

      Hypercard must have been a support nightmare.

    • By DonHopkins 2025-06-0720:11

      https://news.ycombinator.com/item?id=22285675

      DonHopkins on Feb 10, 2020 | parent | context | favorite | on: HyperCard: What Could Have Been (2002)

      Do you have the first commercial HyperCard stack ever released: the HyperCard SmutStack? Or SmutStack II, the Carnal Knowledge Navigator, both by Chuck Farnham? SmutStack was the first commercial HyperCard product available at rollout, released two weeks before HyperCard went public at a MacWorld Expo, cost $15, and made a lot of money (according to Chuck). SmutStack 2, the Carnal Knowledge Navigator, had every type of sexual adventure you could imagine in it, including information about gays, lesbians, transgendered, HIV, safer sex, etc. Chuck was also the marketing guy for Mac Playmate, which got him on Geraldo, and sued by Playboy.

      https://www.zdnet.com/article/could-the-ios-app-be-the-21st-...

      >Smut Stack. One of the first commercial stacks available at the launch of HyperCard was Smut Stack, a hilarious collection (if you were in sixth grade) of somewhat naughty images that would make joke, present a popup image, or a fart sound when the viewer clicked on them. The author was Chuck Farnham of Chuck's Weird World fame.

      >How did he do it? After all, HyperCard was a major secret down at Cupertino, even at that time before the wall of silence went up around Apple.

      >It seems that Farnham was walking around the San Jose flea market in the spring of 1987 and spotted a couple of used Macs for sale. He was told that they were broken. Carting them home, he got them running and discovered several early builds of HyperCard as well as its programming environment. Fooling around with the program, he was able to build the Smut Stack, which sold out at the Boston Macworld Expo, being one of the only commercial stacks available at the show.

      https://archive.org/stream/MacWorld_9008_August_1990/MacWorl...

      Page 69 of https://archive.org/stream/MacWorld_9008_August_1990

      >Famham's Choice

      >This staunch defender was none other than Chuck Farnham, whom readers of this column will remember as the self-appointed gadfly known for rooting around in Apple’s trash cans. One of Farnham ’s myriad enterprises is Digital Deviations, whose products include the infamous SmutStack, the Carnal Knowledge Navigator, and the multiple-disk set Sounds of Susan. The last comes in two versions: a $15 disk of generic sex noises and, for $10 more, a personalized version in which the talented Susan moans and groans using your name. I am not making this up.

      >Farnham is frank about his participation in the Macintosh smut trade. “The problem with porno is generic,” he says, sounding for the briefest moment like Oliver Wendell Holmes. “When you do it, you have to make a commitment ... say you did it and say it’s yours. Most people would not stand up in front of God and country and say, ‘It’s mine.’ I don’t mind being called Mr. Scum Bag.”

      >On the other hand, he admits cheerily, “There’s a huge market for sex stuff.” This despite the lack of true eroticism. “It’s a novelty,” says Farnham. Sort of the software equivalent of those ballpoint pens with the picture of a woman with a disappearing bikini.

      https://archive.org/stream/NewComputerExpress110/NewComputer...

      Page 18 of https://archive.org/stream/NewComputerExpress110

      >“Chuck developed the first commercial stack, the Smutstack, which was released two weeks before HyperCard went public at a MacWorld Expo. He’s embarrassed how much money a silly collection of sounds, cartoons, and scans of naked women brought in. His later version, the Carnal Knowledge Navigator, was also a hit.

      I've begged Chuck to dig around to see if he has an old copy of the floppy lying around and upload it, but so far I don't know of a copy online you can run. Its bold pioneering balance of art and slease deserves preservation, and the story behind it is hilarious.

      Edit: OMG I've just found the Geraldo episode with Chuck online, auspiciously titled "Geraldo: Sex in the 90's. From Computer Porn to Fax Foxes", which shows an example of Smut Stack:

      https://visual-icon.com/lionsgate/detail/?id=67563&t=ts

      I love the way Chuck holds his smirk throughout the entire interview. And Geraldo's reply to his comment: "I was a fulfillment house for orders."

      "That sounds sexual in itself! What was a fulfilment house?"

    • By al_borland 2025-06-0721:35

      I actually had an experience like this yesterday. After reading Gruber talk about how Markdown was never meant for notes, I started to rethink things. I wanted plain text, to be future proof, then stumbled across CotEditor as a means to edit. Inside I was able to use the code highlighting and outline config to define my own regex and effectively create my own markup language with just a dash of regex and nothing more. I then jumped over to Shortcuts and dragged and dropped some stuff together to open/create yearly and daily notes (on either my computer or phone), or append to a log with a quick action.

      It is a custom system that didn’t require any code (if you don’t count the very minor bits of regex (just a lot of stuff like… ^\s- .).

      Is it a good system, probably not, but we’ll see where it goes.

    • By kadushka 2025-06-0718:371 reply

      inspired an entire genre of software-creating software. In this timeline, people shape their computing experiences as easily as one might sculpt a piece of clay, creating personal apps that make perfect sense to them and fit like a glove

      LLMs inspired vibe coding - that’s our timeline.

  • By dkislyuk 2025-06-0717:349 reply

    From Walter Isaacson's _Steve Jobs_:

    > One of Bill Atkinson’s amazing feats (which we are so accustomed to nowadays that we rarely marvel at it) was to allow the windows on a screen to overlap so that the “top” one clipped into the ones “below” it. Atkinson made it possible to move these windows around, just like shuffling papers on a desk, with those below becoming visible or hidden as you moved the top ones. Of course, on a computer screen there are no layers of pixels underneath the pixels that you see, so there are no windows actually lurking underneath the ones that appear to be on top. To create the illusion of overlapping windows requires complex coding that involves what are called “regions.” Atkinson pushed himself to make this trick work because he thought he had seen this capability during his visit to Xerox PARC. In fact the folks at PARC had never accomplished it, and they later told him they were amazed that he had done so. “I got a feeling for the empowering aspect of naïveté”, Atkinson said. “Because I didn’t know it couldn’t be done, I was enabled to do it.” He was working so hard that one morning, in a daze, he drove his Corvette into a parked truck and nearly killed himself. Jobs immediately drove to the hospital to see him. “We were pretty worried about you”, he said when Atkinson regained consciousness. Atkinson gave him a pained smile and replied, “Don’t worry, I still remember regions.”

    • By JKCalhoun 2025-06-0719:112 reply

      With overlapping rectangular windows (slightly simpler case than ones with rounded corners) you can expect visible regions of windows that are not foremost to be, for example, perhaps "L" shaped, perhaps "T" shaped (if there are many windows and they overlap left and right edges). Bill's region structure was, as I understand it, more or less a RLE (run-length encoded) representation of the visible rows of a window's bounds. The region for the topmost window (not occluded in any way) would indicate the top row as running from 0 to width-of-window (or right edge of the display if clipped by the display). I believe too there was a shortcut to indicate "oh, and the following rows are identical" so that an un-occluded rectangular window would have a pretty compact region representation.

      Windows partly obscured would have rows that may not begin at 0, may not continue to width-of-window. Window regions could even have holes if a skinnier window was on top and within the width of the larger background window.

      The cleverness, I think, was then to write fast routines to add, subtract, intersect, and union regions, and rectangles of this structure. Never mind quickly traversing them, clipping to them, etc.

      • By duskwuff 2025-06-0721:001 reply

        The QuickDraw source code refers to the contents of the Region structure as an "unpacked array of sorted inversion points". It's a little short on details, but you can sort of get a sense of how it works by looking at the implementation of PtInRgn(Point, RegionHandle):

        https://github.com/historicalsource/supermario/blob/9dd3c4be...

        As far as I can tell, it's a bounding box (in typical L/T/R/B format), followed by a sequence of the X/Y coordinates of every "corner" inside the region. It's fairly compact for most region shapes which arise from overlapping rectangular windows, and very fast to perform hit tests on.

        • By JKCalhoun 2025-06-0723:06

          Thanks for digging deeper.

      • By gblargg 2025-06-087:18

        The key seems to have been recognizing the utility of the region concept and making it fundamental to the QuickDraw API (and the clever representation that made finding the main rectangular portions easy). This insulated QuickDraw from the complexity of windowing system operations. Once you go implementing region operations you probably find that it's fairly efficient to work out the major rectangular regions so you can use normal graphics operations on them, leaving small areas that can just be done inefficiently as a bunch of tiny rectangles. All this work for clipped graphics was applicable to far more than just redrawing obscured window content, so it could justify more engineering time to polishing it. Given how easy they were to use, more things could leverage the optimization (e.g. using them to redraw only the dirty region when a window was uncovered).

    • By rjsw 2025-06-0718:164 reply

      I think the difference between the Apple and Xerox approach may be more complicated than the people at PARC not knowing how to do this. The Alto doesn't have a framebuffer, each window has its own buffer and the microcode walks the windows to work out what to put on each scanline.

      • By JKCalhoun 2025-06-0718:214 reply

        Not doubting that, but what is the substantive difference here? Does the fact that there is a screen buffer on the Mac facilitate clipping that is otherwise not possible on the Alto?

        • By lambdaone 2025-06-0718:462 reply

          It allows the Mac to use far less RAM to display overlapping windows, and doesn't require any extra hardware. Individual regions are refreshed independently of the rest of the screen, with occlusion, updates, and clipping managed automatically,

          • By saghm 2025-06-0719:271 reply

            Yeah, it seems like the hard part of this problem isn't merely coming up with a solution that technically is correct, but one that also is efficient enough to be actually useful. Throwing specialized or more expensive hardware at something is a valid approach for problems like this, but all else being equal, having a lower hardware requirement is better.

            • By al_borland 2025-06-0721:27

              I was just watching an interview with Andy Hertzfeld earlier today and he said this was the main challenge of the Macintosh project. How to take a $10k system (Lisa) and run it on a $3k system (Macintosh).

              He said they drew a lot of inspiration from Woz on the hardware side. Woz was well known for employing lots of little hacks to make things more efficient, and the Macintosh team had to apply the same approach to software.

          • By atombender 2025-06-0813:03

            So when the OS needs to refresh a portion of the screen (e.g. everything behind a top window that was closed), what happens?

            My guess is it asks each application that overlapped those areas to redraw only those areas (in case the app is able to be smart about redrawing incrementally), and also clips the following redraw so that any draw operations issued by the app can be "culled". If an app isn't smart and just redraws everything, the clipping can still eliminate a lot of the draw calls.

        • By rsync 2025-06-0719:54

          Displaying graphics (of any kind) without a framebuffer is called "racing the beam" and is technically quite difficult and involves managing the real world speed of the electron beam with the cpu clock speed ... as in, if you tax the cpu too much the beam goes by and you missed it ...

          The very characteristic horizontally stretched graphics of the Atari 2600 are due to this - the CPU was actually too slow, in a sense, for the electron beam which means your horizontal graphic elements had a fairly large minimum width - you couldn't change the output fast enough.

          I strongly recommend:

          https://en.wikipedia.org/wiki/Racing_the_Beam

          ... which goes into great detail on this topic and is one of my favorite books.

        • By ehaliewicz2 2025-06-0721:031 reply

          It definitely makes it simpler. You can do a per-screen window sort, rather than per-pixel :).

          Per-pixel sorting while racing the beam is tricky, game consoles usually did it by limiting the number of objects (sprites) per-line, and fetching+caching them before the line is reached.

          • By scripturial 2025-06-085:02

            I remember coding games for the C64 with an 8 sprite limit, and having to swap sprites in and out for the top and bottom half of the screen to get more than 8.

      • By peter303 2025-06-0723:101 reply

        Frame buffer memory was still incredibly expensive in 1980. Our labs 512 x 512 x 8bit table lookup color buffer cost $30,000 in 1980. Mac's 512 x 384 x 8bit buffer in 1984 had to fit the Macs $2500 price. The Xerox Alto was earlier than these two devices and would have cost even more if it had a full frame buffer.

      • By jecel 2025-06-0918:50

        The Alto created the image from a display list, like the Atari 800 or the Amiga. So you could have a wider rectangle on most of the screen for pictures and a narrower rectangle at the bottom for displaying status. It was not up to showing overlapping windows. Nearly all applications just set things to one rectangle, having a frame buffer in practice. This was the case for Smalltalk, which is where Bill saw the overlapping windows. One problem is that filling up the whole screen (606x808) used up half of the memory and slowed down user code, so Smalltalk-72 reduced this to 512x684 to get back some memory and performance.

        The Smalltalk-76 MVC user interface that the Apple people saw only ever updated the topmost window which, by definition, was not clipped by any other window. If you brought some other window to the front it would only then be updated. But since nothing ran in the background it was easy to get the wrong impression that the partially visible windows were being handled.

        Bill's solution had two parts: one was regions, as several other people have explained. It allowed drawing to a background window even while clipping to any overlapping windows that are closer. But the second was PICTs, where applications did not directly draw to their windows but instead created a structure (could be a file) with a list of drawing commands which was then passed to the operating system for the actual drawing. You could do something like "open PICT, fill background with grey pattern, draw white oval, draw black rectangle, close PICT". Now if the window was moved the OS could recalculate all the regions of the new configuration and re-execute all the PICTs to update any newly exposed areas. If the application chose to instead draw its own pixels (a game, for example) then the OS would insert a warning into the app's event queue that it should fix its window contents.

        In parallel with Bill's work (perhaps a little before it) we had Rob Pike's Blit terminal (commercially released in 1982) which added windows to Unix machines. It had the equivalent of regions (less compact, however) but used a per window buffer so the terminal would have where to copy newly exposed pixels from.

      • By mjevans 2025-06-0718:22

        Reminds me of a GPU's general workflow. (like the sibling comment, 'isn't that the obvious way this is done'? Different drawing areas being hit by 'firmware' / 'software' renderers?)

    • By heresie-dabord 2025-06-0720:182 reply

      Bill Atkinson, all smiles as he receives applause from the audience for his work on Mac Paint: https://www.youtube.com/watch?v=nhISGtLhPx4

      • By JKCalhoun 2025-06-0720:29

        That's a great video. Everything he does gets applause and he is all (embarrassed?) grins.

      • By rezmason 2025-06-080:18

        I like how he pronounces "pix-els", learning how we arrived at our current pronunciation is the kind of computer history I can't get enough of

    • By pducks32 2025-06-0720:511 reply

      Would someone mind explaining the technical aspect here? I feel with modern compute and OS paradigms I can’t appreciate this. But even now I know that feeling when you crack it and the thrill of getting the imposible to work.

      It’s on all of us to keep the history of this field alive and honor the people who made it all possible. So if anyone would nerd out on this, I’d love to be able to remember him that way.

      (I did read this https://www.folklore.org/I_Still_Remember_Regions.html but might be not understanding it fully)

      • By giovannibajo1 2025-06-0721:442 reply

        There were far fewer abstraction layers than today. Today when your desktop application draws something, it gets drawn into a context (a "buffer") which holds the picture of the whole window. Then the window manager / compositor simply paints all the windows on the screen, one on top of the other, in the correct priority (I'm simplifying a lot, but just to get the idea). So when you are programing your application, you don't care about other applications on the screen; you just draw the contents of your window and that's done.

        Back at the time, there wouldn't be enough memory to hold a copy of the full contents all possible windows. In fact, there were actually zero abstraction layers: each application was responsible to draw itself directly into the framebuffer (array of pixels), into its correct position. So how to handle overlapping windows? How could each application draw itself on the screen, but only on the pixels not covered by other windows?

        QuickDraw (the graphics API written by Atkinson) contained this data structure called "region" which basically represent a "set of pixels", like a mask. And QuickDraw drawing primitives (eg: text) supported clipping to a region. So each application had a region instance representing all visible pixels of the window at any given time; the application would then clip all its drawing to the region, so that only the visibile pixels would get updated.

        But how was the region implemented? Obviously it could have not been a mask of pixels (as in, a bitmask) as it would use too much RAM and would be slow to update. In fact, think that the region datastructure had to be quick at doing also operations like intersections, unions, etc. as the operating system had to update the regions for each window as windows got dragged around by the mouse.

        So the region was implemented as a bounding box plus a list of visible horizontal spans (I think, I don't know exactly the details). When you represent a list of spans, a common hack is to use simply a list of coordinates that represent the coordinates at which the "state" switches between "inside the span" to "outside the span". This approach makes it for some nice tricks when doing operations like intersections.

        Hope this answers the question. I'm fuzzy on many details so there might be several mistakes in this comment (and I apologize in advance) but the overall answer should be good enough to highlight the differences compared to what computers to today.

        • By II2II 2025-06-080:491 reply

          It's a good description, but I'm going to add a couple of details since details that are obvious to someone who lived through that era may not be obvious to those who came after.

          > Obviously it could have not been a mask of pixels

          To be more specific about your explanation of too much memory: many early GUIs were 1 bit-per-pixel, so the bitmask would use the same amount of memory as the window contents.

          There was another advantage to the complexity of only drawing regions: the OS could tell the application when a region was exposed, so you only had to redraw a region if it was exposed and needed an update or it was just exposed. Unless you were doing something complex and could justify buffering the results, you were probably re-rendering it. (At least that is my recollections from making a Mandelbrot fractal program for a compact Mac, several decades back.)

          • By gblargg 2025-06-087:24

            And even ignoring memory requirements, an uncompressed bitmap mask would have taken a lot of time to process (especially considering when combining regions where one was not a multiple of 8 pixels shifted with respect to the other. With just the horizontal coordinates of inversions, it takes the same amount of time for a region 8 pixels wide and 800 pixels wide, given the same shape complexity.

        • By duskwuff 2025-06-083:041 reply

          > But how was the region implemented?

          The source code describes it as "an unpacked array of sorted inversion points". If you can read 68k assembly, here's the implementation of PtInRgn:

          https://github.com/historicalsource/supermario/blob/9dd3c4be...

          • By giovannibajo1 2025-06-0810:461 reply

            Yeah those are the horizontal spans I was referring to.

            It’s a sorted list of X coordinates (left to right). If you group them in couples, they are begin/end intervals of pixels within region (visibles), but it’s actually more useful to manipulate them as a flat array, as I described.

            I studied a bit the code and each scanline is prefixed by the Y coordinates, and uses an out of bounds terminator (32767).

            • By duskwuff 2025-06-0820:562 reply

              It's a bit more than that. The list of X coordinates is cumulative - once an X coordinate has been marked as an inversion, it continues to be treated as an inversion on all Y coordinates below that, not just until the next Y coordinate shows up. (This manifests in the code as D3 never being reset within the NOTRECT loop.) This makes it easier to perform operations like taking the union of two disjoint regions - the sets of points are simply sorted and combined.

              • By giovannibajo1 2025-06-099:491 reply

                Uhm can you better explain that? I don’t get it. D3 doesn’t get reset because it’s guaranteed to be 0 at the beginning of each scanline, and the code needs to go through all “scanline blocks” until it finds the one whose Y contains the one specified as argument. It seems to me that each scanline is still self contained and begins logically at X=0 in the “outside” state?

                • By duskwuff 2025-06-0919:11

                  > D3 doesn’t get reset because it’s guaranteed to be 0 at the beginning of each scanline

                  There's no such guarantee. The NEXTHOR loop only inverts for points which are to the absolute left of the point being tested ("IS HORIZ <= PT.H ? \\ NO, IGNORE THIS POINT").

                  Imagine that, for every point, there's a line of inversion that goes all the way down to the bottom of the bounding box. For a typical rectangular region, there's going to be four inversion points - one for each corner of the rectangle. The ones on the bottom cancel out the ones on the top. To add a second disjoint rectangle to the region, you'd simply include its four points as well; so long as the regions don't actually overlap, there's no need to keep track of whether they share any scan lines.

    • By jajko 2025-06-0718:254 reply

      Pretty awesome story, but also with a bit of dark lining. Of course any owner, and triple that for Jobs, loves over-competent guys who work themselves to the death, here almost literally.

      But that's not a recipe for personal happiness for most people, and most of us would not end up contributing revolutionary improvements even if done so. World needs awesome workers, and we also need ie awesome parents or just happy balanced content people (or at least some part of those).

      • By 1123581321 2025-06-0718:561 reply

        Pretty much. Most of us have creative itches to scratch that make us a bit miserable if we never get to pursue them, even if given a comfortable life. It’s circumstantial whether we get to pursue them as entrepreneurs or employees. The users or enjoyers of our work benefit either way.

        • By kevinventullo 2025-06-0720:191 reply

          Just to add on, some of us have creative itches that are not directly monetizable, and for which there may be no users or enjoyers of our work at all (if there are, all the better!).

          Naturally I don’t expect to do such things for a living.

          • By 1123581321 2025-06-083:47

            Yes, and thankful for these. Good addition.

      • By richardw 2025-06-0720:14

        Survivorship bias. The guys going home at 5 went home at 5 and their companies are not written about. It’s dark but we’ve been competing for a while as life forms and this is “dark-lite” compared to what our previous generations had to do.

        Some people are competing, and need to make things happen that can’t be done when you check out at 5. Or more generally: the behaviour that achieves the best outcome for a given time and place, is what succeeds and forms the legends of those companies.

        If you choose one path, know your competitors are testing the other paths. You succeed or fail partly based on what your most extreme competitors are willing to do, sometimes with some filters for legality and morality. (I.e. not universally true for all countries or times.)

        Edit: I currently go home at 5, but have also been the person who actually won the has-no-life award. It’s a continuum, and is context specific. Both are right and sometimes one is necessary.

      • By duskwuff 2025-06-0721:021 reply

        That's not quite how I read the story. Jobs didn't ask Atkinson if he remembered regions - Atkinson brought it up.

        • By asveikau 2025-06-0722:39

          It's also a joke, and a pretty good one at that. Shows a sense of humor.

      • By bowsamic 2025-06-0719:37

        What is the dark lining? Do you think Atkinson did not feel totally satisfied with his labour?

        And I don't think anyone said that that's the only way to be

    • By bluedino 2025-06-0719:572 reply

      > In fact the folks at PARC had never accomplished it, and they later told him they were amazed that he had done so.

      Reminds me of the story where some company was making a new VGA card, and it was rumored a rival company had implemented a buffer of some sort in their card. When both cards came out the rival had either not actually implemented it or implemented a far simpler solution

      • By alanfalcon 2025-06-0721:152 reply

        An infamous Starcraft example also contains notes of a similar story where they were so humbled by a competitor's demo (and criticism that their own game was simply "Warcraft in space") that they went back and significantly overhauled their game.

        Former Ion Storm employees later revealed that Dominion’s E3 1996 demo was pre-rendered, with actors pretending to play, not live gameplay.

        • By stevenwoo 2025-06-080:162 reply

          I got a look at an early version of StarCraft source code as a reference for the sound library for Diablo 2 and curiosity made me do a quick analysis of the other stuff - they used a very naive approach to C++ and object inheritance to which first time C++ programmers often fall victim. It might have been their first C++ project so they probably needed to start over again anyways. We had an edict on Diablo 2 to make the C++ look like recognizable C for Dave Brevik's benefit which turned out pretty well I think (it was a year late but we shipped).

          • By genewitch 2025-06-080:541 reply

            Diablo II is in my top 3 games of all time, i still play it all the time. Thanks for contributing so much fun to my life!

            (for ref, diablo III is also in my top 3 :)

            • By stevenwoo 2025-06-084:40

              I was only one of many programmers at Blizzard North + others at Blizzard and our parent company at the time, but you are welcome from me.

          • By selimthegrim 2025-06-0814:26

            What exactly did they do that was naive?

        • By mhh__ 2025-06-0721:38

          Similar tale with propaganda and stats with asterisks missing about the MiG-25 leading to the requirements for the F-15 being very high.

      • By Grosvenor 2025-06-080:001 reply

        Michael Abrash's black book of graphics programming. They heard about a "buffer", so implemented the only non-stupid thing - a write FIFO. Turns out the competition had done the most stupid thing and built a read buffer.

        I teach this lesson to my mentees. Knowing that something is possible gives you significant information. Also, don't brag - It gives away significant information.

        Just knowing something is possible makes it much, much easier to achieve.

        https://valvedev.info/archives/abrash/abrash.pdf

        • By bogantech 2025-06-0820:06

          > Turns out the competition had done the most stupid thing and built a read buffer

          This isn't really stupid though as explained in the pdf

          > Paradise had stuck a read FIFO between display memory and the video output stage of the VGA, allowing the video output to read ahead, so that when the CPU wanted to access display memory, pixels could come from the FIFO while the CPU was serviced immediately. That did indeed help performance--but not as much as Tom’s write FIFO.

          VRAM accesses are contended, so during the visual display period the VGA circuitry has priority. CPU accesses result in wait states - a FIFO between the VRAM and the VGA means less contention and more cycles for CPU accesses

          Why improve read performance though? Games accessing VRAM I presume would be 99% write. Perhaps it was to improve performance in GUIs like Windows?

    • By 90s_dev 2025-06-0718:45

      [flagged]

  • By JKCalhoun 2025-06-0718:358 reply

    When I was on the ColorSync team at Apple we, the engineers, got an invite to his place-in-the-woods one day.

    I knew who he was at the time, but for some reason I felt I was more or less beholden to conversing only about color-related issues and how they applied to a computer workflow. Having retired, I have been kicking myself for some time not just chatting with him about ... whatever.

    He was at the time I met him very in to a kind of digital photography. My recollection was that he had a high-end drum scanner and was in fact scanning film negatives (medium format camera?) and then going with a digital workflow from that point on. I remember he was excited about the way that "darks" could be captured (with the scanner?). A straight analog workflow would, according to him, cause the darks to roll off (guessing the film was not the culprit then, perhaps the analog printing process).

    He excitedly showed us on his computer photos he took along the Pacific ocean of large rock outcroppings against the ocean — pointing out the detail that you could see in the shadow of the rocks. He was putting together a coffee table book of his photos at the time.

    I have to say that I mused at the time about a wealthy, retired, engineer who throws money at high end photo gear and suddenly thinks they're a photographer. I think I was weighing his "technical" approach to photography vs. a strictly artistic one. Although, having learned more about Ansel Adams technical chops, perhaps for the best photographers there is overlap.

    • By rezmason 2025-06-080:282 reply

      > I have been kicking myself for some time not just chatting with him about ... whatever.

      Maybe I should show some initiative! See, for a little while now I've wanted to just chat with you about whatever.

      At this moment I'm working on a little research project about the advent of color on the Macintosh, specifically the color picker. Would you be interested in a casual convo that touches on that? If so, I can create a BlueSky account and reach out to you over there. :)

      https://merveilles.town/deck/@rezmason/114586460712518867

      • By diskzero 2025-06-082:092 reply

        John is cool, but I don't think he was around when the Macintosh II software and hardware was being designed for color support. I did work with Eric Ringewald at Be and he was one of the Color Quickdraw engineers. He would be fun to talk to. Michael Dhuey worked on the hardware of the Mac II platform. I guess we can give some credit to Jean-Louis Gassée as well. Try to talk to those people! I got to work with a lot of these Apple legends at General Magic, Be, Eazel and then back at Apple again. I never got to work on a project with JKCalhoun directly, but I did walk by his office quite frequently.

        • By JKCalhoun 2025-06-083:13

          True. I showed up at Apple in '95 after Color Quickdraw was already a thing.

          Hilariously though, I did get handed the color pickers to "port" to PowerPC. In fact one of the first times I thought I was in over my head being at Apple was when I was staring at 68030 assembly and thinking, "Fuck, I have to rewrite this in C perhaps."

          From your username, I feel like we've chatted before (but I don't know your real name).

        • By rezmason 2025-06-083:10

          > I never got to work on a project with JKCalhoun directly, but I did walk by his office quite frequently.

          Did you ever get hit with a paper airplane as you did? ;)

          Thanks for this reply, and if you're who I think you are, thank you for all the good work you did alongside these other folks :D

      • By JKCalhoun 2025-06-083:10

        We can certainly chat.

    • By sneak 2025-06-082:15

      > I have to say that I mused at the time about a wealthy, retired, engineer who throws money at high end photo gear and suddenly thinks they're a photographer.

      Duchamp would like a word.

      Seriously though, as someone this describes to a T (though “suddenly” in this case is about 19 years), I was afraid to call myself any sort of artist for well over a decade, thinking I was just acquiring signal with high end gear. I didn’t want to try to present myself as something I’m not. After all, I just push the button, the camera does all the work.

      I now have come to realize that this attitude is toxic and unnecessary. Art (even bad art!) doesn’t need more gatekeeping or gatekeepers.

      I am a visual artist. A visual artist with perhaps better equipment than my skill level or talent justifies, but a visual artist nonetheless.

    • By throwanem 2025-06-0720:201 reply

      There probably still isn't a good way to get that kind of dynamic range entirely in the digital domain. Oh, I'm sure the shortfall today is smaller, say maybe four or five stops versus probably eight or twelve back then. Nonetheless, I've done enough work in monochrome to recognize an occasional need to work around the same limitations he was, even though very few of my subjects are as demanding.

      • By JKCalhoun 2025-06-0720:324 reply

        I wish a good monochrome digital camera didn't cost a small fortune. And I'm too scared to try to remove the Bayer grid from a "color" CCD.

        Seems that, without the color/Bayer thing, you could get an extra stop or two for low-light.

        I had a crazy notion to make a camera around an astronomical CCD (often monochrome) but they're not cheap either — at least one with a good pixel count.

    • By lanyard-textile 2025-06-0721:291 reply

      :) Color in the computer is a good “whatever” topic.

      Sometimes it’s just nice to talk about the progress of humanity. Nothing better than being a part of it, the gears that make the world turn.

      • By JKCalhoun 2025-06-0723:08

        Ha ha, but it's also "talking shop". I'm sure Bill preferred it to talking about his Quickdraw days.

    • By Aloha 2025-06-0722:56

      You always lose something when doing optical printing - you can often gain things too, but its not 1:1.

      I adore this hybrid workflow, because I can pick how the photo will look, color palate, grain, whatever by picking my film, then I can use digital to fix (most if not all of) the inherent limitations in analog film.

      Sadly, film is too much of a pain today, photography has long been about composition for me, not cameras or process - I liked film because I got a consistent result, but I can use digital too, and I do today.

    • By hugs 2025-06-0723:341 reply

      "When art critics get together they talk about form and structure and meaning. When artists get together they talk about where you can buy cheap turpentine."

    • By herodotus 2025-06-0813:14

      Bill showed up at one of the WWDCs (2011?). I sat next to him during a lunch, not knowing who he was! He told me his name, and then showed me some photos he had taken. He seemed to me to be a gentle and kind soul. So sad to read this news.

    • By gxs 2025-06-0719:174 reply

      > I have to say that I mused at the time about a wealthy, retired, engineer who throws money at high end photo gear and suddenly thinks they're a photographer

      I think this says more about you than it does about him

      • By dang 2025-06-0719:571 reply

        Please don't cross into personal attack. The cost outweighs any benefit.

        https://news.ycombinator.com/newsguidelines.html

        • By gxs 2025-06-0722:09

          Ugh I hate that you’re almost always right

          I was about to argue but then I saw this part

          > The cost outweighs any benefit.

          And this is absolutely true - there is a benefit but it doesn’t mean it’s worth it

          Either way my bad, I should have elaborated and been more gentle instead of just that quip

      • By viccis 2025-06-0719:431 reply

        It's true though. This effect is what keeps companies like PRS in business.

        • By bombcar 2025-06-0719:56

          There’s a whole industry of prosumer stuff in … well, many industries.

          Power tools definitely have it!

      • By JKCalhoun 2025-06-0720:11

        I don't deny that. That's probably true about a lot of observations.

      • By spiralcoaster 2025-06-0719:592 reply

        This is absolutely true and I don't understand why you're being downvoted. Especially in the context of this man just recently dying, there's someone throwing in their elitist opinion about photographers and how photography SHOULD be done, and apparently Bill was doing it wrong.

        • By JKCalhoun 2025-06-0720:12

          Well, I certainly didn't mean for it to come across that way. I wasn't saying this was the case with Bill. To be clear, I saw nothing bad about Bill's photos. (Also I'm not really versed enough in professional photography to have a valid opinion even if I didn't like them and so would not have publicly weighed in on them anyway.)

          I was though being honest about how I felt at that time — debated whether to keep it to myself or not today (but I always foolishly error on the side of being forthcoming).

          Perhaps it's a strange thing to imagine that someone would pursue in their spare time, especially after retired, what they did professionally.

        • By brulard 2025-06-0721:531 reply

          He said "at the time". If I say "I thought X at the time" it implies I have reconsidered since. Your parents comment was unnecessarily condescending

          • By gxs 2025-06-0722:111 reply

            It’s just the timing and how he said it, especially considering the tone of the message overall

            But the irony isn’t lost on me that I myself shouldn’t have been so mean about it

            • By JKCalhoun 2025-06-0723:09

              You're right about the timing.

HackerNews