Free software scares normal people

2025-10-3015:07921615danieldelaney.net

Dd I’m the person my friends and family come to for computer-related help. (Maybe you, gentle reader, can relate.) This experience has taught me which computing tasks are frustrating for normal…

Dd

I’m the person my friends and family come to for computer-related help. (Maybe you, gentle reader, can relate.) This experience has taught me which computing tasks are frustrating for normal people.

Normal people often struggle with converting video. They will need to watch, upload, or otherwise do stuff with a video, but the format will be weird. (Weird, broadly defined, is anything that won’t play in QuickTime or upload to Facebook.)

I would love to recommend Handbrake to them, but the user interface is by and for power users. Opening it makes normal people feel unpleasant feelings.

This problem is rampant in free software. The FOSS world is full of powerful tools that only have a “power user” UI. As a result, people give up. Or worse: they ask people like you and I to do it for them.

I want to make the case to you that you can (and should) solve this kind of problem in a single evening.

Take the example of Magicbrake, a simple front end I built. It hides the power and flexibility of Handbrake. It does only the one thing most people need Handbrake for: taking a weird video file and making it normal. (Normal, for our purposes, means a small MP4 that works just about anywhere.)

There is exactly one button.

This is a fast and uncomplicated thing to do. Unfortunately, the people who have the ability to solve problems like this are often disinclined to do it.

“Why would you make Handbrake less powerful on purpose?”

“What if someone wants a different format?”

“What about [feature/edge case]?”

The answer to all these questions is the same: a person who needs or wants that stuff can use Handbrake. If they don’t need everything Handbrake can do and find it bewildering, they can use this. Everyone wins.

It’s a bit like obscuring the less-used functions on a TV remote with tape. The functions still exist if you need them, but you’re not required to contend with them just to turn the TV on.

People benefit from stuff like this, and I challenge you to make more of it. Opportunities are everywhere. The world is full of media servers normal people can’t set up. Free audio editing software that requires hours of learning to be useful for simple tasks. Network monitoring tools that seem designed to ward off the uninitiated. Great stuff normal people don’t use. All because there’s only one UI, and it’s designed to do everything.

80% of the people only need 20% of the features. Hide the rest from them and you’ll make them more productive and happy. That’s really all it takes.


Read the original article

Comments

  • By squeedles 2025-10-3016:1222 reply

    Good article, but the reasoning is wrong. It isn't easy to make a simple interface in the same way that Pascal apologized for writing a long letter because he didn't have time to write a shorter one.

    Implementing the UI for one exact use case is not much trouble, but figuring out what that use case is difficult. And defending that use case from the line of people who want "that + this little extra thing", or the "I just need ..." is difficult. It takes a single strong-willed defender, or some sort of onerous management structure, to prevent the interface from quickly devolving back into the million options or schizming into other projects.

    Simply put, it is a desirable state, but an unstable one.

    • By DrewADesign 2025-10-3016:238 reply

      Overall, the development world does not intuitively understand the difficulty of creating good interfaces (for people that aren’t developers.) In dev work, the complexity is obvious, and that makes it easy for outsiders to understand— they look at the code we’re writing and say “wow you can read that?!” I think that can give developers a mistaken impression that other peoples work is far less complex than it is. With interface design, everybody knows what a button does and what a text field is for, and developers know more than most about the tools used to create interfaces, so the language seems simple. The problems you need to solve with that language are complex and while failure is obvious, success is much more nebulous and user-specific. So much of what good interfaces convey to users is implied rather than expressed, and that’s a tricky task.

      • By makeitdouble 2025-10-310:342 reply

        > creating good interfaces (for people that aren’t developers.)

        This is the part where people get excited about AI. I personally think they're dead wrong on the process, but strongly empathize with that end goal.

        Giving people the power to make the interfaces they need is the most enduring solution to this issue. We had attempts like HyperCard or Delphi, or Access forms. We still get Excel forms, Google forms etc.

        Having tools to incrementaly try stuff without having to ask the IT department is IMHO the best way forward, and we could look at those as prototypes for more robust applications to create from there.

        Now, if we could find a way to aggregate these ad hoc apps in an OSS way...

        • By marcus_holmes 2025-10-312:296 reply

          I have nightmare stories to tell of Access Forms from my time dealing with them in the 90's.

          The usual situation is that the business department hires someone with a modicum of talent or interest in tech, who then uses Access to build an application that automates or helps with some aspect of the department's work. They then leave (in a couple of cases these people were just interns) and the IT department is then called in to fix everything when it inevitably goes wrong. We're faced with a bunch of beginner spaghetti code [0], utterly terrible schema, no documentation, no spec, no structure, and tasked with fixing it urgently. This monster is now business-critical because in the three months it's been running the rest of the department has forgotten how to do the process the old way, and that process is time-critical.

          Spinning up a proper project to replace this application isn't feasible in the short term, because there are processes around creating software in the organisation, for very good reasons learned painfully from old mistakes, and there just isn't time to go through that. We have to fix what we can and get it working immediately. And, of course, these fixes cause havoc with the project planning of all our other projects because they're unpredictable, urgent, and high priority. This delays all the other projects and helps to give IT a reputation as taking too long and not delivering on our promised schedules.

          So yeah, what appears to be the best solution from a non-IT perspective is a long, long way from the best solution from an IT perspective.

          [0] and other messes; in one case the code refused to work unless a field in the application had the author's name in it, for no other reason than vanity, and they'd obfuscated the code that checked for that. Took me a couple of hours to work out wtf they'd done and pull it all out.

          • By nradov 2025-10-314:024 reply

            Of course this is ultimately the IT department's own fault for not responding quickly enough to legitimate business requirements. They need to actively look for ways to help rather than processing tickets.

            • By marcus_holmes 2025-10-314:162 reply

              Yeah, this is always the response. But it's wildly impractical - there are only so many developer hours available. The budget is limited, so not everyone gets what they want immediately. This should be obvious.

              Part of the problem is that the novices that create these applications don't consider all the edge cases and gnarly non-golden-path situations, but the experienced devs do. So the novice slaps together something that does 95% of the job with 5% of the effort, but when it goes wrong the department calls in IT to fix it, and that means doing the rest of the 95% of the effort. The result is that IT is seen as being slow and bureaucratic, when in fact they're just doing the fecking job properly.

              • By nradov 2025-10-314:342 reply

                In most organizations the problem is lack of urgency rather than lack of developer hours. The developers sit in isolated siloes rather than going out and directly engaging with business units. This is mostly a management problem but there are plenty of individual developers who wait to be told what to do rather than actively seeking out better solutions for business problems.

                • By marcus_holmes 2025-10-314:403 reply

                  This usually comes back to maker time vs manager time.

                  If you want a developer to write good code quickly, put them in an isolated silo and don't disturb them.

                  If you want a developer to engage with the business units more, be prepared for their productivity to drop sharply.

                  As with all things in tech, it's a trade-off.

                  • By TeMPOraL 2025-10-318:111 reply

                    I think that's the lesser problem. The bigger problem is the attitude of IT is wrong from the start. When they start doing something, they want to Do It Right. They want to automate the business process. But that's the wrong goal! You can spend years doing that and go all the way to building a homegrown SAP, and it will still suck and people will still use their half-assed Excel sheets and Access hacks.

                    IT should not be focusing on the theoretical, platonic Business Process. It never exists in practice anyway. They should focus on streamlining actual workflow of actual people. I.e. the opposite advice to the usual. Instead of understanding what users want and doing it, just do what they tell you they want. The problem with standard advice is that the thing you seek to understand is emergent, no one has a good definition, and will change three times before you finish your design doc.

                    To help company get rid of YOLOed hacks in Excel and such made by interns, IT should YOLO better hacks. Rapid delivery and responsiveness, but much more robust and reliable because of actual developer expertise behind it.

                    • By thyristan 2025-10-318:401 reply

                      > They should focus on streamlining actual workflow of actual people.

                      If you streamline a shitty process, you will have diarrhea...

                      Unfortunately, most processes suck and need improvement. It isn't actually IT's job to improve processes. But almost always, IT is the only department that is able to change those processes nowadays since they are usually tied to some combination of lore, traditions, spreadsheets and misused third-party software.

                      If you just streamline what is there, you are cementing those broken processes.

                      • By TeMPOraL 2025-10-318:582 reply

                        That's precisely the mistake I'm talking about. You think you're smarter than people on the ground, and know better how they should do their job.

                        It's because of that condescending, know-it-all attitude that people actively avoid getting IT involved in anything, and prefer half-assed Excel hacks. And they're absolutely right.

                        Work with them and not over them, and you may get an opportunity to improve the process in ways that are actually useful. Those improvements aren't apparent until you're knee-deep in mud yourself, working hand by hand with the people you're trying to help.

                        • By skydhash 2025-10-3110:061 reply

                          The problem with hackish solution is that they get put in places they don’t belong. In other professions, there’s regulation in place to prevent these kind of shortcuts.

                          Also, if you have ever worked with anyone trying to get specifications worked out, you’ll see that most people (including devs) rely on intuition rather than checklists and will always forget to tell you something that is critical.

                          The thing is that cost of changes in the business can be a simple memo. But for software that usually means redesign.

                          • By TeMPOraL 2025-10-3110:571 reply

                            > The problem with hackish solution is that they get put in places they don’t belong. In other professions, there’s regulation in place to prevent these kind of shortcuts.

                            That's an illusion. The reality is, it's all hacky solutions on top of hacky solutions. Even in manufacturing: the spec may be fixed, and the factory line produces identical products by the million - but the spec was developed through an ad-hoc process, and the factory line itself is a pile of hacks that needs continued tuning to operate. And there is no perfectly specced out procedure for retooling a factory line to support the newest spec that came out of design department - retooling is, in itself, a small R&D project.

                            > Also, if you have ever worked with anyone trying to get specifications worked out, you’ll see that most people (including devs) rely on intuition rather than checklists and will always forget to tell you something that is critical.

                            This is the dirty truth about the universe - human organizations are piles of hacks, always in flux; and so is life itself. The sameness and harmony we see in nature is an illusion brought on by scale (in two ways - at large scale, because we live too short to see changes happening; at small scale, because the chaos at biomolecular level averages out to something simpler at the scale we can perceive).

                            Order and structure are not the default. It takes hard work to create and maintain them, so it's better be worth the cost. The prevalence of Excel-based hacks in corporate is a proof positive that, for internal software, it usually isn't worth it, despite what the IT department thinks.

                            > The thing is that cost of changes in the business can be a simple memo. But for software that usually means redesign.

                            Which is why you shouldn't be building cathedrals that need expensive rework every other week because of some random memo. Instead, go down to where people work; see them tweaking their shovels, take those shovels and make the tweak they want the proper way.

                            • By skydhash 2025-10-3114:281 reply

                              We could do this. And if you take a look at some solutions like the old VisualBasic/Delphi/Unix scripts, the philosophy is the same: Create small software quickly that solves some user/business needs. Systems like Java/.Net and their IDEs, as all as current mobile SDK, they run against that need.

                              A bit of tangent: I think the idea of coddling users is what’s leading to the complexity of all those system. We’re building cathedrals when we need tents. Instead of having small, sharp software tools that can be adjusted easily, we’re developing behemoths that’s supposed to handle everything under the sun (systemd, a lot of modern package managers, languages that is tied to that one IDE,…)

                              • By ghaff 2025-10-3115:521 reply

                                Well, except you end up with 20 different incompatible tools with different workflows.

                                I'm not really arguing for mega-tools with locked-down workflows. But there's usually some happy-ish medium between chaos and rigid monoliths.

                                • By listenallyall 2025-11-011:482 reply

                                  > 20 different incompatible tools with different workflows

                                  Kinda the whole goal behind (and "benefit" of) microservices, right? Totally independent dev teams, all uncoupled from each other, no need to look inside at the code, language-independent - just pass data according to an API and dont look behind the curtain.

                                  • By TeMPOraL 2025-11-018:47

                                    Internally it could work if the teams understand that their services are never done - they're part of a living organism, and the responsibility of a team assigned to a service is to keep it working. Shit will break constantly, but that's not a problem as long as it gets fixed. It's labor-intensive, but done right, we're talking few devs being busy maintaining a process that benefits thousands, or hundreds of thousands of their colleagues. It's what the internal development is meant to be.

                                    At some point in our industry, "service providers" started thinking of themselves as kings, instead of what they were supposed to be - servants.

                                  • By ghaff 2025-11-0115:37

                                    That's the theory. You can also end up with a lot of effort devoted to maintaining totally independent tool chains which may have a single person bus factor.

                        • By littlecosmic 2025-10-3114:101 reply

                          In my experience, it’s often the business side - rather than IT - that tries to use a technical change to force change to the business process that they have failed to change politically… and it usually turns out that a technical change isn’t enough either.

                          • By TeMPOraL 2025-11-018:501 reply

                            Right. But it would help if internal IT wouldn't reinforce the business side in their delusions, and it starts with a mindset problem: IT thinking of itself as a department that delivers products and solutions, instead of a support force of servants meant to run around in the background and respond to immediate needs of people in the field.

                            • By ozim 2025-11-0111:45

                              You went too far and mixing IT with software development.

                              Software development delivers products, internal products and solutions that should be leveraged by business to improve rate of growth.

                              If you have software development department chucked into IT and make them be supporters that run in the background you are wasting potential or wasting money on their salaries.

                              If you want supporters make it IT only and pay for SaaS solutions that everyone is using.

                  • By thesumofall 2025-10-317:25

                    That depends if one measures productivity in LOCs or business impact. As always, it’s not black or white, but my experience is that higher proximity is a net benefit

                  • By nradov 2025-10-3114:57

                    Actually there is no trade-off. That's a common misunderstanding. Reducing latency creates more business value than increasing productivity.

                    http://lpd2.com/

                • By scott_w 2025-10-3111:381 reply

                  > In most organizations the problem is lack of urgency rather than lack of developer hours.

                  I disagree: it's a business prioritisation issue (not necessarily a problem). Ultimately, a lot of the processes are there because the wider business (rightly) wants IT to work on the highest impact issues. A random process that 3 people suffer from probably isn't the highest impact for the business as a whole.

                  Also, because it's not high impact, it makes sense that an intern is co-opted to make life easier (also as a learning experience), however it also causes the issues OP highlighted.

                  The problem is solvable, I think, but it's not easily solvable!

                  • By HeyLaughingBoy 2025-10-3122:17

                    Yes, but often the "business priorities" get so screwed up that people's needs go unmet, and the business ends up wasting money as a result.

                    My best example was a conversation I had with one of the scientists at my job when she mentioned that she had people spending hours every day generating reports from data our instruments produced. I pointed out that with the code we had it would be simple to generate the reports automatically.

                    Her response that she had asked repeatedly for a developer to be assigned to the task, but she kept being pushed away because it was low priority.

                    I couldn't just change the codebase on my own (it was for a medical device), but it was easy enough to spend a lazy afternoon writing a tool to consume the output logs from the device and generate the reports that she needed. That's it: about 4 hours of work and produced something this person had asked for a year prior, and that people were already spending hours each day doing!

                    The people in charge of vetting requests never even bothered to ask a developer to estimate the task. They just heard that there was a work around, so it immediately became "low priority."

              • By listenallyall 2025-10-3112:401 reply

                > wildly impractical - there are only so many developer hours available

                Which is a huge reason that learning a RAD (rapid application development - emphasis on rapid) tool is a pretty useful skill.

                • By bccdee 2025-10-3120:071 reply

                  What makes it rapid is taking shortcuts. There's no silver bullet for increasing speed while preserving maintainability and extensibility. Prioritizing speed is exactly the problem described in the post you're responding to.

                  • By listenallyall 2025-11-011:341 reply

                    Not implementing any solution and blaming it on "only so many developer hours available" is the problem I'm responding to.

                    I'm not "prioritizing" anything. The scenario we're discussing is when an intern or low-level employee is able to successfully automate, enhance, or simplify a manual, inefficient business process that management has not seen fit to improve - so the worker does it themselves.

                    Access and similar platforms aren't "rapid" because of shortcuts, they are rapid because they are visual-based, drag-and-drop, object-oriented and often make a component's properties and methods customizable also via a visual interface. It's a different way of programming, yes, accessible to the masses (which is likely the reason you have so much disdain), but not "shortcuts".

                    • By bccdee 2025-11-0120:21

                      I don't have a problem with accessible development tools like that, but they do have tradeoffs. An Excel spreadsheet is an excellent tool for some business calculations, but once you start asking questions like, "can we add some authorization to this? Can we add some permissions? Can we pipe this into a UI?" we've passed the limits of Excel and we need to implement a real application. The more you implement in your spreadsheet, the more you may end up re-implementing in a real database.

            • By chasd00 2025-10-3113:21

              This has been my experience. Usually a department gets nowhere for months dealing with IT and then goes shopping for consultants. I make my living bringing things online for directors and other biz leadership who just couldn’t get IT off their ass.

            • By RobotToaster 2025-10-3113:221 reply

              Which is ultimately the c-suite's fault for not requiring that and allocating a budget for it.

              • By nradov 2025-10-3115:01

                Nah. It's usually a culture and organizational structure problem, not a budget problem.

            • By ozim 2025-10-317:46

              Downside is that it quickly turns to idea people coming over directly to dev team pushing BS ideas and requiring work to be done on that ASAP.

              You need a structure if you have org of 100+ employees. If it is smaller than that I don’t believe you get dev department.

          • By loki-ai 2025-10-314:271 reply

            more often than not it’s the development team that skips engaging with users, putting in minimal effort to understand their real needs.

            most of these teams only wants a straightforward spec, shut themselves off from distractions, just to emerge weeks or months later with something that completely misses the business case. and yet, they will find ways to point fingers at the product owner, project manager, or client for the disaster.

            • By marcus_holmes 2025-10-314:382 reply

              I have met the occasional person like this, sure. But only ever in really large organisations where they can hide, and only a minority.

              The huge majority of devs want to understand the business and develop high quality software for it.

              In one business I worked for, the devs knew more about the actual working of the business than most of the non-IT staff. One of the devs I worked with was routinely pulled into high-level strategy meetings because of his encyclopaedic knowledge of the details of the business.

              • By TeMPOraL 2025-10-318:363 reply

                The mistake is in trying to understand the business case. There is nothing to understand! The business case is the aggregate of what people actually do. There is no proper procedure that's actually followed at the ground level. Workflows are emergent and in constant flux. In this environment, the role of a dev should not be to build internal products, but to deliver internal hacks and ad-hoc solutios, maintain them, modify on the fly, and keep it all documented.

                I.e. done right, it should be not just possible but completely natural for a random team lead in the mail room to call IT and ask, "hey, we need a yellow highlighter in the sheet for packages that Steve from ACME Shipping needs to pick on extra evening run, can you add it?", and the answer should be "sure!" and they should have the new feature within an hour.

                Yes, YOLO development straight on prod is acceptable. It's what everyone else is doing all the time, in every aspect of the business. It's time for developers to stop insisting they're special and normal rules don't apply to them.

                • By bccdee 2025-11-010:151 reply

                  The thing about that is that you can only ever YOLO the superficial layers of your architecture, and only ever in certain ways. Having a YOLOable system requires deliberate and considered architectural choices deeper down.

                  • By TeMPOraL 2025-11-018:26

                    YOLOable systems are built out of flexible pieces. That's why Excel "abuse" is a constant theme in enterprise :).

                    We should not be thinking about architecture at the business process level. This is just repeating the mistake that needs to be avoided here. This is, and will ever be, a pile of ad-hoc hacks. They're not meant to add up to a well-designed system in a bottom-up fashion, because there is no system to design. The structure we naturally seek, is constantly in flux.

                    The right architectural/design decisions to make here is to make it possible to assemble quick hacks out of robust parts that fulfill additional needs the people on the ground may not consider - logs/audit trail/telemetry, consistency for ergonomic reasons, error handling, efficient compute usage, tracking provenance, information access restrictions dictated by legal obligations, etc.

                    The most important change needed is in the mindset. Internal dev needs to stop thinking of itself as the most important part of the company, or as a well-defined team that should own products. To be useful, the opposite is needed - such devs need to basically become ChatGPT that works: always be there to rapidly respond to requests to tweak some software by people on the ground, and then to retweak is as needed. They need to do this work rapidly, without judgement, and never assume they know better.

                    Only then people will stop weaving ad-hoc Excel sheets into business-critical processes.

                • By skydhash 2025-10-3110:331 reply

                  It would be nice if the computers could be that nice to work. It’s a completely dumb machine that needs everything spelled out. Humans are very flexible. To get flexibility out of a computer require great effort.

                  The main reason you want a computer is cheap emulation (cad, daw,…) or fast (and reliable) automation. Both requires great deal of specifications to get right.

                  • By TeMPOraL 2025-10-3110:591 reply

                    That's not what most people use computers for at work. And it's not what magic Excel sheets and Access forms are made for.

                    • By skydhash 2025-10-3114:171 reply

                      Are you sure? Almost all excel sheets are emulations of some process. They’re not great at it, but they work better than the alternatives. But organizations need automation more than emulation, i.e. they want to improve their process, not merely replacing them.

                      • By TeMPOraL 2025-11-018:42

                        Those sheets are not emulations of some process, they are the process. There is no perfect, platonic process, that the Excel sheets are merely shadows of. There is no fixed goal to approach iteratively. The process is defined by what people do, and it's improved by them through adjusting to situations as they occur and eliminating waste - and creating and evolving these Excel sheets is part of this continuous improvement.

                        Top ends of organizations need and want a lot of dumb, self-defeating things - this is very much the same thing that's described in "Seeing Like a State". Doing this blindly is, of course, a prerogative of the executives, but internal IT is actually low enough in the food chain that it could focus more on helping the org by improving the bottom layers, instead of embracing and encouraging the top to create more rigid structures.

                        EDIT: to refer back to the example I gave upthread:

                        "Steve from ACME Shipping" is not meant to be a special case of an "external vendor assigned to auxiliary shipping itinerary"; the system needs not to be designed to express the concepts of "external vendor" and "auxiliary shipping itinerary" and "shipping itinerary assignment for shippable resources". Steve is just Steve, the whole "extra evening run" thing is probably just a one-off emergency measure for Q3 that will disappear come Q4 (or persist, and then some executives will start talking talk about deeper changes to mailing process). Right now, all the mailing room needs is for someone to add a boolean flag:

                          // Steve from ACME Mailing
                          bool boundToACMEEVeningRun;
                        
                        and hook it up to a button that sets it and logic that displays it with a yellow highlight. And yes, this means all that code will likely need to be thrown away next month (or rather, gated by a config flag so versioning/audit trail still works). Making it work out is what the dev is paid for.

                • By HeyLaughingBoy 2025-10-3122:29

                  An hour is a stretch, but otherwise, yeah.

              • By savolai 2025-10-317:28

                And yet, even ”knowing about the working of the business” is different from actually understanding user needs at UI level, which involves a lot more variables.

                The single most valuable tool is user testing. However it really takes quite a few rounds of actually creating a design and seeing how wrong you saw the other person’s capabilities, to grok how powerful user testing is in revealing your own biases.

                And it’s not hard at all at core. The most important lesson really is a bit of humility. Actually shutting up and observing what real users do when not intervened.

                Shameless plug, my intro to user testing: https://savolai.net/ux/the-why-and-the-how-usability-testing...

          • By GTP 2025-10-319:132 reply

            > Spinning up a proper project to replace this application isn't feasible in the short term, because there are processes around creating software in the organisation, for very good reasons learned painfully from old mistakes, and there just isn't time to go through that.

            I assume those processes weren't applied when deciding to use this application, why? Was there a loophole because it was done by an intern?

            • By dspillett 2025-10-3110:53

              Simple things, or even complex prototypes, get created in office apps because the office apps are their, have the flexibility, and you don't need to get IT to install something new (or allow you to), or convince finance to let you pay for something new, or convince compliance/security/other than the something new is safe anyway, etc. Also in a larger company, once the discussion of developing or buying something comes up lots of other potential stakeholders might raise their heads above the parapet and want to get involved (“Could our dept use this too?”, “We would need it to do Y as well as X…”, “That sounds useful, but it should be us doing it instead”, etc.), and suddenly the quick PoC that has been barely started has become a series of interminable meetings.

              The loophole is that if you have Office or similar you have a variety of development environment, IT/compliance/finance aren't caring what files you produce with the applications you have, and no one else is paying attention initially either, but would have a say (and a procedure for you to follow) if you wanted to bring in or create a new application. The usual process is bypassed.

              This is more commonly associated with Excel, but it applies to Access too (less so than it used to, but there are still plenty people out there who rely on it daily).

              Once the demo/prototype/PoC is there it is a lot easier to “fix up” that than spin up a project in anything else, or get something else in that is already available, for the same reasons as why it was done in Excel/Access in the first place plus the added momentum: the job is already at least part way done, using something else would be a more complete restart so you need to justify that time as well as any other costs and risks.

              [Note: other office suites exist and have spreadsheets & simple DBs with similar capabilities, or at least a useful subset of them, of course, but MS Office's Excel & Access are, for better or worse, fairly ubiquitous]

            • By rob74 2025-10-3110:462 reply

              Well, if someone (especially an intern) who is not in the IT department decides to write an application, it's pretty obvious that they are not familiar with (and therefore won't follow) the processes of the IT department. That's the problem with these more "democratic" development environments: if something is beginner-friendly, beginners will use it...

              • By jyounker 2025-10-3114:52

                Another point of view is that the IT department isn't meeting the users' needs, and that they're bypassing IT because they just want to get their work done.

              • By GTP 2025-10-3111:001 reply

                Yes, the intern will not follow the procedure, and likely isn't even technically required to do so. But, before the application becomes a tool actually used inside the company, there should be some quality control done.

                • By listenallyall 2025-10-3112:381 reply

                  Think of the management structure which arranges, and is satisfied with, tedious, repetitive, manual paper-pushing processes - such that an INTERN can immediately see the efficiency benefits that would come with automation, and doesn't just suggest doing so, but actually builds a program (in limited intern timeframe), that is so helpful it's quickly picked up by multiple employees.

                  Then think again of those managers getting paid manager salaries who couldn't figure this out themselves - or worse, the ones who want to shut it all down because he didn't "follow the procedure" (the procedure of not doing anything useful???)

                  • By GTP 2025-10-3116:431 reply

                    Shutting it down due to a technicality about not following a procedure != shutting it down because it is an unmaintainable mess, while also tasking IT with implementing the same core idea, but in a proper way.

                    • By listenallyall 2025-10-3118:181 reply

                      Lol, define "proper". If they were so knowledgeable, why hadn't they implemented something before the intern arrived?

                      • By GTP 2025-10-3122:44

                        There can be many reasons, e.g. management giving them other tasks.

          • By swader999 2025-10-3111:27

            This legit triggered me. I'm thirty years in now and only two failed projects. My first paid work was an access project for a small business and it failed and I didn't get paid. Luckily I kept trying.

          • By makeitdouble 2025-10-314:29

            > The usual situation is that the business department hires someone with a modicum of talent or interest in tech

            This reminds me of the "just walk confidently to their office and ask for a job to get one!" advice. This sounded bullshit to me until I got to stay with some parts of a previous company, where the hiring process wasn't that far really.

            That's also the kind of companies where contracts and vendor choices will be negociated on golf courses and the CEO's buddies could as well be running the company it would be the same.

            I feel for you.

          • By listenallyall 2025-10-3112:29

            Imagine taking a shit on a technology platform which made it easy for interns with zero experience to successfully automate key aspects of business processes!

            Love the assumption "when it inevitably goes wrong." In real life, many of these applications work perfectly for years and assist employees tremendously. The program doesnt fail, but the business changes - new products, locations, marketing, payment types, inventory systems, tons of potential things.

            And yes, after the original author is gone, nobody is left to update the program. Of course, a lot of programmers or IT folks probably could update it, but ew, why learn and write Access when we can create a new React app with microservices-based backend including Postgres in the cloud and spin up a Kubernetes cluster to run it.

        • By pjmlp 2025-10-318:25

          Delphi and Access are pretty much still around, even if they are seldom reason of a HN front page post.

      • By finghin 2025-10-3016:311 reply

        It’s also about keeping things simple, hierarchical, and very predictable. These do not go hand in hand with the feature creep of collaborative FOSS projects, as others point out here.

        • By hombre_fatal 2025-10-311:45

          Good point. A good interface usually demands a unified end-to-end vision, and that usually comes from one person who has sat down to mull it over and make a bunch of good executive decisions.

          And then you need to implement that, which is never an easy task, and maintain the eternal vigilance to both adhere to the vision but also fit future changes into that vision (or vice versa).

          All of that is already hard to do when you're trying to build something. Only harder in a highly collaborative voluntary project where it's difficult or maybe even impossible to take that sort of ownership.

      • By LtWorf 2025-10-3023:463 reply

        > Overall, the development world does not intuitively understand the difficulty of creating good interfaces

        Nor can the design world, for that matter. They think that making slightly darker gray text on gray background using a tiny font and leaving loads of empty space is peak design. Meanwhile my father cannot use most websites because of this.

        • By DrewADesign 2025-10-312:352 reply

          The dozens of people I know that design interfaces professionally can probably recite more of the WCAG by heart than some of the people that created them. You’re assuming that things you think “look designed” were made by designers rather than people playing with the CSS in a template they found trying to make things “look designed.” You’re almost certainly mistaken.

          • By eviks 2025-10-316:221 reply

            > can probably recite more of the WCAG by heart than some of the people that created them

            That's part of the problem, they'll defend their poorly visible choice by lawyering "but this meets the minimal recommended guideline of 2.7.9"

            • By DrewADesign 2025-10-3120:23

              Find a validator and try to make a text color selection that meets wcag guidelines which doesn’t have contrast high enough to read it perfectly easily. The criteria are not ambiguous and they’re not scraping the visibility barrier.

          • By pseudalopex 2025-10-3115:441 reply

            No. I worked with designers who designed low contrast and low density interfaces. I read articles written by designers. I used products of companies like Apple.

            • By DrewADesign 2025-10-3120:251 reply

              Examples? Are they interface designers? Are they qualified? The existence of shitty designers is no more an impeachment of any design field or designers as the existence of shitty developers is an impeachment of development or developers.

        • By hn_acc1 2025-10-310:303 reply

          As I age, this x1000. Even simple slack app on my windows laptop - clicking in the overview scroll bar is NOT "move down a page". It seems to be "the longer you click, the further it moves" or something equally disgusting. Usually, I dock my laptop and use an external mouse with wheel, and it's easy to do what I want. With a touchpad? Forget it.. I'm clicking 20x to get it to move to the top - IF I can hit the 5-pixel-wide scrollbar. There's no easy way to increase scrollbar size anymore either..

          It's like dark patterns are the ONLY pattern these days.. WTF did we go wrong?

          • By BobbyTables2 2025-10-312:03

            Indeed.

            Win95 was peak UI design.

            I don’t understand modern trends.

          • By fragmede 2025-10-312:061 reply

            With a touchpad? Use two fingers to scroll (also works horizontally). Who's managing to hit a tiny scrollbar that disappears with a touchpad‽

            • By mjevans 2025-10-315:131 reply

              They just aren't as good at detecting real physical contact as a nice physical mouse is at responding to movement and pressure.

              • By fragmede 2025-10-318:49

                I mean, maybe but the question wasn't what is the superior general pointing device (trackball ftw if you ask me) though, but how to scroll using a trackpad without tearing your hair out.

          • By LtWorf 2025-10-317:56

            I created localslackirc to keep using IRC and not have to deal with slack :D

        • By BobbyTables2 2025-10-312:012 reply

          What pisses me off is that the “brutalist” style in the 1990s was arguably perfect. Having standardized persistent menus, meaningful compact toolbars was nice.

          Then the world threw away the menus, adopted an idiotic “ribbon” that uses more screen real estate. Unsatisfied, we dumbed down desktop apps to look like mobile apps, even though input technology remains different.

          Websites also decided to avoid blue underlined text for links and be as nonstandard as possible.

          Frankly, developers did UI better before UI designers went off the deep end.

          • By zeroc8 2025-10-319:20

            The brutalist style also meant that I didn't need a UI designer for my applications. With Delphi I was able to create great apps in a matter of days. And users loved them, because they were so snappy and well thought out. Nowadays it seems I need a UI designer to accomplish just about anything. And the resulting apps might look better but are worse when you are actually trying to accomplish work using them.

          • By sjamaan 2025-10-315:481 reply

            I was ranting exactly the same just yesterday. Nowadays UI designers seem to have forgotten all about affordances. Back in the day you had drop shadows below buttons to indicate that they could be pressed, big chunky scrollbars with two lines on the handle to indicate "grippiness" etc.

            A few days ago I had trouble charging an electric rental car. When plugging it in, it kept saying "charging scheduled" on the dash, but I couldn't find out how to disable that and make it charge right away. The manual seemed to indicate it could only be done with an app (ugh, disgusting). Went back to the rental company, they made it charge and showed me a video of the screen where to do that. I asked "but how on earth do you get to that screen?". Turned out you could fucking swipe the tablet display to get to a different screen! There was absolutely no indication that this was possible, and the screen even implied that it was modal because there were icons at the bottom which changed the display of the screen.

            So you had: zero affordances, modal design on a specific tab, and the different modes showed different tabs at the top, further leading me to believe that this was all there was.

            • By LtWorf 2025-10-317:59

              I've had long discussions at work with our designer, who thinks that people on desktop computers should perform swipe actions with the mouse rather than the UI reacting to mouse scroll events.

              99% of the users are not using the mobile version.

      • By ozgrakkurt 2025-10-3016:501 reply

        IMO they just don’t care enough. They want people to use it but it is not the end of world if it stays niche

        • By csin 2025-10-315:143 reply

          [flagged]

          • By chasd00 2025-10-3113:27

            I think, statistically, it’s likely no one is a “hot chick” otherwise the phrase wouldn’t exist. I get what you’re saying though, personalities and natural talent/gifts pull people to a specialty that makes sense to them and therefore they get good at it.

          • By Imustaskforhelp 2025-10-315:591 reply

            > It's statistically very likely a hot chick does not know calculus.

            It would be honestly interesting if someone actually did a study regarding it.

            I do agree with this statement but it isn't as if everybody doesn't have other opportunity costs, people might have video games as hobbies or just normal hobbies in general as well which could be considered opportunity costs

            The question to me which sounds more interesting which I feel like I maybe reading in the lines but does the society shower attention to beauty which can make them feel less susceptible to lets say calculus which might feel a lot more boring respectively?

            Generally speaking, I was seeing this the other day but female involvement overall in the whole stem department has reduced in %'s iirc.

            Another factor could be the weirdness or expectation. Like just as you think this, this is assumed by many people about hot chicks lets say, so if a hot chick is actually into calculus and she tells it, people would say things like oh wow I didn't know that or really?? which could feel weirdness or this expectation of them to not be this way and be conventional and not have interests it seems.

            I have seen people get shocked in online communities if a girl is even learning programming or doing things like hyprland (maybe myself included as it was indeed rare)

            Naturally I would love if more girls could be into this as I feel like talking to girls about my hobbies when she isn't that interested or not having common hobbies hurts me when I talk to them, they can appreciate it but I feel like I can tell them anything, I am not that deep of a coder right now as much as I am a linux tinkerer, building linux iso's from scratch, shell scripting and building some web services etc. , I like to tinker with software, naturally the word used in unix/foss communities for this is called hacking which should be the perfect way to describe what I mean except they think I am talking about cybersecurity and want me to "hack something", Sorry about this rant but I have stopped saying this hacking just because of how correlated it is to cybersecurity to the normal public. I just say that I love tinkering with software nowadays. Side note, but is there a better word for what I am saying other than hacking?

            • By csin 2025-10-317:20

              It sounds like you are a Linux UI designer.

              Which is a rare thing in this space. Linux is rough around the edges, to say the least. You don't need me telling you. We are in a thread about how open sources software suck at UI design. We could use more people like you in this space.

              The men aren't fussed with the "hacker" label. It sounds cool. It's like when people mistakenly think all Asians know Kung Fu or something. The Asian guy isn't complaining lol.

              There's definitely stigma/sexism that deter women away from this field. But I think opportunity cost is a factor, gravely overlooked.

              Society demands a lot from women, when it comes to appearance. The bar is set very high.

              So high, you don't have the time to be a good programmer AND pretty. Unless you won the genetic lottery.

              I follow women's basketball avidly. Some of the women are not pretty. They are just very good at basketball. It's refreshing to see women be valued, not just because of their beauty.

          • By imtringued 2025-10-319:521 reply

            I think you got everything backwards. I've seen a lot of people who are specialized in a non software domain learn programming and write their own projects. Very often these people know more about what they want to accomplish and work on than someone who has learned how to do software development properly, but has no clue about what software to develop.

            I was that type of person when I started working. I had a burning passion to work on an open source project, but no clue what exactly to work on. Meanwhile at work the project manager gives me a ticket I'd execute rapidly to everyone's satisfaction.

            • By csin 2025-10-3110:08

              No I think we are on the same page.

              The people who specialized in non-software domain, and wrote their own projects, are amazingly talented.

              They are not specializing in 1 field. They are so smart, they've managed to specialize in 2 fields.

              I bet you they also suck at UI design. And wrote projects like Handbrake.

              It's totally understandable.

              I don't expect them to have the time to specialize in THREE fields. There is an opportunity cost to everything.

      • By dhosek 2025-10-3020:231 reply

        In the 90s I did a tech writing gig documenting some custom software a company had built for them by one of the big consultancy agencies. It was a bit of a nightmare as the functionality was arranged in a way that reflected the underlying architecture of the program rather than the users’ workflows. Although I suppose if they’d written the software well, I wouldn’t have had as many billable hours writing documentation.

        • By sublinear 2025-10-3022:402 reply

          > reflected the underlying architecture of the program rather than the users’ workflows

          Is this an inherently bad thing if the software architecture is closely aligned with the problem it solves?

          Maybe it's the architecture that was bad. Of course there are implementation details the user shouldn't care about and it's only sane to hide those. I'm curious how/why a user workflow would not be obviously composed of architectural features to even a casual user. Is it that the user interface was too granular or something else?

          I find that just naming things according to the behavior a layperson would expect can make all the difference. I say all this because it's equally confusing when the developer hides way too much. Those developers seem to lack experience outside their own domain and overcomplicate what could have just been named better.

          • By DrewADesign 2025-10-312:22

            Developers often don’t think it’s a bad thing because that’s how we think about software. Regular users think about applications as tools to solve a problem. Being confronted by implementation details is no problem for people with the base knowledge to understand why things are like that, but without that knowledge, it’s just a confusing mess.

          • By StrauXX 2025-10-3023:01

            If you ever spens time with the low level SAP GUIs, then yes, you will find out why that's definetly a bad thing. Software should reflect users processes. The code below is just an implementation detail and should never impact the design of the interfaces.

      • By max51 2025-10-3118:061 reply

        >Overall, the development world does not intuitively understand the difficulty of creating good interfaces

        I think it's because they are not using the product they are designing. A lot of problems you typically see in modern UIs would have been fixed before release if the people writing it were forced to use it daily for their job.

        For example, dropdown menus with 95 elements and no search/filter function that are too small and only allow you to see 3 lines at a time.

        • By DrewADesign 2025-10-3120:59

          There’s a real difference in usage style between developers and most other users. Having the background knowledge to understand what’s going on behind the curtain makes it easy to deal with things like interactive visual complexity, tons of data and moving parts at the same timetime, implementation warts, etc.

      • By hilong2 2025-10-311:083 reply

        One thing I still struggle with is writing interfaces for complex use cases in an intuitive and simple manner that minimizes required movements and context switching.

        Are there any good resources for developing good UX for necessarily complex use cases?

        • By tastyfreeze 2025-10-311:19

          I am writing scheduling software for an uncommon use case.

          The best method I have found is to use the interface and fix the parts that annoy me. After decades of games and internet I think we all know what good interfaces feel like. Smooth and seamless to get a particular job done. If it doesn't feel good to use it is going to cause problems with users.

          Thats said. I see the software they use on the sales side. People will learn complexity if they have to.

        • By angiolillo 2025-10-3118:071 reply

          > any good resources for developing good UX for necessarily complex use cases?

          For teasing apart complex workflows I'd suggest Holtzblatt and Beyer's Contextual Design book, I taught a user-centered research and design class many years ago and used that as our textbook, hopefully it still holds up.

          For organizing complex applications I like to start with affinity diagrams, card sorts, and collaborative whiteboard sessions. And of course once you have a working prototype, spend as much time as possible quietly watching people interact with your software.

          • By DrewADesign 2025-10-3121:05

            I haven’t read that book. Thanks for the rec

        • By DrewADesign 2025-10-312:08

          Honestly, it’s a really deep topic — for a while I majored in interface/interaction design in school— and getting good at it is like getting good at writing. It’s not like amateurs can’t write solid stories, but they probably don’t really understand the decisions they’re making and the factors involved, and success usually involves accidentally being influenced by the right combination of things at the right time.

          The toughest hurdle to overcome as a developer is not thinking about the gui as a thin client for the application, because to the user, the gui is the application. Developers intuitively keep state in their head and know what to look for in a complex field of information, and often get frustrated when not everything is visible all at once. Regular users are quite different— think about what problems people use your software to solve, think about the process they’d use to solve them, and break it down into a few primary phases or steps, and then consider everything they’d want to know or be able to do in each of those steps. Then, figure out how you’re going to give focus to those things… this could be as drastic as each step having its own screen, or as subtle as putting the cursor in a different field.

          Visually grouping things, by itself, is a whole thing. Important things to consider that are conceptually simple but difficult to really master are informational hierarchy and how to convey that through visual hierarchy, gestalt, implied lines, type hierarchy, thematic grouping (all buttons that initiate a certain type of action, for example, might have rounded corners.)

          You want to communicate the state of whatever process, what’s required to move forward and how the user can make that happen, and avoid unintentionally communicating things that are unhelpful. For example, putting a bunch of buttons on the same vertical axis might look nice, but it could imply a relationship that doesn’t exist. That sort of thing.

          A book that helps get you into the designing mindset even if it isn’t directly related to interface design is Don Norman’s The Design of Everyday Things. People criticize it like it’s an academic tome — don’t take it so seriously. It shows a way of critically thinking about things from the users perspective, and that’s the most important part of design.

      • By zahlman 2025-10-3019:122 reply

        > I think that can give developers a mistaken impression that other peoples work is far less complex than it is.

        Not at all. Talented human artists still impress me as doing the same level of deep "wizardry" that programmers are stereotyped with.

        • By DrewADesign 2025-10-312:272 reply

          That might be the case for you, but something doesn’t need to be universally true for it to be true enough to matter. Find any thread about AI art around here and check out how many people have open contempt for artists’ skills. I remember the t-shirts I saw a few sys admins wearing in the nineties that said “stop bothering me or I’ll replace you with a short shell script.” In the decades I worked in tech, I never saw that attitude wane. I saw a thread here within the past year or two where one guy said he couldn’t take medical doctors and auto mechanics seriously because they lacked the superior troubleshooting skills of a software developer. Seriously. That’s obviously not everybody, but it’s deeefinitely a thing.

          • By lukan 2025-10-318:35

            I believe it comes from low self esteem initially. Then finding their way into computers, where they then indeed have higher skills than average and maybe indeed observed that the job of some people could be automated by a shell script. So ... lots of ungrounded ego suddenly, but in their new guru ego state, they extrapolated from such isolated cases to everywhere.

            I also remember the hostility of my informal universities IT chat groups. Newbs were rather insulted for not knowing basic stuff, instead of helping them. A truly confident person does not feel the need to do that. (and it was amazing having a couple of those persons writing very helpful responses in the middle of all the insulting garbage)

          • By RobotToaster 2025-10-3113:34

            > Find any thread about AI art around here and check out how many people have open contempt for artists’ skills.

            I don't think that's entirely true, what I usually see is people that think AI art is just as good as many artists.

            You can be impressed by something and still think a machine can do it just as well. People that can do complex mental arithmetic are impressive, even if that skill is mostly obsolete by calculators.

        • By cenamus 2025-10-3019:211 reply

          Trust me, there are enough people here that believe that.

          Other engineering disciplines are simpler because you can only have complexity in three dimensions. While in software complexitiy would be everywhere.

          Crazy to believe that

          • By analog31 2025-10-3022:562 reply

            There are many more than three "dimensions" if I may use the term loosely, in software or hardware engineering.

            Cost, safety, interaction between subsystems (developed by different engineering disciplines), tolerances, supply chain, manufacturing, reliability, the laws of physics, possibly chemistry and environmental interactions, regulatory, investor forgiveness, etc.

            Traditional engineering also doesn't have the option of throwing arbitrary levels of complexity at a problem, which means working within tight constraints.

            I'm not an engineer myself, but a scientist working for a company that makes measurement equipment. It wouldn't be fair for me to say that any engineering discipline is more challenging, since I'm in none of them. I've observed engineering projects for roughly 3 decades.

            • By cenamus 2025-10-3113:29

              I think the poster literally meant x, y and z in terms of dimension

            • By swader999 2025-10-3111:311 reply

              The top three are typically quality, budget and time. Usually a compromise is needed on one of these. You could replace quality with features in SW dev.

              • By nradov 2025-10-3120:27

                Scope is more of a dimension than quality. In theory it might be possible to accept lower quality in order to cut the budget or accelerate the time but in practice that doesn't seem to work. Any significant reduction in quality usually causes the project to grind to a halt after the prototype stage. You just can't build any new features when everything is broken.

    • By dayvid 2025-10-3016:154 reply

      The contributors of free software tend to be power users who want to ensure their use case works. I don't think they're investing a lot of thought into the 80/20 use case for normal/majority or users or would risk hurting their workflow to make it easier for others

      • By BinaryIgor 2025-10-3018:45

        True; that's why we have companies with paid product who devote a lot of their time - arguably majority - to make the exact interfaces people want and understand:) it's a ton, a ton of difficult work, for which there is little to no incentive in the free software ecosystem

      • By psunavy03 2025-10-3019:163 reply

        And this is precisely why desktop Linux has not knocked off Windows or MacOS.

        • By ripdog 2025-10-3019:42

          I'd argue that's more because the average person has no interest in installing a new OS, or even any idea what an OS is.

          Most people just keep the default. When the default is Linux (say, the Steam Deck), most people just keep Linux.

        • By bigfishrunning 2025-10-3019:231 reply

          And that's fine. Those users who want something that's not like desktop Linux have plenty of options.

          • By ghaff 2025-10-3019:351 reply

            And increasingly it doesn't matter because they just live in a browser anyway.

            • By thinkmassive 2025-10-3020:411 reply

              Which also makes it easier than ever for more users to run Linux as a desktop OS :)

              • By ghaff 2025-10-3021:21

                Absolutely. I still prefer MacOS/Mac hardware in some ways but running a browser on Linux on a Thinkpad or whatever works pretty well for a lot of purposes.

        • By valyala 2025-10-3021:322 reply

          Omarchy tries resolving this https://github.com/basecamp/omarchy

          • By a96 2025-10-319:251 reply

            Dear reader, please make sure you look up whose project this is and why it's spammed everywhere.

          • By int_19h 2025-10-3111:03

            I clicked on the link, and the first thing I see is a screenshot where half of the screen is taken by terminal with TUI apps in it. There's no "Install" button, and the "Download" one is labelled "ISO".

            Yeah, no, that isn't it.

      • By port11 2025-10-3117:58

        After reading so many apologist comments dismissing the article's points with whataboutism, yours is the first comment that I think addresses the situation properly. As a developer, it's very hard to not mistake the forest for the trees, which is why I'm usually very happy to work with a good UX researcher.

      • By zeroq 2025-10-3016:256 reply

        > contributors of free software tend to be power users

        or, simply put, nerds

        it takes both a different background, approach and skillset to design ux and interface

        if anything FOSS should figure out how to attract skilled artists so majority of designs and logos doesn't look so blatantly amateurish.

        • By WD-42 2025-10-3016:374 reply

          My guess is that, as has always been, the pool of people willing to code for free on their own time because it's fun is just much larger than the people willing to make icons for software projects on their own time because they think it's fun.

          • By ChrisMarshallNY 2025-10-3017:483 reply

            Graphic designers and artists get ripped off, all the time; frequently, by nerds, who tend to do so, in a manner that insults the value of the artist's work.

            It's difficult to get those kinds of creatives to donate their time (trust me on this, I'm always trying).

            I'm an ex-artist, and I'm a nerd. I can definitively say that creating good designs, is at least as difficult as creating good software, but seldom makes the kind of margin that you can, from software, so misappropriation hurts artists a lot more than programmers.

            • By renewiltord 2025-10-3018:165 reply

              Most fields just don’t have the same culture of collaborative everyone-wins that software does. Artists don’t produce CC art in anywhere close to the same influence as engineers produce software. This is probably due to some kind of compounding effect available in software that isn’t available in graphics.

              Software people love writing software to a degree where they’ll just give it away. You just won’t find artists doing the same at the same scale. Or architects, or structural engineers. Maybe the closest are some boat designs but even those are accidental.

              It might just be that we were lucky to have some Stallmans in this field early.

              • By pfannkuchen 2025-10-3021:442 reply

                Isn’t there a lot more compensation available in software? Like as a developer, you can make a lot of money without having to even value money highly. I think in other fields you don’t generally get compensated well unless you are gunning/grinding for it specifically. “For the love of the art” people in visual arts are painters or something like that, probably. Whereas with software you can end up with people who don’t value money that much and have enough already, at least to take a break from paid work or to not devote as much effort to their paid work. I imagine a lot of open source people are in that position?

                • By prepend 2025-10-3022:221 reply

                  I think most OSS projects are started by unemployed people as hobbies. Or ego projects to get jobs.

                  • By ChrisMarshallNY 2025-10-3115:28

                    I'm unemployed (retired), and it's a hobby (that I take quite seriously).

                    Ego is likely involved. I love my babies, but what others think of my work isn't that important (which is good, because others aren't very impressed).

                    I make tools that I use, mostly.

                • By renewiltord 2025-10-3021:56

                  Well, early '90s Torvalds wasn't the wealthy fellow he is now and he was busy churning things out and then relicensed Linux under GPL.

              • By bitwize 2025-10-3018:34

                Fonts are an interesting case. The field of typography is kind of migrating from the "fuck you, pay me" ethic of the pure design space into a more software-like "everyone wins" state, with plenty of high-quality open-source fonts available, whereas previously we had to make do with bitmap-font droppings from proprietary operating systems, Bitstream Vera, and illegal-to-redistribute copies of Microsoft's web font pack.

                I think this is because there are plenty of software nerds with an interest in typography who want to see more free fonts available.

              • By WD-42 2025-10-3018:361 reply

                I think the collaborative nature of open source software dev is unlike anything else. I can upload some software in hopes that others find is useful and can build on top of it, or send back improvements.

                Not sure how that happens with a painting, even a digital one.

                • By ChrisMarshallNY 2025-10-3115:48

                  Art tends to be a solitary vocation.

                  But professional graphic designers, train to work in product-focused teams. They also are able to create collaborative suites of deliverables.

                  Most developers will find utility in the work of graphic designers, as opposed to fine artists.

              • By kmeisthax 2025-10-3022:241 reply

                There's actually a fair bit of highly influential CC-licensed artwork out there. Wikipedia made a whole free encyclopedia. The SCP Foundation wiki is it's own subculture. There's loads of Free Culture photography on Wikimedia Commons (itself mirrored from Flickr). A good chunk of your YouTube feed is probably using Kevin McCleod music - and probably fucking up the attribution strings, too. A lot of artists don't really understand copyright.

                But more importantly, most of them don't really care beyond "oh copyright's the thing that lets me sue big company man[0]".

                The real impediment to CC-licensed creative works is that creativity resists standardization. The reason why we have https://xkcd.com/2347/ is because software wants to be standardized; it's not really a creative work no matter what CONTU says. You can have an OS kernel project's development funded entirely off the back of people who need "this thing but a little different". You can't do the same for creativity, because the vast majority of creative works are one-and-done. You make it, you sell it, and it's done. Maybe you make sequels, or prequels, or spinoffs, but all of those are going to be entirely new stories maybe using some of the same characters or settings.

                [0] Which itself is legally ignorant because the cost of maintaining a lawsuit against a legal behemoth is huge even if you're entirely in the right

                • By renewiltord 2025-10-315:55

                  I like this explanation, though there is one form of creative standardization: brand identity. And I suppose that's where graphics folk engage with software (Plasma, the GNOME design, etc.). Amusingly, I like contributing to Wikipedia and the Commons so I should have thought of that. You're absolutely right that I had a blind spot there in terms of what's the equivalent there of free software.

                  Another thing is that the vast amount of fan fiction out there has a hub-and-spoke model forming an S_n graph around the solitary 'original work' and there are community norms around not 'appropriating' characters and so on, but you're right that community works like the SCP Foundation definitely show that software-like property of remixing of open work.

                  Anyway, all to say I liked your comment very much but couldn't reply because you seem to have been accidentally hellbanned some short while ago. All of your comments are pretty good, so I reached out to the HN guys and they fixed it up (and confirmed it was a false positive). If you haven't seen people engage with what you're saying, it was a technical issue not a quality issue, so I hope you'll keep posting because this is stuff I like reading on HN. And if you have a blog with an RSS feed or something, it would be cool to see it on your profile.

              • By oddmiral 2025-11-016:51

                Users are trying to solve their own problems.

                Graphic artists are creating graphics editors (Gimp, Krita, Blender, ComfyUI, etc.) with tons of options.

            • By some_furry 2025-10-3018:112 reply

              This is a weird thread for me to read, as someone who a) works primarily with developer tooling (and not even GUI tooling, I write cryptography stuff usually!), b) is very active in a vibrant community of artists that care about nerd software projects.

              I don't, as a rule, ever ask artists to contribute for free, but I still occasionally get gifted art from kind folks. (I'm more than happy to commission them for one-off work.)

              Artists tragically undercharge for their labor, so I don't think the goal should be "coax them into contributing for $0" so much as "coax them into becoming an available and reliable talent pool for your community at an agreeable rate". If they're enthusiastic enough, some might do free work from time to time, but that shouldn't be the expectation.

              • By galagawinkle489 2025-10-3019:114 reply

                Why should they work for pay on free software? Nobody expects to be paid to work on the software itself. Yet artists expect to be treated differently.

                If it is your job, then go do it as a job. But we all have jobs. Free software is what we do in our free time. Artists don't seem to have this distinction. They expect to be paid to do a hobby.

                • By ChrisMarshallNY 2025-10-3020:313 reply

                  Doing a pro graphic design treatment is lot more than just "drawing a few pictures," and picking a color palette.

                  It usually involves developing a design language for the app, or sometimes, for the whole organization (if, like the one I do a lot of work for, it's really all about one app). That's a big deal.

                  Logo design is also a much more difficult task than people think. A good logo can be insanely valuable. The one we use for the app I've done a lot of work on, was a quick "one-off," by a guy who ended up running design for a major software house. It was a princely gift.

                  • By Dylan16807 2025-10-3021:021 reply

                    > Doing a pro graphic design treatment is lot more than just "drawing a few pictures," and picking a color palette.

                    Are you quoting someone? Yeah it's a real job, and so is programming. I don't think anyone in this conversation is being dismissive about either job.

                    • By ChrisMarshallNY 2025-10-3022:231 reply

                      You'd be surprised, then, to know that a lot of programmers think graphic design is easy (see the other comment, in this thread), and can often be quite dismissive of the vocation.

                      As a programmer, working with a good graphic designer can be very frustrating, as they can demand that I make changes that seem ridiculous, to me, but, after the product ships, makes all the difference. I've never actually gotten used to it.

                      That's also why it's so difficult to get a "full monty" treatment, from a designer, donating their time.

                      • By Dylan16807 2025-10-3022:271 reply

                        > see the other comment

                        Which other comment?

                        If you mean the one saying it's not harder than programming, that's not calling it easy.

                        • By ChrisMarshallNY 2025-10-3022:302 reply

                          It can be a lot harder. Programming, these days, isn't always that hard.

                          Very different skillset. There was a comment about how ghastly a lot of software-developed graphical assets can be.

                          Tasteful creativity does not grow on trees.

                          • By Dylan16807 2025-10-3022:33

                            "can be" makes it a very different statement. Either one "can be" a lot harder than the other, depending on the task. The statement above is about typical difficulty.

                            And even if they're wrong about which one is typically harder, they weren't saying it was easy, and weren't saying it was significantly easier than programming.

                          • By galagawinkle489 2025-10-3111:391 reply

                            Programming well requires taste and creativity. A different type, but no less rare than taste and creativity in "arty" fields.

                            • By ChrisMarshallNY 2025-10-3111:48

                              Exactly. It's amazing how we, as programmers, can demand that others recognize that, for us, but we, ourselves, refuse to give the same respect, in regards to other fields.

                              The same can be said for any vocation that generates a product. An expertly-crafted duck decoy can have the same level of experience and skill, as a database abstraction.

                              I have had the privilege to work with some of the top creatives, as well as scientists and engineers, in the world, and have seen the difference.

                  • By aleph_minus_one 2025-10-3115:271 reply

                    > It usually involves developing a design language for the app, or sometimes, for the whole organization (if, like the one I do a lot of work for, it's really all about one app). That's a big deal.

                    > Logo design is also a much more difficult task than people think. A good logo can be insanely valuable. The one we use for the app I've done a lot of work on, was a quick "one-off," by a guy who ended up running design for a major software house. It was a princely gift.

                    A lot of developers also tend to invest quite an insane amount of work into their preferred open-source project and they do know how complicated their work is, and also how insane the value is that they provide for free.

                    So, where is the difference?

                  • By prepend 2025-10-3021:52

                    Programming is a big deal too.

                    It’s not like graphic design is harder than programming.

                    I’d rather have crappy graphics than pay designers instead of programmers for free oss.

                • By nemomarx 2025-10-3019:20

                  It's just more common for artists to do small commission work on the side of a real job. 30 dollars for something is basically a donation or tip in my view, and the community can crowd fund for it the same way bug bounties work I think?

                • By some_furry 2025-10-3019:461 reply

                  > Yet artists expect to be treated differently.

                  Because it's a different job!

                  Your post is like asking, "Why is breathing free but food costs money?"

                  • By Dylan16807 2025-10-3020:251 reply

                    Either you're implying that people should code for free, or your analogy is so vague as to be useless.

                    Yeah it's a different job but they're both jobs. Why should one be free and one not be free?

                    • By some_furry 2025-10-3020:512 reply

                      Because programmers consent to programming for free. That fact does not, in any way, obligate anyone else to.

                      • By Dylan16807 2025-10-3020:583 reply

                        The question/skepticism is why the programmers are consenting to this but not the artists.

                        • By allenu 2025-10-3021:401 reply

                          I suspect some of this is due to the fact that the programmers consenting to do free work already have well-paying jobs, so they have the freedom and time to pursue coding as a hobby for fun as well. Graphic designers and UX designers are already having a hard time getting hired for their specific skills and getting paid well for it, so I imagine it's insulting to be asked to do it for free on top of that.

                          That said, I don't think it's as simple as that. Coding is a kind of puzzle-solving that's very self-reinforcing and addictive for a certain type of person. Coders can't help plugging away at a problem even if they're not at the computer. Drawing, on the other hand, requires a lot more drudgery to get good, for most people anyway, and likely isn't as addictive.

                          • By crq-yml 2025-10-313:55

                            I believe it's more nuanced than that. Artists, like programmers, aren't uniformly trained or skilled. An enterprise CRUD developer asks different questions and proposes different answers compared to an embedded systems dev or a compiler engineer.

                            Visual art is millennia older and has found many more niches, so, besides there being a very clear history and sensibility for what is actually fundamental vs industry smoke and mirrors, for every artist you encounter, the likelihood that their goals and interests happen to coincide with "improve the experience of this software" is proportionately lower than in development roles. Calling it drudgery isn't accurate because artists do get the bug for solving repetitive drawing problems and sinking hours into rendering out little details, but the basic motive for it is also likely to be "draw my OCs kissing", with no context of collaboration with anyone else or building a particular career path. The intersection between personal motives and commerce filters a lot of people out of the art pool, and the particular motives of software filters them a second time. The artists with leftover free time may use it for personal indulgences.

                            Conversely, it's implicit that if you're employed as a developer, that there is someone else that you are talking to who depends on your code and its precise operation, and the job itself is collaborative, with many hands potentially touching the same code and every aspect of it discussed to death. You want to solve a certain issue that hasn't yet been tackled, so you write the first attempt. Then someone else comes along and tries to improve on it. And because of that, the shape of the work and how you approach it remains similar across many kinds of roles, even as the technical details shift. As a result, you end up with a healthy amount of free-time software that is made to a professional standard simply because someone wanted a thing solved so they picked up a hammer.

                        • By melagonster 2025-10-3117:25

                          Open source/Free software communities are comprised of programmers. People love to help their communities. Sometimes a community contains some artists, but this condition is rare. e.g., Inkscape have some good picture when user open it.

                        • By some_furry 2025-10-3021:331 reply

                          Why aten't programmers drawing furry porn?

                          It's really not deep.

                          • By Dylan16807 2025-10-3021:40

                            I dispute that claim but it doesn't answer the question. When you have multiple people involved in the community of an open source project, what makes them decide where to contribute, and what makes them decide if they'll use marketable skills for free or not? I think it's an interesting thing to look into.

                      • By prepend 2025-10-3022:07

                        Wouldn’t designers consent to designing for free?

                        This seems like a self selection problem. It’s not about forcing people to work for free. It’s about finding designers willing to work for free (just like everyone else on the project).

                • By cwillu 2025-10-3020:05

                  You know that (some) people get paid to work on free software, right?

              • By ChrisMarshallNY 2025-10-3018:141 reply

                It’s a long story, in my case.

                There’s a very good reason for me to be asking for gratis work. I regularly do tens of thousands of dollars’ worth of work for free.

                • By imtringued 2025-10-3110:091 reply

                  That only works if you form a team with the artist. It doesn't work when the person you're commissioning free stuff from is an external artist who is getting flooded by both paid and unpaid requests. Even a token amount will let them know to prioritize you over the freeloaders.

                  • By ChrisMarshallNY 2025-10-3111:03

                    This is true. I have paid friends free market rates for work; even though they would have done it for free.

                    It’s a matter of Respect. It’s really amazing, how treating folks with simple Respect can change everything.

                    I like working in teams, but I also participate in an organization, where we’re all expected to roll up our sleeves, and pitch in; often in an ad hoc fashion.

            • By palata 2025-10-319:461 reply

              Just for the record: as a developer, I have done a ton of free software contributions. I pretty much didn't get anything from it, except people complaining or asking me to do even more for them, for free.

              I don't know if that qualifies as "getting ripped off", but it's not exactly paying me either.

              • By ChrisMarshallNY 2025-10-3110:361 reply

                I can relate, but artists get treated even worse. It seems to be a thing with creatives. Musicians also get ripped off a lot, as do writers.

                Developers seem to have a product that people can actually attach a value to, but art and music; not so much. They seem to be in different Venn circles.

                In all of it, we do stuff because of the love of the craft. One of the deeper satisfactions, for me, is when folks appreciate my work (payment is almost irrelevant; except for "keeping score"). It's pretty infuriating, to have someone treat my work as if it is a cheap commodity. There's a famous Star Trek scene, where Scotty and his crew are being disciplined for a bar fight with some Klingons[0], and Scotty throws the first punch. I can relate.

                [0] https://www.youtube.com/watch?v=5rsZfcz3h1s

                • By pseudalopex 2025-10-3116:081 reply

                  > Developers seem to have a product that people can actually attach a value to, but art and music; not so much.

                  This says more of your perception I think. Many people attach value to art and music. Many people do not attach value to software.

          • By zer00eyz 2025-10-3017:001 reply

            UI != icons.

            UI and UX are for all intents lost arts. No one is sitting on the other side of a 2 way mirror any more and watching people use their app...

            This is how we get UI's that work but suck to use. This is how we allow dark patterns to flourish. You can and will happily do things your users/customers hate if it makes a dent in the bottom of the eye and you dont have to face their criticisms directly.

            • By lamontcg 2025-10-3017:363 reply

              > UI and UX are for all intents lost arts. No one is sitting on the other side of a 2 way mirror any more and watching people use their app...

              Which is also why UI/UX on open source projects are generally going to suck.

              There's certainly no money to pay for that kind of experiment.

              And if you include telemetry, people lose their goddamn minds, assuming the open source author isn't morally against it to begin with.

              The result is you're just getting the author's intuitive guesswork about UI/UX design, by someone who is likely more of a coder than a design person.

              • By cwillu 2025-10-3021:53

                The dependency on telemetry instead of actually sitting down with a user and watching them use your software is part of the problem. No amount of screen captures, heatmaps or abandoned workflow metrics will show you the expression on a person's face.

              • By Dylan16807 2025-10-3020:30

                Unless you get super invasive, telemetry will tell you how often a feature is used but I don't think it'll help much with bad and confusing layouts.

              • By imtringued 2025-10-3110:18

                You actually skipped over the most important part:

                > You can and will happily do things your users/customers hate if ... you dont have to face their criticisms directly.

                A lot of software developers can't take criticism well when it comes to their pet projects. The entire FreeCAD community, for instance, is based entirely around the idea that FreeCAD is fine and the people criticising it are wrong and have an axe to grind, when that is exactly backwards.

          • By ambicapter 2025-10-3016:501 reply

            Much larger but not non-existent, people post their work (including laborious stuff like icon suites and themes) on art forums and websites for no gain all the time.

            • By keyringlight 2025-10-3017:25

              Going back to the winxp days there was a fairly vibrant group of people making unofficial themes for it, although I think that was helped by the existence of tools (from Stardock?) specialized on that task and making it approachable if your skill set didn't align perfectly.

          • By oddmiral 2025-11-016:48

            Earlier free software has/had tons of free artwork: icons, themes, skins, mods. I had few dozens window border skins and GUI themes for Gnome 1.x, for example. gnome-look.org has 1.5k themes for GTk3/4, but I'm OK with stock theme now, because 90% of the time I see browser or console.

        • By array_key_first 2025-10-3023:121 reply

          They're not just nerds, they're power users. These are different things.

          Pretty much everyone is a power user of SOME software. That might be Excel, that might be their payroll processor, that might be their employee data platform. Because you have to be if you work a normal desk job.

          If Excel was simpler and had an intuitive UI, it would be worthless. Because simple UI works for the first 100 hours, maybe. Then it's actively an obstacle because you need to do eccentric shit as fast as possible and you can't.

          Then, that's where the keyboard shortcuts and 100 buttons shoved on a page somewhere come in. That's where the lack of whitespace comes in. Those aren't downsides anymore.

          • By csin 2025-10-317:57

            "If Excel was simpler and had an intuitive UI, it would be worthless."

            Excel is a simple intuitive UI.

            I use 10% of Excel. I don't even know the 90% of what it's capable of.

            It hides away it's complexity.

            For people that need the complex stuff, they can access it via menus/formulas.

            For the rest of us, we don't even know it's there.

            Whereas, Handbrake shoves all the complexity in your face. It's overwhelming for first time users.

        • By Panzer04 2025-10-310:10

          The person who is going to bother adding stuff to a piece of software is almost certainly by definition a power-user.

          This means they want to add features they couldn't get anywhere else, and already know how to use the existing UI. Onboarding new users is just not their problem or something they care about - They are interested in their own utility, because they aren't getting paid to care about someone else's.

          It's not a "nerd" thing.

        • By 8note 2025-10-3018:06

          UX and interface designers are also nerds.

          i think the bigger issue is that the power users usecases are different from the non-power users. not a skillset problem, but an incentive one

        • By phendrenad2 2025-10-3016:361 reply

          I'm optimistic that the rise of vibe coding will allow the people who understand the user's wants and needs to fix the world's FOSS UIs.

          • By moring 2025-10-318:47

            I'm sceptical about fixing (in the sense of a lasting solution), but it might be a very powerful tool to communicate to devs what the UI should look like.

        • By DrewADesign 2025-10-3016:312 reply

          I have been beating this drum for many years. There are some big cultural rifts and workflow difficulties. Unless FOSS products are run by project managers rather than either developers or designers, it’s a tough nut. Last I looked, gimp has been really tackling this effort more aggressively than most.

          • By graemep 2025-10-3017:041 reply

            I am not convinced bad UI is either a FOSS issue, or solved by having project managers. I know very non-tech people who struggle with Windows 11, for example. I do not like MS Office on the rare occasions I have used it on other people's machines. Not that impressed by the way most browser UIs are going either.

            • By DrewADesign 2025-10-312:19

              Microsoft has been lagging on interface design for a long time. If the project managers are focused on forcing users into monetizable paths against their will, then of course you’re going to get crap interfaces and crap software quality. If you have a project manager that’s focused on directing people to solve problems for users rather than people just bolting on whatever makes sense, then that’s a lot different. And no, bad UIs aren’t inherent to FOSS— look at Firefox, Blender, Signal… all FOSS projects that are managed by people focused on integrating the most important features in a way that makes sense for the ecosystem.

          • By Cotterzz 2025-10-3017:531 reply

            gimp has been my goto when I want to explain bad ui, developer designed ui, or just typical foss ui I'm glad they're fixing it. It's also my image editor of choice.

            • By DrewADesign 2025-10-312:12

              Yeah I’ve been using it as a go-to example for the wrongest approach to UI design for years. I’m glad to see they’re working harder than most to fix some of the underlying problems.

    • By duxup 2025-10-3018:097 reply

      It always amazes me how even just regular every day users will come to me with something like this:

      Overly simplified example:

      "Can you make this button do X?" where the existing button in so many ways is only distantly connected to X. And then they get stuck on the idea that THAT button has to be where the thing happens, and they stick with it even if you explain that the usual function of that button is Y.

      I simplified it saying button, but this applies to processes and other things. I think users sometimes think picking a common thing, button or process that sort of does what they want is the right entry point to discuss changes and maybe they think that somehow saves time / developer effort. Where in reality, just a new button is in fact an easier and less risky place to start.

      I didn't say that very well, but I wonder if that plays a part in the endless adding of complexity to UI where users grasp onto a given button, function, or process and "just" want to alter it a little ... and it never ends until it all breaks down.

      • By graybeardhacker 2025-10-3020:122 reply

        I always tell clients (or users): "If you bring your car to the mechanic because it's making a noise and tell them to replace the belt, they will replace the belt and you car will still make the noise. Ask them to fix the noise."

        In other words, if you need expert help, trust the expert. Ask for what you need, not how to do it.

        • By nerdponx 2025-10-3020:492 reply

          If you tell the mechanic "my car is making a noise, fix the belt please" and then they just fix the belt, that's on the mechanic as well.

          • By duxup 2025-10-3021:32

            I would hope the mechanic would engage with the customer in more back and forth.

            But sometimes power structures don't allow for it. I worked tech support in a number of companies. At some companies we were empowered to investigate and solve problems... sometimes that took work, and work from the customer. It had much better outcomes for the customer, but fixes were not quick. Customers / companies with good technical staff in management understood that dynamic.

            Other companies were "just fix it" and tech support were just annoying drones and the company and customer's and management treated tech support as annoying drones. They got a lot more "you got exactly what you asked for" treatment ... because management and even customers will take the self defeating quick and easy path sometimes.

          • By estimator7292 2025-10-3022:32

            It's a hypothetical to communicate an entirely different point. The mechanic is't real or important.

        • By rkunal 2025-10-3023:142 reply

          It is a common misconception that the "expert" knows the best. Expert can be a trainee, or may be motivated to make more for its organisation or have yet to encounter your problem.

          On the other hand, if you are using your car for a decade and feel it needs a new belt - then get a new belt. Worst case scenario- you will lose some money but learn a bit more about an item you use everyday.

          Experts don't have your instincts as a user.

          • By bigglywiggler 2025-10-310:18

            I am a qualified mechanic. I no longer work in the field but I did for many years. Typically, when people 'trust their instincts as a user' they are fantastically wrong. Off by a mile. They have little to no idea how a car works besides youtube videos and forum posts which are full of inaccuracies or outright nonsense and they don't want to pay for diagnosis.

            So when they would come in asking for a specific part to be replaced with no context I used to tell them that we wouldn't do that until we did a diagnosis. This is because if we did do as they asked and, like in most cases, it turned out that they were wrong they would then become indignant and ask why we didn't do diagnosis for free to tell them that they were wrong.

            Diagnosis takes time and, therefore, costs money. If the user was capable of it then they would also be capable enough to carry out the repair. If they're capable of carrying out the diagnosis and the repair then they wouldn't be coming to me. This has proved to be true over many years for everyone from kids with their first car to accountants and even electrical engineers working on complex systems for large corporations as their occupation. That last one is particularly surprising considering that an engineer should know the bounds of their knowledge and understand how maintenance, diagnosis and repair work on a conceptual level.

            Don't trust your instincts in areas where you have no understanding. Either learn and gain the understanding or accept that paying an expert is part of owning something that requires maintenance and repair.

          • By hobs 2025-10-311:082 reply

            If you don't trust the expert then why are you asking them to fix your stuff? It's a weird idea that you'd want an idiot to do what you say because you know better.

            • By aidenn0 2025-10-314:47

              In this case, it's at least partly because the expert has access to a lift...

            • By duxup 2025-10-3112:321 reply

              If they're asking the mechanic to do X and they understand the mechanic is just doing X and NOT venturing to fix your problem. I guess that is fine.

              I agree though it sets up a weird dynamic where folks might come back to the expert and complain a problem isn't fixed, but that's not what they asked for / they broke the typical expert and customer dynamic.

              • By hobs 2025-10-3113:021 reply

                In my experience the best thing is to then convince the expert you are right, using your own expertise.

                If your mechanic is too stupid to recognize the problem after you explain it then you don't have a mechanic, the set of hands you are directing is basically unskilled labor.

                • By duxup 2025-10-3113:181 reply

                  How do you know you're right if you haven't fixed it?

                  That seems like a way to just be a stuck in the mud and wrong the whole time.

                  • By hobs 2025-10-3114:371 reply

                    exclude the impossible and what is left however improbable must be the truth

                    • By duxup 2025-10-3116:43

                      If we're still talking about an amateur doing it, isn't their understanding of the possibilities petty limited?

      • By jaredhallen 2025-10-310:18

        I think what you're driving at can be more generalized as users bringing solutions when it would be more productive for them to bring problems. This is something I focus on pretty seriously in IT. The tricky part is to get the message across without coming across as unhelpful, arrogant, or obstructive. It often helps to ask them to describe what they're trying to achieve, or what they need. But however you approach the discussion, it must come across as a sincere desire to help.

      • By dmd 2025-10-3018:241 reply

        You are describing a form of the XY problem. https://en.wikipedia.org/wiki/XY_problem

        • By duxup 2025-10-3019:42

          I think you are likely correct, thank you.

      • By nerdponx 2025-10-3019:303 reply

        Don't fall into the trap of responding to the user's request to do Y a certain way. They are asking you to implement Y, and they think they know how it should be implemented, but really they would be happy with Y no matter how you did it. https://xyproblem.info/

        • By LegionMammal978 2025-10-3020:153 reply

          On the other hand, I've not uncommonly seen this idea misused: Alice asks for Y, Bob says that it's an XY problem and that Alice really wants to solve a more general problem X with solution Z, Alice says that Z doesn't work for her due to some detail of her problem, Bob browbeats Alice over "If you think Z won't work, then you're wrong, end of story", and everyone argues back and forth over Z instead of coming up with a working solution.

          Sometimes the best solution is not the most widely-encouraged one.

          • By rcxdude 2025-10-3112:19

            Yes, often an issue on stackoverflow. It's one of the reasons why it can be frustrating to use as you get more experienced: if an expert is at the point of asking on stackoverflow they're probably doing something at least a little bit unusual! But people who answer on stackoverflow mostly see questions from less experienced people and so default to operating in that mode.

            I generally try to answer the Y but also indicate that it suggests there may be an X that could be better achieved some other way, and mention Z if I'm reasonably confident in what X is. It might increase the chance that the person asking just does Y anyway even if Z would be better, but frankly that's not really my business.

          • By nerdponx 2025-10-3020:511 reply

            Bob saying "you should use Z end of story" it's just as a hardheaded and unhelpful as Bob saying "X doesn't do that end of story".

            • By dwaltrip 2025-10-3114:56

              Unfortunately, still quite common. The ego is quite the tricky one.

          • By imtringued 2025-10-3110:24

            I've seen this too. Explicitly talking about something being an XY problem is a red flag, because the goal is usually to dismiss you with a canned answer that doesn't help you.

            The point of XY problems isn't to call people out on supposedly bad behaviour, it's to push them in the right direction and provide more context.

        • By exasperaited 2025-10-3020:51

          I think the XY problem thing is likely very common. But developers are tending to use the term in a very dismissive way, superior way now.

        • By duxup 2025-10-3019:44

          Yeah I often will ask for a quick phone call and try to work from the top down, or the bottom up depending on the client. Getting to the thing we're solving often leads to a different problem description and later different button or concept altogether.

          Sometimes it's just me firing up some SQL queries and discovering "Well this happened 3 times ... ever ..." and we do nothing ;)

      • By uticus 2025-10-3018:22

        In my experience, this is a communication issue, not a logical or technical or philosophical issue. Nor the result of a fixation caused by an idea out of the blue.

        In my experience it may be solved by both parties spending the effort and time to first understand what is being asked... assuming they are both willing to stomach the costs. Sometimes it isn't worth it, and it's easier to pacify than respectfully and carefully dig.

      • By amatecha 2025-10-3021:34

        Yeah, I've had now a couple decades of experience dealing with this, and my typical strat is to "step back" from the requested change, find out what the bigger goal is, and usually I will immediately come up with a completely different solution to fulfill their goal(s). Usually involving things they hadn't even thought about, because they were so focused on that one little thing. When looking at the bigger picture, suddenly you realize the project contains many relevant pieces that must be adjusted to reach the intended goals.

      • By rcxdude 2025-10-3112:17

        In general the advice I've heard it that users are absolutely going to be right about when there's a problem (which you ignore at your peril), they can usually identify where the problem is, but they are terrible at coming up with ways to fix it.

    • By cosmic_cheese 2025-10-3017:23

      It's my belief that much of this flavor of UI/UX degradation can be avoided by employing a simple but criminally underutilized idea in the software world (FOSS portion included), which is feature freezing.

      That is, either determine what the optimal set of features is from the outset, design around that, and freeze or organically reach the optimium and then freeze. After implementing the target feature set, nearly all engineering resources are dedicated to bug fixes and efficiency improvements. New features can be added only after passing through a rigorous gauntlet of reviews that determine if the value of the feature's addition is worth the inherent disruption and impact to stability and resource consumption, and if so, approaching its integration into the existing UI with a holistic approach (as opposed to the usual careless bolt-on approach).

      Naturally, there are some types of software where requirements are too fast-moving for this to be practical, but I would hazard a guess that it would work for the overwhelming majority of use cases which have been solved problems for a decade or more and the required level of flux is in reality extremely low.

    • By vayup 2025-10-3019:271 reply

      Spot on. Defending simplicity takes a lot of energy and commitment. It is not sexy. It is a thankless job. But doing it well takes a lot of skill, skill that is often disparaged by many communities as "political non sense"[1]. It is not a surprise that free software world has this problem.

      But it is not a uniquely free software world problem. It is there in the industry as well. But the marketplace serves as a reality check, and kills egregious cases.

      [1] Granted, "Political non sense" is a dual-purpose skill. In our context, it can be used both for "defending simplicity", as well as "resisting meaningful progress". It's not easy to tell the difference.

      • By jacobr1 2025-10-3019:451 reply

        The cycle repeats frequently in industry. New waves of startups address a problem with better UX, and maybe some other details like increased automated and speed using more modern architectures. But feature-creep eventually makes the UX cumbersome, the complexity makes it hard to migrate to new paradigms or at least doing so without a ton of baggage, so they in turn are displaced by new startups.

        • By gmueckl 2025-10-3021:38

          If the last part was true, Autodesk and Adobe would have had to go under a decade ago.

    • By rcxdude 2025-10-3112:31

      Yes. I think the crux of good interface design is coming up with a model that is simple and flexible enough to be understood by the user but also allow them to achieve their goals by composing operations and options, as opposed to special-casing each possible use-case. This allows you to address the needs of a large number of users without drowning in complexity, but it's really hard to come up with the right model and in general if there's something that will make you unpopular with your users it's changing around the UI after they've gotten used to it, so you don't really get to evolve this as the product develops.

      (Commercial software is far from immune to this as well: professional tools like CAD are notoriously arcane and often have a huge number of special-purpose features, and they're not incentivised to improve their UI model because it would alienate their existing users, as well as not show up on the feature lists which are often used to drive purchasing decisions)

    • By patrakov 2025-10-3113:551 reply

      While working for one of the previous companies, I hit a regrettable counterexample for the point in the article.

      Developers built a web UI for creating containers for the labs, taking the advice from this (then future) article too literally. Their app could only build containers, in the approved way. Yet, not all labs were possible to run in containers, and the app did not account for that (it was a TODO). Worse, people responsible for organizing the labs did not know that not all labs are compatible with containers.

      Lab coordinators thus continued to create containers even in cases where it didn't make sense, despite the explicit warning "in cases X, Y, Z, do not proceed, call Alexander instead".

      So if you make one button you better make that it is always the right button. People follow the happy-but-wrong path way too easily if there is no other obvious one.

      • By nemomarx 2025-10-3113:581 reply

        Having to read a label and go out of the tool to do something else is basically impossible UX, yeah. You'll never get users to do that, and little in line warnings also won't work unless you block the buttons at the same time I think.

        In this example I wonder if the tool was too "MVP" and they didn't evaluate what minimum viable would mean for the users?

        • By patrakov 2025-10-3114:05

          In this case, the product owner had a wrong idea of what's minimum viable, and his idea was faithfully implemented, plus a warning in the app to call me in specific incompatible cases.

          Later the missing pieces were added, we had "two buttons" and the resulting user confusion because they did not know and could not be taught whether a container makes sense for a particular lab.

    • By PaulDavisThe1st 2025-10-3016:352 reply

      Good points, but to add to the sources of instability ... a first time user of a piece of software may be very appreciative of its simplicity and "intuitiveness". However, if it is a tool that they spend a lot of time with and is connected to a potentially complex workflow, it won't be long before even they are asking for "this little extra thing".

      It is hard to overestimate the difference between creating tools for people who use the tools for hours every day and creating tools for people who use tools once a week or less.

      • By SoftTalker 2025-10-3016:522 reply

        Right. For most people, gimp is not only overkill but also overwhelming. It's hard to intuit how to perform even fairly simple tasks. But for someone who needs it it's worth learning.

        The casual user just wants a tool to crop screenshots and maybe draw simple shapes/lines/arrows. But once they do that they start to think of more advanced things and the simple tool starts to be seen as limiting.

        • By LiquidSky 2025-10-3017:511 reply

          But the linked article addresses that. They're not advocating for removing the full-feature UI, they just advise having a simple version that does the one thing (or couple of things) most users want in a simple way. Users who want to do more can just use the full version.

          • By PaulDavisThe1st 2025-10-3018:213 reply

            Users don't want "to do more". They want to do "that one extra thing". Going from the "novice" version to the "full version" just to get that one extra thing is a real problem for a lot of people. But how do you address this as a software designer?

            • By devilbunny 2025-10-3019:551 reply

              I'm not a coder, so I'm not going to pretend that this solution is easy to implement (it might be, but I wouldn't assume so), but how about allowing you to expose the "expert" options just temporarily (to find the tool you need) and then allow adding that to your new "novice plus" custom menus? I.e., if you use a menu option from the expert menu X number of times, it just shows up even though your default is the novice view.

              • By cestith 2025-10-3118:001 reply

                That’s harder than static menus, but it’s not really anything harder once you have customizable hot list menus.

                • By devilbunny 2025-11-010:56

                  It seemed that way to me but I have done enough work with computers (I am on HN, after all) to know that things people in general think should be easy often are not, and things they think are hard may be simple. Thanks.

            • By sjamaan 2025-10-315:59

              I don't know if this works well in general, but for example Kodi has "basic", "advanced" and several progressively more advanced steps in between for most of its menus. It hides lots of details that are irrelevant to the majority of users.

            • By LiquidSky 2025-10-3018:26

              Progressive disclosure? If you know your audience, you probably know what most people want, and then the usual next step up for that "one extra thing". You could start with the ultra-simple basic thing, then have an option to enable the "next step feature". If needed you could have progressive options up to the full version.

        • By thaumasiotes 2025-10-3017:01

          > The casual user just wants a tool to crop screenshots and maybe draw simple shapes/lines/arrows. But once they do that they start to think of more advanced things and the simple tool starts to be seen as limiting.

          Silksong Daily News went from videos of a voiceover saying "There has been no news for today" over a static image background to (sometimes) being scripted stop-motion videos.

      • By galagawinkle489 2025-10-3019:141 reply

        And why exactly should free software prioritise someone's first five minutes (or first 100 hours, even) over the rest of the thousands of hours they might spend with it?

        I see people using DAWs, even "pro" ones made by companies presumably interested in their bottom lines. In all cases I have no idea how to use it.

        Do I complain about intuitiveness etc? Of course not. I don't know how to do something. That's my problem. Not theirs.

        • By Qem 2025-10-3020:392 reply

          > And why exactly should free software prioritise someone's first five minutes (or first 100 hours, even) over the rest of the thousands of hours they might spend with it?

          Well, if people fail at that first five minutes, the subsequent thousand hours most often never happens.

          • By array_key_first 2025-10-3023:151 reply

            The inverse is also true. If you prioritize the first five minutes, your software is worthless in any industry that matters.

            And that's why designers are using Photoshop and not Microsoft paint.

            • By csin 2025-10-318:342 reply

              See, I feel this is where programmers just don't "get" good UI design.

              Photoshop is good UI design. A normie can use photoshop the same way they use MS paint.

              Albeit it just loads slower.

              A normie doesn't need all the bells and whistles. They can just use photoshop like a glorified MS paint.

              You can't do that with GIMP. It's actually really fucking annoying, if you try to use GIMP to do a MS paint job.

              • By array_key_first 2025-10-3120:411 reply

                > Photoshop is good UI design. A normie can use photoshop the same way they use MS paint.

                This is just straight up not true. You're only saying this because you, presumably, have used Photoshop.

                It has a million buttons, layers are a thing, there's a million tools, etc. No, they can't just pick it up because it's complex software for a complex problem domain.

                Maybe you disagree. Okay. Pick a different example. 3D Max? Why aren't studios using Microsoft Paint 3D instead of 3D max?

                • By csin 2025-11-016:56

                  "It has a million buttons, layers are a thing, there's a million tools, etc. No, they can't just pick it up because it's complex software for a complex problem domain."

                  See this is the thing that software devs don't "get" about UI design.

                  It's the exact thing the original author is trying to communicate.

                  You CAN have a powerful tool. And still have it be user friendly for normies!

                  You hide away it's complexities. So it's not INDTIMIDATING for new users.

                  You know what. I'm going to reinstall gimp. Just to prove my point.

                  Let's compare photoshop with gimp.

                  Before I begin, let me preface. Modern photoshop is an enshitified piece of garbage. I would never use it.

                  But this is nothing to do with enshitification. That's a whole different thing.

                  Ok let's start:

                  - I grab a random image from imgur. Copy paste. Ctrl-V. Both apps passed the test. I was a little worried gimp couldn't even do this.

                  - On load this is what photoshop looks like: https://imgur.com/a/3uYsm2h

                  - On load this is what gimp looks like: https://imgur.com/a/DnPcRTc

                  First impressions:

                  - GIMP is ugly as fuck. It looks outdated. There's information overload on the left side. Too much shit happening. Too much text squashed together. INTIMIDATING.

                  - In contrast, photoshop has a more minimalist look. There is a "Layers" window on the right. New users don't need to touch it.

                  - There is a "Size & Position" window. This is key. Notice how there's only 3 things inside that window. Notice how it's not squashed with all the other shit on the left. Think about that. Why did the designer do this? Because those 3 things are what 90% of normies are looking to do.

                  - This is exactly what the original author was talking about, with the TV remote. The most common operations should be sectioned off at the top of the remote. Similarly, the most common operations in photo editing should be sectioned off, in clear view.

                  Ok, Step 2. Let's try and crop this image. A common operation:

                  - Photoshop. Click the crop button. Shows you a bit more complexity in it's settings. You don't have to touch it. It gives you a helpful grid UI: https://imgur.com/a/tLjL6en

                  - And then it has a blue "Done" button at the bottom. Finished easy.

                  - GIMP. We start with a brush by default??? Whoops I accidentally drew on the picture. I didn't want to do that. Thank god I know ctrl-Z.

                  - So it's that cross thing right? That's the move button. Nope that's not what I want to do :(

                  - It must be the one next to it. The rectangle. Ok, some random corner thingies appear in the corners. I click on one of the corners. The image gets split into two. But now what? WTF do I do now: https://imgur.com/a/f7TTHJs

                  I can go on and on and on and on, criticizing gimp's terrible UI design. I hope, the little I have demonstrated, is a tease into what UI design is really about.

              • By galagawinkle489 2025-10-3111:371 reply

                Clearly this is not true. Photoshop is difficult to use. I have opened it and tried to use it many times. Its UI is super complicated. There are endless buttons and I have no idea how to do anything.

                There are heaps of Photoshop tutorials on YouTube, which wouldn't be necessary if what you said were true.

                I used GIMP to do MS paint stuff years ago when I used it fairly regularly.

                GIMP is always a whipping boy for UI design on forums like this and I think it is pretty unfair. It is a pretty good program comparatively. If you want to see bad UI design a much better example is something like Visual Studio. What a mess.

                • By wolvesechoes 2025-10-3112:312 reply

                  > If you want to see bad UI design a much better example is something like Visual Studio. What a mess.

                  Yeah, big button "Create project" and another, albeit smaller, button for "Run" puts a really high bar for the user to jump over.

                  Nothin as good as plain old cc followed by a bunch of cryptic flags.

                  • By array_key_first 2025-10-3120:42

                    The main problem is that neither of those buttons make it clear what the fuck they're actually doing and also they don't work.

                    Download a random solution. Will the run button work? I highly doubt it.

                  • By galagawinkle489 2025-10-3121:24

                    I'm talking about the user interface, not how difficult it to click a button to do basic actions.

                    VS is incredibly cluttered. Too many buttons, cryptic icons and entirely unclear how to do things in the options.

                    I'm not comparing it to cc but to be fair that is actually well documented and hasn't changed its user interface in decades.

          • By goodpoint 2025-10-3110:03

            Those people will not use software designed for professional use and move to something else.

    • By aidenn0 2025-10-314:441 reply

      Not only is it hard to figure out the use-case, but the correct use-case will change over time. If this were made in the iPod touch era, it would probably make 240p files for maximum compatibility. That's ... probably the wrong setting for today.

      • By SwtCyber 2025-10-3112:18

        Simplicity has an expiration date if it's too rigid

    • By miki123211 2025-10-318:421 reply

      To design a good user interface, you need a feedback loop that tells you how people actually use your software. That feedback loop should be as painless for the user as possible.

      Having people to man a 1-800 number is one way to get that feedback loop. Professional user testing is another. Telemetry / analytics / user tracking, or even being able to pull out statistics from a database on your server, is yet another. Professional software usually has at least two of these, sometimes all four. Free software usually has none.

      There are still FLOSS developers out there who think that an English-only channel on Libera.chat (because Discord is for the uneducated n00bs who don't know what's good for them) is a good way to communicate with their users.

      What developers want from software isn't what end users want from software. Take Linux for example. A lot of things on Linux can only be done in the terminal, but the people who are able to fix this problem don't actually need it to be fixed. This is why OSS works so well for dev tools.

      • By user205738 2025-10-3121:55

        Those who have been using Linux for a long time have very useful terminals with syntax highlighting, auto-completion, typo correction and many other bells and whistles.

        In addition, muscle memory has been developed and there is experience.

        They don't realize that newbies don't have any of this, and it's very inconvenient to type commands in the terminal without it. You may not be able to copy and paste them the way they are used to (right mouse button or Ctrl-V)

        The terminal in linux makes people hysterical and angry.

        The problem could be partially solved by adding auto-completion and auto-correction to the default iso.

    • By apitman 2025-10-3022:33

      I suspect in the short term users are going to start solving this more and more by asking ChatGPT how to make their video work on their phone, and it telling them step by step how to do it.

      Longer term I wonder if complex apps with lots of features might integrate AI in such a way that users can ask it to generate a UI matching their needs. Some will only need a single button, some will need more.

    • By uticus 2025-10-3018:28

      > It takes a single strong-willed defender, or some sort of onerous management structure...

      I'd say it's even more than you've stated. Not only for defending an existing project, but even for getting a project going in the first place a dictator* is needed.

      I'm willing to be proven wrong, and I know this flies in the face of common scrum-team-everybody-owns approaches.

      * benevolent or otherwise

    • By SwtCyber 2025-10-3112:16

      Simplicity isn't just a design challenge, it's a discipline problem

    • By abustamam 2025-10-3115:48

      I could see the case for having multiple front-ends to do exactly one thing.

      In the case of handbrake, I'd just see how I personally use it. Am I doing one thing 99% of the time? Maybe others are too. Let's silo that workflow.

    • By mschuster91 2025-10-3018:33

      > to prevent the interface from quickly devolving back into the million options

      Microsoft for a loooong time had that figured out pretty well:

      - The stuff that people needed every day and liked to customize the most was directly reachable. Right click on the desktop, that offered a shortcut to the CPL for display and desktop symbols.

      - More detailed stuff? A CPL that could be reached from the System Settings

      - Stuff that was low level but still needed to be exposed somewhat? msconfig.

      - Stuff that you'd need to touch very rarely, but absolutely needed the option to customize it for entire fleets? Group Policy.

      - Really REALLY exotic stuff? Registry only.

      In the end it all was Registry under the hood, but there were so many options to access these registry keys depending what level of user you were. Nowadays? It's a fucking nightmare, the last truly decent Windows was 7, 10 is "barely acceptable" in my eyes and Windows 11 can go and die in a fire.

    • By m463 2025-10-3120:20

      I think it is like simplified wikipedia. It is possible, but not mainstream enough to get good coverage.

      https://simple.wikipedia.org/wiki/Main_Page

      It would be interesting to have a "simplified linux", or "linux for kids" or similar.

    • By Cotterzz 2025-10-3017:58

      It does shed light on a possibly better solution though that gives the user a list of simple, common use case options or access to the full interface.

      I do feel quite strongly that this should be implemented in the app though.

      There must be examples of this approach already being used?

    • By cellular 2025-10-3022:45

      This is why i developed GatorCAM for CNC.

      FreeCAD is too complicated. Too many ways to accomplish the same task (nevermind only certain ways work too.)

      So everything is simple and only 1 way to create gcode. No hidden menus. No hidden state.

    • By alistairSH 2025-10-3113:37

      Eh, not sure I agree.

      Taking the Handbrake example, providing a default "simple" interface (as Magicbrake does) would be trivial to implement, maintain, and defend. The existing default "super user" interface could be just a toggle away (and make the toggle sticky so a power user doesn't have to touch it but once).

      I used to work with an engineer who loved to remind us that a "perfect" interface would have a single button, and that button should be delivered pre-pushed. Always seemed like wise words to me.

    • By rekabis 2025-11-016:30

      > figuring out what that use case is difficult.

      Which is why observability is so damn important.

      Observability allows you to grock what your users in aggregate are doing, and adapt your product accordingly. You can take the lower-40% of features and squirrel them away off the main UI. You can take the lowest-10% of features and stick them in a special tools panel that needs to be explicitly hunted down and enabled. You can carve the UI up into three different levels - focused, simple, expert - that hide certain functionality and features, and explicitly expose others in certain ways, and allow the user to switch between them at will.

      There is just so many ways that this particular cat can be skinned, you just need to collect the information on how the users are actually using the product. You just need to get permission in ways that encourages your users to participate.

      Because without that data, you’re just stabbing in the dark and hoping you aren’t stabbing yourself in the foot. Or worse -- metaphorically ripping open your entire femoral artery by alienating the majority of your users.

    • By iwontberude 2025-10-3116:36

      It’s “the tyranny of the marginal user” as Ivan Vendrov coined it

  • By gspencley 2025-10-3017:309 reply

    A lot of this type of stuff boils down to what you're used to.

    My wife is not particularly tech savvy. She is a Linux user, however. When we started a new business, we needed certain applications that only run on Windows and since she would be at the brick and mortar location full time, I figured we could multi-purpose a new laptop for her and have her switch to Windows.

    She hated it and begged for us to get a dedicated Windows laptop for that stuff so she could go back to Linux.

    Some of you might suggest that she has me for tech support, which is true, but I can't actually remember the last time she asked me to troubleshoot something for her with her laptop. The occasions that do come to mind are usually hardware failure related.

    Obviously the thing about generlizations is that they're never going to fit all individuals uniformly. My wife might be an edge case. But she feels at home using Linux, as it's what she's used to ... and strongly loathed using Windows when it was offered to her.

    I feel that kind of way about Mac vs PC as well. I am a lifelong PC user, and also a "power user." I have extremely particular preferences when it comes to my UI and keyboard mappings and fonts and windowing features. When I was forced to use a Mac for work, I honestly considered looking for a different position because it was just that painful for me. Nothing wrong with Mac OS X, a lot of people love it. But I was 10% as productive on it when compared to what I'm used to... and I'm "old dog" enough that it was just too much change to be able to bear and work with.

    • By singhrac 2025-10-3018:083 reply

      One summer in middle school our family computer failed. We bought a new motherboard from Microcenter but it didn’t come with a Windows license, so I proposed we just try Ubuntu for a while.

      My mom had no trouble adjusting to it. It was all just computer to her in some ways.

      • By trenchpilgrim 2025-10-3020:07

        Same, my mom ran Linux for years in the Vista days cuz her PC was too slow for Windows. She was fine. She even preferred Libreoffice over the Office ribbon interface.

      • By theandrewbailey 2025-10-310:362 reply

        Sometime around 2012, Windows XP started having issues on my parent's PC, so I installed Xubuntu on it (my preferred distro at the time). I told them that "it works like Windows", showed them how to check email, browse the web, play solitare, and shut down. Even the random HP printer + scanner they had worked great! I went back home 2 states away, and expected a call from them to "put it back to what it was", but it never happened. (The closest was Mom wondering why solitare (the gnome-games version) was different, then guided her on how to change the game type to klondike.)

        If "it [Xubuntu] works like Windows" offended you, I'd like to point out that normies don't care about how operating system kernels are designed. You're part of the problem this simplified Handbrake UI tries to solve. Normies care about things like a start menu, and that the X in the corner closes programs. The interface is paramount for non-technical users.

        I currently work in the refurb division of an e-waste recycling company.[0] Most everyone else there installs Ubuntu on laptops (we don't have the license to sell things with Windows), and I started to initially, but an error always appeared on boot. Consider unpacking it and turning it on for the first time, and an error immediately appears: would you wonder if what you just bought is already broken? I eventually settled on Linux Mint with the OEM install option.

        [0] https://www.ebay.com/str/evolutionecycling

        • By patrakov 2025-10-3114:01

          For one of my relatives, it also never happened. I installed Linux on their laptop that was having issues and explained how to browse the web and use some apps.

          They always answered me "it works well".

          But what I found during my next visit is a paper with a telephone number of computer helpers, and the laptop was running a fresh copy of Windows, presumably installed by these helpers.

        • By BolexNOLA 2025-10-312:13

          Mint is definitely what I recommend to people who hate windows now but are nervous about swapping to Linux. Bazzite if they’re gamers.

      • By mvdtnz 2025-10-314:551 reply

        [flagged]

        • By BolexNOLA 2025-10-3117:28

          Try looking at this another way: people who are tech savvy may be more likely to have parents who are also tech savvy when compared to the average person.

          If we don’t buy that theory: There are also a lot of people who visit and comment on this site, meaning there are tons of people who have parents who have not successfully switched over to Linux. The ones who have had success are the ones speaking up, which is currently in the single digits - nothing outlandish about that.

          This is no different than somebody talking about a 35mm film camera and a bunch of people jumping in with their experience with 35mm film cameras. Are you as critical/skeptical of those conversations as well? You shouldn’t be and I would be surprised if so! So the logic is basically the same.

          For the record my parents do not run Linux. I could maybe vaguely see my mom getting a handle on it, but unlikely and definitely not unless she made some big commitment to do it. However, I do have a friend whose mom is a gamer using a Linux laptop. This stuff does happen!

    • By abustamam 2025-10-3115:54

      I grew up with windows. Used it until after college when I was gifted a MacBook pro for dev work. I used parallels to keep a windows box close by (I think I only ever used up to Vista), but then eventually I just went full Mac and got used to the interface and key bindings. It's a fine interface, not amazing, not terrible, but I can get around. And every job I've had basically required a Mac anyway.

      Years later, I built a gaming machine so obviously I needed Windows. Got Win10 and eventually upgraded to 11 and it's just so jarring how unusable it is.

      In older windows I could click on my computer to see all my drives and files. Now I have no idea where it is so I just click on the little folder icon at the bottom which opens I think my home directory, then I have to click on somewhere else to see my C and D drives. I can probably make a desktop shortcut or something but point being is that it's un intuitive. And powershell is not a great terminal, and I haven't found one for windows.

      SO, after learning that gaming works well on Linux, I recently switched to Ubuntu, and I haven't looked back. Gaming, AI workflows, everything I need works just perfectly, and if it doesn't I can easily customize things to work the way I want it to. I'm not treated as a criminal for installing software on my computer. It's awesome.

    • By cosmic_cheese 2025-10-3017:365 reply

      Familiarity is massively undersold in the Linux desktop adoption discussion. Having desktop environments that are near 1:1 clones of the commercial platforms (preferably paired with a distribution that's designed to be bulletproof and practically never requires its user to fire up a terminal window) would go so far for making Linux viable for users sitting in the middle of the bell curve of technical capability.

      It's one of those situations where "close enough" isn't. The fine details matter.

      • By array_key_first 2025-10-3023:191 reply

        The main problem with this is that the commercial offerings are pretty much just bad.

        Windows isn't the way it is because of some purposeful design or anything. No, it's decades of poor decisions after poor decisions. Nothing, and I do mean nothing, is intuitive on Windows. It's familiar! But it is not intuitive.

        If you conform to what these commercial offerings do, you are actively making your software worse. On purpose. You're actively programming in baggage from 25 years ago... in your greenfield project.

        • By Nathanba 2025-10-311:34

          I don't even think that it remained very familiar aside from a taskbar (that also changed in win11) and the fact that there are desktop icons when you install things via double clicking (the double click installing also optionally changed with the Microsoft store and the msi installers are almost entirely gone these days, totally different uis pop up now). Even core things that people definitely use like the uninstallation, settings etc. ui has changed completely for the worse. Windows has also changed a lot of its core ui over the years like the taskbar, the clock, the startmenu etc. I guess one thing you could say is that it was a gradual change over many versions but everytime people hate it. Really, what Linux should have done is what Windows has done with WSL: Offer a builtin compatability layer so that you can install windows apps on linux, perhaps prompting you to enter a windows license and then it will launch those apps in a VM, even per window/app.

      • By BolexNOLA 2025-10-312:15

        > Familiarity is massively undersold in the Linux desktop adoption discussion

        Totally agree. My first distro was Elementary because it was sold to me as Mac-like. It’s…sort of that, but it was enough for me to stick with it and now I’ve tried 3 other distros! Elementary is still in place in my n150 server. Bazzite for my big gaming machine. Messed with Mint briefly, wasn’t for me but I appreciated what it was.

        Familiarity is so important.

      • By LtWorf 2025-10-316:54

        Lol, have you not noticed how every version of windows moves everything and the users are no longer able to do anything?

      • By zahlman 2025-10-3019:141 reply

        What do you see as wrong or missing "fine details" in, say, Cinnamon?

        • By cosmic_cheese 2025-10-3019:551 reply

          Assuming that the point of comparison is Windows (since it’s a rough XP/7 analogue), any difference in behaviors, patterns, or conventions that might differ from what a long time Windows user would expect, including things that some might write off as insignificant. In particular, anything relating to the user’s muscle memory (such as key shortcuts, menu item positions, etc) needs to match.

          The DE needs to be as close to a drop-in replacement as possible while remaining legally distinct. The less the user needs to relearn the better.

          • By mort96 2025-10-3020:59

            For example, practically every text box in practically every Linux system handles ctrl+backspace by deleting a word. This clashes with a Windows user's expectation that ctrl+backspace deletes a word in some system applications while inserting a small square character in others.

      • By pessimizer 2025-10-3116:531 reply

        No, this is poison. They constantly change things, and Free Software would be racing to clone them, continually leaving familiarity behind in order to be a wonky version of the real thing. That battle is lost when it starts. Firefox was a great version of Firefox, everybody loved it (except when it locked up the entire system), nobody thought it was a knock-off of IE. Firefox then became a shit version of Chrome (I assume on Google's orders), and eventually developed into a good enough version of Chrome, shedding all of its users along the way. The Linux desktop is doing better than Firefox now.

        The advantage to Free Software is that you don't have to change everything with Windows, Apple, Adobe, or Google demand you do (unless they grab control of a FOSS project, like in Firefox's case.) There are a number of writers who recommend Linux and Free software only for that reason - that once you get a workflow going, you don't want to change it according to corporate whims.

        > practically never requires its user to fire up a terminal window

        This can be a problem. But it will be less of a problem with LLMs. We need to encourage amateur (and proficient) Linux adopters and users to lean on AI to deal with anything giving them problems. I had an LLM walk me through updating a .deb package in MATE to match HEAD upstream, and to do it in a way that would be replaced when Debian updated the package itself. This is something I've been carefully avoiding learning for a decade, and if I had taken the effort to try to learn, it would be weeks of research and I'd have messed up the system multiple times along the way. Instead, after a few false starts, I did it and gained the knowledge to do it again.

        • By cosmic_cheese 2025-11-011:17

          It's not necessary to chase, just copy what Windows users have largely agreed to be good and stick to that.

          So for example, a hypothetical Windows DE could offer XP, 7, and 10 modes which the user can freely switch between which would never change. This delivers on two fronts: first, it presents a familiar, comfortable UI for the user, and second, it offers a promise that most of the popular Linux desktops do not which is that significant changes will not occur, even over long time scales.

          I disagree on LLMs/terminal use. Too many things can go wrong in too many different ways for LLMs to be of much use to users for troubleshooting in many cases, and there's also the issue of the user even knowing what to ask for in the first place (even many moderately technical users aren't going to have the foggiest clue what a Debian package, MATE, HEAD, or upstream are).

          The system really just needs to be engineered to 1) be extremely robust and not break in the first place 2) when it does break, have the ability to silently self-heal 99% of the time. A non-essential but excellent bonus would be 3) to be able to express what's wrong and what needs to be done to the user in that last 1%. This won't be easy to accomplish, but the first distro that does will be richly rewarded with user loyalty.

    • By vladms 2025-10-3113:161 reply

      > Nothing wrong with Mac OS X

      In fact, when I had a similar experience I ended up making a short list (which I since lost) of things that seemed terribly wrong UI wise.

      True, overall Mac is just different. The issue that I have with that ecosystem is the too many people consider it "perfect" and don't even consider discussing issues and complaining about things. Every product has pluses and minuses, but if you the user "believes blindly" that "there is only one way" that is probably not good for anybody.

      After a couple of weeks I adapted just fine to using the Mac, but I surely don't miss it either.

      • By latexr 2025-10-3115:341 reply

        > too many people consider it "perfect" and don't even consider discussing issues and complaining about things.

        That is becoming less and less true. More and more of the most ardent Apple fans have been complaining about the direction of macOS for years. Developer sentiment is low.

        • By presbyterian 2025-10-3115:512 reply

          I've been a huge Mac fan for a decade or more, at least, and not only is Tahoe the least popular release I've seen, it's the first one where the majority of people I hear from dislike it. It's bad enough that I haven't updated still, I'm waiting a few point releases at least to see how they fix it up, and I'm trying out Linux distros to see what I'll start using if I have to move away.

          • By justaregulanerd 2025-11-012:36

            More recent Mac convert (actually gone Linux -> M1 Mac) and the initial M1 Air I bought, I naturally upgraded to Tahoe and felt that while it's pretty (and I really, really want the world to move on from Material interfaces), I did also feel the readability concerns were completely valid.

            I had to return that Mac for a screen defect, and the one I now have has been kept back on Sequoia, and I'm totally fine with it and will probably stick with it until security updates stop, at which point I surely hope Tahoe is more readable.

          • By BolexNOLA 2025-10-3117:561 reply

            Mac->Linux swapper here (back in April). I left after they screwed me on a hardware situation.

            Honestly I’ve really enjoyed the swap. But man I really miss having iMessages across my devices as well as the shared clipboard. By far the two things I missed the most. Everything else I’ve kind of moved on from and can’t even think of off the top of my head anymore

            • By Max-Limelihood 2025-10-3121:151 reply

              Yeah, Apple is well-known for being completely insane on this kind of stuff—whenever someone builds an app for this, Apple immediately sets about hunting down and banning iMessage users they suspect of using it. https://news.ycombinator.com/item?id=38646903 https://techcrunch.com/2024/03/21/doj-calls-out-apple-for-br... https://www.wired.com/story/beeper-apple-imessage-fight/

              • By BolexNOLA 2025-11-0118:22

                You seem pretty informed on this stuff. Do you have any insight into why I’m hearing, at least anecdotally, a much higher failure rate among their desktop offerings over their laptops? My M1 Pro Mac Studio crapped out after 2.5 years! Anytime it went to sleep it would kernel panic and restart. It got all green with their diagnostic test, they did a full firmware refresh, literally nothing could fix it and they had no idea what it was. They wanted me to pay over $700 to replace the logic board and weren’t even sure if that would fix it. Also, the ethernet port failed after a year.

                My buddy has almost the exact same story about his M1 iMac. Just under 4 years, now it crashes and forces a safe mode boot randomly. The computer can take upwards of 10 minutes to even start up. They gave him the same business, exact same repair offering, and he’s moving on like I am. I’ve got 1 other friend with a similar unfolding right now, none of these situations were with laptops.

    • By le-mark 2025-10-3023:081 reply

      > When I was forced to use a Mac for work, I honestly considered looking for a different position because it was just that painful for me.

      I share this aversion. I have a Mac book work sent me, sitting next to me right now, that I never use. Luckily I’m able to access the vpn via Linux and all the apps I need have web interfaces (office 365).

      • By asimovDev 2025-10-318:17

        won't you get in trouble for using a personal device for accessing work resources?

    • By helterskelter 2025-10-3117:071 reply

      Same story here. My wife is not tech savvy at all but she likes Linux because it mostly just gets out of her way and the desktop doesn't really change drastically between updates. I haven't had to help her do anything on it in over a decade except get an old PC game up with Wine.

    • By e40 2025-10-3110:29

      I too was a Windows power user. Never thought I could use macOS. It was painful to start, but ultimately it was far better than what I had on Windows. You just have to put in the work. I did this conversion after I was 50.

    • By tombert 2025-10-311:102 reply

      I grew up using Windows but have been using Linux and Mac almost exclusively for the past fifteen years; the only exposure I get to Windows is when I have to play tech support for my parents [1].

      I hated OS X when I first used it. A lot, actually. I didn't consider leaving my job over it (I couldn't have afforded it at the time even if I had wanted to), but I did think about trying to do an ultimatum with that employer to tell them to buy me a computer with Windows or let me install Linux on the Macbook (this was 2012 so it had the Intel chip). I got let go from that job before I really got a chance (which itself is a whole strange story), but regardless I really hated macOS at the time.

      It wasn't until a few years later and a couple jobs after that I ended up growing to really like macOS, when Mavericks released, and a few years later, I actually ended up getting a job at Apple and I refuse to allow anyone to run Windows in my house.

      My point is, I think people can actually learn and appreciate new platforms if they're given a chance.

      [1] https://news.ycombinator.com/item?id=45708530

      • By bruce511 2025-10-314:162 reply

        I agree, people can learn and appreciate if given the chance. But they've more important things to do so changing OS is just a distraction.

        I know, techies love to love or hate the OS. Here there are endless threads waxing lyrical about Windows, MacOS or say dozen Linux installs. But 99% of users could care less.

        It's kinda like cars. Petrol heads will talk cars for ages. Engine specs. What brand of oil. Gearbox ratios. Whereas I'm like 99% of people - I get in my car to go somewhere. Pretty much the only "feature" a car needs is to make me not worry about getting there.

        So for 97% of people the "best" OS is the one they don't notice. The one that's invisible because they want to run a program, and it just runs.

        The problem with switching my mom to Linux is not the OS. It's all the programs she uses. And while they might (or might not) be "equivalent" they're not the same. And I'm not interested in re-teaching her every bit of software, and she's not interested in relearning every bit of software.

        She's not on "a journey" of software discovery. She has arrived. Changing now is just a waste of time she could be gardening or whatever.

        The reason it'll never be the year for Linux Desktop is the same reason it's always been - it's not there already.

        • By tombert 2025-10-314:21

          I mostly agree with you, though one of the few good things about Electron taking over the desktop means that an increasing number of programs are getting direct ports to Linux. A guy can dream at least.

        • By sudobash1 2025-10-314:292 reply

          > And I'm not interested in re-teaching her every bit of software, and she's not interested in relearning every bit of software.

          I don't see Windows as having much of an edge there. Lots of things seem to change on Windows just for change's sake. I get so tired of the churn on Windows versions and finding how to disable the new crummy features. If you want to avoid relearning all the time, something simple like XFCE is going to be way better.

          • By tombert 2025-10-314:402 reply

            And Linux won't arbitrarily irrevocably brick your computer because of an automatic update. In my opinion, having your computer bricked because of an automatic update is a very large change to adapt to.

            I feel the need to constantly reiterate this; if someone who works on Windows Update reads this, please consider a different career, because you are categorically terrible at your job. There are plenty of jobs out there that don't involve software engineering.

            • By simonask 2025-10-318:051 reply

              > And Linux won't arbitrarily irrevocably brick your computer because of an automatic update.

              To the average user, it absolutely will. Unless they happen to run on particularly well-supported hardware, the days of console tinkering aren't gone, even on major distros.

              What's fixable to the average Linux user and what's fixable to the average person (whose job is not to run Linux) are two very, very different things.

              • By tombert 2025-10-3114:461 reply

                If you run a modern distro with a modern filesystem, you can at the very least have automatic snapshots that actually work, and you can restore to a previous state if an update breaks things. The same cannot be said for Windows.

                • By fn-mote 2025-10-3121:401 reply

                  I'm not sure what you're referring to, but I would not give anybody good odds of booting from a snapshot on Ubuntu/ZFS.

                  I would expect that booting from an older kernel would work, possibly in recovery mode.

                  • By tombert 2025-10-3123:13

                    I have booted from snapshots on Ubuntu with ZFS plenty of times and it has worked fine. I've also used Snapper with btrfs and restored from backup and it's worked fine. I've also booted from snapshots in NixOS and it has worked fine. I actually cannot think of a time where any of those examples didn't work fine.

                    Windows system restore has never worked for me.

            • By bruce511 2025-10-3115:121 reply

              I think (in general) the number of machines being bricked because of an update is about a rounding error from 0.

              The biggest brick event in recent times was Shockwave, not Windows. Personally I've never seen a bricked machine, not at home, not at work, not at family.

              Of course my anecdata is meaningless as is your annecdata. Ymmv.

              • By tombert 2025-10-3117:05

                I say this in particular because the automatic update to Windows 11 bricked my mom’s computer, or at least it required me to nuke the machine and reinstall everything from scratch. You can look at the linked post from a few levels up if you want details.

                This is the second time this has happened to my family from Windows, on different computers.

          • By wolvesechoes 2025-10-3112:50

            > I don't see Windows as having much of an edge there.

            But they specifically said it is not about OS, but about programs on this OS. There is Windows-based software that looks the same as 2 decades ago.

      • By Hendrikto 2025-10-3110:381 reply

        > I did think about trying to do an ultimatum with that employer to tell them to buy me a computer with Windows or let me install Linux on the Macbook

        > I refuse to allow anyone to run Windows in my house

        So you don’t care for peoples preferences unless they match your own? I don’t get that. You were in the same position. Why don’t you just let people use what they like?

        • By tombert 2025-10-3114:49

          I can have whatever rules I want in my house. I am the one who would be playing tech support for all these computers if something breaks, and Windows is so utterly terrible that I will not touch it anymore.

          There are only three people in my house, two of which appear to be happy with macOS and one (me) is happy with Linux.

          I am not upset with the job requiring people to use macOS. That job was awful for a whole bunch of wonderful reasons, but that wasn’t one of them. If people expect me to play tech support in my house I want an OS that I understand and isn’t terrible to do it.

    • By globular-toast 2025-10-319:23

      What you're used to is definitely a huge part of it. But I do think 10-15 years ago Linux was easier to break than Windows, because it didn't make any effort to hide away the bits that let you break it. This was mainly a matter of taste. People who know what they're doing don't want to use some sanitised sandbox.

      Linux was like a racing car. Raw and refined. Every control was directly connected to powerful mechanical components, like a throttle, clutch and steering rack. It did exactly what you told it to do, but to be good at it requires learning to finesse the controls. If you failed, the lessons the were delivered swiftly and harshly: you would lose traction, spin and crash.

      Windows was more like a daily driver. Things were "easier", but that the cost of having less raw control and power, like a clutch with a huge dual mass flywheel. It's not like you can't break a daily driver, any experienced computer guy has surely broken Windows more than once, but you can just do more within the confines o the sandbox. Linux required you to leave.

      It's different now. Distros like Ubuntu do almost everything most people want without having to leave the sandbox. The beautiful part about Linux, though, is it's still all there if you want it and nice to use if you get there, because it's build and designed by people who actually do that stuff. Nowadays I tend to agree it is mostly just what you're used to and what you've already learnt.

  • By f33d5173 2025-10-3020:075 reply

    People want features, and they're willing to learn complicated UIs to get them. A software that has hyper simplified options has a very limited audience. Take his example: we have somebody who has somehow obtained a "weird" video file, yet whose understanding of video amounts to wanting it to be "normal" so they can play it. For such a person, there are two paths: become familiar enough with video formats that you understand exactly what you want, and correspondingly can manipulate a tool like handbrake to get it, or stick to your walled-garden-padded-room reality where somebody else gives you a video file that works. A software that appeals to the weird purgatory in the middle necessarily has a very limited audience. In practice, this small audience is served by websites. Someone searches "convert x to y" and a website comes up that does the conversion. Knowing some specialized software that does that task (and only that one narrow task) puts you so far into the domain of the specialist that you can manage to figure out a specialist tool.

    • By robenkleene 2025-10-3021:551 reply

      For this example:

      > we have somebody who has somehow obtained a "weird" video file

      Why are you arriving at the conclusion that this requires complex software, rather than just a simple UI that says "Drop video file here" and "Fix It" below? E.g., instead of your conclusion "stick to your walled-garden-padded-room reality where somebody else gives you a video file that works", another possibility is the simple UI I described? That seemed to me the point of the post.

      • By f33d5173 2025-10-311:072 reply

        The issue is that downloading a software, for most people, implies an investment in the task the software does that is unlikely to be paid off if it only does a single simple task. If I'm going out of my way to download something, then I'm probably willing to learn a few knobs that give me more control. Hence why I suggested that such a person would rather use a website.

        This is really just my read for why this sort of software isn't more common. Go ahead and make it, and if it ends up being popular I'll look the fool.

        • By SftwrSvior81 2025-10-312:04

          > an investment in the task the software does that is unlikely to be paid off if it only does a single simple task

          I don't think that's true at all. The tool linked here is exactly the kind of utility that does one single task and that people are happy to download. Most people use software to solve a problem, not to play around with it and figure out if they have a use for it.

    • By impure-aqua 2025-10-318:02

      The walled gardens got a lot more appealing.

      When we moved to Canada from the UK in 2010 there was no real way to access BBC content in a timely manner. My dad learned how to use a VPN and Handbrake to rip BBC iPlayer content and encode it for use on an Apple TV.

      You had to do this if you wanted to access the content. The market did not provide any alternative.

      Nowadays BBC have a BritBox subscription service. As someone in this middle space, my dad promptly bought a subscription and probably has never fired up Handbrake since.

    • By latexr 2025-10-3115:452 reply

      > or stick to your walled-garden-padded-room reality where somebody else gives you a video file that works.

      That’s not always a possibility. See for example:

      https://www.theverge.com/2020/5/20/21262302/ap-test-fail-iph...

      Those people didn’t need or want Photoshop or a complicated program with tons of options to convert image formats form anything to anything. Even a simpler app like Preview implies knowing where to look for the conversion part and which format to pick. They could have done instead with a simple window which said “Drop File Here”, and if it’s an HEIC, convert to JPEG. Could even have an option to delete the original, but that’s bout it.

      There’s an Alfred workflow which is basically that. It does have some configuration, but the defaults are sensible and doesn’t let you screw up the important part of “detect this format which is causing me trouble and produce a format which will work”

      https://alfred.app/workflows/alfredapp/heic-to-jpeg/

      • By f33d5173 2025-10-3120:161 reply

        > That’s not always a possibility.

        The solution in such cases can't reasonably be "everybody around the world learns to download one particular tool to fix things". In your example, the two reasonable solutions are either apple figures out how not to send image files to people that they can't understand, or the college board figures out how to convert heic into jpeg themselves. Otherwise, as in that case, most people will simply be left in a lurch.

        • By latexr 2025-10-3121:091 reply

          No, of course not. The point is that in the meantime (the problem was eventually solved) such a tool bridges the gap.

          • By f33d5173 2025-10-3121:46

            The people with the know how to use such a tool are more likely to use a swiss army knife tool than a specialized one off tool. The article mentions someone using windows photo viewer to do the conversion.

      • By jcelerier 2025-10-3118:511 reply

        > They could have done instead with a simple window which said “Drop File Here”, and if it’s an HEIC, convert to JPEG

        But then you have to remember the names of 200 distinct software that all do this one thing, so you make a meta-software to manage and organize them, and you're back to square one only with more indirection

        • By fn-mote 2025-10-3121:441 reply

          LLM's exist now.

          Google could do this years ago.

          It might be a problem that you have to search for the solution to every time you have it, but you'll find the solution quickly because many other people also experience the problem.

          • By jcelerier 2025-11-010:34

            > It might be a problem that you have to search for the solution to every time you have it,

            as you say, it is a problem, and for many people an unacceptable tradeoff

    • By hshdhdhehd 2025-10-3111:57

      I like the OBS software. Complex at first yes. But it has a magic solution: great one pager to get you started. Follow it and you are recording within a few seconds!

    • By mock-possum 2025-10-316:11

      Conversely, people want familiar UIs that they’re familiar with, and are willing to forgo features to use them.

HackerNews