A Bed Assembly Drama
If you open this newsletter all the time, if you forward to your friends and co-workers, if it challenges you to think in new and different ways — consider subscribing.
You get access to the weekly Things I Read and Loved at the end of the Sunday newsletter, the massive links/recs posts, the ability to comment, and the knowledge that you’re paying for the stuff that adds value to your life. Plus, there’s the threads: like yesterday’s 500+ ideas for lunches that require no effort and last Friday’s amazingly generative What Are You Doing On Hard Mode?
Last month, a friend of mine who we’ll call Jane bought a nice mid-century modern bedframe from Wayfair. She was moving into a place of her own and had to buy several items to outfit the space, so she decided to spend more on the couch and less on the bedframe. Wayfair’s looked nice, the reviews were good, and best of all, they didn’t charge for shipping.
The bed parts would arrive, unassembled, in a few big boxes. This friend has some physical limitations that make it difficult to lift or maneuver heavy objects, so she knew she couldn’t put it together herself. But right there on the Wayfair site there’s an offer for professional assembly — for just $84.99?
Sounds GREAT. The FAQ explains that the assembly will be performed by an “assembly pro” from “one of our trusted partners, Wayfair Home Services or Angi.” Assemblers are “background-checked and highly rated by customers.” All you have to do is add-on assembly when you buy the bed, then schedule it using the app.
So Jane paid $84.99 and scheduled assembly for the day after delivery. The boxes were dropped by FedEx at the wrong apartment door, a common occurrence with FedEx in this area, and Jane’s elderly neighbor had to push them down the hallway to her door. The next day, the Angi assembler arrived and began assembly. After several hours, the assembler told Jane she couldn’t finish: Wayfair had sent the wrong part.
Jane sent a photo of the bed to Wayfair, who promised to send a replacement part. But then the photo made its way to the Wayfair parts department, where someone looked closer: it wasn’t the wrong part. The assembler had just assembled the bed incorrectly.
Jane got in touch with Angi and asked for a new assembler to come reassemble the bedframe. She kindly requested that they send a different person than they’d sent before. A few hours later, Jane received notification that her assembly had been scheduled. With the same assembler. Who then failed to show up for the appointment.
She was at wit’s end. She cancelled the assembly. She’ll eventually have the $84.99 refunded, but only after she fills out an online form and waits several weeks for a check to arrive in the mail.
Jane was telling a friend about the whole fiasco, and they volunteered to come reassembly. He quickly discovered that the entire frame had been assembled incorrectly, beginning with step one. The Angi assembler was probably good at all sorts of other things in her life. But she was not good at assembling.
Jane’s bedframe saga reminded me of a conversation I was having with a different friend last weekend, about the Apple Store’s inability to get her iPhone to ring with an incoming call. After a two hour phone call and a visit to the store, the phone remained broken. She started trying different Google queries — and fixed it herself.
After that conversation, I checked our island NextDoor to see that several FedEx packages had either been delivered to bushes or the wrong house — a frequent occurrence. Soon thereafter, my prescription was refilled incorrectly. At Fred Meyer, our local Kroger-owned grocery store, a bagger in his 70s put all my frozen items in a normal bag, and my chips in the cold storage bag I’d brought from home.
Small stuff, all of it. None of it super inconveniencing. But a series of inconveniences add up. Maybe you’ve experienced something similar over the last few years, contributing to a vague conviction that people are bad at their jobs.
People are bad at their jobs is a sibling, of course, to no one wants to work anymore: the refrain of peak pandemic years. But both are deflections from what actually makes people “bad” at their jobs — or disincentivizes people to work. The truth is: the jobs are bad.
What makes a job bad? Take a look at Angi, which, like food delivery apps, Thumbtack, Taskrabbit, Instacart, and hundreds of other “gig economy” employers, promises consumers cheap ease: just a few clicks, and some part of your life will be easier. In reality, the business model that creates both the cheapness and the ease makes the end product significantly worse: the only way the company can make a profit is by taking a significant cut off the top of the service and by exclusively hiring part-time “independent contractors” (and thus circumventing labor laws; economist David Weil calls this phenomenon “the fissured workplace.”)
As a result, the majority of people who sign-up to do the work are making very little money. At Angi, assemblers report making around $25 for a single assembly. A bed is scheduled to take two hours. Add in transportation, gas, and/or parking costs, and you’re making less than minimum wage ($16.66 in Washington State). So who’s doing this work? Someone who’s desperate for some control over their schedule. Someone who’s desperate, generally. Not someone who can market their skills themselves — like a local handyperson, working for themselves, who’d charge their hourly rate (mine charges $50 an hour) or a local furniture assembly specialist (the one I called charges $150 minimum for assembly; more for a larger project).
If you wanted to reverse engineer a job to ensure that the people doing it would do it badly, you’d build something like Angi. It doesn’t provide training. It doesn’t provide tools. It doesn’t provide benefits, or job security, or anything close to a living wage. If you get better at assembling, or if you’re so good you amass a client base, you can bet you’re not doing those jobs through Angi anymore.
Or take the example of FedEx — delivering to the wrong door and throwing packages in the bushes. We’ll start with FedEx’s primary competitor: UPS. UPS is unionized and directly employs its hundreds of thousands of U.S. workers. The average pay for a UPS driver is $95,000 a year, plus over $50,000 in healthcare and pension contributions. Part-time workers receive the same benefits. Working for UPS is a good job — and that’s part of why UPS workers are good at their jobs.
Delivering for FedEx, by contrast, is a bad job. It subcontracts its routes to smaller entities that then hire their own drivers; a scrape of job listings for drivers shows an average pay of $22 an hour. Most subcontractors pay their drivers using a “fixed daily rate,” which means they’re paid the same no matter how long the work takes. If they’re done early, they go home early; if they take longer, they’re not paid overtime. For a route like the one here on the island, they’re not receiving any extra pay for the time spent waiting in the ferry line on both sides.
It’s a bad job, which is part of the reason we have new drivers all the time — drivers who don’t know the island, or where to leave packages, but are desperate to finish the route as soon as possible. Again, the parameters of the job itself make it far more likely for whoever’s doing it to be “bad” at it.
Then there’s the botched prescription refill. I use the Walgreens Pharmacy out of convenience — but Walgreens, like every other chain pharmacy in the US, has been significantly understaffed for years. Chain pharmacies are now forced to negotiate with Pharmacy Benefit Managers (PBMs), who serve as middlemen between insurance companies and the pharmacies themselves, ostensibly saving insurees millions by negotiating lower prices from the pharmacies themselves. In practice, PMBs save the insurance company millions — but not necessarily the insuree, who, even if they do save a few dollars on a prescription, is paying for “low cost” drugs by spending endless hours on hold or in line to get their prescription filled.
At the same time, the non-pharmacy parts of the drugstores themselves are pulling in less money, placing even more pressure on the pharmacy as a profit center. Thousands of chain pharmacies have closed since the pandemic, effectively funneling more customers into fewer outposts. (Or, as happened with me, the PMB that represented my insurance company offered my previous pharmacy at Fred Meyer a “take it or leave it” deal; when Fred Meyer declined, I could no longer have my prescriptions filled there. Hence: Walgreens, along with thousands of other local residents who can no longer fill their prescriptions at Fred Meyer).
More customers + more pressure to create profit + stagnant wages = fewer employees working more hours for less pay, which also leads to burnout — and people being bad at their jobs. And as for working in an environment that effectively forces you to be bad at something you’re good at — that leads people to leave the industry altogether. Cue: even more understaffing, even more pressure on the people who remain, and a much higher likelihood of a prescription renewal getting screwed up.
Okay, but what about the Fred Meyer bagger? Working at Fred Meyer is actually a pretty okay job: you start at minimum wage (again, $16 an hour here in Washington) and then get bumped up when you become part of the union. A full-time gig offers full medical benefits, paid vacation and sick leave, even Sunday overtime ($1 more an hour).
The vast majority of baggers I encounter are actually quite good at their job, but I also encounter fewer and fewer baggers, because Fred Meyer has decided that the best way to lower costs is to only have one checkout with an actual checker open and funnel all other customers to self check-out. This particular bagger was bad at this job because he probably shouldn’t be working this job, but it’s one of the few jobs available to someone who has to work into their 70s. (The bagger couldn’t get hired at Costco, which is right across the street and a very good job).
As for my friend’s iPhone — well, working at the Apple Store is also a good job. Even if you start at the bottom of the pay scale, they promote from within, offer full health benefits for full-time employees, 401k matching, and stock discounts. The Genius Bar appointment system is unrivaled. I recently messed up my computer and since I live two hours from the nearest Genius Bar, they sent me an overnight box (via FedEx, that time they got the delivery right) to ship it to them, did the repair, and shipped it back (also via FedEx, but this package didn’t get here overnight because the weekend delivery person couldn’t find my front door. Yes, she was new.)
Sometimes the product is just weird — or, as we used to say about certain cars, a lemon. But we’ve had enough experiences with people in bad jobs that it sure feels like everyone, no matter the industry, is doing bad work. We blame it on lack of ambition, lack of pride, laziness, rudeness, whatever, because it’s always easier to blame the individual who made our life difficult, instead of the systems that don’t just foster but incentivize bad work.
As a society, we have decided that we want more for less: more convenience, more purchases, more technology, but none of it at prices that render it out of reach. For years, we allowed immediate gratification to blind ourselves to the reality that making something cheaper and more accessible almost always makes it worse. It didn’t matter if the shirt fell apart or the couch collapsed — you could always buy a new one and survive on the glow of its novelty until it had to be replaced as well.
The exploitation (of workers, of natural resources) that made that abundant cheapness possible was largely invisible and thus ignorable. Some people paid the time tax and figured out new homes for their discarded items, but most people pile it into Goodwill or the dumpster, telling themselves a story about how it’d find a second life, or telling themselves nothing at all.
A badly made pair of sandals is annoying but survivable. But people being bad at their jobs in your everyday life is far more difficult to ignore. It adds unanticipated rupture. It blows up your carefully planned day. It exacts a stiff time tax — and while you have money (in part because everything is so cheap!) your time has become far more precious. You’re just standing there in line at Walgreens, stewing about how there has to be a better way, absolutely indignant that they can’t manage to figure out one prescription that refills every three months seriously how hard can it be. Our hunt for the frictionless deal turns a purchase into a quagmire that takes weeks to fully undo, restore, and solve.
Even if you don’t personally hold these values, the vast majority of us are members of societies that do. But resistance is very possible. If everyone’s good at their job, shop there. If you need help with something, find a local company or self-employed person to pay directly — and tip them. If something feels like a massive deal, someone or some part of the earth is paying steeply for it, and chances are high you will pay more for it (in replacement costs, in labor, in time) later. And if you’re forced to use a company with bad services and bad products, the fault is very rarely the worker themselves, but the organization that makes it so difficult for them to be good at their job.
I’m not saying we should all spend more money on everything. Or that we should collectively lower our standards and accept shoddy work. I keenly understand that part of the reason we rely on these exploitative services is because we, ourselves, are subject to the demands of the same economy: one that tells us our time is always better spent working or recovering from work, instead of helping others with their bedframe assembly or, say, shopping in person.
But I do think it’s worth wondering: what would happen, how might the paradigm shift, if we continue normalizing paying far more for far less? ●
Just want to be clear here that companies and consumers are all part of this — but if you’ve found yourself in the position of buying something like the Wayfair bedframe, I don’t think you’ve done anything wrong. I have ALSO bought a Wayfair bedframe!! (Although my assembly saga was far less fraught than Jane’s). I’m not trying to make anyone feel like shit; instead, I’m trying to have us think about our reactions to “bad” products and “bad” service.
So, for today’s discussion: What have you paid less money for — product-wise, service-wise — but ended up paying more in other ways? What are your strategies for normalizing (receiving) less for (paying) more, particularly in the US, which valorizes (receiving) more for (paying) less?
The comments are, as always, a subscriber-only space; don’t be butts and let’s keep it one of the good places on the internet.
I have been in the workforce for almost 30 years now and I believe that everybody is getting more squeezed so they don’t have the time or energy to do a proper job. The expectation is to get it done as quickly as possible and not do more unless told so.
In SW development in the 90s I had much more time for experimentation to figure things out. In the last years you often have some manager where you basically have to justify every thing you do and always a huge pile of work that never gets smaller. So you just hurry through your tasks.
I think google had it right for a while with their 20% time where people could do wanted to do. As far as I know that’s over.
People need some slack if you want to see good work. They aren’t machines that can run constantly on 100% utilization.
> In the last years you often have some manager where you basically have to justify every thing you do and always a huge pile of work that never gets smaller. So you just hurry through your tasks.
This has been my exact experience. Absolutely everything is tracked as a work item with estimates. Anything you think should be done needs to be justified and tracked the same way. If anything ever takes longer than the estimate that was invariably just pulled out of someones ass (because it's impossible to accurately estimate development unless you're already ~75% of the way through doing it, and even then it's a crapshoot) you need to justify that in a morning standup too.
The end result of all of this is every project getting bogged down by being stuck on the first version of whatever architecture was thought up right at the beginning and there being piles of tech debt that never gets fixed because nobody who actually understands what needs to be done has the political capital to get past the aforementioned justification filter.
Also this push to measure everything means that anything that can’t be measured isn’t valued.
One of your teammates consistently helps unblock everyone on the team when they get stuck? They aren’t closing as many tickets as others so they get overlooked on promotions or canned.
One of your teammates takes a bit longer to complete work, but it’s always rock solid and produces fewer outages? Totally invisible. Plus they don’t get to look like a hero when they save the company from the consequences of their own shoddy work.
The biggest mistake those employees make on their way to getting overlooked is assuming their boss knows.
Everyone needs to advocate for themselves.
A good boss will be getting feedback from everyone and staying on top of things. A mediocre boss will merely see "obvious" things like "who closed the most tickets." A bad boss may just play favorites and game the system on their own.
If you've got a bad boss who doesn't like you, you're likely screwed regardless. But most bosses are mediocre, not actively bad.
And in that case, the person who consistently helps unblock everyone needs to be advertising that to their manager. The person who's work doesn't need revisiting, who doesn't cause incidents needs to be hammering that home to their manager. You can do that without throwing your teammates under the bus, but you can't assume omnipotence or omniscience. And you can't wait until the performance review cycle to do it, you have to demonstrate it as an ongoing thing.
Your boss can know about it, but if their boss wants data on performance you’re back in the same boat.
Funny you mention engineers needing to market themselves though. That leads to its own consequences. I’ve been at a place where everyone needed to market their own work in order to get promoted, to get raises, and to stay off the chopping block.
The end result? The engineers at the company who get promoted are… good at self-promotion, not necessarily good at engineering. Many of the best engineers at the company—who were hired to do engineering—languish in obscurity while people who can game the system thrive. People get promoted who are only good at cranking out poorly-made deliverables that burden their team with excessive long-term maintenance issues. They fuck off to higher levels of the company, leaving their team to deal with the consequences of their previous work.
Run that script for five or ten years and it doesn’t seem to be working out well for the company.
You made excellent points. As someone looking to solve problems, finish tasks and go home. I just don't feel energized marketing myself if it is not during changing jobs.
And measurement has really taken over now. There is little value in getting task done well as compared to finishing more jira stories.
And that's fine. It's why the lifecycle of most technology companies is fairly short. They grow for a while and eventually stagnate, to be replaced by the next crop of startups when a disruptive innovation comes along. And then the cycle repeats.
When it comes time for layoffs, it generally isn't what your boss knows, it's what your boss's grandboss thinks to throw onto a spreadsheet at the eleventh hour before Quarterly Reports are due.
A good direct boss might keep you on track for a bonus or other "local advancement", maybe even a promotion, but many companies you are only as valued as the ant numbers you look like from the C Suite's mile high club. (Which doesn't protect your good boss, either.)
> The biggest mistake those employees make on their way to getting overlooked is assuming their boss knows.
100%. You ask me to do the near impossible, I'll pull it off. But you will be very well-versed in how hard it is first.
I agree it's a mistake but one thing that's never taken into account in this discussion is that many people find it enough that they are doing their jobs. They don't want to do marketing. A lot of tech people are like that which is a real tragedy.
What you're describing was precisely our culture at the last startup.
One group plans ahead and overall do a solid job, so they're rarely swamped, never pull all-nighters. People are never promoted, they're thought of as slacking and un-startup-like. Top performers leave regularly because of that.
The other group is behind on even the "blocker"-level issues, people are stressed and overworked, weekends are barely a thing. But — they get praised for hard work. The heroes. (And then leave after burning out completely.)
(The company was eventually acquired, but employees got pennies. So it worked out well for the founders, while summarily ratfucking everyone else involved. I'm afraid this is very common.)
The classic one too is that as somebody who puts out the fires, you get all the praise; whereas if you just do the damn job right from the beginning, nobody notices. Corollary: create as many fires as you can, just don't completely burn the whole thing to the ground.
It's even got a name: https://en.wikipedia.org/wiki/McNamara_fallacy
It's got a name and we know that it's happening yet the overpaid overeducated c-suite demands it? What gives?
This was previously recommended to me on HN, so I’ll pass it along. The book “Seeing Like A State” gives a pretty reasonable explanation for why this happens: https://en.m.wikipedia.org/wiki/Seeing_Like_a_State
The basic idea is that the only viable way to administer a complex and heterogenous system like a massive corporation is to simplify by enforcing “legibility” or homogeneity. Without this, central control becomes far too complex to manage. Thus, the simplification becomes a mandate, even at the cost of great inefficiencies.
What makes the book particularly interesting is the many different historical examples of this phenomenon, across a wide array of human endeavors.
I like the book quite a bit, and it's been formative in my politics.
That said, I am not sure if the take-away is that managers need to account for these factors by allowing for illegibility- I am not reading you claim that, but contextually that's how the discussion feels to me.
I do agree with Scott that enforcing perfect legibility is impossible and even attempting to do so can cause immense problems, and I agree with his analysis of these modernist efforts and have found that it's a useful lens for understanding a lot of human enterprise.
I find a lot of hope in that view: nothing actually gets done without some horizontal, anarchist cooperation.
But I also find hope in the fact that it's structurally a issue with authoritarian organizational strategies which can't be accounted for and surmounted.
Thank you for the reply!
I don't want to make any strong claims here, but my gut reaction to your first comment is that what one manager calls “allowance for illegibility”, another might call “trust in my reports”.
Yes, at the end of the day it's necessary to have some amount of "trust" in the people doing the work. Which is good- you can try to avoid that but if it didn't happen very little would get done.
Everything rots, everything changes.
Investors want to know how long you're going to keep making them money. They don't like surprises.
Really, I think what we need are new ways for investors to participate and understand and structure their investments that don't have negative downward consequences for the structure of businesses.
Maybe I would have found the book more impactful if I had read it earlier in life. I felt like it put together various ideas and presented them well in a comprehensible manner. What I feel it omits is that the mechanisms of a state only have to be actionable, not rational. If you ask me how to mow a lawn and I come up with some byzantine process involving multiple steps that don't even contribute to the end goal I'm going to be labeled nuts or maybe "eccentric" if they want to be polite. The same scrutiny doesn't apply to the various bureaucratic processes of a state for whatever reason.
The problem is that this miserable state of affairs works at scale.
Yes, on problems that exist at the scale of one or intelligent, educated, experienced, and dedicated human (or maybe up to 3-5), an individual or small team will run circles around a business. You can have a top-notch CEO and COO and HR manager and six program managers (each with zero domain experience other than running a Jira board) and four dozen junior consultants who memorized just enough to pass the interviews and an art department and sales and finance and IT. For some problems, that whole $50M enterprise will be utterly demolished by a couple of determined engineers.
Likewise, a monarchy with a wise, benevolent, and just king can flourish, whereas a corrupted and bureaucratically entangled democracy is woefully inefficient.
But if you want your kingdom to last more than two generations before succumbing to a greedy monarch, or want your enterprise to solve bigger problems that don't decompose nicely to small ones, to vertically integrate huge manufacturing systems and scale out to billions of units, the only method that works is the inefficient one. And it does work!
Only revisionist history tell tales of flourishing kingdoms under a just king. In reality, the reason feodality worked for so long was the anarchy and power struggle, the cavalcades (basically raids) and a honour based justice (basically don't kill fellow nobility during war, and avoid killing militantes during cavalcades and you'll be good). The anarchical nature of the system made it particularly susceptible to organised raids, but also extremely 'agile' in it's political responses. Once power was consolidated however, the clergy and the royalty pushed their law and hierarchical order onto the mostly aristocratic feodality, it broke and you get the crusade against Alby, the war between Plantagenet and capetiens, and probably a lot of other misery inflicted to the general population. Then once the hierarchical order is set, you need an administration, which will become inefficient by nature.
> The problem is that this miserable state of affairs works at scale.
It "works" in the sense that it can be kept going by patching the damage it causes by throwing more money at it.
What it mostly does at scale is appear to work, to those high enough above it that they can't see any of the details: only the metrics that are being optimized for.
The question is if the Kingdom would then still be worth surviving if life for everyone there ends up being miserable.
What if it doesn't survive and 70% of the people who were in the Kingdom end up in worse, arbitrarily-ruled, small despotic fiefdoms instead? And only 10% end up being better off by being lucky enough to have landed in the high-trust+high-competence small group?
Or, switching to consumer products vs company revenue/profit or kingdoms, and grounding in a specific example: people love to hate Windows, but how many of them would actually be better off if the options were just Mac (still expensive, still niche) or Linux? And "well they could just learn how to [code or configure text files or whatever]" for these purposes counts as worse off, IMO - more time spent on something that used to kinda-sorta-at-least-work-predictably for them.
> people love to hate Windows, but how many of them would actually be better off if the options were just Mac (still expensive, still niche) or Linux?
I don't know, but Windows has becoming increasingly worse at everyday usage. I swear Linux has better suspend/sleep functionality now, doesn't sneaky restart randomly (yeah, just because you reopen an explorer window but none of my other, actually important programs will definitely make people notice), doesn't take a minute to react to an unlock attempt several times a day for no reason on even very performant hardware..
So yeah, I think many would be better off with Linux.
Your comparison isn't very good as Microsoft Windows undergoes perpetual change and churn for the sake of doing it. This breaks existing workflows along the way. As a product it was effectively complete by the time Windows 2000 was released, having successfully integrated what was then considered state of the art technology to develop a practical operating system based on the principals known at the time. All it ever needed from there forward was maintenance updates and kernel updates to enable new hardware level technology to be harnessed by software.
> overeducated c-suite
Arguably the modern MBA has gotten so insular, with many graduating with an MBA having only the barest modicum of humanities courses and the barest foot out of the door of a business college, that despite supposedly representing a higher University degree it seems increasingly fair to call it "undereducated". MBA programs got too deep into the business of selling as many MBAs as they could as quickly as they could they forgot to check their own curriculum for things like "perverse incentives" and "regulatory capture" and "tribalism".
An MBA is a professional graduate degree, like a JD or MD. Criticizing professional degree programs for lack of humanities coursework rather misses the point. Students are supposed to have got that in undergraduate.
Sure, but a lot of Business undergraduate programs, even at prestigious Universities, are now "pre-MBA" and very MBA-focused, if not "direct to MBA" and allow taking bare minimums of non-Business classes and just about guarantee MBA program entry. For MD this sort of "academic incest" makes sense that you are going to have more because there is too much specialized knowledge to learn during graduate programs. (But also most pre-Med doesn't pre-qualify Med School like "pre-MBA" can.) JDs still seem to expect a variety of candidates of different undergraduate backgrounds, though "Pre-Law" sometimes exists, it often isn't a specific "program" and to my understanding can be several different options from very different undergraduate college options; "Pre-Law" seems as much about navigating the analysis paralysis of all the possible paths as anything else, without narrowing the number of paths.
I think the MBA programs have built "pre-MBA" programs not because they have so many skills to specialize, and not necessarily because they have so many possible paths to try to navigate, but because the it sells more Business school undergraduate credits.
Good MBA programs still exist. Not all MBAs involve "academic incest", and there are still MBA programs that encourage non-Business undergraduate degrees. Not all "academic incest" is bad either. But there's definitely an anecdotal sense that many of the people I see with MBAs spent the least time learning anything that wasn't taught in a Business School classroom, with the least consequences for their non-Business School GPAs, because the Business School wants that graduate degree funnel and the tuition dollars it guarantees, than any other graduate degree program I've seen. (Hence why I mentioned "perverse incentives", especially. The Business School wants you to do well in Business School so you keep paying the Business School. The Business School cares less what you do outside the Business School so that you keep paying the Business School.)
Try to make a thread about unions on HN and read the comments, then it'll make sense.
There's chance that maybe there exists a revenue stream that increases by further applying that policy across a system that you don't have access to?
While important, it actually misses a common problem I see: the assumption that every measurement is accurate.
The phenomenon being discussed here is a type of overfitting:
https://sohl-dickstein.github.io/2022/11/06/strong-Goodhart....
The last 50 years or so of managerial practice has been a recipe for overfitting with a brutal emphasis on measuring, optimizing, and stack ranking everything.
I think an argument can be made that this is an age of overfitting everywhere.
Interesting that something similar came up recently where an AI being trained might fake alignment with training goals.
Worse yet, these are upward-censored metrics. Failing to make them hurts your career, but making or exceeding your targets doesn’t really help your career—it’s just seen as validating management’s approach.
As soon as they impose metrics, you need to bring in a union, and (to be frank) chase or bug out anyone who’s not on board with worker solidarity.
> Also this push to measure everything means that anything that can’t be measured isn’t valued.
Never thought I'd see an intelligent point made on hackernews, but there it is. You are absolutely correct. This really hit home for me.
You could have made your point better without insulting everyone on the forum.
It's fascinating that you end up sort of doing the work twice, you build an excel (or jira) model of the work work along with the actual work to be done.
Often this extends to the entire organization, where you have like this parallel dimension of spreadsheets and planning existing on top of everything.
Eats resources like crazy to uphold.
Jira is already almost like "productivity theater" where engineers chart the work for the benefit of managers, and managers of managers only. Many programmers already really resent having to deal with it. Soon it will be a total farce, as engineers using MCP Jira servers have LLMs chart the "work" and manage the tickets for them, as managers do the same in reverse, instructing LLMs to summarize the work being done in Jira.
It'll be nothing but LLMs talking to other LLMs under the guise of organizational productivity in which the only one deriving any value from this effort is the companies charging for the input and output tokens. Except, they are likely operating at a loss...
Managers (as in PMs, EMs, and C-Suite) don't like JIRA either - there just isn't an alternative.
Customers and investors ask for delivery timelines and amount of resources invested on major features or products, and you need to give an accurate-ish answer, and you as a company will be dealing with hundreds if not thousands of features depending on size.
In that kind of a situation, the only way you can get that visibility is through JIRA (or a JIRA type product), because it acts as a forcing function to get a defensible estimate, and monitor progress.
Furthermore, due to tax laws, we need to track investments into features and initiatives, and JIRA becomes the easiest way to collect that kind of amoratization data.
Once some AI Agent to automate this whole program management/JIRA hygiene process exists, it will make life for everyone so much easier.
This explanation is not incompatible with calling the whole business a "theater".
Its not _all_ theater. Sometimes something does make it into the box and out the door.
How is it theater?
When customers give you money, they expect a date.
When investors give you money, they want to see whether or not you are investing in the right initiatives.
When you open a company, the IRS, SEC, and other regulators expect some amount of financial compliance.
Do you want me to come to you and give you an ultimatum to give me an exact date, calculate amortization, and defend existing investments, and if any of those slips you are the fired? And do that with all the hundreds and thousands of initiatives on a daily basis?
That's the alternative.
Welcome to the industry - you're paid to make purchasers happy, not you. Purchasers don't care if you DuckDB or OracleDB - they care if the product they paid for will be delivered on time and meet the needs stipulated in their contract.
If you want to be happy and only deal with engineering problems, you sadly have to deal with the poopshow that JIRA is.
It's theater because the numbers in JIRA are, for the most part, pulled out of someone's ass, and then multiplied by various coefficients by managers along the chain (based on their pessimism and/or experience). Garbage in, garbage out.
So yes, this is theater, and it only makes someone happy for as long as they aren't aware (or can pretend to not be aware) how the sausage is made.
If you round up great engineering orgs that ship impactful stuff more of them don't use JIRA than do. Linear, Basecamp, Asana, Monday etc.
My experience is by the time an org gets hundreds of priorities and can't effectively delegate to sub orgs they're already fucked and there's no point working there if you want to do anything meaningful.
None of this sounds necessary for the human race. Maybe David Graeber was right.
Nothing is necessary to exist besides foraging, yet you are still using an industrially manufactured product (laptop or mobile phone) to reply to someone on a VC-subsidized forum.
So I'm not sure your contention has much merit, unless you wish to return to the woods and stop using HN, otherwise you're just enabling the supposed waste you appear to detest.
Or alternatively, you could hop off the high horse and understand the headaches the people you report to at work deal with, and thus maybe learn some additional context that can help you at your current or future job, and maybe think of a way to remove the drudgery in a process that annoys everyone.
"And yet you partake in society. Curious.
I am very smart "
I mean there is an alternative out there for making software that doesn't require profit and can still provide societal value. The alternative isn't to forage in the wilderness, please tell me you are just having a laugh and weren't being serious.
This is the perfect manifestation of the quote: It's easier to imagine the end of the world than the end of capitalism
So far none of the imaginary economic systems seem to work as well as capitalism when it comes to raising human living standards. These vague, low-effort criticisms are getting tiresome.
Capitalism has become as much of a thought-terminating argument as 'the gods'. Most '-ism' words I think.
Yes but metrics! How can the CEO look like they know what's happening without understanding anything if they don't have everyone producing numbers?
This compounds with each _team_ modeling the work in jira/excel too!
Absolutely everything is tracked as a work item with estimates. Anything you think should be done needs to be justified and tracked the same way.
My grandpa once said something that seemed ridiculous but makes a lot of sense: that every workplace should have a “heavy” who steals a new worker’s lunch on the first day, just to see if he asserts himself. Why? Not to haze or bully but to filter out the non-fighters so that when management wants to impose quotas or tracking, they remember that they’d be enforcing this on a whole team of fighters… and suddenly they realize that squeezing the workers isn’t worth it.
The reason 1950s workplaces were more humane is that any boss who tried to impose this shit on workers would have first been laughed at, and then if he tried to actually enforce it by firing people, it would’ve been a 6:00 in the parking lot kinda thing.
> steals a new worker’s lunch on the first day, just to see if he asserts himself
> to filter out the non-fighters
This is bullying and hazing.
Many of the workers in the 1950s were combat veterans who had lived through some shit and weren't as easy to push around. Contrast that to today when a lot of people tend to panic over minor hazards like a respiratory virus with a >99% survival rate. That cowardice puzzled me until I realized that a lot of younger people have led such sheltered lives that they have never experienced any real hardship or serious physical danger so they lack the mental resilience to cope with it. They just want to be coddled and aren't willing to fight for anything.
That generation had it more together as citizens, and they held on to power for a long time. Postwar all of the institutions in the US grew quickly, and the WW2 generation moved up quickly as a result. The boomer types sat in the shadows and learned how to be toxic turds, and inflicted that on everyone.
Why do you think that is? I’m wondering if the shared sacrifice of WW2 has something to do with it.
That's half of it. The other half is, WWII turned the United States from a relative backwater to a military and industrial superpower. So the war also taught lessons on a societal level about organization and cooperation, and the postwar economic boom provided the means to get great things done.
> The other half is, WWII turned the United States from a relative backwater to a military and industrial superpower.
The US was the leading industrial power from around 1880 or 1890, and it became the leading military power in the 1910s (by dint of entering WWI so late that it didn't exhaust its manpower fighting it). It may have been a cultural backwater as late as WWI, but its economic status would have been fairly undisputed. And by WWII, the only question anyone would have seriously asked is if the US or the UK held the throne as greatest of the great powers.
I think if you look at how most people lived, worked, travelled, communicated, educated, etc before WW2 - there was a huge improvement after the war that resulted in lots of development and economic opportunities for the average person.
Sure, but that doesn't make the original statement correct.
>WWII turned the United States from a relative backwater to a military and industrial superpower.
Labor also has more power when a ton of young newcomers to the working force were just killed before they could ever make it there.
>The boomer types sat in the shadows and learned how to be toxic turds, and inflicted that on everyone.
The boomer types are now in their 70s and even 80s and mostly retired (or dead). It's the generations after them that run many of the anal-retentive, bureaucratically obsessive compulsive managerial postings today, and among those are a good number of gen z turds who are at least as toxic, while being smugly self-righteous about their habits. We'll be blaming boomers for decades after they're dead, for things long since out of their hands.
Boomers is anyone 60 or older right now - not just 70+
That being said, Boomer has evolved to mean anyone older, established and conservative.
Like the counterculture saying from the past, don't trust anyone over 30.
One of the consequences of WWII was that everyone's plans, ideas, and work cultures were turned into direct results very quickly, in the real world. Sometimes fatally.
The people who lived through that had their feet on the ground.
Aside from its many other flaws, post-70s neoliberalism added a bizarre abstraction layer of economic delusion over everything. This suppressed the core truths of physical reality, common sense, and the basic social requirement of sane reciprocal relationships, and did its best to make consequences as indirect and deniable as possible.
Things that really, really matter - like ecological, political, and social stability - were devalued in everyday experience and replaced with economic abstractions that are more mystical than practical.
It's very culty, and the disconnect between how things should be and how they really are is getting more and more obvious to everyone.
"Aside from its many other flaws, post-70s neoliberalism added a bizarre abstraction layer of economic delusion over everything. This suppressed the core truths of physical reality, common sense, and the basic social requirement of sane reciprocal relationships, and did its best to make consequences as indirect and deniable as possible."
I think I need to print that out and put on the wall. However, did you live through it youself? I think it it hard to evaluate stuff like this with 2nd hand experience only.
What if the workers decide the work is imposing on them? Maybe that's a good thing but it could go too far.
> The reason 1950s workplaces were more humane is that any boss who tried to impose this shit on workers would have first been laughed at, and then if he tried to actually enforce it by firing people, it would’ve been a 6:00 in the parking lot kinda thing.
That era also had militant labor organization and real socialist and communist parties in the US. Anticommunism killed all that and brought us to the current state of affairs where employers that respect their employees even a little bit are unicorns.
Why do you need unions for this as opposed to just a tight labor market?
High demand for labor can lead to better conditions, but demand for labor isn't static and without real organization and solidarity it's nearly impossible for workers to punish companies that move jobs to low-cost locales. Economic policy is also controlled by the employer class, which means policies that encourage unemployment and inflation are common.
This is my experience as well. In the late 90s/early 2000s I had the luxury of a lot of time to deeply and learn Unix, Perl, Java, web development, etc., and it was all self-directed. Now with Agile, literally every hour is accounted for, though we of course have other ways of wasting time by overestimating tasks and creating unnecessary do-nothing stories in order to inflate metrics and justify dead space in the sprint.
>> literally every hour is accounted for
I saw one company where early-career BA/PMs (often offshore) would sit alongside developers and "keep them company" almost all day via zoom.
Everyone's complaining about that as a developer, and rightly so. But that can't be easy for the PMs, either, trying to find a way to "add value" when they have no idea what's going on.
I'd expect there to be some "unexpected network outages" regularly in that kind of situation...
I would just terminate the call. Like... hell no.
Yep, that would be my own personal hell.
This is kind of cool as an alternative process to develop apps with. Literally product in a zoom window telling you what to build as you go along. No standups, no refinement, no retros etc. Just a PM that really knows what the customer needs and the developer just building those as you go along.
No developer wants to being treated as a code monkey and I bet no PM would want to waste time watching someone type out code that they don't understand.
No. It's just awful.
Twice the billable hours! /s
If you're creating nothing stories to justify work life balance and avoid burnout your organization has a problem. Look into Extreme Programming and Sustainable Pace.
I think thats the observation being made. Most people respond to the organizational problem with the only tools they have, which manifests as that.
Usually management knows and doesnt care about the problem
---
And yet well over half of professional developers have productivity so low that if they get laid off the term gets the same amount done...
> People ... aren’t machines that can run constantly on 100% utilization.
You also can't run machines at 100% utilisation & expect quality results. That's when you see tail latencies blow out, hash maps lose their performance, physical machines wear supra-linearly... The list goes on.
The standard rule for CPU-bound RPC server utilization is 80%. Any less and you could use fewer machines; any more and latency starts to take a hit. This is when you're optimizing for latency. Throughput is different.
Doesn't this depend on the number of servers, crash rates and recovery times? I wouldn't feel confident running 3 servers running at 80% capacity in ultra low latency scenarios. A single crash would overwhelm the other 2 servers in no time.
Right; this is only for large pools of servers.
Difference is machines break and that costs lots of money.
People just quit, some businesses consider it a better outcome.
You can’t brute-force insight.
I'm often reminded of that Futurama episode “A Pharaoh to Remember” (S04E07), where Bender is whipping the architects/engineers in an attempt to make them solve problems faster.
> In SW development in the 90s I had much more time for experimentation to figure things out. In the last years you often have some manager where you basically have to justify every thing you do and always a huge pile of work that never gets smaller.
Software development for a long time had the benefit that managers didn't get tech. They had no chance of verifying if what the nerds told them actually made sense.
Nowadays there's not just Agile, "business dashboards" (Power BI and the likes) and other forms of making tech "accountable" to clueless managers, but an awful lot of developers got bought off to C-level and turned into class traitors, forgetting where they came from.
I commend you for having an opinion so bad I can't tell if you're satirizing marxists or not.
Let me ask you this, would you rather be managed by a hierarchy made up of people who don't understand what you do? Because I assure you it is far worse than being managed by "class traitors".
well, not the original poster, but I have been managed by both kinds, and the best manager I ever had was not a former techie and the worst was a former programmer.
The worst manager did often say things that were sort of valuable and correct in a general way, like "well you don't actually know that because it hasn't been tested" which was of course true, but he also seemed to think he could tell people what the correct way to do something was without knowing the technology and the codebase. This often meant that I had to go to junior developers later, after a meeting, and say "concerning ticket X, T. didn't consider these things(listing the things), so that while it is true that we should in principle do what T. said, it will not be adequate, you will also need to do this - look at the code for this function here, it should be abstracted out in some way probably, this was my crappy way of handling the problem in desperation Y months ago."
Trying to explain to him why he was wrong was impossible in itself, he was a tech genius evidently, and you just had to give it up after a bit, and figure that at some time in the future the decisions would be reversed after "we learned" something.¨
on edit: in the example I give the manager as I said was correct in what he wanted done, but as I said it was inadequate as the bug would keep recurring if only that was done, so more things had to be done that were not as pretty or as pure as what he wanted.
I want my manager to help get the business out of my way- managing requirements, keeping external dependencies on track, fussy paperwork and such.
I don't need my manager second-guessing my every decision or weighing in on my PRs making superficial complaints about style while also bemoaning our velocity.
Hands down, the best managers I've had have all been clueless about the languages and types of work I do, and the worst managers have (or think they) have some understanding of what I do.
> Let me ask you this, would you rather be managed by a hierarchy made up of people who don't understand what you do? Because I assure you it is far worse than being managed by "class traitors".
One's direct manager should be a developer, yes. The problem is the level above that - most organisations don't have a SWE career track, so if you want a pay rise you need a promotion and that's only available for managerial roles.
The problem there is that a lot of developers make very bad managers and a lot of organisations don't give a fuck about giving their managers the proper skills training. The result is then usually a "tech director" who hasn't touched code in years but just loves to micromanage based on knowledge from half a decade ago or more. That's bad enough in Java, but in NodeJS, Go, Rust or other hipster reinvent-the-wheel stacks it's dangerous.
They come in and blather completely irrelevant, way outdated or completely wrong "advice", plan projects with way less resources than the project would actually need - despite knowing what "crunch time" entails for their staff themselves.
And also, the programmers that got "promoted" to management are people that are here for the money/power and asked to be promoted, not because they care about coding. And absolutely not because their peers wanted for them to be promoted because they saw a good manager in them while they were working together.
So they'll definitely make it worse for everyone than a guy that doesn't know anything about tech but wanted a career in management because they care about managing.
Oh, I vastly prefer people who don’t understand and know it.
Reminds me of Frank Zappa comparing "cigar chomping old guys" to the "hip young types" that replaced them
> I have been in the workforce for almost 30 years now and I believe that everybody is getting more squeezed so they don’t have the time or energy to do a proper job. The expectation is to get it done as quickly as possible and not do more unless told so.
That's my impression as well, but I'd stress that this push is not implicit or driven by metrics or Jira. This push is sold as the main trait of software projects, and what differentiates software engineering from any other engineering field.
Software projects are considered adaptable, and all projects value minimizing time to market. This means that on paper there is no requirement to eliminate the need to redesign or reimplement whole systems or features. Therefore, if you can live with a MVP that does 70% of your requirements list but can be hacked together in a few weeks, most would not opt to spend more man months only to get minor increments. You'd be even less inclined to pay all those extra man months upfront if you can quickly get that 70% in a few weeks and from that point onward gradually build up features.
Definitely squeezed.
They say AI, but AI isn't eliminating programming. I've wrote a few applications with AI assistance. It probably would've been faster if I wrote it myself. The problem is that it doesn't have context and wildly assumes what your intentions are and cheats outcomes.
It will replace juniors for that one liner, it won't replace a senior developer who knows how to write code.
> The problem is that it doesn't have context
You are supposed to give it context, if you dont provide it context how will it know what its supposed to do?
I really wish that was the case. You can give it only so much context before it starts to go down a path where the context doesn't even make sense to it, and yet if you explained it to a colleague they would instantly understand.
Context has layers and really 1st or 2nd layers ever get reached by AI but it can't dive further because it is too focused on the final output rather than the continuation of the output.
For example you write code and then tell it what the final expect output is, it some how always divorces itself from rudimentary implementations and poops out something that cut a lot holes out or shortcuts all of your work. Removes modularity in favor of that immediate outcome. AI is just not good enough to understand the complex relationship of maintainable code and deliverable code. So it poops out what is easily made to meet the deliverable.
I felt this way with Github Copilot but I started using Cursor this week and it genuinely feels like a competent pair programmer.
What work are you doing the last few days? My experience is for a very narrow range of tasks, like getting the basics of a common but new to me API working, they are moderately useful. But the overwhelming majority of the time they are useless.
This has been my experience as well.
Cursor Chat and autocomplete are near useless, and generate all sorts of errors, which on the whole cost more time.
However, using composer, passing in the related files explicitly in the context, and prompting small changes incrementally has been a game changer for me. It also helps if you describe the intended behaviour in excruciating detail, including how you want all the edge cases/errors handled.
I recently tried Cursor for about a week and I was disappointed. It was useful for generating code that someone else has definitely written before (boilerplate etc), but any time I tried to do something nontrivial, it failed no matter how much poking, prodding, and thoughtful prompting I tried.
Even when I tried to ask it for stuff like refactoring a relatively simple rust file to be more idiomatic or organized, it consistently generated code that did not compile and was unable to fix the compile errors on 5 or 6 repromptings.
For what it's worth, a lot of SWE work technically trivial -- it makes this much quicker so there's obviously some value there, but if we're comparing it to a pair programmer, I would definitely fire a dev who had this sort of extremely limited complexity ceiling.
It really feels to me (just vibes, obviously not scientific) like it is good at interpolating between things in its training set, but is not really able to do anything more than that. Presumably this will get better over time.
If you asked a junior developer to refactor a rust program to be more idiomatic, how long would you expect that to take? Would you expect the work to compile on the first try?
I love Cline and Copilot. If you carefully specify your task, provide context for uncommon APIs, and keep the scope limited, then the results are often very good. It’s code completion for whole classes and methods or whole utility scripts for common use cases.
Refactoring to taste may be under specified.
"If you asked a junior developer to refactor a rust program to be more idiomatic, how long would you expect that to take? Would you expect the work to compile on the first try?"
The purpose of giving that task to a junior dev isn't to get the task done, it's to teach them -- I will almost always be at least an order order of magnitude faster than a junior for any given task. I don't expect juniors to be similarly productive to me, I expect them to learn.
The parent comment also referred to a 'competent pair programmer', not a junior dev.
My point was that for the tasks that I wanted to use the LLM, frequently there was no amount of specificity that could help the model solve it -- I tried for a long time, and generally if the task wasn't obvious to me, the model generally could not solve it. I'd end up in a game of trying to do nondeterministic/fuzzy programming in English instead of just writing some code to solve the problem.
Again I agree that there is significant value here, because there is a ton of SWE work that is technically trivial, boring, and just eats up time. It's also super helpful as a natural-language info-lookup interface.
Personally, I think training someone on the client’s dime is pretty unethical.
You have misunderstood something here.
I (like a very large plurality, maybe even a majority, of devs) do not work for a consulting firm. There is no client.
I've done consulting work in the past, though. Any leader who does not take into account (at least to some degree) relative educational value of assignments when staffing projects is invariably a bad leader.
All work is training for a junior. In this context, the idea that you can't ethically train a junior "on a client's dime" is exactly equivalent to saying that you can't ever ethically staff juniors on a consulting project -- that's a ridiculous notion. The work is going to get done, but a junior obviously isn't going to be as fast as I am at any task.
What matters here is the communication overhead not how long between responses. If I’m indefinitely spending more time handholding a jr dev than they save me eventually I just fire em, same with code gen.
A big difference is that the jr. dev is learning compared to the AI who is stuck at whatever competence was baked in from the factory. You might be more patient with the jr if you saw positive signs that the handholding was paying off.
That was my point, though I may not have been clear.
Most people do get better over time, but for those who don’t (or LLM’s) it’s just a question of if their current skills are a net benefit.
I do expect future AI to improve. My expectation is it’s going to be a long slow slog just like with self driving cars etc, but novel approaches regularly turn extremely difficult problems into seemingly trivial exercises.
I would be more patient with an AI that only costs me a fraction of a cent an hour.
The value of my time dwarfs the cost of using an AI.
That said, you are underestimating AI costs if you think it works out to a fraction of a cent per hour.
One time during a 1:1 with who I consider the best manager I ever had, in the context of asking now urgent something needed to get done, I said something along the llines of how I tend to throttle to around 60% of my "maximum power" to avoid burnout but I could push a bit harder if the task we were discussing was essential with to warrant it. He said that it wasn't necessary but also stressed that any time in the future that I did push myself further, I should always return to 60% power as soon as I could (even if the "turbo boost" wasn't enough to finish whatever I was working on. To this day, I'm equally amazed at both how his main concern with the idea of me only working at 60% most of the time was that I didn't let myself get pressured into doing more than that and the fact that there are probably very few managers out there who would react well to my stating the obvious truth that this is necessary
I was about to post largely the same thing. There is a saying in design: "Good, fast, cheap --- pick two." The default choice always seems to be fast and cheap nowadays. I find myself telling other people to take their time, but I too have worked jobs where the workloads were far too great to do a decent job. So this is what we get.
Have we learnt nothing? 100% utilisation of practically any resource will result in problems with either quality or schedules.
What, as an industry, do we need to do to learn this lesson?
It needs to be reflected faster in quarterly results. When the effect takes a year or two, nobody notices and there are too many other variables/externalities to place blame.
Same. What's crazier now is nobody in management seems to want to take a risk, when the risks are so much lower. We have better information, blogs, posts on how others solved issue, yet managers are still like "we can't risk changing our backend from dog shit to postgres". . . .when in the 90s you would literally be figuring it all out yourself, making a gut call and you'd be supported to venture into the unknown.
now it's all RSU, Stock Prices, FAANG ego stroking and mad dashes for the acquihire exit pushing out as much garbage as possible while managers shine it up like AI goodness
People have to care about outcomes in order to get good outcomes. Its pretty difficult to get someone to work extra time, or care about the small stuff if there is a good chance that they will be gone in 6 months.
Alternatively, if leadership is going to cycle over in 6 months - then no one will remember the details.
The article addresses the fact that it's more of the "job" that the software company provides as an extension of their services isn't really a "job" a la "SW development in the 90s"
It's the after effect of companies not being penalized for using the exploitation dragnet approach to use people in desperate situations to generate more profits while providing nothing in return.
> People need some slack
Definitely. If you tighten a bearing up-to 100% - to zero "play", it will stop rotating easy.. and start wearing. Which is.. in people-terms, called burnout.
Or as article below says, (too much) Efficiency is the Enemy..
I totally agree, it was a stark contrast between phd life and purely sw engineer life, in terms of doing things the way i wanted.
I've even seen this and it seems to have accelerated in the last 10 years or so. I'm seeing roles be combined, deadlines get tighter, and quality go down. Documentation has also gotten worse. This all seems pretty odd when you consider the tools to develop, test, and even document have mostly gotten more powerful/better/faster.
What documentation?
How much more expensive is your time for the company now vs the 90s?
counting in cost of living increases? Probably about the same.
It's almost as if people don't understand what the word "productivity" means. That's all it is, if you hear "x increase in productivity" and it sounds great, it really means : you, the worker, work harder after we fire other people and thus are "more productive" because you did the same out put that 2 people did. Sucker. And we all eat this shit up.
I've always thought if I gave better estimates about how long things would take, my schedule would support a decent job.
But black swans seem to be more common than anticipated.
(I also wonder - over your career, do you naturally move up to jobs with higher salaries and higher expectations?)
Only 20 years for me, but this is my observation also.
I think letting devs 2 hours a day, that they can flex so if they wanna use it on Fridays its fine, for personal projects, whether internal or otherwise. Just think of all the random tech debt that could be resolved if devs had 2 hours a day to code anything, including new projects that benefit everyone. Most people can only squeeze out about 6 hours worth of real work anyway. You burn up by the end of the day.
>Just think of all the random tech debt that could be resolved if devs had 2 hours a day to code anything, including new projects that benefit everyone.
regardless of the potential benefits of this plan, zero tech debt would get erased.
imho net tech debt would increase by the 80 20 rule, meaning that you're not going to get more than 80% of the side projects fully wrapped in the 20% of the time that you've allotted to them.
I guess tech debt could even be increased in some cases. Some people shouldn't have too much time available :-)
sounds like bit of a death spiral
as tech gets commoditized the companies are worse, more funding but worse
Capitalism eventually ends up in those with capital making those without capital work until they drop. We are in that eventuality right now.
There are fields of study that agree with you. It is evidence based that treating your workers well, having reasonable quotas and expectations for work life balance, good wages and reinforcement for effort, etc creates conditions where workers perform more efficiently and last longer
But many organizations reject this. Why wouldn’t they? There is a surplus of workers and consumers accept substandard products. Skimp on training, put out crap. Throw workers into the fire, demand everything from them, get furious if they don’t prioritize the company above everything in their life, burn them out, cut them loose, pick another from the stack of resumes
I was talking to someone who works for a startup recently. A colleague died and it was announced on a Friday. They were expected to finish the day. On Monday their replacement started and the team was told to bring this person up to speed asap. No space to grieve, no time to process. Soulless and inhuman. Disgusting and sociopathic behavior
The gig economy is way worse than the author describes.
Gig workers can't advance with the companies they work for.
Gig workers can't build a network with their coworkers because they don't have coworkers...and there's a good chance that they are competing for work with other people working for the same company.
There are dead end day jobs, and then there is gig work.
Casual labor frequently involves working alongside other casual laborers and/or regular employees and/or the person hiring the casual labor.
The gig economy is people working alone.
These days "hustling" = independently rich people trying to build an online following and selling ads/courses/get rich quick schemes/crypto scams.
The gig economy is real, back-breaking work. No "husler" has done a single day of food or package deliveries.
This isn't too different from most low-skill jobs. Most people don't aspire to be assistant manager at McDonalds, they do it for a while, build a resume, then move.
It’s vastly different.
Gig workers are literally disposable robots. You’re part of a computer program. There is no human relationship. At least a McDonald’s worker can talk to their manager.
But there’s a difference between “don’t want” and “structurally locked out”.
Managers at McDonalds can make $50-70K/yr. There is job security, benefits and opportunities for career advancement. Plenty of people start at the very bottom of the ladder flipping burgers and make it all the way to corporate. It's a tired meme that "McDonalds jobs are meant for teenagers". These are all incredibly in-demand jobs. And plenty of fast food chains pay significantly more, sometimes including benefits like college tuition reimbursement.
build a resume
And establish work relationships with other people who can help with future job hunting.
The Uber app doesn’t have an HR department.
Not to mention casual employees at least get some sort of social aspect from their work life. (A slight variation on the networking you mentioned.) Most of my friends, I have through past work environments like shared offices, etc. That would be near-impossible as a gig worker.
Except when it isn't, like Peter Cancro of Jersey Mikes, who started making sandwiches and then bought it in 1975, and in 2024 sold it to Blackrock for $8B.
Or more here: https://www.businessinsider.com/ceos-started-entry-level-at-...
Now, not all people at Jack in the Box are destined to be the CEO, but they do have more opportunities than someone working DoorDash
I think a lot of commenters here are projecting this article onto their work lives as tech office workers, but it's really more about the world of unskilled and semi-skilled service/gig workers, like handymen, furniture assemblers, delivery drivers, and so on.
All these things can be true and they reinforce each other: The jobs suck <-> The people willing to do them aren't very happy, skilled or competent <-> The pay is minuscule. And we can't seem to get out of this Nash Equilibrium.
None of those listed jobs is actually unskilled labor. Driving a big truck around narrow roads is a skill most don’t have, doing it at speed and running up and down to actually move the heavy packages is a skill most don’t have. Assembling furniture is a skill most don’t have, especially with complex engineered wood products that will break if stressed wrong. Handymen is literally just a collection of skilled labor jobs rolled into one guy that can handle small home improvement projects like carpentry, masonry, plumbing, and electrical. These are specialized jobs that have wrongly been labeled “un-skilled” or “semi-skilled” as if knowledge work is the only skill of value…
Very, very little labor is unskilled. In almost any work there is a massive difference in quality and speed between someone who has been doing it for <6 months vs. someone who has been doing it for >3 years.
My theory is that "unskilled labor" was a term of propaganda invented by an earlier generation of business leaders in order to publicly devalue many labor-intensive roles. That generation knew that it was a lie, but the business leaders that followed were taught that "unskilled labor" was axiomatic, and essentially "drank the kool-aid".
The result of this is that the labor pool for many disciplines has been hollowed out because it's no longer financially sustainable for workers to build the skills needed to excel in those roles.
Yes, it's propaganda. If the corporatists can convince people that a previously well-paid job (working in a slaughterhouse, for instance) is actually "unskilled labor," they're one step closer to convincing people it should pay less, and that it's a job no one you know would take so they have to import cheap labor to do it.