AI Doesn't Reduce Work–It Intensifies It

2026-02-0914:44232172hbr.org

One of the promises of AI is that it can reduce workloads so employees can focus more on higher-value and more engaging tasks. But according to new research, AI tools don’t reduce work, they…

Illustration by Eynon Jones

Right now, many companies are worried about how to get more employees to use AI. After all, the promise of AI reducing the burden of some work—drafting routine documents, summarizing information, and debugging code—and allowing workers more time for high-value tasks is tantalizing.


Read the original article

Comments

  • By hansonkd 2026-02-0915:4917 reply

    I've been saying this for the past 2 years. Even think about the stereotypical "996" work schedule that is all the rave in SF and AI founder communities.

    It just takes thinking about it for 5 seconds to see the contradiction. If AI was so good at reducing work, why is it every company engaging with AI has their workload increase.

    20 years ago SV was stereotyped for "lazy" or fun loving engineers who barely worked but cashed huge pay checks. Now I would say the stereotype is overworked engineers who on the midlevel are making less than 20 back.

    I see it across other disciplines too. Everyone I know from sales, to lawyers, etc if they engage with AI its like they get stuck in a loop where the original task is easier but now it revealed 10 more smaller tasks that fill up their time even more so than before AI.

    Thats not to say productivity gains with AI aren't found. It just seems like the gains get people into a flywheel of increasing work.

    • By kibwen 2026-02-0915:566 reply

      Talking about "productivity" is a red herring.

      Are the people leveraging LLMs making more money while working the same number of hours?

      Are the people leveraging LLMs working fewer hours while making the same amount of money?

      If neither of these are true, then LLMs have not made your life better as a working programmer.

      • By pixl97 2026-02-0916:253 reply

        Regardless of that, LLMs could be a Moloch problem.

        That is, if anyone uses it your life will be worse, but if you don't use it then your life will be even worse than those using it.

        Too bad you programmers didn't unionize when you had the chance so you could fight this. Guess you'll have to pull yourself up by your bootstraps.

        • By danaris 2026-02-0920:40

          Well, at least thus far, the only reason my life is worse due to AI is because of all the people who won't stop talking about how amazing it is for vibe-coding everything from scratch despite ample empirical evidence to the contrary.

          Until and unless there are some more significant improvements in how it works with regard to creating code, having strong "manual" programming skills is still paramount.

        • By shinryuu 2026-02-0916:39

          Classical prisoner's dilemma.

        • By MattDamonSpace 2026-02-0921:09

          Thank god there’s no programmers union ffs

      • By coldtea 2026-02-0916:092 reply

        >Are the people leveraging LLMs making more money while working the same number of hours?

        Nobody is getting a raise for using AI. So no.

        >Are the people leveraging LLMs working fewer hours while making the same amount of money?

        Early adopters maybe, as they offload some work to agents. As AI commodifies and is the baseline, that will invert, especially as companies shed people to have the remaining "multiply" their output with AI.

        So the answer will be no and no.

        • By Spivak 2026-02-0917:213 reply

          Well they don't call it being a wage slave for nothing. You aren't getting a raise because you're still selling the same 40-60 hours of your time. If the business is getting productivity wins they'll buy less time via layoffs.

          (USSR National Anthem plays) But if you owned the means of production and kept the fruits of your labor, say as a founder or as a sole proprietor side hustle, then it's possible those productivity gains do translate into real time gains on your part.

          • By coldtea 2026-02-0922:49

            >But if you owned the means of production and kept the fruits of your labor, say as a founder or as a sole proprietor side hustle, then it's possible those productivity gains do translate into real time gains on your part.

            Not even then: since it will commodify your field, and make any rando able to replicate it.

          • By _DeadFred_ 2026-02-0918:59

            The very reason why we object to state ownership, that it puts a stop to individual initiative and to the healthy development of personal responsibility, is the reason why we object to an unsupervised, unchecked monopolistic control in private hands. We urge control and supervision by the nation as an antidote to the movement for state socialism. Those who advocate total lack of regulation, those who advocate lawlessness in the business world, themselves give the strongest impulse to what I believe would be the deadening movement toward unadulterated state socialism.

            --Theodore Roosevelt

          • By nico_h 2026-02-0918:06

            What about coops? Or partnerships?

        • By suttontom 2026-02-110:03

          At some FAANG companies, using AI is now part of the role profile against which your performance and compensation is assessed. So, yes, some engineers are technically getting a raise for using AI.

      • By nradov 2026-02-0917:392 reply

        Did high-level languages and compilers make life better for working programmers? Is it even a meaningful question to ask? Like what would we change depending on the outcome?

        • By Jensson 2026-02-0918:56

          Lots of people have jobs today thanks to high level languages that wouldn't have a job before them, they don't need to know how to manage memory manually.

          Maybe that will happen for LLM programming as well, but I haven't seen many "vibe coder wanted" job ads yet that doesn't also require regular coding skills, so today LLM coding is just a supplementary skill its not a primary skill, so not like higher level languages since those let you skip a ton of steps.

        • By wiseowise 2026-02-0918:06

          > Did high-level languages and compilers make life better for working programmers

          Yes.

      • By elevatortrim 2026-02-0916:101 reply

        Of course not. In the world of capitalism and employment, money earned is not a function of productivity, it is a function of competency. It is all relative.

        • By nico_h 2026-02-0918:141 reply

          Oh you sweet summer child. Under capitalism money is a function of how low you can pay your fungible organic units before they look for other opportunities or worse, unionize (but that can be dealt with relatively easily nowadays). Except for a few exceptional locations and occupations, the scale is tilted waaay against the individual, especially in the land of the free (see H-1B visas, medical debt and workers on food stamps). (See also the record profits or big companies since Covid).

          • By elevatortrim 2026-02-1015:25

            > Oh you sweet summer child.

            Let's keep it civil.

            > ow low you can pay your fungible organic units before they look for other opportunities or worse

            This is what I meant? The more replace-able you are, the lowest you can be paid before you look for other opportunities. And, of course, yes, it is absolutely tilted against the individual.

      • By athrowaway3z 2026-02-0916:14

        Lines of code are not a good metric for productivity.

        Neither are the hours worked.

        Nor is the money.

        Just think of the security guard on site walking around, or someone who has a dozen monitors.

      • By zozbot234 2026-02-0918:381 reply

        > Are the people leveraging LLMs making more money while working the same number of hours?

        > Are the people leveraging LLMs working fewer hours while making the same amount of money?

        Yes, absolutely. Mostly because being able to leverage LLMs effectively (which is not "vibe coding" and requires both knowing what you're doing and having at least some hunch of how the LLM is going to model your problem, whether it's been given the right data, directed properly, etc.) is a rare skill.

        • By Jensson 2026-02-0918:53

          Can you name an example? Who do you know that made more money by using LLM?

    • By asielen 2026-02-0916:451 reply

      I feel this. Since my team has jumped into an AI everything working style, expectations have tripled, stress has tripled and actual productivity has only gone up by maybe 10%.

      It feels like leadership is putting immense pressure on everyone to prove their investment in AI is worth it and we all feel the pressure to try to show them it is while actually having to work longer hours to do so.

    • By pimlottc 2026-02-0916:363 reply

      I laughed at all the Super Bowl commercials showing frazzled office workers transformed into happy loafers after AI has done all their work for them...

      • By SkyPuncher 2026-02-0916:40

        I chuckled at the Genspark one while imaging what the internal discussions must have been.

        Obviously, "take a day off" is not the value prop their selling to buyers (company leadership), but they can't be so on the nose in a public commercial that they scare individual contributors.

      • By CuriouslyC 2026-02-0916:422 reply

        As one of the AI people doing 996(7?) I will at least say I can watch youtube videos/play bass/etc while directing 4-5 agents without much trouble, I have my desktop set up into a terminal grid and I just hover the window I want to talk to and give voice instructions. Since I'm working on stuff I'm into the time passes pleasantly.

        • By aiven 2026-02-1512:04

          Do you feel that you are at least as productive as a "regular" developer working a normal amount of hours? In terms of the quality and quantity of features you have shipped.

        • By grep_name 2026-02-0917:331 reply

          Can you describe what stack you're using for this?

          • By CuriouslyC 2026-02-0917:44

            Hyprland, Voxtype, Claude Code + Pi.

      • By sevensor 2026-02-0916:411 reply

        Yeah, why would billionaires sell us something that lets us chill out all day, instead of using it themselves and capturing the value directly? You claim to have a perpetual motion machine and a Star Trek replicator rolled into one, what do you need me for?

        • By AgentMatt 2026-02-0916:58

          Those ads are not for workers, they're for the employers.

    • By thinkharderdev 2026-02-0916:39

      There's an old saying among cyclists attributed to Greg Lemond: "It doesn't get easier, you just go faster"

    • By joenot443 2026-02-0916:078 reply

      I don't think it's super complicated. I think that prompting takes generally less mental energy than coding by hand, so on average one can work longer days if they're prompting than if they were coding.

      I can pretty easily do a 12h day of prompting but I haven't been able to code for 12h straight since I was in college.

      • By zeroonetwothree 2026-02-0916:322 reply

        For me it’s the opposite. Coding I enter flow and can do 5 hours at a stretch while barely noticing.

        Prompting has so many distractions and context switches I get sick of it after an hour.

        • By az09mugen 2026-02-0922:14

          Same here. Context switching is a real flow-killer.

        • By joenot443 2026-02-1213:33

          I wish I had a brain like that :)

          Coding has always been very tiring for me. A 4h onsite is genuinely exhausting.

      • By treetalker 2026-02-0916:14

        Isn’t the grander question why on earth people would tolerate, let alone desire, more hours of work every day?

        The older I get, the more I see the wisdom in the ancient ideas of reducing desires and being content with what one has.

        ---

        Later Addition:

        The article's essential answer is that workers voluntarily embraced (and therefore tolerated) the longer hours because of the novelty of it all. Reading between the lines, this is likely to cause shifts in expectation (and ultimately culture) — just when the novelty wears off and workers realize they have been duped into increasing their work hours and intensity (which will put an end to the voluntary embracing of those longer hours and intensity). And the dreaded result (for the poor company, won't anyone care about it?!) is cognitive overload, hence worker burnout and turnover, and ultimately reduced work quality and higher HR transaction costs. Therefore, TFA counsels, companies should set norms regarding limited use of generative language models (GLMs, so-called "AI").

        I find it unlikely that companies will limit GLM use or set reasonable norms: instead they'll crack the whip!

        ---

        Even Later Addition:

        As an outsider, I find it at once amusing and dystopian to consider the suggestions offered at the end of the piece: in the brutalist, reverse-centaur style, workers are now to be programmed with modifications to their "alignment … reconsider[ation of] assumptions … absor[ption of] information … sequencing … [and] grounding"!

        The worker is now thought of in terms of the tool, not vice versa.

      • By jplusequalt 2026-02-0916:24

        >so on average one can work longer days if they're prompting than if they were coding

        It's 2026 for god's sake. I don't want to work __longer__ days, I want to work __shorter__ days.

      • By SkyPuncher 2026-02-0916:43

        I agree. However, for me, I'm finding that I'm drastically leveling up what I'm doing in my day to day. I'm a former founder and former Head of Engineering, back in an IC role.

        The coding is now assumed "good enough" for me, but the problem definition and context that goes into that code aren't. I'm now able to spend more time on the upstream components of my work (where real, actual, hard thinking happens) while letting the AI figure out the little details.

      • By Ygg2 2026-02-0916:151 reply

        If you're in the office for 12h it won't matter if you're proompting, pushing pens or working your ass off. You gave that company 12h of your life. You're not getting those back.

        • By joenot443 2026-02-1213:31

          This is a good point. Thankfully I WFH; I agree that 12h in one office is too long.

      • By packetlost 2026-02-0916:13

        While I agree with the idea that prompting is easier to get started, is it actually less work. More hours doesn't mean they're equally as productive. More, lower quality hours just makes work:life balance worse with nothing to show for it.

      • By lm28469 2026-02-0916:17

        > I can pretty easily do a 12h day of prompting

        Do you want to though?

      • By jvanderbot 2026-02-0916:10

        That's a bingo.

        Additionally, I can eke out 4 hrs really deep diving nowadays, and have structured my workday around that, delegating low-mental-cost tasks to after that initial dive. Now diving is a low enough mental cost that I can do 8-12hrs of it.

        It's a bicycle. Truly.

    • By throwawaysleep 2026-02-0916:032 reply

      > If AI was so good at reducing work, why is it every company engaging with AI has their workload increase.

      Throughout human history, we have chosen more work over keeping output stable.

      • By coldtea 2026-02-0916:103 reply

        Throughout human history we were never given the choice. We were forced into it like cattle.

        • By alex43578 2026-02-0917:301 reply

          You could always choose to work less, but would have less as a result.

          These days, that choice is more viable than ever, as the basic level of living supported by even a few hours a week of minimum wage affords you luxuries unimaginable 50 or 100 years ago.

          • By aiven 2026-02-1512:16

            the system is built in a way that you need to run in order to stay still. you cant just "work less" because you will not have stability this way. new slaves using new shiny tools will work more and more and outcompete you in whatever domain you are working. so you will be forced to work more, to learn more in order just to keep your current lifestyle.

            without basic morality and gov regulation capitalism would exploit humans like cattle. 996 should be outlawed everywhere

        • By whaleidk 2026-02-0916:16

          See a lot of people on this site doing it willingly. I think a lot of people will always choose perceived convenience over anything

        • By Ygg2 2026-02-0916:34

          You are correct; however, it should be noted that even the top 1% overworks themselves to some extent (e.g. American CEOs work on average 63h per week). They do it for a different reason, though.

      • By danaris 2026-02-0920:43

        "Throughout human history", approximately 90% of all work was to produce food. More work meant more food, which meant more people could survive.

        We don't have to do that anymore. We have enough food for everyone.

        Now, we're just being whipped to work harder to produce more profits for the people who already have more than they will ever be able to spend. We're just increasing their dollar-denominated high scores.

    • By throwaw12 2026-02-0916:39

      > If AI was so good at reducing work, why is it every company engaging with AI has their workload increase.

      Isn't it simple?

      Because of competition, which is increased because of entry barrier is lowered a lot for building new software products.

      You output a lot, so do your competition.

    • By pllbnk 2026-02-0920:05

      I have seen this written many times and can't shake off this feeling myself; I feel more productive using LLMs but I am not sure I really am. I even feel quite overloaded right now with all the ideas that I could do. In the past I also had many ideas but they were quickly set aside understanding that there's not enough time for everything. Now, it usually starts with prompting and I get into a rabbit hole. In the end, it feels like a lot of words have been exchanged but the results are nowhere to be found.

    • By SkyPuncher 2026-02-0916:38

      > If AI was so good at reducing work, why is it every company engaging with AI has their workload increase.

      Heavy machinery replaces shovels. It reduces workload on the shovel holders, However, someone still needs to produce the heavy machinery.

      Some of these companies are shovel holders, realizing they need to move up stream. Some of these companies are already upstream, racing to bring a product to market.

      The underlying bet for nearly all of these companies is "If I can replace one workflow with AI, I can repeat that with other workflows and dominate"

    • By bschwindHN 2026-02-0915:57

      Same story with hardware and software. Hardware gets more efficient and faster, so software devs shove more CPU intensive stuff into their applications, or just go lazy and write inefficient code.

      The software experience is always going to feel about the same speed perceptually, and employers will expect you to work the same amount (or more!)

    • By tangjurine 2026-02-1023:26

      What does 20 back mean?

    • By natnatenathan 2026-02-0916:32

      I think you're missing the point. The folks pushing 996 (and willingly working 996) feel like they are in a land rush, and that AI is going to accelerate their ability "take the most amount of land" No one is optimizing for the "9 to 5" oriented engineer.

    • By skybrian 2026-02-0916:16

      Maybe ask the friendly AI about reducing project scope? But we probably won’t if we’re having too much fun.

    • By DaedalusII 2026-02-0915:53

      now everyone gets to be a manager !

    • By bingohbangoh 2026-02-0917:30

      Many people in silicon valley truly Believe that AI will take over everything. Therefore, this is the last chance to get in so you better be working really really hard.

      There's a palpable desperation that makes this wave different from mobile or cloud. It's not about making things better so much as its about not being left behind.

      I'm not sure of the reason for this shift. It has a lot of overlap with the grindset culture you see on Twitter where people caution against taking breaks because your (mostly imaginary) competition may catch up with you.

    • By lkbm 2026-02-0916:31

      Jevons Paradox applies to labor.

    • By seanmcdirmid 2026-02-0915:532 reply

      996 is a Chinese term, not American.

      There is a lot of work to do, just because you are doing more work with your time doesn’t mean you can somehow count that as less work.

      • By hansonkd 2026-02-0915:561 reply

        I've only seen it in job postings and linkedin posts from SF founders.

        • By seanmcdirmid 2026-02-0920:191 reply

          I'm not sure if you are serious, but 996 was invented by the Chinese tech industry in 2019 and I've never heard it to describe anything in the USA (well, until today). Wiki:

          https://en.wikipedia.org/wiki/996_working_hour_system

          Note all the examples are also Chinese. There is a recent edit at the bottom of the page though:

          Occurrence in US tech companies In 2025, amidst the AI boom, reports have emerged of startup tech companies in San Francisco / Silicon Valley requiring "9-9-6" work schedules, with the goal of building things quickly in a competitive market.[15][16][17] California Labor Code §515.5 exempts employers from providing many software engineers with overtime pay.

      • By Bullfight2Cond 2026-02-0915:551 reply

        china outlawed it

  • By btbuildem 2026-02-0915:227 reply

    I think the article nails it, on multiple counts. From personal experience, the cognitive overload is sneaky, but real. You do end up taking on more than you can handle, just because your mob of agents can do the minutia of the tasks, doesn't free you from comprehending, evaluating and managing the work. It's intense.

    • By maccard 2026-02-0915:494 reply

      For a very small number of people the hard part is writing the code. For most of us, it’s writing the correct code. AI generates lots of code but for 90% of my career writing more code hasn’t helped.

      > you do end up taking on more than you can handle, just because your mob of agents can do the minutia of the tasks, doesn’t free you from comprehending, evaluating and managing the work

      I’m currently in an EM role and this is my life but with programmers instead of AI agents.

      • By snovv_crash 2026-02-0916:01

        Also EM and it feels like now I have a team of juniors on my personal projects, except they need constant micromanaging in a way I never would for real people.

      • By jakubtomanik 2026-02-0916:462 reply

        Does AI write 100% correct code? No, but under my watch it writes code that is more correct than anything that anyone else on the team contributed in past year or more. Even better when it is wrong I don’t have to spend literal hours arguing with it nor I have to be mindful how what I’m saying affects others feelings so I get to spend more time on actual work. All in all it’s a net positive

        • By bunnyman 2026-02-0918:42

          I agree.

          I provide specific instructions, gotchas when prompting the Agent to write the code. I churn out more instructions quickly by using my voice.

          Yes it makes mistakes, but it can correct them quickly as well. This correction loop takes more time if it is a human in my team working.

        • By maccard 2026-02-0917:53

          I never said it’s not a net positive - I said that writing more code won’t solve the problem.

          > under my watch it writes code that is more correct than anything that anyone else on the team contributed in past year or more

          This I don’t believe.

      • By throwaw12 2026-02-0916:431 reply

        > For most of us, it’s writing the correct code.

        I am not sure about this statement, aren't we always cutting the corners to make things ~95% correct at scale to meet deadlines with our staffing/constraints?

        Most of us, who doesnt work on Linux kernel, space shuttles, and near realtime OSes, we were writing good enough code to meet business requirements

        • By maccard 2026-02-0917:54

          My point is that coming up with the business requirements was always the hard part (unless you’re writing a scheduler)

      • By eloisant 2026-02-0916:11

        So you're saying AI doesn't help, and having reports is just like using AI (which you said doesn't help).

        What's stopping you from becoming an IC and producing as much as your full team then? What's the point of having reports in this case?

    • By Tade0 2026-02-0915:44

      Started referring to it as "speed of accountability".

      A responsible developer will only produce code as fast as they can sign it off.

      An irresponsible one will just shit all over the codebase.

    • By wnolens 2026-02-0915:43

      This has been my experience too. I feel freed up from the "manual labor" slice of software development and can focus on more interesting design problems and product alignment, which feels like a bit of a drug right now that i'm actually working harder and more hours.

    • By Molitor5901 2026-02-0915:471 reply

      I'm not sure I would agree in totem. Freeing the minutia allows for a higher cognitive load on the bigger picture. I use AI primarily for research gathering, and refining of what I have, which has freed up a lot of time to focus on the bigger issues, and specifically in my case, zeroing in on the diamond in the rough.

      • By btbuildem 2026-02-113:37

        > Freeing the minutia allows for a higher cognitive load on the bigger picture

        I think we do agree -- the higher "big picture" cognitive load feels more expensive than the minutia cognitive load

    • By skeeter2020 2026-02-0916:47

      do you think this is inherent in AI-related work, or largely due to the current state of the world, where it's changing rapidly and we're struggling to adapt our entire work systems to the shifting landscape, while under intense (and often false) pressures to "disrupt ourselves"? Put another way, if this was similarly true twenty years ago with the rise of Google, is it still true today?

    • By moomoo11 2026-02-0916:402 reply

      That is fun though.

      I hated the old world where some boomer-mentality "senior" dev(s) would take days or weeks to deliver ONE fucking thing, and it would still have bugs and issues.

      I like the new world where individuals can move fast and ship, and if there are bugs and issues they can be resolved quickly.

      The boomer-mentality and other mids get fired which is awesome, and orgs become way leaner.

      Just because there are excess of CS majors and programmers doesn't mean we need to make benches that they can keep warm.

      • By sumtechguy 2026-02-0916:451 reply

        That has more to do with where you work than AI.

        Some places have military grade paperwork where mistakes are measured in millions of dollars per min. Others places are 'just push it in fix it later'.

        AI is not going to change that. That is a people problem. Not something you can automate away. But you can fire your way out of it.

        • By moomoo11 2026-02-0917:101 reply

          For sure. I was replying to people not in that, it seems from the commenters here that is where they (and me) have worked or are working now. Whether it is their own company or some other place.

          I've only ever worked at places that are at the bleeding edge and even there we had total slackers.

          • By anukin 2026-02-100:08

            That’s totally orthogonal to what the OP is responding to though. Military software can be bleeding edge as well as extremely susceptible to error prone code which means you need to test more. Similar are the cases with financial softwares which are usually written in ocaml etc. observing your ability to comprehend, the places where you work must be “bleeding” profusely.

      • By wiseowise 2026-02-109:021 reply

        > I hated the old world where some boomer-mentality "senior" dev(s) would take days or weeks to deliver ONE fucking thing, and it would still have bugs and issues.

        What does that even mean? Are you begrudged manager or enthusiastic youngster who is upset that “boomers” are not killing themselves by juggling thousands of tasks ADHD-style?

    • By varispeed 2026-02-0915:422 reply

      "Explain to me like I am five what you just did"

      Then "Make a detailed list of changes and reasoning behind it."

      Then feed that to another AI and ask: "Does it make sense and why?"

      • By salawat 2026-02-0916:14

        Garbage In, Garbage Out. If you're working under the illusion any tool relieves you from the burden of understanding wtf it is you're doing, you aren't using it safely, and you will offload the burden of your lack of care on someone else down the line. Don't. Ever. Do. That.

      • By coldtea 2026-02-0916:12

        Then get rid of. They can keep 1/10 the humans, and have them run such agents.

  • By bryanlarsen 2026-02-0915:511 reply

    I've started calling it "revenge of the QA/Support engineers", personally.

    Our QA & Support engineers have now started creating MR's to fix customer issues, satisfy customer requests and fix bugs.

    They're AI sloppy and a bunch of work to fix up, but they're a way better description of the problem than the tickets they used to send.

    So now instead of me creating a whole bunch of work for QA/Support engineers when I ship sub-optimal code to them, they're creating a bunch of work for me by shipping sub-optimal code to me.

    • By skybrian 2026-02-0915:551 reply

      I wonder how well a coding agent would do if you asked one to review the change and then to rewrite the merge request to fix the things it criticized?

      • By bryanlarsen 2026-02-0916:061 reply

        It does quite well and definitely catches/fixes things I miss. But I still catch significant things it misses. And I am using AI to fix the things I catch.

        Which is then more slop I have to review.

        Our product is not SaaS, it's software installed on customer computers. Any bug that slips through is really expensive to fix. Careful review and code construction is worth the effort.

HackerNews