ATMs didn't kill bank teller jobs, but the iPhone did

2026-03-1214:48185234davidoks.blog

There's a lot more to replacing labor than just automating tasks

A few months ago, J. D. Vance, sitting vice president of the United States, gave an interview to Ross Douthat of the New York Times. During that interview, Vance and Douthat had an interesting exchange:

Douthat: How much do you worry about the potential downsides of AI? Not even on the apocalyptic scale, but on the scale of the way human beings respond to a sense of their own obsolescence? These kinds of things.

Vance: So, one, on the obsolescence point, I think the history of tech and innovation is that while it does cause job disruptions, it more often facilitates human productivity as opposed to replacing human workers. And the example I always give is the bank teller in the 1970s. There were very stark predictions of thousands, hundreds of thousands of bank tellers going out of a job. Poverty and commiseration.

What actually happens is we have more bank tellers today than we did when the ATM was created, but they’re doing slightly different work. More productive. They have pretty good wages relative to other folks in the economy.

I tend to think that is how this innovation happens.

There are two interesting things about what Vance said, both relating to the example that he chose about bank tellers and ATMs.

The first thing is what it tells us about who J. D. Vance is. The bank teller story—how ATMs were predicted to increase bank teller unemployment, but in fact did not—isn’t a story you’ll hear from politicians; in fact, for a long time, Barack Obama would claim, incorrectly, that ATMs had decreased the number of bank tellers, in order to suggest that the elevated unemployment rate during his presidency was due to productivity gains from technology. I’ve never heard a politician cite the bank teller story before: but I have seen the bank teller story cited in a lot of blogs. I’ve seen it cited, for example, by Scott Alexander and Matt Yglesias and Freddie deBoer; and I’ve heard it, upstream of the humble bloggers, from such fine economists as Daron Acemoglu and David Autor. The story of how ATMs didn’t automate bank tellers is, indeed, something of a minor parable of the economics profession. You can see it encapsulated in this wonderful graph from the economist James Bessen:

From James Bessen, Learning by Doing (2015)

So Vance’s choice of example tells us the same thing that his appearance on the Joe Rogan Experience did, which is that J. D. Vance—however much he might like to hide it—really, really loves reading blogs.

But the other thing about the bank teller story that Vance cites is that it’s wrong. We do not, contrary to what Vance claims, have “more bank tellers today than we did when the ATM was created”: we in fact have far fewer. The story he tells Douthat might have been true in 2000 or 2005, but it hasn’t been true for years. Bank teller employment has fallen off a cliff. Here is a graph of bank teller employment since 2000:

So what happened to bank tellers? Autor, Bessen, Vance, and the like are right to point out that ATMs did not reduce bank teller employment. But they miss the second half of the story, which is that another technology did. And that technology was the iPhone. The huge decline in bank teller employment that we’ve seen over the last 15-odd years is mainly a story about iPhones and what they made possible.

But why? Why did the ATM, literally called the automated teller machine, not automate the teller, while an entirely orthogonal technology—the iPhone—actually did?

The answer, I think, is complementarity.

In my last piece, on why I don’t think imminent mass job loss from AI is likely, I talked a lot about complementarity. The core point I made was that labor substitution is about comparative advantage, not absolute advantage: the relevant question for labor impacts is not whether AI can do the tasks that humans can do, but rather whether the aggregate output of humans working with AI is inferior to what AI can produce alone. And I suggested that given the vast number of frictions and bottlenecks that exist in any human domain—domains that are, after all, defined around human labor in all its warts and eccentricities, with workflows designed around humans in mind—we should expect to see a serious gap between the incredible power of the technology and its impacts on economic life.

That gap will probably close faster than previous gaps did: AI is not “like” electricity or the steam engine; an AI system is literally a machine that can think and do things itself. But the gap exists, and will exist even as the technology continues to amaze us with what it can now accomplish.

But by talking about why ATMs didn’t displace bank tellers but iPhones did, I want to highlight an important corollary, which is that the true force of a technology is felt not with the substitution of tasks, but the invention of new paradigms. This is the famous lesson of electricity and productivity growth, which I’ll return to in a future piece. When a technology automates some of what a human does within an existing paradigm, even the vast majority of what a human does within it, it’s quite rare for it to actually get rid of the human, because the definition of the paradigm around human-shaped roles creates all sorts of bottlenecks and frictions that demand human involvement. It’s only when we see the construction of entirely new paradigms that the full power of a technology can be realized. The ATM substituted tasks; but the iPhone made them irrelevant.

Let’s start with the actual story of how the ATM affected bank tellers.

In the 1940s or ‘50s, if you owned a bank, you needed physical locations—these were your “branches”—and you needed people to staff those branches. You’d have your bank managers, your loan officers, and you’d have your bank tellers. When a customer wanted to deposit a check or check their balance or make a withdrawal, they’d talk to one of the tellers; and because this was the highest-volume type of interaction that people would have with your bank, you’d have to hire tellers in huge numbers.

The bank teller thus became a classic “mid-skill” occupation. It required a high school diploma and about a month of on-the-job training around counting cash and processing checks and settling accounts at the end of each day, but it didn’t require a college degree. And because they handled such a core part of the banking workflow, banks required a huge number of tellers: the average bank branch in an urban area might employ about two dozen people as tellers.

But in the 1950s and ‘60s, as Western economies were booming and enjoying their magnificent postwar economic expansions, labor was getting much more expensive. This was a good thing—it was simply the other side of rising wages—but it was also painful for enterprises that relied on lots of manual labor. And so we find that all the fashionable business concepts of the 1950s and ‘60s revolved around reducing labor costs to the maximum extent possible. It’s no coincidence that it was in the 1950s that the word “automation” entered the English language.

It used to be, for instance, that when you went shopping you’d have your stuff retrieved for you by a small army of clerks running around the shop; indeed that’s still how it’s done in places like India with an abundance of cheap labor. But humans were getting expensive in the 1950s and ‘60s, so everyone wanted to reduce the human component, and so in that period you saw the rise of supermarkets and discount stores, where the whole innovation is getting the stuff yourself. (Sam Walton’s Made in America is a good record of what that revolution was like from the inside; consumers tended to be quite happy with the whole thing, since corporate savings could be passed on in the form of cheaper goods.) And it’s the same reason why in the ‘50s and ‘60s you saw the rise of laundromats, vending machines, self-service gas stations, and “fast food” restaurants like McDonald’s.

So in the 1950s and ‘60s, the goal of every single business that employed humans was to find ways to replace humans with machines: in economic terms, to substitute capital for labor. And even though they were a relatively labor-light business to start with, this was true of banks as well. This was the case in the United States, but it was actually particularly true in Europe, where labor unrest among bank employees was an ongoing headache. (Financial sector employees were actually some of the most militant of all white-collar workers during this period: because of prolonged strikes by bank employees, Irish banks were closed 10 percent of the time between 1966 and 1976.)

Enter the computer. In the 1960s, to the great relief of bank management teams, it became possible to imagine that computers could be used to reduce the role of human labor in the banking process.

There were two key innovations that made this possible. The first was IBM’s invention of the magnetic stripe card in the 1960s: this was a thin strip of magnetized tape, bonded to a plastic card, that could encode and store data like account numbers, and which could be read by a machine when swiped through a card reader. And the second was Digital Equipment Corporation’s pioneering minicomputer, which dramatically reduced the price and size of general-purpose computing.

And so, bringing those two innovations together, you could finally imagine a machine that could do, programmatically, what a human teller might do: that could identify a customer automatically, via the magnetic stripe; that could communicate with the central servers of a bank to verify the customer’s account balance; and that could dispense cash or accept deposits accordingly.

And so in the 1960s, teams working concurrently in Sweden and the United Kingdom pioneered the earliest versions of what would eventually become known as the automated teller machine. These were primitive devices—they had the tendency to “eat” payment cards and to dispense incorrect amounts of money, and they didn’t see much uptake—but by the late 1960s it was clear where things were going. IBM, at that point the largest technology company in the world, soon took interest in the technology, and for the next few years groups of IBM engineers refined the technological and infrastructural layer to make the ATM functional.

And by the mid-1970s, after years of technical investment, the ATM was finally ready for prime time. By that point IBM, then enjoying its peak of influence, had decided the market wasn’t worth the investment, and so it ceded the nascent ATM industry to a company called Diebold.1

And in 1977 the ATM finally got its big break. Citibank, then the second-largest deposit bank in the United States, decided to make ATMs the subject of a large push: they spent a large sum installing the machines across its deposit branches. The New York Times reported it as “a $50 million gamble that the consumer can be wooed and won with electronic services.” But the response was tepid. In the same New York Times article, we encounter a scene from a bank branch in Queens where one of Citibank’s ATMs was installed: “most of the customers,” the article reports, “preferred to wait in line a few moments and deal with the teller rather than test the new machines.”

But Citibank’s gamble paid off. Consumer wariness toward ATMs turned out to be temporary: the advantages of the ATM over the human teller were obvious. Running an ATM was cheaper than paying a human—each ATM transaction cost the bank just 27 cents, compared to $1.07 for a human teller—and this could either be passed to the consumer in the form of lower fees or simply kept as profit. And ATMs were also just more convenient. An ATM could do in 30 seconds what would take a human teller at least a few minutes; and while a human teller was only available during business hours, ATMs could be used at any time of day.

And the benefits for the bank were even greater. ATMs were expensive to install, but once they were installed they were wonderfully lucrative and had low maintenance costs. The fee opportunities were wonderful, since banks could charge fees on out-of-network transactions. And since ATMs were not legally considered to be branches, banks could deploy ATMs without running afoul of banking laws that restricted interstate bank branching.

All of this meant that banks had a really strong incentive to put ATMs everywhere. And so they did. In 1975 there were about 31 ATMs per one million Americans; by the year 2000, that number had grown to 1,135, a 37-fold increase in just 25 years.

And what did this do to the bank tellers?

The natural expectation is that ATMs would make human bank tellers obsolete, or at least strongly reduce demand for bank teller jobs. And indeed the number of bank tellers per branch declined significantly: from 21 tellers per branch to about 13 per branch once ATMs had hit saturation. But this decline in teller intensity corresponded with an increase in aggregate teller employment. The number of ATMs per capita grew dramatically after 1975; but the number of bank tellers increased along with it. Bank tellers did become a smaller share of total employment, since the increase in bank teller employment was smaller than the increase in other occupations; but at no point in the period between 1970 and 2010 did the number of bank tellers actually enter a prolonged decline.

Why is that? Why did ATMs, which automated the bulk of the teller’s job, not lead to a decrease in teller employment?

We find the most elegant explanation in a paper from David Autor:

First, by reducing the cost of operating a bank branch, ATMs indirectly increased the demand for tellers: the number of tellers per branch fell by more than a third between 1988 and 2004, but the number of urban bank branches (also encouraged by a wave of bank deregulation allowing more branches) rose by more than 40 percent. Second, as the routine cash-handling tasks of bank tellers receded, information technology also enabled a broader range of bank personnel to become involved in “relationship banking.” Increasingly, banks recognized the value of tellers enabled by information technology, not primarily as checkout clerks, but as salespersons, forging relationships with customers and introducing them to additional bank services like credit cards, loans, and investment products.

We thus have a classic case of the Jevons effect. Teller labor was an input into an output that we can call “financial services.” ATMs allowed us to produce that output more efficiently and economize on the use of the labor input. But demand for the output was sufficiently elastic that more efficient production meant more demand: and demand increased to the point that there was actually greater demand for the labor input as well. And—this part is not quite the classic Jevons effect—the greater demand suggested to banks that there had been certain functions that were previously considered incidental to the teller job, like “relationship banking,” which were actually quite useful. And so ATMs were a truly complementary technology for the bank teller.

By the 2010s, people had begun to notice that there had been no mass unemployment of bank tellers. In 2015, James Bessen published a book called Learning by Doing, using the non-automation of bank tellers as a central example; soon it became a sort of load-bearing parable about what Matt Yglesias called “the myth of technological unemployment.” From Bessen the story diffused to Autor and Acemoglu; then to the economics bloggers; then to people like Eric Schmidt, who cited the ATM story in 2017 as one reason why he was a “denier” on the question of technological job loss. And they were right: ATMs really didn’t reduce bank teller employment.

But there was an ironic element to all of this: at the exact moment that people started talking about how technology had not displaced bank tellers, it stopped being true.

In the 2010s, bank teller employment entered a period of prolonged decline. This was not a product of the financial crisis that peaked in 2008: bank teller employment was roughly the same in 2010 as it had been in 2007. And the decline was not rapid but gradual. It continued even as banks returned to full health as the Great Recession abated. First there was a severe decline that started after 2010; then a slight recovery at the end of the decade; and then a collapse during the COVID years from which bank teller employment has never recovered. In 2010, there were 332,000 full-time bank tellers in the United States; by 2016, there were 235,000; by 2022, there were just 164,000.

This was not a long-delayed ATM shock: the ATM had reached full saturation long before. It was, rather, the effect of another technology, one that had nothing to do with banking. It was a product of the iPhone.

Apple first introduced the iPhone in 2007. By 2010, it was clear that the iPhone-style smartphone, with a touchscreen and an app store, was going to be the defining technological paradigm of the years to come: people were going to conduct huge portions of their life through the prism of the smartphone, which soon became simply “the phone.” And just as more forward-thinking institutions like Citibank knew in the 1970s that ATMs were the future, the smarter banks knew by the early 2010s that the future lay in what they called mobile banking.

The mobile banking vision was simple: the banking customers of the future would do all their banking via their banks’ mobile apps. They would buy things via payment cards or, later, via Apple Pay; they would check their balance or make deposits through the banking app; the customer’s relationship with the bank would be mediated entirely via the app. In this new world, there was no reason for the physical bank location to exist. Indeed there were new entrants, like Revolut or Klarna, that existed entirely as mobile apps. The branch was a thing of the past.

Mobile banking succeeded much more rapidly than the ATM did—which is remarkable, considering that mobile banking was a much bigger change than the ATM. I remember, as a kid, opening my first bank account at the Chase branch in my hometown, and the excitement of occasionally visiting there to deposit any checks I might have. I’m still a Chase customer, and I interact frequently with my Chase account for all sorts of reasons. But it’s been many years since I visited a physical Chase location. My relationship with Chase has transcended any need for the branch. I don’t think I’m alone in this: the Chase branch in my hometown, where I would once deposit checks, closed in 2023. The building now houses a doctor’s office.

And so the rise of mobile banking removed any real reason to have bank branches. Visits to bank branches declined dramatically throughout the 2010s, and banks aggressively redesigned the banking experience around the digital interface. The number of commercial bank branches per capita peaked in 2009 and has fallen by nearly 30 percent since, with most of the decline occurring in wealthier areas that were more likely to adopt digital banking first. Between 2008 and 2025, Bank of America, which at some point surpassed Citibank as the second-largest deposit bank in the United States after Chase, closed about 40 percent of its branches. Online banking had been around since the 1990s, Bank of America’s CEO said, but the iPhone was a “game changer” that “effectively allowed customers to carry a bank branch in their pockets.”

And as the branch disappeared, so did the teller. ATM had been an innovation within the existing world of physical banking, and thus its replacement of the bank teller could inevitably only be partial; as long as people were still visiting the bank branch, it was useful to repurpose tellers as “relationship bankers.” But when branch visits declined that stopped making sense. The iPhone represented a wholly different way of banking, and within it there was no real need for the bank teller: and so a large institution like Bank of America was able to reduce its headcount from 288,000 in 2010 to 204,000 in 2018.

Of course, the transition to mobile banking also created jobs: banks now needed software developers to build and maintain the digital interface, and they needed customer service representatives to handle any problems that might emerge. And so a “mid-skill” job was replaced by a thin stratum of “high-skill” jobs and a vast army of “low-skill” ones. The term for this in labor economics is “job polarization.”

So that’s the irony of the parable of the bank teller. Technology did kill the bank teller job. It wasn’t the ATM that did it, but the iPhone.

I think the story of the ATM and the iPhone offers us an important lesson about technology and its impacts on labor markets. Because Vance, of course, wasn’t really talking about ATMs when he talked about ATMs; he was talking about AI.

The lesson is worth stating plainly. The ATM tried to do the teller’s job better, faster, cheaper; it tried to fit capital into a labor-shaped hole; but the iPhone made the teller’s job irrelevant. One automated tasks within an existing paradigm, and the other created a new paradigm in which those tasks simply didn’t need to exist at all. And it is paradigm replacement, not task automation, that actually displaces workers—and, conversely, unlocks the latent productivity within any technology. That’s because as long as the old paradigm persists, there will be labor-shaped holes in which capital substitution will encounter constant frictions and bottlenecks.

This has, I think, serious implications for how we’re thinking about AI.

People in AI frequently talk about the vision of AI being a “drop-in remote worker”: AI systems that can be inserted into a workflow, learn it, and eventually do it on the level of a competent human. And they see that as the point where you’ll start to see serious productivity gains and labor displacement.

I am not a “denier” on the question of technological job loss; Vance’s blithe optimism is not mine. But I’m skeptical that simply slotting AI into human-shaped jobs will have the results people seem to expect. The history of technology, even exceptionally powerful general-purpose technology, tells us that as long as you are trying to fit capital into labor-shaped holes you will find yourself confronted by endless frictions: just as with electricity, the productivity inherent in any technology is unleashed only when you figure out how to organize work around it, rather than slotting it into what already exists. We are still very much in the regime of slotting it in. And as long as we are in that regime, I expect disappointing productivity gains and relatively little real displacement.

The real productivity gains from AI—and the real threat of labor displacement—will come not from the “drop-in remote worker,” but from something like Dwarkesh Patel’s vision of the fully-automated firm. At some point in the life of every technology, old workflows are replaced by new ones, and we discover the paradigms in which the full productive force of a technology can best be expressed. In the past this has simply been a fact of managerial turnover or depreciation cycles. But with AI it will likely be the sheer power of the technology itself, which really is wholly unlike anything that has come before, and unlike electricity or the steam engine will eventually be able to build the structures that harness its powers by itself.

I don’t think we’ve really yet learned what those new structures will look like. But, at the limit, I don’t quite know why humans have to be involved in those: though I suspect that by the time we’re dealing with the fully-automated organizations of the future, our current set of concerns will have been largely outmoded by new and quite foreign ones, as has always been the case with human progress.

But, however optimistic I might be about the human future, I don’t think it’s worth leaning on the history of past technologies for comfort. The ATM parable is a comforting story; and in times of uncertainty and fear we search naturally for solace and comfort wherever it may come. But even when it comes to bank tellers, it’s only the first half of the story.


Read the original article

Comments

  • By paxys 2026-03-1215:1814 reply

    One key line about ATMs is buried deep in the article:

    > the number of tellers per branch fell by more than a third between 1988 and 2004, but the number of urban bank branches (also encouraged by a wave of bank deregulation allowing more branches) rose by more than 40 percent

    So, ATMs did impact bank teller jobs by a significant amount. A third of them were made redundant. It's just that the decrease at individual bank branches was offset by the increase in the total number of branches, because of deregulation and a booming economy and whatever else.

    A lot of AI predictions are based on the same premise. That AI will impact the economy in certain sectors, but the productivity gains will create new jobs and grow the size of the pie and we will all benefit.

    But will it?

    • By whatisthiseven 2026-03-1218:003 reply

      > But will it?

      My prediction is no, because productivity gains must benefit the lower classes to see a multiplier in the economy.

      For example, ATMs being automated did cause a negative drop in teller jobs, but fast money any time does increase the velocity of money in the economy. It decreases savings rate and encourages spending among the class of people whose money imparts the highest multiplier.

      AI does not. All the spending on AI goes to a very small minority, who have a high savings rate. Junior employees that would have productively joined the labor force at good wages, must now compete to join the labor force at lower wages, depressing their purchasing power and reducing the flow of money.

      Look at all the most used things for AI: cutting out menial decisions such as customer service. There are no "productivity" gains for the economy here. Each person in the US hired to do that job would spend their entire paycheck. Now instead, that money goes to a mega-corp and the savings is passed on to execs. The price of the service provided is not dropping (yet). Thus, no technology savings is occurring, either.

      In my mind, the outcomes are:

      * Lower quality services

      * Higher savings rate

      * K-shaped economy catering to the high earners

      * Sticky prices

      * Concentration of compute in AI companies

      * Increased price of compute prevents new entrants from utilizing AI without paying rent-seekers, the AI companies

      * Cycle continues all previous steps

      We may reach a point where the only ones able to afford compute are AI companies and those that can pay AI companies. Where is the innovation then? It is a unique failure outcome I have yet to see anyone talk about, even though the supply and demand issues are present right now.

      • By mullingitover 2026-03-1218:182 reply

        > My prediction is no, because productivity gains must benefit the lower classes to see a multiplier in the economy.

        Baumol's cost disease hurts the lower classes by restricting their access to services like health care and education, and LLMs/agents make it possible to increase productivity in these areas in ways which were once unimaginable. The problem with services is that they're typically resistant to productivity growth, and that's finally changing.

        If you can get high quality medical advice for effectively nothing, if you can get high quality individualized tutoring for free, that's a pretty big game changer for a lot of people. Prices on these services have been rising to the stratosphere over the past few decades because it's so difficult to increase the productivity of individual medical practitioners and educators. We're entering an era that could finally break this logjam.

        • By bwestergard 2026-03-1218:302 reply

          "Baumol's cost disease hurts the lower classes by restricting their access to services like health care and education, and LLMs/agents make it possible to increase productivity in these areas in ways which were once unimaginable."

          You've expressed very clearly what LLMs would have to do in order to be economically transformative.

          "If you can get high quality medical advice for effectively nothing, if you can get high quality individualized tutoring for free, that's a pretty big game changer for a lot of people. Prices on these services have been rising to the stratosphere over the past few decades because it's so difficult to increase the productivity of individual medical practitioners and educators. We're entering an era that could finally break this logjam."

          It's not that process innovations are lacking, it's that product innovations are perceived as an indignity by most people. Why should one child get an LLM teacher or doctor while others get individualized attention by a skilled human being?

          • By mullingitover 2026-03-1218:503 reply

            > Why should one child get an LLM teacher or doctor while others get individualized attention by a skilled human being?

            Is the value in the outcome of receiving medical advice and care, and becoming educated, or is the value just in the co-opting of another human being's attention?

            If the value is in the outcome, the means to achieving that aren't of much consequence.

          • By somekyle2 2026-03-1218:381 reply

            It also seems like the value of quality tutoring that doesn't primarily function as social/class signaling goes down as tools capable of automating high quality intellectual work are more widely available.

        • By mcmcmc 2026-03-1219:05

          I’m sick of this idea that “free” services are beneficial to society. There is no such thing as a free lunch; users are essentially bartering their time, attention, IP (contributed content) and personal/behavioral data in exchange for access to the service.

          By selling those services at a cost of “free”, hyperscalers eliminate competition by forcing market entrants to compete against a unit price of 0. They have to have a secondary business to subsidize the losses from servicing the “free” users, which of course is usually targeted advertising to capitalize on the resources paid by users for access. Or simply selling to data brokers.

          With the importance of training data and network effects, “free” services even further concentrate market power. Everyone talks about how AI is going to take away jobs, but no one wants to confront how badly the anticompetitive practices in big tech are hurting the economy. Less competition means less opportunity for everyone else, regardless of consumer benefit.

          The only way it works if the “free” service for tutoring or healthcare is through government subsidies or an actual non-profit. Otherwise it’s just going to concentrate market power with the megacorps.

      • By wagwang 2026-03-1219:061 reply

        > because productivity gains must benefit the lower classes to see a multiplier in the economy

        by this logic, the invention of mechanized farm equipment, which displaced farm labor, didnt increase productivity

        • By malfist 2026-03-1219:13

          It made food cheaper.

      • By babypuncher 2026-03-1218:47

        I would argue we've even already seen this play out with productivity gains across the economy over the last 40 years. The American middle class has been gradually declining since the '80s. AI seems likely to accelerate that trend for the exact reasons you point out.

        A lot of people recognize this pattern even if they can't articulate it, and that's why they hate AI so much. To them, it doesn't matter if AI lives up to the hype or not. Either it does and we're staring down a future of 20%+ unemployment, or it doesn't and the economy crashes because we put all our eggs in this basket.

        No matter what happens, the middle class is likely fucked, and anyone pushing AI as "the future" will be despised for it whether or not they're right.

        Personally, I think the solution here might be to artificially constrain the supply of productivity. If AI makes the average middle-class worker twice as productive, then maybe we should cut the number of work hours expected from them in a given week.

        The complete unwillingness of people in power to even acknowledge this problem is disheartening, and is highly reminiscent of the rampant corruption and wealth inequality of the Gilded Age.

        Technological progress that hurts more people than it helps isn't progress, it's class warfare.

    • By bobthepanda 2026-03-1216:12

      IIRC, the way this worked was that by decreasing tellers required per branch, it made a lot more marginal locations pencil out for branches, at a time when the banking industry was expansionary.

      This is not so helpful if AI is boosting productivity while a sector is slowing down, because companies will cut in an overabundant market where deflationary pressure exists.

    • By aurareturn 2026-03-1215:228 reply

      We're already seeing large software companies figure out that they don't need 5,000 developers. They probably only need 1,000 or maybe even fewer.

      However, the number of software companies being started is booming which should result in net neutral or net positive in software developer employment.

      Today: 100 software companies employ 1,000 developers each[0]

      Tomorrow: 10,000 software companies employ 10 developers each[1]

      The net is the same.

      [0]https://x.com/jack/status/2027129697092731343

      [1]https://www.linkedin.com/news/story/entrepreneurial-spirit-s...

      • By snarf21 2026-03-1215:27

        Don't count all those chickens before they hatch. There might be more started but do they all survive? Think back to the dot-com boom/crash for an example of where that initial gold rush didn't just magically ramp forever. There were fits and starts as the usefulness of the technology was figured out.

      • By paxys 2026-03-1215:253 reply

        Why will we need 1000 companies tomorrow to do the same thing that 100 companies are doing today? If they are really so efficient because of AI then won't 10 companies be able to solve the same problems?

        • By aurareturn 2026-03-1215:352 reply

          Because that car repair company with 3 local stores previously couldn't justify building custom software to make their business more efficient and aligned with what they need. The cost was too high. Now they might be able to.

          Plenty of businesses need very custom software but couldn't realistically build it before.

          • By jimbokun 2026-03-1219:35

            Car repair companies won’t see a meaningful improvement to their bottom line with more custom software. Will it increase the number of cars per employee per day they can repair?

          • By cityofdelusion 2026-03-1219:22

            I see no way that company would save more money from hiring an experienced developer compared to paying their yearly invoice on the COTS product doing the same thing today. The only way this works is with a very wage suppressing effect.

        • By RHSeeger 2026-03-1216:15

          What makes you think they'll be doing the same thing?

        • By gloxkiqcza 2026-03-1215:28

          There’s always more problems to be solved. Some of them just weren’t financially feasible before.

      • By haliskerbas 2026-03-1215:272 reply

        Do the booming companies pay the same as the ones who did layoffs? If you're laid off from Meta or other top tier paying company (the behemoths doing layoffs) you might have a tough time matching your compensation.

        • By RHSeeger 2026-03-1216:14

          But do they need to? If a <role X> job at a top tier company making $600k is eliminated and two <role X> jobs at a "more average" company making $300k replace it; is that really a bad thing? Clearly, there's some details being glossed over, but "one job paying more than a person really needs" being replaced by "two jobs, each paying more than a person really needs" might just be good for society as a whole.

        • By aurareturn 2026-03-1215:321 reply

          There's likely going to be a separation between the top earners and the average.

          IE. If a top tier dev make $1m today, they'll make $5m in the future. If the average makes $100k today, they'll maybe make $60k.

          AI likely enables the best of the best to be much more productive while your average dev will see more productivity but less overall.

          • By Sleaker 2026-03-1219:26

            I think this is assuming that the labor market knows how to identify the dirct value of devs. This already seems to be a problem across the board regardless of job role.

      • By small_model 2026-03-1215:271 reply

        I think this is true in the short/medium term, hence the confusing picture of layoffs but growing number of tech roles overall. The limit maybe be just millions of companies with one tech person and a team of agents doing their bidding.

        • By aurareturn 2026-03-1215:28

          Maybe software engineers will be like your personal lawyer, or plumber. Every business will have a software engineer on dial, whether it's a small grocery store or a kindergarten.

          Previously, software devs were just way too expensive for small businesses to employ. You can't do much with just 1 dev in the past anyway. No point in hiring one. Better go with an agency or use off the shelf software that probably doesn't fill all your needs.

      • By ap99 2026-03-1216:53

        And the differentiator will be (even more than it is now) product vision since AI-enhanced engineering abilities will be more level.

      • By raw_anon_1111 2026-03-1215:26

        Only because VC companies are throwing money at them. How many of them are actually profitable and long term sustainable

      • By lovich 2026-03-1215:291 reply

        Ah, so that explains why job growth is at a steady pace and the software industry hasn’t been experiencing net negative job growth the past year or so.

        How silly of me to rely on reality when it’s so obvious that AI is benefiting us all.

        • By aurareturn 2026-03-1215:381 reply

          I think you're being sarcastic? I'm not sure.

          Anyways, this is the start. Companies are adjusting. You hear a lot about layoffs but unemployments. But we're in a high interest environment with disruptions left and right. Companies are trying to figure out what their strategy is going forward.

          I don't expect to see a boom in software developer hiring. I think it'll just be flat or small growth.

          • By lovich 2026-03-1216:48

            I was being sarcastic.

            We are in negative growth, and the current leadership class keeps talking about all the people they can get rid of.

            Look at the Atlassian layoff notice yesterday for example where they lied to our faces by saying they were laying off people to invest more in AI but they totally aren’t replacing people with AI.

      • By hackyhacky 2026-03-1216:442 reply

        > We're already seeing large software companies figure out that they don't need 5,000 developers. They probably only need 1,000 or maybe even fewer.

        Long-term, they will need none. I believe that software will be made obsolete by AI.

        Why use AI to build software for automating specific tasks, when you can just have the AI automate those tasks directly?

        Why have AI build a Microsoft Excel clone, when you can just wave your receipts at the AI and say "manage my expenses"?

        Enjoy your "AI-boosted productivity" while it lasts.

        • By pixelatedindex 2026-03-1216:501 reply

          > Long-term, they will need none. I believe that software will be made obsolete by AI.

          I think this is a bit hyperbolic. Someone still needs to review and test the code, and if the code is for embedded systems I find it unlikely.

          For SaaS platforms you’ll see a dramatic reduction, maybe like 80% but it’ll still have a handful of devs.

          Factories didn’t completely eliminate assembly line workers, you just need a far fewer number to make sure the cogs turn the way it should.

          • By hackyhacky 2026-03-1216:572 reply

            > Someone still needs to review and test the code, and if the code is for embedded systems I find it unlikely.

            I feel like you didn't understand my comment. I am predicting that there is no code to review. You simply ask the AI to do stuff and it does it.

            Today, for example, you can ask ChatGPT to play chess with you, and it will. You don't need a "chess program," all the rules are built in to the LLM.

            Same goes for SaaS. You don't need HR software; you just need an LLM that remembers who is working for the company. Like what a "secretary" used to be.

        • By esseph 2026-03-1217:141 reply

          > Why use AI to build software for automating specific tasks, when you can just have the AI automate those tasks directly?

          Speed, cost, security, job/task management

          Next question

          • By hackyhacky 2026-03-1217:411 reply

            > Speed, cost, security, job/task management

            All of that will inevitably be solved.

            50 years ago, using a personal computer was an extravagant luxury. Until it wasn't.

            30 years ago, carrying a powerful computer in your pocket was unthinkable. Until it wasn't.

            Right now, it's cheaper to run your accounting math on dedicated adder hardware. But Llms will only get cheaper. When you can run massive LLMs locally on your phone, it's hard to justify not using it for everything.

    • By manwe150 2026-03-1219:19

      > So, ATMs did impact bank teller jobs by a significant amount.

      Did it? This sounds like describing a company opening a new campus as laying off a third of their employees, partly offset by most of them still having the same job in the same company but at a new desk.

    • By onetimeusename 2026-03-1217:361 reply

      I go back and forth on this. I relate it to software. I don't think AI can meaningfully write software autonomously. There are people who oversee it and prompt it and even then it might write things badly. So there needs to be a person in the loop. But that person should probably have very deep knowledge of the software especially for say low level coding. But then that person probably developed the knowledge by coding things by hand for a long time. Coding things by hand is part of getting the knowledge. But people especially students rely heavily on AI to write code so I assume their knowledge growth is stunted. I don't know mathematical proofs will help here. The specs have to come from somewhere.

      I can see AI making things more productive but it requires humans to be very expert and do more work. That might mean fewer developers but they are all more skilled. It will take a while for people to level up so to speak. It's hard to predict but I think there could be a rough transition period because people haven't caught on that they can't rely on AI so either they will have to get a new career or ironically study harder.

      • By jama211 2026-03-1218:263 reply

        An AI’s ability to meaningfully write software autonomously has changed hugely even in the last 6 months. They might still require a human in the loop, but for how long?

        • By onetimeusename 2026-03-1219:23

          I am not saying AI's abilities are the shortcoming here. The problem is that people need to trust that software has certain attributes. For now, that requires someone with knowledge to be part of it. It's quite possible development becomes detached from human trust. As I said that would reduce the number of developers but the ones who are left would have to have deep knowledge to oversee it and even that may be gone. Whatever happens in the future, for now I think people will have to level up their knowledge/skills or get a new career and that's probably true for most professions.

        • By bwestergard 2026-03-1218:331 reply

          Quantitative measures of this are very poor, and even those are mixed.

          My subjective assessment is that agents like Copilot got better because of better harnesses and fine tuning of models to use those harnesses. But they are not improving in the direction of labor substitution, but rather in the direction of significant, but not earth-shaking, complementarity. That complementarity is stronger for more experienced developers.

          • By jygg4 2026-03-1218:59

            Agree. Nice to see a post with proper economic thought on the topic.

        • By mekoka 2026-03-1219:09

          This LLM ability is directly proportional to the quantity of encoded (i.e. documented) knowledge about software development. But not all of the practice has thus been clearly communicated. Much of mastery resides in tacit knowledge, the silent intuitive part of a craft that influences the decision making process in ways that sometimes go counter to (possibly incomplete or misguided) written rules, and which is by definition very difficult to put into language, and thus difficult for a language model to access or mimic.

          Of course, it could also be argued that some day we may decide that it's no longer necessary at all for code to be written for a human mind to understand. It's the optimistic scenario where you simply explain the misbehavior of the software and trust the AI to automatically fix everything, without breaking new stuff in the process. For some reason, I'm not that optimistic.

    • By cjbgkagh 2026-03-1215:281 reply

      No, I think it's likely that this is the first major productivity boom that won't be followed with a consumption boom, quite the opposite. It'll result in a far greater income inequality. Things will be cheaper but the poor will have fewer ways to make money to afford even the cheaper goods.

      • By alex_sf 2026-03-1215:437 reply

        If goods aren't being sold, then the price will drop.

        • By cjbgkagh 2026-03-1215:543 reply

          It's not that simple. If a poor person makes zero dollars how much of the reduced cost item could they now afford?

          We have a massively distorted economy driven by debt financialization and legalised banking cartels. It leads to weird inversions. For example as long as housing gets increasingly expensive at a predictable rate the housing becomes more affordable instead of less as banks are more able to lend money. The inverse is also true, if housing were to drop at a predictable rate fewer people would be able to get a mortgage on the house so fewer people could afford to buy one. Housing won't drop below cost of materials and labor (ignoring people dumping housing to get rid of tax debts as I would include such obligations in the cost of acquisition). Long term it's not sustainable but long term is multi-generational.

          • By kjkjadksj 2026-03-1216:541 reply

            Fwiw in places like parts of the midwest housing is below cost of labor and materials. An existing house might be $70k and several bedrooms at that. You just can’t get anything built for that even if you build it all yourself.

          • By charcircuit 2026-03-1217:27

            It depends. There are people and businesses today who even make negative dollars each month, but they still purchase things every month.

          • By carlosjobim 2026-03-1216:45

            > Housing won't drop below cost of materials and labor

            Only if every person born needs to have a brand new house constructed for them.

            Not if - you know - people die and don't need a house to live in anymore.

            But considering how it's been the past 20 years, I'm starting to expect that a lot of the current elder generation will opt to have their houses burnt down to the ground when they die. Or maybe the banker owned politicians will make that decision for them with a new policy to burn all property at death to "combat injustice". Who knows what great ideas they have?

        • By layer8 2026-03-1216:26

          Or the goods will just go away if too few people are willing to pay their price, and only the lower-quality cheaper-to-make goods will remain.

        • By zerotolerance 2026-03-1216:22

          "will" being the operative word here. High school level Econ makes no promises about WHEN prices adjust. Price setting is a whole science highly susceptible to collusion pressure. Prices generally drop only when the main competition point is price (commodities). In this case the main issue is that AI is commoditizing many if not all types of labor AND product. In a world where nothing has value how does anything get done?

        • By _DeadFred_ 2026-03-1215:561 reply

          Cool concept, but this isn't 1980. We've been sold these sorts of concepts for 40+ years now and things have only gotten worse.

          We have a K shaped economy. Top earners take the majority. The top 20% make up 63% of all spending, and the top 10% accounted for more than 49%. The highest on record. Businesses adapt to reality and target the best market, in this case the top 10 to 20%, and the rest just get ignored, like in many countries around the world.

          All that unlocked money? In a K shaped economy it mostly goes to those at the top, who look to new places to park/invest it, raising housing prices, moving the squeeze of excess capital looking for gains to places like nursing homes and veterinary offices. That doesn't result in prices going down, but in them going up.

          The benefit to the average American will be more capital in the top earners' hands looking for more ways to do VC style squeezes in markets previously not as ruthless but worth moving to now as there are less and less 'untapped' areas to squeeze (because the top 10-20% need more places to park more capital). The US now has more VC funds than McDonalds.

          • By runarberg 2026-03-1216:141 reply

            Irrelevant aside: But I hold grudge against the economists who picked the letter K to represent increased inequality. They missed the perfect opportunity to use the less-then inequality symbol (<) and call it a “less-then economy”.

        • By idiotsecant 2026-03-1216:194 reply

          This and other fairytales.

          The only solution here is to stop tying people's value to their productivity. That makes a lot of sense in the 1900s but it makes a lot less sense when the primary faucet of productivity is automation. If you insist on tying a person's fundamental right to a decent and secure life to their productivity and then take away their ability to be productive you're left with a permenant and growing underclass of undesirables and an increasingly slim pantheon of demigods at the top.

          We have written like, an ocean of scifi about this very subject and somehow we still fail to properly consider this as a likely outcome.

          • By ap99 2026-03-1216:451 reply

            Speaking of fairytales, you're living in your own.

            Disconnecting value from productivity sounds good if you don't examine any of the consequences.

            Can you build a society from scratch using that principle? If you can't then why would it work on an already built society?

            Like if we're in an airplane flying, what you're saying is the equivalent getting rid of the wings because they're blocking your view. We're so high in the sky we'd have a lot of altitude to work with, right?

          • By karol 2026-03-1216:303 reply

            They key is to do it by setting up the right structure or end up with it naturally, not by laws and control, because then you end up in a oppressive nanny state at the very best.

          • By carlosjobim 2026-03-1216:47

            It's already completely disconnected, don't worry about it. Most people who own any real estate earn more in price appreciation per year than they earn in take-home salary from their real full-time jobs.

          • By s5300 2026-03-1219:36

            [dead]

        • By marcosdumay 2026-03-1216:35

          I don't know what economy you are looking at, because the opposite is usually true since humanity industrialized.

          If goods aren't being sold, then the price will increase.

        • By wnc3141 2026-03-1216:52

          to the point of where the cost of bringing the goods to market or its opportunity cost exceed the price the market will bear. Its why people living in areas of material poverty don't just get everything on discount.

    • By suzzer99 2026-03-1219:091 reply

      I don't understand the economics behind bank branches. Some of the best real estate by me is taken up by giant bank branches that are always mostly empty with a few bored employees inside. And they open new ones all the time. So it's not like they're stuck in some lease.

      • By fragmede 2026-03-1219:12

        But when those employees are meeting with clients, they create money out of thin air by making loans, which then is used to pay for goods and services such as leases.

    • By plorkyeran 2026-03-1217:45

      I also notice that in the very first graph bank teller jobs were growing rapidly until ATMs started to be deployed, and then switched to growing very slowly. That sure suggests to me that if ATMs didn't exist bank teller growth would have continued at a faster pace than it actually did.

    • By Cpoll 2026-03-1216:58

      > A third of them were made redundant

      If I'm reading this correctly, the interpretation should be that a third of them were transferred to new branches.

      0.66 (two thirds retention) * 1.4 (40% more branches) = 0.84, so we only expect ~16% were made redundant.

    • By rayiner 2026-03-1217:40

      Correct. The story isn’t correct even in the original formulation. US population increased by 50% from 1980 to 2010, and the economy became far more financialized. But the number of bank teller jobs barely grew during that period, even before the iPhone.

    • By keeda 2026-03-1217:06

      I don't think it will, but I also think it's not all doom and gloom.

      I think it would be a mistake to look at this solely through the lens of history. Yes, the historical record is unbroken, but if you compare the broad characteristics of the new jobs created to the old jobs displaced by technology, they are the same every time: they required higher-level (a) cognitive (b) technical or (c) social skills.

      That's it. There is no other dimension to upskill along.

      And LLMs are good at all three, probably better than most people already by many metrics. (Yes even social; their infinite patience is the ultimate advantage. Prompt injection is an unsolved hurdle though, so some relief there.)

      Plus AI is improving extremely rapidly. Which means it is probably advancing faster than most people can upskill.

      An increasingly accepted premise is that AI can displace junior employees but will need senior employees to steer it. Consider the ratio of junior to senior employees, and how long it takes for the former to grow into the latter. That is the volume of displacement and timeframe we're looking at.

      Never in history have we had a technology that was so versatile and rapidly advancing that it could displace a large portion of existing jobs, as well as many new jobs that would be created.

      However, what few people are talking about is the disintermediating effect of AI on the power of capital. If individuals can now do the work of entire teams, companies don't need many of them. But by the same token(s) (heheh) individuals don't need money, and hence companies, to start something and keep it going either! I think that gives the bottom side of the K-shaped economy a fighting chance to equalize.

    • By irjustin 2026-03-1215:333 reply

      > But will it?

      No, because if you think about Startrek the endgame is replicators. Well the concept that 100% of basic needs are met.

      At some point work becomes unnecessary for a society to function.

      • By collingreen 2026-03-1216:251 reply

        Why is that the endgame with people though? Maybe I'm just jaded but several different human nature elements came to mind when I read your comment:

        Greed/Change Avoidance:

        If someone invented replicators right now, even if they gave it completely away to the world, what would happen? I can't imagine the finance and military grind just coming to an end to make sure everyone has a working replicator and enough power to run it so nobody has to work anymore. Who gives up their slice of society to make that change and who risks losing their social status? This is like openai pretending "your investment should be considered a gift because money will have no value soon". That mask came off really quickly.

        Status/Hate:

        There are huge swaths of the US population that would detest the idea that people they see as "below" them don't have to work. I can imagine political movements doing well on the back of "don't let the lazy outgroup ruin society by having replicators".

        Fuck the Poor:

        We don't do the easy things to eliminate or reduce suffering now, even when it has real world positive effects. Malaria, tuberculosis, even boring old hunger are rampant and causing horrible, unnecessary suffering all over the world.

        Dont tread on me:

        I shudder when I think of the damage someone could do with a chip on their shoulder and a replicator.

        The road to hell is paved with good intentions:

        What happens when everyone can try their own version of bio engineering or climate engineering or building a nuclear power plant or anything else. Invasive species are a problem now and I worry already when companies like Google decide to just release bioengineered mosquitos and see what happens. I -really- worry when the average person decides a big complicated problem is actually really simple and they can just replicate their particular idea and see what happens. Whoops, ivermectin in the water supply didn't cure autism!

        Someone give me some hope for a more positive version here because I bummed myself out.

        • By pixl97 2026-03-1216:48

          Solving unlimited power before solving unlimited greed invites unlimited tragedy.

      • By win311fwg 2026-03-1217:19

        Does it? The Communist Manifesto famously hypothesized that those who have the replicators, so to speak, will not allow society to freely use them.

        The future is anyone's guess, but it is certain that 100% of your needs being able to be met theoretically is not equivalent to actually having 100% of your needs met.

      • By carlosjobim 2026-03-1217:581 reply

        We have to grow out of those kind of dreams. That's like a kid dreaming that when he grows up he'll eat ice cream for dinner every day.

        People when they mature have an innate desire to work. It is good for body and mind. If you're curious about the world, you'll have to do some work one way or another to achieve your goals and satisfy your curiosity.

        If "society" is just a function of basic needs, then there's plenty of places in the world to visit where people live like that and use any excess energy in endless fighting against each other instead of work.

        • By Noumenon72 2026-03-1218:37

          I would say endless fighting against each other is a much more innate desire than work. I know I don't have one.

    • By fnord77 2026-03-1217:44

      we're going to find out

  • By lchengify 2026-03-1218:28

    Two anecdotes I'll share:

    First: Most people believe it was Netflix that killed Blockbuster, but that's not strictly correct. It was the combination of Netflix and Redbox that really sealed the deal for Blockbuster (and video rental generally). It normally takes not one, but at least two things to really fill the full functionality of a old paradigm. Also it's human nature to focus heavily on one thing (Blockbuster was aware of Netflix) but lose sight of getting flanked by something else.

    Second: Not listed here is how banks themselves have changed to be almost entirely online, which in many cases is more of a outsourcing play than a labor destruction play. My favorite example of this is Capital One, where the vast majority of their credit card operations literally cannot be solved in a branch. You must call them to say, resolve a fraud dispute. Note that this still requires staffing and is (not yet) fully automated, just not branch staffing. It doesn't make sense to staff branches to do that.

  • By ahartmetz 2026-03-1215:0926 reply

    I do not get what's special about banking apps as opposed to online banking. I've been doing online banking in the browser on a PC since before apps and I'm still doing it because dealing with data on a phone is painful compared to a PC.

    Is an app really that much easier to use?

    • By dylan604 2026-03-1215:124 reply

      Sounds like someone forgetting that for a large number of people, their mobile device is their only computer.

      • By dehrmann 2026-03-1215:175 reply

        I know this is true, but for serious tasks, I need the screen real estate. I'm amazed at what some people can do from a phone, but also wonder if they're missing things, of if it's actually inefficient.

        • By danielbln 2026-03-1215:234 reply

          I'm going to bet that you are a millennial or older? We need our big screens for $IMPORTANT work (buying big things, money stuff, etc.). GenZ tends to be less bothered by it and just does it all on the tiny screen in their pocket. It's time to schedule a colonoscopy.

          • By dehrmann 2026-03-1216:401 reply

            What if millennials are good at both and are choosing the right too for the job?

          • By havaloc 2026-03-1215:491 reply

            My boomer dad does more things on his phone than I do and I'm Gen X. It's actually astonishing how much he does on his iPhone. I'm dragging out the laptop and he's on his iPhone happy as a clam.

          • By idiotsecant 2026-03-1216:25

            I used to be with ‘it’, but then they changed what ‘it’ was. Now what I’m with isn’t ‘it’ anymore and what’s ‘it’ seems weird and scary. It’ll happen to you!

          • By twelve40 2026-03-1215:401 reply

            that's kind of an ad hominem, but also beside the point: most bank apps (and websites) are actually absolute garbage, especially the top ones, just one example: the Citi app (on different phones) for a very long time refused to allow me to make a payment or change my password, so i had no choice but to use desktop. Somehow still, top banks' ugly websites seem to allow more functionality/fewer bugs than their mobile apps, which are very often just dumbed-down webviews or simplifications of their websites.

        • By bjtitus 2026-03-1215:18

          I wouldn't call checking a bank balance and initiating transfers "serious tasks". Maybe important but they aren't complex.

        • By rkuykendall-com 2026-03-1215:25

          I am going to guess you are 30 or older. Google image search "laptop tasks millennial" to see that this is a feeling shared among our cohort but not the younger cohort.

        • By jama211 2026-03-1218:28

          Do you need it, or do you just feel more comfortable with it?

        • By neutronicus 2026-03-1215:19

          Or if they go to the public library when those tasks come up.

      • By freedomben 2026-03-1215:221 reply

        Browsers and websites work pretty well on mobile devices too. Website != desktop only

        • By dylan604 2026-03-1215:412 reply

          If you consider a website fully laden with ads as working. I have yet to find an ad blocker that works on my iOS/iPad OS that works as well as on my computer. I also hate apps with all of their invasive data hoarding that is much more controllable on my computer. So to me, websites on mobile are broken as they are full of malware vectors that are not present when looking at the same website on my non-mobile device. For me, website === desktop only

          • By rsync 2026-03-1219:18

            I encourage you to install the dns4eu ad blocking profile on your ios device.

            It’s free, it’s transparent, you can read the profile… And it takes two minutes.

      • By eloisant 2026-03-1216:521 reply

        That wasn't true before smartphones, everyone had a computer so they could access the Internet. Except maybe in developing countries - but the article is about the US.

        • By dylan604 2026-03-1217:07

          At one point, humans had not stepped on the moon. At one point, we didn't know about antibiotics. At one point....

          It doesn't matter what used to be, we're discussing what is now. We now have mobile devices that are much cheaper for people to obtain than a computer. For most, that device is more powerful than a computer they could afford. Arguing the fact that a vast number of people's only compute device is their mobile is just arguing with a fence post. It serves no purpose.

      • By bjtitus 2026-03-1215:181 reply

        Exactly. 96% of internet users use mobile phones. 62% use PCs.

    • By mrweasel 2026-03-1216:041 reply

      My bank decided that the online banking website needed to be more like the app, so now they are both terrible. Basically the entire site is white space on the computer, because everything is centred and dumb down. Input fields for numbers are invisible, they are just a label saying "Kr" and you're suppose to click it and the numerical keyboard on the phone pops up, except it obviously doesn't on the computer.

      Paying billed is easier on the phone in the sense that bills in Denmark have a three part number, e.g. +71 1234567890 1234678 where the first is a type number, second is the receiver and the last is a customer number with the receiver. The phone allows to just use the camera to scan the number.

      Transferring money is terrible on both platforms, because it's designed to be doable on the phone, meaning having three or four screen, but it gives you no overview. There's plenty of space on a computer for a proper overview giving you the feeling of safety, but it's not used. Same for account overview. Designed to the phone, but doesn't adapt to the bigger screen and provide you with more details, so you need to click every single expense to see what is is exactly.

      • By ahartmetz 2026-03-1216:26

        I've had the same thing happen. Huge buttons, a lot of whitespace, little functionality in the default web version. To deal with stocks and such, the old version is still available somewhere.

    • By cheema33 2026-03-1218:44

      > I do not get what's special about banking apps as opposed to online banking.

      I use both. In the beginning I used to prefer the web version. I can use my large monitor to see more data and use a full keyboard and mouse. But I have started to use the mobile version more. For Wells Fargo at least, the mobile version is faster to log into because of face ID support. The website requires a lot more clicks and keystrokes. Also, the mobile app makes it easy and possible to deposit checks if and when I get them.

    • By DonsDiscountGas 2026-03-1218:57

      I've had the same thought. The only major difference that I can think of is the built-in camera making check deposits easier. It may also be that people were just generally using computers more and using the internet more over this same time period, although a lot is that because of smartphones

    • By lizknope 2026-03-1218:32

      Yeah, I have been doing online banking since around 1998.

      I have refused to install the bank app on my phone because I see no point in it and just downsides in case I get mugged (bad experience in my teenage years)

      The 1 check I get a year takes about a minute to deposit at the ATM on my way to work.

    • By conductr 2026-03-1215:23

      My main reason to go to bank after online was to deal with physical things. Mainly checks and specifically depositing them. Now, I can usually do that with my phone because of the camera. Even if I had a webcam before, I don’t recall the functionality being there. They had check scanners but usually for businesses and my check volume is really low so never made sense to get one (usually came with a monthly fee to have one iirc)

      Even now, the mobile deposit limit seems sufficiently low that I still go to the bank with more frequency than I’d like. Luckily, the ATM at the bank has a check scanner now that doesn’t have a limit so that’s usually easier and faster. It’s the daily $5000 limit I hit the most, a single check and put me over it and require a trip to bank. I think the monthly limit is $30000 and that doesn’t get in my way often. I think $5000 is too low of a daily limit. It’s common enough that I have to make a $5k+ settlement with friends/family that usually always has to be done by check. (For curious, This is usually travel that I pay for and we settle up later.)

      Less common, but sometimes I need to get a bank check (guaranteed funds) or a money order. Way less frequent is need to get/give cash funds. Usually can use ATM for this unless it’s a larger withdrawal or if I need some particular denomination. This whole paragraph accounts for about 1-4 annual trips in any given year though.

    • By retired 2026-03-1219:00

      An app on your phone can be more secure as you are using the device itself as a hardware token.

    • By 1123581321 2026-03-1215:221 reply

      Yes, the apps perform better/faster and generally have more UI thought put into them. Overall, lower friction. Often when people need to use their banking app, they're in a hurry, maybe stressed (e.g. in line at a grocery store) so everything the bank can do quickly and with visual assurance helps.

      On the premium end of banking, where users generally aren't stressed about money, offering an app is more about catering to however the user prefers to interact.

      • By ahartmetz 2026-03-1215:302 reply

        A small screen and shitty keyboard are friction to me shrug

        • By DonsDiscountGas 2026-03-1218:59

          I'm the same way but we're both posting on hacker news. Many people prefer phones

        • By 1123581321 2026-03-1215:32

          You must know most people only have their phones when they are running errands, at work, etc.

    • By simonw 2026-03-1215:26

      Official banking apps are harder to phish than websites. They also tend to keep you signed in for longer, especially once you enable something like FaceID.

    • By forinti 2026-03-1215:161 reply

      One bank I work with seems to have all but given up on online banking and I just have to use their app because online banking will no longer work on Linux (although they don't openly admit it).

      I think Android and iOS are safer platforms than PCs and that's why banks want you to use your phone.

      • By empyrrhicist 2026-03-1215:241 reply

        > online banking will no longer work on Linux

        How? Across multiple browsers?

        > I think Android and iOS are safer platforms than PCs and that's why banks want you to use your phone.

        This statement fills me with revulsion and rage lol. The only real "safety" involved here is the removal of user agency. I have a lot more trust in a machine I can actually control, secure, and monitor than the black box walled-garden of phoneland.

        • By zetanor 2026-03-1217:251 reply

          Your bank's insurer trusts Google's security more than yours, and they must surely (and rightfully) believe that while Google would spy on you, they wouldn't steal your bank account.

    • By 1980phipsi 2026-03-1215:104 reply

      You can deposit checks via the app pretty easily.

      • By fweimer 2026-03-1215:131 reply

        The last time I've used a check was close to thirty years ago. I assume ahartmetz's experience is similar.

        Many countries have functioning giro systems. The U.S. is just an outlier.

        • By connicpu 2026-03-1215:19

          I've never written a check, but I have had to deposit occasional checks. In the last 6 years the only checks I've received were first paychecks at a new job (before direct deposit was set up) and my covid stimulus checks.

      • By ahartmetz 2026-03-1215:151 reply

        I'm in Europe where the situation is different: checks haven't been used in appreciable numbers for 30 years or so. It's all online or paper transfer orders. If you get a pre-filled paper transfer order, you can type (or scan and OCR I suppose) the same data into the online form.

        • By bluedino 2026-03-1215:205 reply

          Your grandma doesn't give you a $10 check for your birthday in Europe?

          What about manufacturer rebates?

      • By contracertainty 2026-03-1215:241 reply

        What's a check? As the saying goes, 'I'm too European for this'.

        On a more serious note, the last time I saw a cheque in the UK was my grandfather balancing his cheque book in the mid 80s. It really has been that long since they were in general use in the UK, at least.

        Just like with the prevalance of Apple/iPhones, the US banking system is global outlier.

        Things you can't do with my banking app you can do with the web site:

        - Extract your transactions to excel/csv

        - Use OpenBanking

        - See all my accounts on screen at once

        - Sharedealing

        - International transfers

        But people are right, banks trust the mobile app more, and realy on it as an MFA device, so even if you use the website you still need the app.

        • By retired 2026-03-1219:04

          Europeans have checks as well, so that doesn’t really makes sense.

      • By monocularvision 2026-03-1215:19

        Yep, check deposit was the last reason I might regularly visit a bank (although even before the iPhone, I would use the ATM for that)

    • By ericmay 2026-03-1215:133 reply

      How do you scan a check on your PC?

      Generally yes the apps tend to be easier to use for most things, especially with a high-speed internet connection. Customers prefer them, banks build them since customers prefer them.

      • By freedomben 2026-03-1215:192 reply

        My PC has had a scanner connected to it for over 20 years, and in the mid 00s I was scanning and depositing checks through my bank's website (USAA). Even with modern cameras and fancy smarphone software, the results you get from a PC scan are still much better than taking a picture with your phone.

        If you don't have a scanner, nearly all laptops have a webcam built in, and many people have one for their desktop as well.

        On top of all that, there's no reason you can't use your smartphone camera to upload an image into a website through the mobile browser. I've done it many times for things. Just this morning I "scanned" a receipt into Ramp by taking a picture with my smartphone in the mobile browser.

        You can't invade the user's privacy nearly as well in a browser (which is great for analytics/marketing), so there's a lot of incentive to the app creator to force a mobile app. But I think we should be honest that it's not for the user, it's for the company.

        • By ericmay 2026-03-1215:341 reply

          > My PC has had a scanner connected to it for over 20 years

          You're basically the only person in America doing this. Tens of millions of folks are just scanning it with the app on their phone and it's objectively a much better experience lol. The resolution of the photo taken on your smartphone is beyond good enough, there's no need to over-engineer something here.

          > You can't invade the user's privacy nearly as well in a browser (which is great for analytics/marketing), so there's a lot of incentive to the app creator to force a mobile app. But I think we should be honest that it's not for the user, it's for the company.

          I agree with your first sentence, but not your second one.

          Banking applications can certainly get more/different data on you from using the app, but the job of the bank is to protect money and to know their customer. Privacy is secondary, of course outside of things like other people knowing your account balance, unauthorized access, &c. That's for the bank, because they don't want to lose your money, but it's also for you because you don't want other people getting access to your money.

        • By bob1029 2026-03-1215:251 reply

          > the results you get from a PC scan are still much better than taking a picture with your phone.

          The quality of the check images is not as big of a deal as you might think. No one is actually inspecting these unless the amount of deposit is near a limit or the account is flagged for suspicious activity. You definitely do not want to throw away the physical copy until the bank confirms the deposit.

      • By ninalanyon 2026-03-1215:201 reply

        Haven't written or received a cheque in thirty years. But surely you could do it with any kind of digital camera, even a webcam.

        • By simonw 2026-03-1215:301 reply

          Out of interest, do you live in a country other than the USA?

          (I'm guessing you are because in the USA they spell it check, not cheque.)

          I asked because the USA still seems to be stubbornly check-focused.

      • By bluedino 2026-03-1215:18

        Visioneer paperport!

        I wonder if you can use a webcam?

    • By eloisant 2026-03-1216:513 reply

      No, the article is wrong about the iPhone.

      It's the Internet that killed bank tellers.

      • By ghaff 2026-03-1217:25

        And you still need bank branches every now and then for various things. Still don't understand how various expansive bank branches are profitable.

      • By socalgal2 2026-03-1217:271 reply

        It's also not the iPhone given Europe is 60-70% Android

        • By retired 2026-03-1219:07

          Android market share in Europe is dropping, hasn’t been 70% in a while and it’s closing in on 60%.

      • By lotsofpulp 2026-03-1217:53

        Best way to get clicks without publishing something of substance is to publish something wrong. If the article was titled "The internet killed bank teller jobs", then people would think "duh" and no one would click on it.

    • By wolttam 2026-03-1215:18

      I can do all the same things with my bank with a browser that I can via the app.

      It seems like a natural evolution of the technology and adoption rates to me. There was rudimentary online banking in the 2000s, then we saw banks shift to fully online presences in the 2010s. Maybe it wasn’t “the iphone” but just the fact that by the 2010s, everybody had a device in their pocket.

    • By jldugger 2026-03-1218:461 reply

      Ever deposit a check via PC browser?

      • By jdauriemma 2026-03-1218:51

        +1, this is my use case as well

    • By snarf21 2026-03-1215:24

      Mostly easier in the sense that it is always in your hand already, not at home on the charger on your desk.

    • By jama211 2026-03-1218:28

      Yes? Why would I go over to my computer and boot it up and sit down and type in a website when I could just pull my phone out tap tap done?

    • By kjkjadksj 2026-03-1216:56

      My bank doesn’t allow for zelle access on PC. Otherwise I would never mobile bank.

    • By Obscurity4340 2026-03-1215:35

      Honestly, its overkill. When my MaBook went kaput, i had to start doing everything on my iPhone. Had to get a good mobile documents office suite (Collabora is great ), do all my banking with both mobile apps or desktop browser apps, etc. Its been dfine, i doubt i would use a full size computer for that anymore.

    • By add-sub-mul-div 2026-03-1215:19

      Right, I'm going out of my way to avoid inviting Google/Apple and their respective app store surveillance ecosystems into my transactions. I don't even have banking apps installed. I don't understand why so many people are prostrating themselves to this future for minor convenience.

    • By jader201 2026-03-1215:16

      I mean, this argument isn’t really specific to banking apps. This could apply to any native vs. web app, in general.

      Native apps can provide a bit more streamlined UX (e.g. Face ID), while also being able to provide more robust features (mobile deposit).

      The downsides are arguably higher development costs / OS compatibility, and having to install a separate app.

    • By nonameiguess 2026-03-1215:401 reply

      I'm always a bit confused in these discussions what is special about banking software of any kind at all. My bank has an app, but other than checking a balance every now and again, the only reason I use it is because it's also my insurance provider and I make claims through it. For actual banking, I don't really do any, through the website or the app. My pay is direct deposit. My purchases are on credit with payment details generally stored with the vendor; otherwise, I have cards or use the numbers. Monthly balance payoff is autopay. I had to go into the website once to set all that up however many years ago I don't remember, but people talk in these threads like they're in their banking apps directly moving money around all the time, actually making payments with the app. Why?

      • By acatnamedjoe 2026-03-1217:14

        I have a personal current account, a shared current account with my wife, and several savings accounts. It is frequently necessary to move money between these accounts.

        Also, here in the UK we don't really use Venmo or anything like that, so normally transferring cash to and from friends and family happens by bank transfer as well.

    • By SpaceManNabs 2026-03-1215:221 reply

      Doing it on the go via the app is much easier than using the web app through the main OS browser just because the UI is optimized. not a problem with using the web app approach, just that there isnt as much investment in it due to zeitgeist i guess.

      Also since you are already using 2FA, you are already on the phone so might as well do basic operations there.

      I can also look at transactions in my bed before going to bed so that is nice.

      If I need to look at a support ticket or look at transactions more deeply, i still use the desktop approach.

      • By freedomben 2026-03-1215:24

        I don't think many people would argue that there shouldn't be a mobile app, just that there should also be a website/webapp way to do it as well if you don't want to install their native app.

    • By dartharva 2026-03-1215:141 reply

      Mobile payments (at least in places where they are executed correctly) are certainly a huge improvement over physically exchanging cash and change. I haven't needed to take out my wallet for years.

      • By ahartmetz 2026-03-1215:18

        I don't see what difference it makes. If you use cash, you draw it at the ATM.

    • By everdrive 2026-03-1215:241 reply

      You just need to understand how things are now. Here are few modern smartphone conventions that render banking on an old-fashioned PC totally obsolete:

      - Remembering that you need to do banking, but waiting to do it until you're at home in front of your computer. This is impossible now, and if I don't follow the impulse the moment it occurs, the impulse will forever escape into the ether.

      - Even the mere mention of needing to observe a URL is often far too scary. Typing one in, or using a browser bookmark is of course, impossible.

      - Using a keyboard and mouse. It's just too onerous to use tools that are efficient and accurate. Modern users would much rather try to build a mental map of the curvature of their thumb, so that when they touch their touchscreen and obscure the button they're hitting, they they can reference that 3D mental map to guess at what portion of the screen they've actually pressed. Getting this wrong 30% of the time does not detract from the allure of touch screens.

      - Using a normal-sized screen that allows you to actually see a lot of data at once, or even use multiple tabs. Again, this is really unthinkable. Of course it be be completely unacceptable to need to wait to do your banking until you're in front of a computer. It's 2026, and I cannot be bothered to remember to do a task later. But, in needing to always follow every impulse immediately, it doesn't matter that my phone screen only displays a small amount of information at once, or that tabbed browsing is impossible in a banking app. Those inconveniences are acceptable, or even welcome!

      • By ido 2026-03-1215:441 reply

        I literally can't find where the bookmarks even are on Edge (I didn't care enough to search online).

        • By ahartmetz 2026-03-1218:03

          Autocompletion is my bookmarks collection for frequently visited websites.

HackerNews