Technology innovation makes possible increases in individual productivity (defined as output per hour of work), and thus prosperity, but that economic engine seems to be slowing. GDP grows when we have increases in inputs (capital, labor) or in productivity gains (which shift the production function upward, given a set of inputs).
Historical analysis shows technology advancement as the key driver. However, the engine of technology > productivity > prosperity seems to have slowed dramatically. Over the last decade, aggregate gains have happened at < 40% the speed of the previous two decades.
St. Louis Fed, Real Output/Hour of All Persons, Index 2012=100
where are the productivity gains?
the technology for potential economic gains is there
some argue that we're not building stuff that matters as much — say, that we're not inventing things as in the 90s or 2000s, that we’re getting "120 characters instead of flying cars,” or that we're "end of cycle" and we need to wait for the next internet or cloud-scale platform shift to see productivity gains again.
This is overly dismissive. Evolving the world's information architecture and tools for creation, communication and transaction have the fundamental ability to reshape our society. The "flattening of the world" via the mass adoption of the smartphone circa 2007 and video commuting in 2020 both support as big a potential productivity boost as you could imagine. Lower cost to develop software (cloud compute, better tools) and distribute software (the web, smartphones, democratized adoption) has led to a cambrian explosion of SaaS.
The costs of global commerce and entrepreneurship have already plummeted through commerce super-infrastructure such as Alibaba, Stripe and Shopify, but efficient intermediation of business transactions has a long road ahead. The AI-driven automation of industries has barely begun. And for better and worse, as consumers we collectively spend more time in front of more powerful screens. We are no less impacted by new technology in 2020 than we have been over the last few decades. We have the compute power and the algorithms — we seem ripe to benefit from more tech-driven productivity gains. What gives?
we're stuck in a rut of adjustment costs
My hypothesis is that we're stuck in a rut of "adjustment costs." Learning and implementing new tech is disruptive and costly; traditionally, in a business, technology adoption has required implementation, decommissioning of old systems, refactoring of code bases, integration, operational change management, not to mention, cultural change and buy-in from many parties, and the need for actual humans to learn to use new technology. There's an "investment period" for any significant new technology.
This adjustment cost is experienced at different levels. Faced with an ever-increasing pace of innovation, institutions can't absorb the change quickly enough. They have technical and talent debt and struggle to migrate away from their SmallTalk and COBOL systems. IT leaders in the F500 will tell you that an organization with 80/20 maintenance/innovation mix is common — and in many companies, it's more like 90/10.
To make matters worse, the potential needle-moving technology investments that companies could make do not appear linearly, and often require concentrated investment over a short period of time: a 360 degree customer data consolidation, a re-architecture to microservices, replacement of an on-prem or homegrown ERP system, a shift to simpler, zero-trust security environments. However, these investments can feel untenable, as decisions at large public companies are often made based on quarterly and annual financial financial impact.
Even for companies able to overcome these barriers, we're overlooking the adjustment costs at the team and individual level, unique to our current rich, rapidly evolving and fragmented software environment.
knowledge workers are overwhelmed by metawork
increasingly, white collar work is done in software — interacting with customers, making decisions, coordinating teams and producing artifacts (prose, code, designs, visuals, spreadsheets, presentations).
but the real work feels at risk of being overwhelmed by metawork — the friction surrounding our tools and created by the combination of them and our organizations.
There are many familiar types of metawork — here are four:
individual switching costs. we are tabbing and tabbing and tabbing. we're all able to achieve less flow state as we use more fragmented software, and lose out to digital distractions. there is also a cost to learning new tools, especially when those tools are fundamentally more powerful (and therefore, naturally more complex) than in the previous decade
data integration burden at the end user. there's a tragedy of the commons around integration in SaaS. every department buys software to serve that department, and every piece of software wants you to live within their UI. but that's not how companies work — they work across silos and across applications, and so much valuable time is wasted doing swivel-chair tasks like copy-pasting data from one system to another and reformatting/cleaning it, or manipulating per-application analytics and reporting that is a bad shadow of excel/google sheets
, and software that encourage clumsy, lopsided and costly collaboration.** Lopsided collaboration tools encourage asking for personal resources (finite) instead of seeking informational resources (shareable). this is that colleague who treats you like their own personal AskJeeves, and pings you on slack "what is the answer to [X]?" instead of just looking it up in the wiki. it's often because wiki search fails, and slack search is untenable
meeting, messaging and interrupt hell. we've added communications tools like Slack and Zoom to email and IRL meetings, and they come with a higher ongoing burden when you're not in meetings. as the cost of communications decreases, the number of interactions increases, as does the time required to process them
productivity gains will come from reducing the work about work
metawork is currently winning vs. focused work, and it's hurting our aggregate productivity. COVID brings the issue into sharper relief.
such as this NBER working paper shows increases in internal emails (+5.2%), average # of recipients on emails (+2.9%), number of meetings per person (+12.9 percent) and the number of attendees per meeting (+13.5 percent).
every knowledge worker is now remote, spending all their productive hours working in and collaborating in software (
!). Existing frictions, distractions and stress are magnified. increasingly, being better at your computer tools is being better at your job.
we're clearly in the adjustment cost rut of the technology adoption cycle. The companies who figure out how to reduce the metawork fastest will be advantaged — both from a productivity and a talent perspective. Some companies will drown in the adjustment costs and metawork, and lose out to competitors.
I'm optimistic that productivity gains will come from reducing the work about work, for knowledge workers and knowledge teams. the potential gains from software adoption are there. the companies who get us out of this adjustment cost rut will unleash them.
to better measure knowledge work and use of time, without burdening or policing workers
integration and automation at the end-user (creator, builder) level
software that is easier to learn, but guides toward becoming a super-user
software that crosses silos of users, vs. being built "for Sales" or "for HR" or "for design"
software that balances the cost of communication for the sender/receiver
software that surfaces low-effort, high signal information to the user
software that reduces the need for meetings, and software that restores single-tasking
software companies that figure out how to cross the adoption chasm — from end-user to team and company standardization
thinking about this, too? sarah at greylock dot com
Want to print your doc? This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (