The year was 1970. The place was Detroit, Michigan. My father worked as a shipping clerk in a factory of a small, family-owned business. Having dropped out of college after a year, with no technical skills beyond a good high school education, he put in five hard days’ work every week, but saw no prospect of advancement into a middle-class career.
One day, my father read a newspaper ad (remember those?) for a six-month trade school program. It offered intensive technical training to anyone with the aptitude and willingness to work hard at it, and claimed there was so much demand for these skills that graduates could “write their ticket” into good jobs. Although he had no previous exposure to this work, and despite its historic paucity of African-Americans, my dad decided to take a leap of faith.
With my mother’s moral and financial support, my father plunged into learning a new technical trade and mastering its unfamiliar skills, terminology, and procedures. He worked just as hard to improve his own prospects as he did for his previous employer, graduated in six months with a newly valuable skill set, and secured an entry-level role — an apprenticeship, of sorts — with a new company. Given a chance to learn on the job, he mastered his craft and embarked on a career that brought the Auguste family firmly into the middle class. It was our American Dream.
Why was it more straightforward for my father to break into a middle-class job in 1970 than it is today for more than 100 million Americans who, like him, lack bachelor’s degrees? Given the dramatic decline in economic mobility — which is throttling once-vibrant communities and poisoning American politics — such explanations are vital, but many accounts of the rising barriers to opportunity through meaningful work are sadly lacking in coherence.
What was most different in 1970 was not a lack of rapid change in technology… it was more the rules of the game that created better pathways to the middle class.
One widely suspected culprit is that “technology is taking our jobs.” Specifically, the jobs of working people without advanced formal education, who are labelled ipso facto as “unskilled” and considered most likely to be eclipsed by computers, automation, and artificial intelligence (AI). Was my high school–educated father just lucky to seek work when a middle-class trade did not require mastery of “rocket science” tech skills?
In a word: no. The trade school ad that changed my father’s career path in 1970 was to learn COBOL, the software language for IBM mainframe computers. Companies were rapidly adopting COBOL and in desperate need of programmers. What was most different in 1970 was not a lack of rapid change in technology; in fact, economic historian Robert Gordon has demonstrated convincingly that 1970 was near the end of a 100-year surge in U.S. productivity growth and innovation that transformed every aspect of our economy — much more than in the roughly 50 years since.
What really distinguished the environment in 1970 were the more effective policies and practices for navigating economic and technological changes. Put differently, it was more the rules of the game that created better pathways to the middle class — often making technology an empowering tool in the hands of workers like my father. This was not because of luck. It was because of deliberate choices, including long-term investments, by our government, industry, and citizenry.
What are the rules that transformed my father’s story then, and shape our story today?
Public and Corporate R&D Investment.
While benefiting from considerable government funding for basic research, IBM chose to invest a massive “bet-the-company” sum in the 1960s to develop the generation of mainframes that my father later learned to program. Contrast such catalytic investment with today’s short-term dividend and share buyback norms.
Labor Market Access and Dynamism.
Innovation not only created demand for COBOL programmers, it also allowed that demand to be filled. An entrepreneur had realized in 1970 that companies and universities could not train COBOL programmers rapidly enough, and chose to start a business that succeeded by converting underutilized human talent and motivation into valuable skills. Employers chose to hire and train new workers, like my father, who had the potential do the job well, even if — like him — they lacked a college degree or had never worked in an office. Strikingly, top U.S. employers in 2018 mostly overlook similar talent today, having hard-wired a college degree requirement into their hiring screens in the face of approximately 600,000 workers in the U.S. information technology field alone — a dysfunction I co-founded [email protected] to address. If you can do the job, you should get the job.
Middle-Class Volatility and Risk.
The scariest moment en route to our family’s middle-class ascent was when my father quit his stable job to seek the skills to find a better one. My mother’s full-time job with benefits made this choice possible. She was paid enough in 1970 to secure our family health coverage and cover all our essential needs, including our small home in a working-class Detroit neighborhood. Since 1970, the basic costs of a middle-class life (notably housing, education, health and child care) have soared out of reach of most single- and dual-earner families; unstable earnings and benefits render it more difficult for a similarly situated family to take that chance, especially if they have to relocate. As entrepreneur Reid Hoffman notes, a safety net can also be a trampoline.
The impact of technology on our lives — and on the future of meaningful work — is the result of research, investment, regulatory, and business model choices that are made by people.
And the list goes on. Workers in search of a middle-class career have to navigate the rules of financing training based on FICO scores; rules of corporate accounting and taxation favor investment in machines over investment in human capacity; rules of collective bargaining, non-compete agreements, and employer concentration hold wages down; rules of occupational licensing boards and NIMBY zoning boards keep newcomers out of good careers and trapped in long commutes. We treat such civic choices — death by a thousand cuts of working-class efforts to earn and to rise — as if they were gravity, forces of nature beyond our control.
These choices are not gravity, but they are grave. Nor is technology a force of nature. The impact of technology on our lives — and on the future of meaningful work — is the result of research, investment, regulatory, and business model choices that are made by people. We may call these people scientists, engineers, ethicists, entrepreneurs, business executives, government officials, legislators, activists, and voters. AI and big data may optimize certain decisions at speeds that people can’t match, but (some) people decide what to optimize, and for whom.
I’m often asked to predict the future of work, a question usually tinged with more than a little anxiety, if not outright alarm. We’ve seen some silly — and perhaps even sinister — proposals to throttle technology from leaders who should know better. But to answer those bad ideas with claims of inevitability is a dangerous mistake. We need to stop pretending that no one need be accountable for how we use technology, and in particular for how we allow it to influence the experiences, earnings, opportunities, and outcomes of working Americans and their families.
Vice President Biden recently argued that we must “choose a future that puts work first.” We need not ignore the complexities underneath his problem statement to admit that it is a choice. Where should we start? Stop blaming the tools, and start fixing the rules.