Escape to nowhere – Why new jobs might not save us from machine workers

This is a companion piece to my 2020 essay “Creative” jobs won’t save human workers from machines or themselves, so I recommend rereading it now. In the three years since, machines have gotten sharply better at “creative” and “artistic” tasks like generating images and even short stories from simple prompts. Video synthesis is the next domino poised to fall. These advancements don’t even represent the pinnacle of what machines could theoretically achieve, and as such they’ve called into question the viability of many types of human jobs. Contrary to what the vast majority of futurists and average people predicted, jobs involving artistry and creativity seem more ripe for automation than those centered around manual labor. Myth busted. 

Another myth I’d like to address is that machines will never render human workers obsolete since “new jobs that only humans can do will keep being created.” This claim is usually raised during discussions about technological unemployment, and its proponents point out that it has reliably held true for centuries now, and each scare over a new type of machine rendering humans permanently jobless has evaporated. For example, the invention of the automobile didn’t put farriers out of work forever, it just moved them to working in car tire shops. 

The first problem with the claim that we’ll keep escaping machines by moving up the skill ladder to future jobs is that, like any other observed trend, there’s no reason to assume it will continue forever. In any type of system, whether we’re talking about an ecosystem or a stock market, it’s common for trends to hold steady for long periods before suddenly changing, perhaps due to some unforeseen factor. Past performance isn’t always an indicator of future performance.

The second problem with the claim is that, even if the trend continues, people might not want to do the jobs that become available to them in the future. Let me use a game as an analogy.

“Geoguessr” is an e-sport where players are shown a series of Google Street View images of an unknown place and have to guess where it is by marking a spot on a world map. The player who guesses the shortest distance from the actual location wins. Some people are shockingly good at it. Some tournaments offer $50,000 to the winner.

Anyway, some guys built a machine that can beat the best human at it.

This is a good model of how technological unemployment could play out in the future. Geoguessr, which could be thought of as a new job that was made possible by advances in technology (e.g. – Google Street View, widespread internet access) was created in 2013. Humans reigned supreme at it for 10 years until a machine was invented that could do it better. In other words, this occupation blinked in and out of existence in the space of 10 years.

That’s enough time for an average person to get trained and to perform it well enough to become an expert and net a steady income. However, as computers improve, they’ll be able to learn new tasks faster. The humans who played Geoguessr full-time will jump to some new kind of invented job made possible by a newer technology like VR. There, humans will reign supreme for, say, eight years before machines can do it better.

The third type of invented job will exist thanks to another future technology like wearable brain scanners. The human cohort will then switch to doing that for a living, but machines will learn to do it better after only six years.

Eventually, the intervals between the creation and obsolescence of jobs will get so short that it won’t be worth it for humans to even try anymore. By the time they’re finished training for it, they might have a handful of years of employment ahead of them before being replaced by another machine. The velocity of this process will make people drop out of the labor market in steadily growing numbers through a combination of hopelessness and rational economic calculation (especially if they can just get on welfare permanently). I call this phenomenon “automation escape velocity,” whereby machines get faster at learning new work tasks than humans, or so fast that humans have too small an advantage to really capitalize on.

This is a scenario shows how the belief that “Machines will never take away all human jobs because new jobs that only humans can do will keep being created” could hold true, but at the same time fail to prevent mass unemployment. Yes, humans will technically remain able to climb the skill ladder to newly created jobs that machines can’t do yet, but the speed at which humans will need to keep climbing to stay above the machines below them will get so fast that most humans will fall off. A minority of gifted people who excel at learning new things and enjoy challenges will have careers, but the vast majority of humans aren’t like that.

We should let machines choose jobs for us

In the last few months, I’ve posted links to a few articles with related implications:

In summary, when it comes to picking fields of study and work, humans are bad at doing it for themselves, bad at doing it for each other, and would be better off entrusting their fates to computers. While this sounds shocking, it shouldn’t be surprising–nothing in our species’ history has equipped us with the ability to perform these tasks well.

Consider that, for the first 95% of the human species’ existence, there was no such thing as career choice or academic study. We lived as nomads always on the brink of starvation, and everyone spent their time hunting, gathering, or caring for children. Doing anything else for a living was inconceivable. People found their labor niches and social roles in their communities through trial-and-error or sometimes through favoritism, and each person’s strengths and weaknesses were laid bare each day. Training and education took the form of watching more experienced people do tasks in front of you and gradually learning how to do them yourself through hands-on effort. The notion of dedicating yourself to some kind of study or training that wouldn’t translate into a job still payoff for years was inconceivable.

For the next 4.9% of our species’ existence, more career options existed, but movement between them was rare and very hard. Men typically did what their fathers did (e.g. – farmer, merchant, blacksmith), and breaking into many career fields was impossible thanks to restrictions on social class, race, or ethnicity. For example, a low-caste Indian was forbidden to become a priest, and a black American was forbidden admission to medical school. Women were usually prohibited from working outside the home, and so had even less life choice than men. The overwhelming majority of people had little or no access to information or ability to direct their courses of their own lives.

Only in the last 200 years, or 0.1% of our species’ existence, have non-trivial numbers of humans gained the ability to choose their own paths in life. The results have been disappointing in many ways. Young people, who are naturally ill-equipped to make major life choices for themselves, invest increasingly large amounts of time and money pursuing higher education credentials that turn out to not align with their actual talents, and/or that lead to underwhelming jobs. In the U.S., this has led to widespread indebtedness among young adults and to a variety of toxic social beliefs meant to vent their feelings of aggrievement and to (incorrectly) identify the causes of such early life struggles and failures.

The fact that we’re poor at picking careers, as evidenced by two of the articles I linked to earlier and by a vast trove of others you can easily find online, isn’t surprising. As I showed, nothing in our species’ history has equipped us with the skills to satisfactorily choose jobs for ourselves or other people. This is because nowhere near enough time has passed for natural selection to gift us with the unbiased self-insight and other cognitive tools we would need to do it well. If choosing the right field of study and career led to a person having more children than average, then the situation will be different after, say, ten more generations have passed.

Ultimately, most people end up “falling into” jobs that they are reasonably competent to perform and for which they have modest levels of passion, a lucky few end up achieving their childhood dreams, and an unlucky few end up chronically unemployed or saddled with jobs they hate. (I strongly suspect these outcomes have a bell curve distribution.)

As I said, the primary reason for this is that humans are innately mediocre judges of their own talents and interests, and are not much better grasping the needs of the broader economy so they can pursue careers likely to prosper. In the U.S. I think the problem is particularly bad due to the Cult of Self-Esteem and related things like rampant grade inflation and the pervasive belief that anyone can achieve anything through hard work. There aren’t enough reality checks in the education system anymore, too many powerful people (i.e. – elected politicians, education agency bureaucrats, and college administrators) have vested interests in perpetuating the current dysfunctional higher education system, and our culture has not come around to accepting the notion that not everyone is cut out for success and that it’s OK to be average (or even below average).

And I don’t know if this is a particularly American thing, but the belief that each person has one, true professional calling in life, and that they will have bliss and riches if only they can figure out what it is, is also probably wrong and leads people astray. A person might be equally happy in any one of multiple career types. And at the opposite end of the spectrum are people who have no innate passions, or who are only passionate about doing things that can’t be parlayed into gainful employment, like a person who absolutely loves writing poetry, but who also writes poor-quality poetry and lacks the aptitude and creativity to improve it.

Considering all the problems, letting computers pick our careers for us should be the default option! After all, if you’re probably going to end up with an “OK” career anyway that represents a compromise between your skills and interests and what the economy needs, why not cut out the expensive and stressful years of misadventures in higher education by having a machine directly connect you with the job? No high school kid has ever felt passionate about managing a warehouse, yet some of them end up filling those positions and feeling fully satisfied. 

Such a computer-based system would involve assigning each human an AI monitor during their childhood. Each person would also take a battery of tests measuring traits like IQ, personality traits, and manual dexterity during their teen years, performed multiple times to compensate for “one-off” bad test results. Machines would also interview each teen’s teachers and non-parent relatives to get a better picture of what they were suited for. (I’m resistant to relying on the judgements of parents because, while they generally understand their children’s personalities very well, their opinions about their children’s talents and potential are biased by emotion and pride. Most parents don’t want to hurt the feelings of their children, want to live vicariously through them, and like being able to brag to other people about their children’s accomplishments. For those reasons, few parents will advise their children to pursue lower status careers, even if they know [and fear] that that is what they are best suited for. )

After compiling an individual profile, the computer would recommend a variety of career fields and areas of study that best utilize the person’s existing and latent talents, with attention also paid to their areas of interest and to the needs of the economy. At age 18, the person would be enrolled in work-study programs where they would have several years to explore all of the options. It would be a more efficient and natural way to place people into jobs than our current higher education system. By interning at the workplaces early on, young adults would get an unadulterated view of important factors like work conditions and pay.

And note that, even among highly successful people today, it’s common for their daily work duties to make little or even no use of what they learned in their higher education courses. Some argue that a four-year college degree is merely a glorified way of signaling to employers that you have a higher than average IQ and can stick to work tasks and get along with peers in pseudo-work settings reasonably well. Instead of charging young people tens or hundreds of thousands of dollars for those certifications, why not do it earlier, less obtrusively, and much cheaper through the monitoring and testing I described?

While I think a computer-based system would be better for people on average and in the long run, it would also be psychologically shattering to many teenagers who got the bad news that their dream career was not in the cards for them. However, it is also psychologically shattering to pursue such dreams and to fail after many years of struggle and financial expenditure. Better to get over it as early as possible, and to enter the workforce faster and as more of an asset to the economy, with no time and money wasted on useless degrees, dropped majors, and career mistakes.

Finally, the same level of technology and of its integration into the workforce could raise the value of capital throughout each person’s career arc. AI monitors would detect changes to each person’s skill sets and knowledge bases over time, as old things were forgotten and new things were learned. Having an up-to-date profile of a worker’s strengths and weaknesses would further optimize the process of linking them with positions for which they were best qualified. And through other forms of monitoring and analysis, AIs would come to understand the unique demands of each line of work and how those demands were changing, and to custom tailor continuing education “micro-credentialing” for workers to keep them optimized for their roles.

Teaching more people to code isn’t a good jobs strategy

Amongst the sea of spilled ink about America’s purported “STEM shortage,” and policy proposals to address this disaster by training preschoolers to code,  this article stands out as one of the best counterpoints I’ve seen:

‘[Although] I certainly believe that any member of our highly digital society should be familiar with how these [software] platforms work, universal code literacy won’t solve our employment crisis any more than the universal ability to read and write would result in a full-employment economy of book publishing.’ (SOURCE)

OUCH! The article goes on describe how lower-skilled computer programming jobs are being outsourced to India, leaving a pool of higher-skilled jobs here in the U.S., which will get more cutthroat as time passes.

I’ll add the following:

  1. Computer coding is a dry, difficult job that few people are suited for. It requires tremendous patience, good math skills, a willingness to work brutal hours to get promoted, and it provides few (if any) opportunities for self-expression or emotionally interacting with clients. You sit in a cubicle looking at numbers and letters on a computer screen, tediously typing away and testing your software program over and over to work out the kinks. The notion that America can expand its white-collar workforce by incentivizing more people to become computer programmers rests on the flawed assumption that human beings are perfectly interchangeable widgets lacking innate strengths, weaknesses and preferences that together limit their job options.  The vast majority of people just aren’t cut out to spend eight hours a day poring over computer code.
  2. Wages will decrease if labor supply increases. As with any other profession, computer programmer salaries are determined by supply and demand. If the STEM Shortage Chicken Littles get their way and the number of American computer programmers sharply increases, then median wages will decrease unless there’s an equivalent rise in demand for their services. Lower pay will make an already dull and difficult job not worth it for many coders, and people will start fleeing for other jobs, counterbalancing the inflow of new coders.

If we do think that there’s a shortage of computer programmers in America, then there’s a fair case to be made that the best way to fix it is to focus on retaining existing talent rather than trying to attract new entrants to the field. Complaints about age discrimination against older workers, low pay, and overly demanding work schedules seem pervasive if the news articles out of Silicon Valley are to be believed, and are supported by high rates of turnover in computer programming companies.

Links

  1. https://qz.com/987170/coding-is-not-fun-its-technically-and-ethically-complex/
  2. http://blogs.harvard.edu/philg/2017/07/09/teaching-young-americans-to-be-code-monkeys/
  3. https://www.fastcompany.com/3058251/why-learning-to-code-wont-save-your-job
  4. https://techcrunch.com/2013/05/05/there-is-in-fact-a-tech-talent-shortage-and-there-always-will-be/
  5. http://www.npr.org/sections/ed/2015/09/18/441122285/learning-to-code-in-preschool
  6. https://www.bls.gov/ooh/computer-and-information-technology/computer-programmers.htm#tab-8