What's new

Welcome to bpuei | Welcome My Forum

Join us now to get access to all our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, and so, so much more. It's also quick and totally free, so what are you waiting for?

stepping aside


Staff member
Feb 16, 2024
Reaction score
In Only Humans Need Apply, the authors identify five ways that people can adapt to automation and intelligent machines. They call it ‘stepping’. I have added in parentheses the main attributes I think are needed for each option.

  • Step-up: directing the machine-augmented world (creativity)
  • Step-in: using machines to augment work (deep thinking)
  • Step-aside: doing human work that machines are not suited for (empathy)
  • Step narrowly: specializing narrowly in a field too small for augmentation (passion)
  • Step forward: developing new augmentation systems (curiosity)

There is a lot of talk and media coverage about stepping-up, stepping-in, and stepping-forward. I have previously discussed stepping-in and concluded that anyone affected by these technologies [AI, GPT, LLM] needs to understand their basic functions and their underlying models. These tools will be thrust into our workplaces very soon. So let’s step-in to working with machine learning but with a clear understanding of who needs to be in charge — humans. I stand by this position today.

There are few people discussing ‘stepping-aside’ — using our human empathy — as an option for work. Connecting our work as much as possible to human-to-human interactions I believe is becoming essential. The potential for Gollem-class AIs (Generative Large Language Multi-modal Models) to bringing about “the total decoding and synthesizing of reality” is scary.

Automation can be a good thing. There are many rote, dangerous, and mundane tasks that are better done by machines. But we cannot and should not automate our humanity. We are already seeing this with automated responses in email and texting.

“So what happens when we automate our most impactful and superior cognitive capacity—thinking—and we don’t think for ourselves? I think we end up not acting in very smart ways, and then the algorithms are trained by behaviors that have very little to do with intelligence. Most of the stuff we spend doing on a habitual basis is quite predictable and monotonous and has very little to do with our imagination, creativity, or learnability—which is how we refer to curiosity.” —Tomas Chamorro-Premuzic, author — I, Human: AI, Automation, and the Quest to Reclaim What Makes Us Unique. In the ‘age of AI,’ what does it mean to be smart? —2023-03-16

In step-lively I suggested that doing human work that machines are not suited to do requires that we really understand what machines and code are able to do and what they may be able to do. For example, voice actors are already getting replaced by technology. Perhaps barbers and hair stylists will survive longer. The key for this work is — choose well.

This is where I will be focusing my professional efforts — stepping aside — by encouraging manual sensemaking, frameworks like personal knowledge mastery, and participation in communities of practice. I am not ignoring new technologies in this ‘AI’ field, but I believe there is a real need for people to get better at communicating and making sense with other people.

find community

The perpetual beta coffee club is a community of practice focused on learning at work
Top Bottom