To Adapt to Tech, We’re Heading Into the Shadows

Given this, your robotics rotation comes as a rude awakening: The entire da Vinci system—and therefore the entire procedure—can be controlled by one person at a time. This means your involvement in the surgical work is entirely optional. Not only are you not really fluent with this glorified videogame controller, you are working with a surgeon who knows that if they give you all-or-nothing control of this beast, you will be much slower and make more mistakes than they would. Moreover, every action you take is broadcast to the attending physician's console and onto large high-definition TVs. The attending, nurses, scrub, and anesthesiologist can see and judge it all. Put together, it means the attending will barely let you operate, and will “helicopter teach” you when they do, and nurses and scrubs will spread the word that you suck to other attendings. You’re stuck in “see one” mode for most of your residency. After four or five years of trying to learn the approved way, you’ve barely gotten to work through an entire robotic surgery yet are legally empowered to use this tool wherever you land.

Building embodied skill in a high-status profession is just one tiny slice of how we adapt and innovate, given new technology—but the reasons for the rise of productive deviance are clear here and evident in many other industries, ranging from policing to chip design to journalism.

What happened here? In search of major leaps in productivity, we’ve created and deployed intelligent technologies. These allow for two things: much higher-quality, more widely shared scrutiny of the work by more people, and for a single expert to take much more precise and complete control of the actions required to get the job done. On the surface, this is fantastic—allowing each expert to make better use of their talents and for a team of diverse professionals to coordinate much more fluidly. Under the surface, it blocks you from learning in the “See, do, teach” pathway that’s been the approved default for a long, long time. And gray area options aren’t really available—you don’t have legitimate access to the system before your training starts. And you aren’t going to even try to push your way into a procedure, because you know you don’t have the basic skill you need to be granted control of the thing, and so does that expert. They are just going to swat you down.

If you’re going to adapt—and about one in eight residents in my study did—you have to do so in really inappropriate ways.

If you’re one of the few who manage to get really good with the robot during your residency, you started getting practical exposure to it years in advance—when everyone (even you) would say it’s totally inappropriate. In undergrad or medical school, you hung around in labs when you should have been getting a generalist education. You spent hundreds of extra hours on the simulator or reviewing videos of robotic surgery on YouTube when you should have been spending time with patients. Then, after all this prep showed the attending, nurses, and scrubs that you could handle the da Vinci, you used this to get preferential access to procedures and, most importantly, to operate without an attending in the room. The more you did this, the better you got, and the more rope you were given by attending physicians. But every one of these steps was at best a serious, very concerning breach of standards for your profession and hospital operations—and in some cases maybe even against the law.

Let’s step back to think about adaptation to new tech in general. If you’re involved in work that exhibits the above characteristics, you’re going to turn to deviance to innovate and adapt. It would be one thing to argue this off one data set, but since I published my work on surgery, I’ve checked this across more than two dozen top-quality studies, and this pattern shows up in all of them.