Think Like Nobody's Tracking Your Thinking
Irina Raicu is the director of the Internet Ethics program at the Markkula Center for Applied Ethics. Views are her own.
This essay first appeared in Recode in December 2017.
“Sometimes a type of glory lights up the mind of a man,” writes John Steinbeck in his novel East of Eden, which is set in a California valley — Salinas, though, not Silicon. “It happens to nearly everyone. You can feel it growing or preparing like a fuse burning toward dynamite. ... It is the mother of all creativeness, and it sets each man separate from all other men.”
Okay, but what does that have to do with artificial intelligence?
In the novel, published in 1952, Steinbeck continues:
I don’t know how it will be in the years to come. There are monstrous changes taking place in the world, forces shaping a future whose face we do not know. Some of these forces seem evil to us, perhaps not in themselves but because their tendency is to eliminate other things we hold good.
That line finds an echo in our times. Various ethicists are writing, these days, about the concerns that AI might eliminate some things “we hold good” — and not just meaning “jobs.” They write, for example, about the threat of “moral de-skilling” in the age of algorithmic decision-making. About what might be lost or diminished by the advent of robot caretakers. About what role humans will play, in general, in an age of machine learning and neural networks making so many of the decisions that shape human lives.
“It is true,” Steinbeck writes,
that two men can lift a bigger stone than one man. A group can build automobiles quicker and better than one man, and bread from a huge factory is cheaper and more uniform. When our food and clothing and housing all are born in the complication of mass production, mass method is bound to get into our thinking and to eliminate all other thinking.
We are in the process of shifting from the kind of mass production that Steinbeck talked about to a kind of mass production that requires much less human involvement. If “mass method” was bound to get into our thinking back then, how is it shaping our thinking now? Is this what the current focus on data collection and analysis of patterns is about?
“In our time,” adds Steinbeck,
mass or collective production has entered our economics, our politics, and even our religion, so that some nations have substituted the idea collective for the idea of God. This in my time is the danger. There is great tension in the world, tension toward a breaking point, and men are unhappy and confused.
In our own time, AI is spreading into all the various spheres of our lives, and there is tension and great concern about its impact. We are confused by dueling claims that AI will eliminate jobs or create new ones; that it will eliminate bias or perpetuate it and make it harder to identify; that it will lead us to longer, happier lives — or to extinction.
“At such a time,” writes Steinbeck’s narrator, “it seems natural and good to me to ask myself these questions. What do I believe in? What must I fight for and what must I fight against?”
Good questions for us, too.
“Our species is the only creative species,” writes Steinbeck, “and it has only one creative instrument, the individual mind and spirit of a man.” He goes on to knock down the notion of collaborative creativity, and you can certainly disagree with that, but keep in mind that in our time the trajectory seems to be toward handing over creativity, too, to algorithms, leaving aside the human mind (whether individual or collective).
Of course, it is still the human mind — individual or collective — that decides what data to collect for algorithms to analyze, what factors to incorporate into algorithms, what weight to give distinct factors and what data to use in training those algorithms; but those decisions are camouflaged by the often-accepted myth that “data-driven” or “data-based” algorithmic processes are objective, or neutral (unlike other human decision-making processes).
“And this I believe,” continues Steinbeck:
that the free, exploring mind of the individual human is the most valuable thing in the world. And this I would fight for: the freedom of the mind to take any direction it wishes, undirected. And this I must fight against: any idea, religion, or government which limits or destroys the individual. ... I can understand why a system built on a pattern must try to destroy the free mind, for this is the one thing which can by inspection destroy such a system.
I think about that as I read about the latest developments in data-driven pedagogy and education technologies that try to read — in order to shape — developing minds. “Are your brainwaves private, sensitive information?” asks a recent article in CSO magazine:
Most people probably never really gave it much thought because it is not something companies usually collect and store. But if kids in school had to start wearing brainwave-detecting headbands that measure their attention levels in real time, couldn’t that impact student privacy? The brainwave attention-level results are shared with teachers and school administrators, and are collected and stored by a private company.
Telling students that their attention levels (and emotions, and what else about their brains?) will be detected, measured and collected is very likely to impact that “free, exploring mind of the individual human” that Steinbeck wrote about.
This is not fiction. A company named BrainCo claims to offer the “world’s first wearable device specifically designed to detect and analyze users’ attention levels,” in conjunction with “the world’s first integrated classroom system that improves education outcomes through real-time attention-level reports.” CSO reports that BrainCo has sold 20,000 devices to China, and that BrainCo’s CEO has said the company’s goal is “to capture data from 1.2 million people ... [which] will enable us to use artificial intelligence on what will be the world’s largest database to improve our algorithms for things like attention and emotion detection.”
So the claimed goal is to harvest human brain waves in order to improve artificial intelligence, purportedly with the ultimate goal of improving human education and therefore human intelligence (though, as the CSO article notes, BrainCo representatives “did not rule out that students’ brainwave data might be used ‘for a number of different things’”).
Of course, the implications of such practices go beyond student privacy; or, rather, student privacy gets at something deeper than concerns about identity theft or potential misuse of disciplinary or grade records. Creativity requires privacy. Telling students that their attention levels (and emotions, and what else about their brains?) will be detected, measured and collected is very likely to impact that “free, exploring mind of the individual human” that Steinbeck wrote about. There’s a reason why “Dance like nobody’s watching” rings so true to so many.
One might add, “Think like nobody’s strapping a band around your head to collect information about your thinking.”
Creativity involves a leap — a departure from the known, the norm. Departures from the norm are subversions of the status quo. We are social animals, and most of us would be uncomfortable with being seen as subversive. So the brain-analyzing devices themselves might well chill creative thought, even if they were purely placebos. And they are not intended as placebos — so the algorithms involved might well “learn” about the chilled thinking that they themselves caused, and magnify and perpetuate its stunting effects.
The “glory” that Steinbeck wrote about is different from the insights of “a system built on a pattern.” The age of artificial intelligence forces us to work harder to define human intelligence — and to fight to defend it.
Photo by Bob May, used without modification under a Creative Commons license.