Technology has changed how and where we do our work.
It's also changing the way we think.
Thursday, June 1, 2000
| In one of last season''s episodes of "Star Trek: Voyager," crewmember and former Borg Seven of Nine overdoses on ambition. Questing for perfection and hungry for success, the 24th-century cyborg overloads her computerized brain, eventually paying the price with the sanity of her human half.
Seven of Nine''s troubles start when she upgrades her electronically enhanced brain and starts downloading larger-than-usual quantities of the ship''s log while she "regenerates," or sleeps. At first, the upgrade is a boon. Seven of Nine''s skyrocketing productivity helps avert disaster when she discovers a dangerous virus in the ship''s plasma coils, thanks to her turbo-charged circuitry.
But shortly, the torrent of information flooding Seven of Nine''s processors overwhelms her. Paranoid and unable to distinguish between truth and fiction, she starts spreading rumors of conspiracy, hurling the ship into chaos. In the end, the diagnosis is simple and humbling: Seven of Nine''s ambition has exceeded her capabilities. The ship''s doctor dismantles the upgrade, and life aboard the Voyager is restored to order.
Leave it to science fiction to point out the dark side of progress. "Information overload" doesn''t have a heading in the Merck Manual yet, but it should. The phrase aptly describes the cranky, insomniac state that follows when a person''s synapses are fried to a smoking mess. It has other names, too: David Shenk, who wrote a book about the affliction, calls it "data smog." British psychologist David Lewis calls it "information fatigue syndrome." And of course there''s Sting, who beat the curve in 1981 with the Police song "Too Much Information."
Scientists are just beginning to unravel the effect of the information revolution on the human species. They know this much: the kind of work people do in the modern industrial world is unlike any work humans have done before. It involves huge amounts of information and demands that people wrap their minds around different subjects on cue, bouncing between instant e-mail messages and urgent phone calls, pagers and Web sites, assimilating facts and ideas all day long. Multitasking used to mean making sure the rolls didn''t burn while simultaneously talking on the phone and chopping vegetables. Now it means pingponging not just between ideas but their very presentation: in words, images and sounds, angled a dozen different ways.
It''s not just the amount of information that''s changed; it''s also the speed of transmission. People strive to work as quickly as their computers do, an unfair competition that always leaves the humans coming up short.
We have technology--now a feature of the environment--to thank for this state of affairs. And there''s reason to believe humans are changing their behavior to adapt to it.
As the old barriers separating work and home life dissolve, work gets diffused throughout the whole of life. New cultural trends reflecting this are already emerging, and something else may be beginning, too: a different way of thinking, one that doesn''t require long uninterrupted periods of time but instead mirrors the quicksilver shift of the images on a computer screen. History might look back at something called the Silicon Valley Human and recognize it as the prototype for a subtly but distinctly different human being, one that uses its brain in an entirely new way and whose cultural habits reflect a fundamental environmental shift.
Any time a species evolves, something is gained and something is lost. The question is, What are those things, and how will they change the way people live?
In the Beginning
No hallucinogen-huffing Amazon tribe could offer San Jose State University anthropologists Jan English-Lueck, Chuck Darrah and Jim Freeman charms above those of the Silicon Valley clan. They believe this place is a culture incubator, and that studying Silicon Valleyites may provide a clue to how the rest of the world will behave once it too is wired to capacity (if the rest of the world gets wired to capacity, as they like to say). The word that describes the culture-incubator phenomenon stems from Greek roots meaning "culture" and "creation."
"To me, Silicon Valley is a premier laboratory for ethnogenesis," says Jan English-Lueck. "We are creating culture right and left. I think that''s tremendously exciting."
One year ago, the trio completed 170 lengthy interviews with employees in the high-tech sector, from janitors to CEOs, shadowing their subjects throughout the work day and into their homes. Within a short time, the project''s indicators began to swing toward a kind of attitudinal magnetic north: Regardless of how many hours people spent at the office, they felt like they were working all the time.
Darrah, Freeman and English-Lueck found people making work calls during the commute, reading memos in front of the TV, checking e-mail after the kids went to bed and networking during every conceivable social function, from picking up the kids at day care to chatting at barbecues. They found people taking self-improvement courses to further their careers and using job skills at home. People "worked" on their relationships, their bodies, their personalities, their spirituality.
"The code word I use for it is ''workification,''" English-Lueck says. "When we''re touring people''s houses they''ll take us to various work stations: this is where they pay their bills, this is where they do research on their garden. The language people use, the processes they use, come straight out of work."
One person, she recalls, talked about having "family meetings." Being a consumer is another form of work: shopping for airline rates, picking an energy provider, taking care of the recycling.
There''s just one problem. Most people don''t want to work all the time. In fact, to lots of people this sounds like a living hell.
English-Lueck, however, is sanguine about the way work shades every part of people''s lives. It''s not nonstop drudgery in her view, but rather constant activity that leaves scant time to recharge.
"In many ways this is not an utterly bad thing, except that work is stressful, and that means that a lot of things that once were considered refuges are now considered work. So where do you go to get away from work if your family is work and your garden is work or driving is work?"
So who tricked us into accepting 16 hours of work every day? Computers were supposed to ease the burden of work, not just spread its weight over a larger portion of life.
Chuck Darrah believes that even though there''s a crisis mentality among many working people in Silicon Valley, much of it is self-induced. If so, then it seems even he''s not exempt. Darrah arrives at a 3pm interview not yet having eaten--he''s had meetings all day--and polishes off a burger between questions. A boots-wearing native of Santa Clara Valley, father of two and self-professed "old frump," Darrah insists that he''s really not a prophet of doom. The pace of life in Silicon Valley is fast, he concedes, but people exaggerate it--and besides, a lot of them kind of like it.
"People are very scheduled, but they''re not pathological," he says. "There''s kind of a pride or machismo about working so much. Despite all the talk about how things are spinning apart, when you talk to people, quite often they''ll also say their lives are quite satisfying. They''ll say, ''Part of this is that my life is so exciting. I''m doing things I never thought I''d be able to do, meeting people I never thought I''d meet.''"
Employers are more than happy to feed that intoxicating momentum with more thrilling work. But they also have the help of little inanimate elves--the gadgets, the go-anywhere, do-anything toys of work: laptop computers, pagers, cell phones. These are work''s handmaidens, the tools that allow people to haggle with clients while sunning at the beach.
The work of manipulating information requires nothing so gross as an assembly line or machine press, so it can be done anywhere, even in pajamas. The slow seepage of work through the supposedly impermeable walls of the home is a disaster or a victory, depending on how you look at it. Maybe it spells mental absenteeism by keyboard-tapping parents who are present in body only. Maybe it''s a return to a more integrated (if romanticized) way of life that prevailed for most of human history, when people worked in fields or shops close to home, their children nearby.
"Many people argue that what''s going on today isn''t anything new, that it''s reuniting work and family in a way that was rended apart by the Industrial Revolution," Darrah says.
He says he often hears "farm stories" from people who imagine that their lives working from a computer at home are not so different from those of the dairy farmers of yesteryear. But Darrah sees a huge difference, namely that most of the activity is being driven by major corporations that are considerably more bossy than cows.
Corporations, he says, hand employees the responsibility of balancing their private lives with their professional duties as if it''s easy--"As if it''s somehow a level playing field between you and Hewlett-Packard." Darrah points out that in all corporate jobs, downsizing is a constant threat and competition is as ferocious within companies as between them. Try slapping the big black-and-white haunches of Intel to make it move over next time it''s about to back you into a corner. You''ll soon find out you don''t own the old girl, even if you do get to work sitting by the pool. She owns you--and she knows that with the help of those fancy new tools, you can do a lot more work than you used to.
Darrah and English-Lueck can''t predict what life will be like in 50 years. But they do know that all the interruptions by e-mail and phone calls and faxes make it difficult for their study''s subjects to concentrate the way they used to.
"It means life is lived in very short chunklets, little chunks of time in which something gets done," Darrah says. "Whereas in the past you might just sit down and just do something for a long period of time. Very seldom do people have time to do that."
After having a family, Darrah had to learn to write academic papers in snippets of time, abandoning the linear beginning-to-end method he grew up perfecting. He says he can now take 10- or 15-minute increments and scribble paragraphs on his yellow tablets that he''ll later rearrange into a coherent order, as if he were using a word processing program.
"I said, ''I can either never write again or I can learn to write [while] being interrupted,''" he says.
If every moment, even outside of work, is spent striving toward some officious end--reading a quick article in a trade journal, exercising to keep heart disease at bay, maintaining a network of potentially useful acquaintances with quick personal e-mails--then something has to fall away. And some people think it''s the fragile things that go first: contemplative time, time to just be and not do, time to let the mind drift and spin.
Neil Quinn, director of ethics and technology for the Markkula Center for Applied Ethics at the University of Santa Clara, thinks creative thought is the biggest casualty of the gadget glut. It''s the reflective time that''s slipping away, he says, time people need to assimilate what they''ve learned and, as he puts it, "arrive at a new and higher understanding of things.
"Take two unrelated things that have occurred," he says. "You''ve read books by two authors who are completely unconnected. But you happen to have gained some third insight that draws a relationship between those two books. There has to be that quiet time that allows you to assimilate the information you''re dealing with. You''re never going to come up with that third conclusion, that synergy, without it."
Quinn thinks "real in-depth thinking ability" has taken a hit, too. "This may be a little bit far-fetched, but what I think we''re seeing today is more people becoming jacks of all trades because we''re gaining lots of information that''s new to us," he says. "But we''re not advancing the state of intelligence the way we previously did. So granted, more people know more things, but we''re not increasing depth of knowledge."
In the argot of the metaphysicians, as above, so below. Scattered, surfacey thinking can do a number on a person''s physiology as well as on intelligence. If someone''s juggling a number of tasks and the tasks don''t get finished, says Larry Rosen, co-author of the 1997 book TechnoStress, the brain and the body get stressed out. The brain wants to do things thoroughly, and it rebels when it can''t.
"There''s something called the Zeigarnick effect that says that you remember unfinished tasks better than you remember finished ones," Rosen says. He explains that when you start a task and don''t finish it, it stays "on" in the brain like a string of lights. If enough strings are left on, they will disrupt sleep--usually between 2 and 4am. Multitasking increases that likelihood.
"You could light up the same number of tasks one at a time," Rosen says, "and finish each one and it would go off. But if you have 10 areas all lit up, some dimmer than others, and you''re switching between, then your brain''s controller is always checking in on those unfinished tasks to make sure you don''t forget them. You can think of it as your brain always being in a state of fight or flight."
For the sleep-starved masses who keep waking up at 3am thinking about work, Rosen has some recommendations. Don''t check e-mail or voice-mail just before going to bed. Make a to-do list at night as a cue to the brain that you''ll take care of unfinished business tomorrow. Then, before going to sleep, do something totally unrelated to work: read, watch TV, write in a journal.
People who are very busy have different strategies for getting through their days. Compartmentalizing is a key strategy for some of them.
April Sakara, vice president of marketing for a startup called FastForward Networks and mother of two, has a schedule as tight as a waterproof weave, with every half-hour accounted for. She prevents information overload by picking three things to accomplish every day and "having those channels running" in her mind. When something comes up related to one of those goals, she "finds the bucket to drop it in."
Sakara is so efficient she''s even recruited her dreams to help her out from time to time. Like Seven of Nine, she sometimes processes things in her sleep.
"It''s very bizarre. Usually I don''t dream, yet when I have things I''m struggling with, I don''t have the same kind of sleep. I''m like walking the Earth or something. It''s hard to explain. Sometimes it''s a solution I''m looking for. Sometimes it''ll be an at-ease feeling about it, like I found the bucket it needs to go in and it will be solved in my everyday routine."
Chopping up the day''s activities into neat cubes that fit into certain compartments is just one part of the solution. The other part is moving between the compartments with the agility that urgent e-mails and ringing cell phones demand.
The Kids Are All Right
There is one group seemingly unfazed by the fuss, one that handles the ringing phones and constant e-mails with perfect aplomb. The kids.
"Now we have the luxury of looking at younger people--teenagers and kids--and they just accept this as the norm," says English-Lueck. "[For them] having this kind of density of information is just the way you live, and they don''t have much angst about it, or regret. If anything, they kind of like it."
There''s nothing like a half hour of quality time with MTV to make a grownup feel old and irrelevant. Music videos, like the commercials and movie trailers they inspired, explode with flashing images that stay onscreen for fractions of a second. It might make parents feel nauseated, but the tykes can gaze at it for hours. They can even do homework in front of it, or so countless kids have insisted in countless arguments with their parents.
The ante''s been upped at the computer and console games table, too, says Christopher Erhardt, who teaches game design at Digipen Institute of Technology in Washington.
"It used to be possible to put out something with minimum graphics and minimum sound and they''d be content to play games that weren''t what you''d call graphic showpieces," he says. "But now kids want bang for their buck. They tend to want high action with sophisticated controls that are innovative but not really overwhelming."
Erhardt notes that kids are able to keep up with just about anything game designers put on the market. He attributes this not to some mysterious cognitive development but to something quite recognizable to the parents arguing with their kids over doing homework in front of the TV while talking on the phone.
"It''s more an ability we all had as teenagers that we tend to forget, and that''s the ability to multitask," Erhardt says. "We always as adolescents have a much better ability to compartmentalize than we do as adults."
No one has tracked the minute chemical footprints in the brain that show how the cognitive process has adapted to late-20th-century technology. But scientists have definite ideas about what they would find if they could.
"We''ve been tossing it around for a long time, and there''s no study on it," says Larry Rosen. "Our guess is because of the multimedia nature of technology, and because of changes in the presentation of information over the years--from longer and more detailed to shorter and more to the point--that people today are getting less tolerant of longer bursts of information.
"But I''m pretty sure we''re going to find that we have shorter attention spans," he says confidently. "Even the schools have cut down the amount of time for a lesson. When I was in junior high and high school, classes were exactly an hour. Classes now are 42 minutes long, and my kids can tell you exactly what time each class begins and ends."
Mental long-distance runners have fallen out of fashion. It''s all about sprinting now.
It Started With Book Larnin''
This is not the first time human brains have mutated to expand some functions while allowing others to wither. Anthropologists Darrah and English-Lueck both compare the present to the moment in the early 16th century when literacy touched down among the masses.
"One of the effects of literacy is that you can no longer remember anything," English-Lueck says. "There''s no way on earth I could begin to memorize a five-hour saga, and yet that was the norm for someone who was educated, even 500 years ago. Because we learned to externalize information to paper, we started thinking differently and working differently and acting differently. And it could be that we are in a similar situation where we''re going to treat information differently."
We already do. Why commit an 11-digit phone number to memory when there''s a speed dialer on the phone? E-mail addresses can be easily stored on e-mail programs. And this is just the beginning of the ways people rely on technology to remember things for them.
In spite of ethicist Neil Quinn''s fears about the prospects for creative thought, he thinks humans will ultimately benefit from not having to sweat the small stuff. In terms of the literacy model, he says, the modern-day analog to 16th-century memory loss is even more memory loss, courtesy of the "memory augmentation" offered by other mediums--like RAM.
The new crucial skill, Quinn says, will be knowing how to navigate the information quickly in order to retrieve what''s useful.
"I think it''s better that we have our reasoning skills and deductive skills fine-tuned more than they used to be," Quinn says, "instead of cluttering our minds with pure information."
Quinn''s take is a refreshing change from the usual gripe that we''re headed for a national ADHD epidemic. Maybe, like a nation of absent-minded professors who''ve finally found the perfect assistants to mind the details, we''ll dump all our petty concerns into a Palm Pilot and, free at last to pursue poetry, philosophy and higher math, get down to the serious business of thinking. The computer will alert us when it''s time to address one of the petty concerns, and humankind will evolve into a nobler version of itself.
OK, maybe not. But whatever arrives in the coming generations will be cloaked in normalcy soon enough. And whatever it is that will be lost in the coming generations is not really worth lamenting, because shortly, no one will miss it.
The anthropologists know how to put it into perspective.
"We''re in a time of transition," says Chuck Darrah. "We don''t know how it''s going to shake out yet. How long does it take for things to change? The simple answer to that is: ''However long it takes the last generation to die out.''"