Texas alum had vision for the Internet that changed the world
It was the mystery of the human brain that first sparked Bob Taylor’s interest in computers nearly a half century ago.
Before Taylor initiated the ARPAnet project (the precursor to the Internet), before he funded the creation of the mouse, before he led the team that helped invent personal computing, he was a graduate student in psychology at The University of Texas at Austin who, as he tells it, “was interested in the brain and how it works.”
“I realized after awhile it was going to be years and years and years until we knew a great deal about the brain. I wanted more rapid reinforcement than that so I became interested in computers,” Taylor says. “Over time, as I learned more about the nature of computer programming, I realized the attraction of writing a program, posing some sort of question, putting it into a computer, and soon getting a result. Now that is rapid reinforcement.”
He also realized the computer could be more than just a piece of technology that would quickly do arithmetic. It was a tool that could create new communities and revolutionize the way people talk to each other and solve problems. That vision helped lead to the creation of the Internet 40 years ago.
New York Times technology reporter John Markoff, who is leading a conversation with Taylor at the LBJ Auditorium this week to mark the 100th anniversary of the Graduate School, has said few people deserve the title “father of the Internet” more than him. Taylor, however, shies away from that moniker.
“For the Internet to have existed and to be in the state it’s in today, it had to have thousands of fathers,” says Taylor, now retired in California. “There were a lot of people who made that happen.”
Still, many agree with Markoff and say Taylor jump-started the Internet age through his work in government in the 1960s and in the private sector in the 1970s and 1980s.
“Bob is not a technologist per se because his background is in philosophy and psychology,” says LBJ School Professor Gary Chapman, who first met Taylor in the late 1980s when Taylor was working at the Xerox Palo Alto Research Center (Xerox PARC). “But he’s got an extraordinary ability to see the future and to envision the future that technologists can make happen.”
He points to a landmark, 1968 white paper–“The Computer as a Communication Device” (PDF, download Adobe Reader)–that envisioned online “communities of interest” with hundreds of members spread across the world working together to solve problems.
“And after that, the technologies that flowered under his leadership ever after changed just about everything in the world,” says Chapman.
“Why can’t a computer work with text?”
Born in Dallas in 1932, and adopted in San Antonio 28 days later, Taylor grew up around Texas, moving frequently as his father, a Methodist minister, went to lead new churches.
After a tour in the U.S. Navy, Taylor enrolled in The University of Texas at Austin in 1954 first as an undergraduate and then a graduate student. In addition to psychology, he studied math, philosophy, English and religion.
For his master’s thesis, which he earned in 1964, Taylor spent hours in the dark, soundproof laboratories in the basement of Mezes Hall studying how well people can identify the location of a sound when they can’t see its source. He learned the human mind can pinpoint noise within three degrees of its actual location.
In researching his thesis, he also learned how frustrating computers of that era could be.
At that time, computers were used for batch processing. Researchers would punch holes in stacks of cards, give them to an operator who would then feed them into a computer and, in a day or two, come up with a set of results.
“Each card represented an instruction or a piece of data. There were hundreds for one program,” Taylor recalls. “If there was one error along the way, you had to start over, find the error, put the cards together, and go back and try again.”
The process wasn’t just frustrating. For Taylor, it was fundamentally wrong.
“I refused to use it and went back to my hand calculator and did things that way,” he says. “The whole process bothered me philosophically. Why can’t I work directly with the computer? Why can’t a computer work with text? Why did it have to be only a number cruncher?”
Forging a vision
Those types of questions drove Taylor as he left the Forty Acres, embarked on his career and joined thinkers from a host of other disciplines–math, linguistics, physics, electrical engineering–to forge the new world of computer science.
He joined NASA Headquarters at the beginning of the space race. There, he helped direct federal funds to Douglas Engelbart, who invented the mouse and amazed the world with a multi-city, online computing demonstration in 1968 that has since been dubbed “the mother of all demos.”
From NASA, Taylor moved to the Pentagon in the Department of Defense’s Advanced Research Projects Agency (ARPA), a civilian agency that supported speculative, long-term research. He was deputy director and then director of its computer research program. The program had been started by J.C.R. Licklider, a mentor who first took note of Taylor after reading his master’s thesis and later co-authored the influential white paper with him.
While at ARPA, Taylor had access to computers that were connected to universities that had ongoing projects funded by his office. He was frustrated that he had to literally move to a different chair to connect with different groups of researchers and that those researchers couldn’t use their computers to communicate with one another if they were in different cities.
When he thought of having one terminal that could connect all users through an interactive network, regardless of geography, the idea for ARPAnet was born. The resulting network was the precursor to the Internet, but it was not the Internet.
After leaving ARPA, Taylor spent 13 years building and managing the Computer Science Laboratory at Xerox PARC, where most of the tools for personal computing and the Internet–the personal computer, the laser printer, the Ethernet, the WSIWYG (what you see is what you get) graphical user interface–were invented.
He later built a new research lab for Digital Equipment Corp. and oversaw other groundbreaking developments, including new workstations, the electronic book and an early search engine. Taylor retired in 1996.
In 1999, he was awarded the National Medal of Technology. The citation reads, “For visionary leadership in the development of modern computing technology, including initiating the ARPAnet project–forerunner of today’s Internet–and advancing ground-breaking achievements in the development of the personal computer and computer networks.”
In 2004 he and three colleagues were awarded the Draper Prize, the highest award of the National Academy of Engineering. Its citation reads: “For the vision, development, and conception of the principles for, and their effective integration in, the world’s first practical networked personal computers”.
Computer Sciences Professor J Strother Moore worked for Taylor at Xerox PARC and remembers it as a “dream job” in a freewheeling and creative environment.
“On the one hand I felt free to do what I thought needed to be done, and on the other, my work was only part of a grander vision and was supported by the work of many others,” Moore says.
“You have to understand how computers were perceived in the 1960s and before: giant, incredibly valuable machines reserved for important scientific tasks and operated by a near priesthood,” Moore says. “To even imagine that one of these machines would sit on your desk, much less your lap, and be used for nothing more serious than composing a letter or drawing a picture or writing a memo to the office was a giant leap. But that is the vision Taylor had in the ’60s and he assembled the people at PARC to make it happen.”
That a student of psychology had such a vision is remarkable but also quite natural, says Brad Love, an associate professor of psychology.
“There’s a trend throughout time that people interested in how the mind works always look at the most complex machine that’s out there,” he says, pointing to the clock and the telephone switchboard as tools once viewed as metaphors for the brain. “In the 1960s, the computer metaphor dominated and it’s still the most dominant.”
Love, who teaches a course on the psychology of how products are designed, says the discipline provides a strong scientific foundation and also nurtures a curiosity about how people act. That allows psychologists to talk the language of engineers and apply the language to everyday life.
“There’s a human in the loop in all these things and a psychologist could bring a lot to it,” he says.
Although his vision has become reality, Taylor remains surprised by how long it took.
“We began using it ourselves every day in our work in our lab beginning about 1974, 1975,” he says. “So why should it take until 1995 for the Internet to begin to become widespread?”
The answer, he says, is multifold: Xerox never saw the commercial opportunities in the technology he and his team developed. (The company famously gave away some of its ideas to Apple founder Steve Jobs.) He met resistance from a computer industry tied to its batch processing profit model. And no one was around in the early years to fill that gap. IBM and ATandT rejected invitations to join the ARPAnet experiment.
“The elements of today’s Internet, the products, had to be created by lots of companies. Those companies had not yet come into being,” he says, noting that some were created by his colleagues and protégés. “The large companies were not interested.”
Taylor uses the Internet today mostly to check email or run Google searches.
“I’m not an esoteric user,” he says.
Like many common users, he’s frustrated by spam emails, viruses and security breaches but knows his successors in Silicon Valley and elsewhere are working on those problems.
As he reflects on his 35-year career, he says he’s most proud that he was able to assemble very creative people who invented and then put together the different pieces of the Internet and personal computing universe. And he’s still pleased that he helped the world use the computer to help build communities and not just crunch numbers or switch devices on and off.
“The notion of a stored program working its way through a problem with your help,” he says, “is still very exciting.”