Combination of technologies led the digital revolution
For more than a decade, beginning in the early 1970s, the development of networks and that of home computers proceeded separately from one another. They finally began coming together in the late 1980s with the advent of modems, online services, and the Web. Just as combining the steam engine with ingenious machinery drove the Industrial Revolution, the combination of the computer and distributed networks led to a digital revolution that allowed anyone to create, disseminate, and access any information anywhere.
Industrial revolution and assembly lines
The Industrial Revolution was based on two grand concepts that were profound in their simplicity. Innovators came up with ways to simplify endeavors by breaking them into easy, small tasks that could be accomplished on assembly lines.
Balance of people
Eckert and Mauchly served as counterbalances for each other, which made them typical of so many digital-age leadership duos. Eckert drove people with a passion for precision; Mauchly tended to calm them and make them feel loved.
Interplay of ideas and team
This latter approach tries to show that what may seem like creative leaps—the Eureka moment—are actually the result of an evolutionary process that occurs when ideas, concepts, technologies, and engineering methods ripen together. Neither way of looking at technological advancement is, on its own, completely satisfying. Most of the great innovations of the digital age sprang from an interplay of creative individuals (Mauchly, Turing, von Neumann, Aiken) with teams that knew how to implement their ideas.
Focus yet ability to change mind
The same traits that make them inventive, such as stubbornness and focus, can make them resistant to change when new ideas come along. Steve Jobs was famously stubborn and focused, yet he dazzled and baffled colleagues by suddenly changing his mind when he realized he needed to think different. Aiken lacked that agility.
Like Xerox PARC and other corporate research satellites that followed, Bell Labs showed how sustained innovation could occur when people with a variety of talents were brought together, preferably in close physical proximity where they could have frequent meetings and serendipitous encounters.
Interior design of innovative workplaces
The corridors were extremely long, more than the length of two football fields, and designed to promote random meetings among people with different talents and specialties, a strategy that Steve Jobs replicated in designing Apple’s new headquarters seventy years later. Anyone walking around Bell Labs might be bombarded with random ideas, soaking them up like a solar cell.
Price is a key for innovation
Transistors were being sold in 1954 to the military for about $16 apiece. But in order to break into the consumer market, Haggerty insisted that his engineers find a way to make them so that they could be sold for less than $3. They did. He also developed a Jobs-like knack, which would serve him then and in the future, for conjuring up devices that consumers did not yet know they needed but would soon find indispensable.
But Haggerty understood the importance of spawning new markets rather than merely chasing old ones. He convinced a small Indianapolis company that built TV antenna boosters to join forces on what would be called the Regency TR-1 radio. Haggerty made the deal in June 1954 and, typically, insisted that the device be on the market by that November. It was.
Innovation in other fields
Indeed, there was a symbiotic relationship between the advent of the transistor radio and the rise of rock and roll. Elvis Presley’s first commercial recording, “That’s All Right,” came out at the same time as the Regency radio. The rebellious new music made every kid want a radio.
Throughout the 1950s Terman, who went on to become Stanford’s provost, grew the industrial park by encouraging its occupants to have a symbiotic relationship with Stanford; employees and executives could study or teach part-time at the university, and its professors were given leeway to advise new businesses. Stanford’s office park would end up nurturing hundreds of companies, from Varian to Facebook.
When to push, when to heed
One useful leadership talent is knowing when to push ahead against doubters and when to heed them. Shockley had trouble striking this balance. One case arose when he devised a four-layer diode that he thought would be faster and more versatile than a three-layer transistor.
How to know if the idea is right?
Worse yet, Shockley’s infatuation with the four-layer diode turned out to be misplaced. Sometimes the difference between geniuses and jerks hinges on whether their ideas turn out to be right. If Shockley’s diode had proved practical, or if he had evolved it into an integrated circuit, he may have again been regarded as a visionary. But that didn’t happen.
It was hard to get money, especially from established corporations, to start a completely independent company. The idea of seed funding for startups was not yet well established; that important innovation would have to wait, as we shall see, until the next time Noyce and Moore leaped into a new venture.
Transistors for pocket radios
The traitorous eight who formed Fairchild Semiconductor, by contrast, turned out to be the right people at the right place at the right time. The demand for transistors was growing because of the pocket radios that Pat Haggerty had launched at Texas Instruments, and it was about to skyrocket even higher;
Rapid daily iteration, not a single breakthrough
“I don’t remember any time when a light bulb went off and the whole thing was there,” conceded Noyce. “It was more like, every day, you would say, ‘Well, if I could do this, then maybe I could do that, and that would let me do this,’ and eventually you had the concept.” After this flurry of activity he wrote an entry in his notebook, in January 1959: “It would be desirable to make multiple devices on a single piece of silicon.”
Inventing and finding use cases
One aspect of innovation is inventing new devices; another is inventing popular ways to use these devices. Haggerty and his company were good at both. Eleven years after he had created a huge market for inexpensive transistors by pushing pocket radios, he looked for a way to do the same for microchips. The idea he hit upon was pocket calculators.
3 team - Noyce, Moore, Grove
In contrast to Noyce’s sweet gentility, Grove had a blunt, no-bullshit style. It was the same approach Steve Jobs would later use: brutal honesty, clear focus, and a demanding drive for excellence. “Andy was the guy who made sure the trains all ran on time,” recalled Ann Bowers. “He was a taskmaster. He had very strong views about what you should do and what you shouldn’t do and he was very direct about that.”
Authority and culture
In addition to being a recruiting tool, the culture at Atari was a natural outgrowth of Bushnell’s personality. But it was not simply self-indulgent. It was based on a philosophy that drew from the hippie movement and would help define Silicon Valley. At its core were certain principles: authority should be questioned, hierarchies should be circumvented, nonconformity should be admired, and creativity should be nurtured.
Innovation requires having at least three things: a great idea, the engineering talent to execute it, and the business savvy (plus deal-making moxie) to turn it into a successful product.
Making of research labs
A few corporate research centers, most notably Bell Labs, existed before the war. But after Bush’s clarion call produced government encouragement and contracts, hybrid research centers began to proliferate. Among the most notable were the RAND Corporation, originally formed to provide research and development (hence the name) to the Air Force; Stanford Research Institute and its offshoot, the Augmentation Research Center; and Xerox PARC. All would play a role in the development of the Internet.
They instinctively isolated and routed around any node that tried to claim more significance than the others. The Internet was born of an ethos of creative collaboration and distributed decision making, and its founders liked to protect that heritage. It became ingrained in their personalities—and in the DNA of the Internet itself.
Origin of packet switching
That led to an unstable hair-trigger situation; a nation was more likely to launch a preemptive strike if it feared that its communications and ability to respond would not survive an attack. “The origin of packet switching is very much Cold War,” he said.
the 1960s… music, space, computers
It was thus that in the second half of 1969—amid the static of Woodstock, Chappaquiddick, Vietnam War protests, Charles Manson, the Chicago Eight trial, and Altamont—the culmination was reached for three historic enterprises, each in the making for almost a decade. NASA was able to send a man to the moon. Engineers in Silicon Valley were able to devise a way to put a programmable computer on a chip called a microprocessor. And ARPA created a network that could connect distant computers.
The early Internet
It was still a gated community, open primarily to researchers at military and academic institutions. It wasn’t until the early 1980s that civilian counterparts to ARPANET were fully opened, and it would take yet another decade before most ordinary home users could get in.
The mother of all demos
They even were able to create hypertext links together. In short, Engelbart showed, back in 1968, nearly everything that a networked personal computer does today. The demo gods were with him, and to his amazement there were no glitches. The crowd gave him a standing ovation. Some even rushed up to the stage as if he were a rock star, which in some ways he was.
Homebrew club and first meeting
This first meeting of the Homebrew Computer Club was held on a rainy Wednesday, March 5, 1975, in Gordon French’s Menlo Park garage. It occurred just when the first truly personal home computer became available, not from Silicon Valley but from a sagebrush-strewn strip mall in a silicon desert.
Hardware as commodity and software as profits
What Gates and Allen set out to do on that December day in 1974 when they first saw the Popular Electronics cover was to create the software for personal computers. More than that, they wanted to shift the balance in the emerging industry so that the hardware would become an interchangeable commodity, while those who created the operating system and application software would capture most of the profits.
Programming language for a chip
That summer Gates and Allen became enchanted by Intel’s new 8008 microprocessor, a powerful upgrade of its 4004 “computer on a chip.” They were so excited by a story on it in Electronics Magazine that years later Gates would remember the page number it was on. If the chip really could act like a computer and be programmed, Allen asked Gates, why not write a programming language for it, specifically a version of BASIC?
For fun, he would study the manuals of the office computers made by Hewlett-Packard and DEC and then try to redesign them using fewer chips. “I have no idea why this became the pastime of my life,” he admitted. “I did it all alone in my room with my door shut. It was like a private hobby.” It was not an activity that made him the life of the party, so he became pretty much a loner, but that talent to save chips served him well when he decided to build a computer of his own.
Learning to do a smaller successful product first
The escapade ended after they got ripped off at gunpoint trying to sell one in a pizza parlor, but from the seeds of the adventure a company would be born. “If it hadn’t been for the Blue Boxes, there wouldn’t have been an Apple,” Jobs later reflected. “Woz and I learned how to work together.” Wozniak agreed: “It gave us a taste of what we could do with my engineering skills and his vision.”
Simplicity + focus
Devices should not need manuals. “That simplicity rubbed off on him and made him a very focused product person,” said Ron Wayne, who worked with Jobs at Atari. In addition, Bushnell was able to help mold Jobs into an entrepreneur.
Designing and selling
Jobs also started accompanying Wozniak to Homebrew meetings, carrying the television set and conducting the demonstrations, and he came up with a plan to sell circuit boards preprinted with Wozniak’s design. It was typical of their partnership. “Every time I’d design something great, Steve would find a way to make money for us,” said Wozniak. “It never crossed my mind to sell computers. It was Steve who said, ‘Let’s hold them in the air and sell a few.’ ”
UX principle - least surprise
“The goal was to give the user a conceptual model that was unsurprising,” Frankston explained. “It was called the principle of least surprise. We were illusionists synthesizing an experience.”
Execution > Prototype > Idea
But Jobs was the first to become obsessed with the idea of incorporating PARC’s interface ideas into a simple, inexpensive, personal computer. Once again, the greatest innovation would come not from the people who created the breakthroughs but from the people who applied them usefully.
The first GUI
What caught his attention was the graphical user interface featuring a desktop metaphor that was as intuitive and friendly as a neighborhood playground. It had cute icons for documents and folders and other things you might want, including a trash can, and a mouse-controlled cursor that made them easy to click. Not only did Jobs love it, but he could see ways to improve it, make it simpler and more elegant.
Instead of trying to market what he had produced, he decided simply to offer it publicly. He had recently gone with a friend to hear a lecture by Stallman, who had become an itinerant global preacher for the doctrine of free software. Torvalds didn’t actually get religion or embrace the dogma: “It probably didn’t make a huge impact on my life at that point. I was interested in the technology, not the politics — I had enough politics at home.” But he did see the practical advantages of the open approach.
Stallman and Linus
Torvalds admitted to “not exactly being a huge fan” of Stallman, explaining, “I don’t like single-issue people, nor do I think that people who turn the world into black and white are very nice or ultimately very useful. The fact is, there aren’t just two sides to any issue, there’s almost always a range of responses, and ‘it depends’ is almost always the right answer in any big question.”
Coexisting all approaches
Each model had its advantages, each had its incentives for creativity, and each had its prophets and disciples. But the approach that worked best was having all three models coexisting, along with various combinations of open and closed, bundled and unbundled, proprietary and free. Windows and Mac, UNIX and Linux, iOS and Android: a variety of approaches competed over the decades, spurring each other on—and providing a check against any one model becoming so dominant that it stifled innovation.
Computers vs network
As a kid growing up on the edge of London in the 1960s, Tim Berners-Lee came to a fundamental insight about computers: they were very good at crunching step by step through programs, but they were not very good at making random associations and clever links, the way that an imaginative human could.
Places and people for innovation
During his Oxford years, microprocessors became available. So, just as Wozniak and Jobs had done, he and his friends designed boards that they tried to sell. They were not as successful as the Steves, partly because, as Berners-Lee later said, “we didn’t have the same ripe community and cultural mix around us like there was at the Homebrew and in Silicon Valley.” Innovation emerges in places with the right primordial soup, which was true of the Bay Area but not of Oxfordshire in the 1970s.
Instead Berners-Lee insisted that the Web protocols should be made available freely, shared openly, and put forever in the public domain. After all, the whole point of the Web, and the essence of its design, was to promote sharing and collaboration.
Display and not editing
The emphasis on display rather than editing tools nudged the Web into becoming a publishing platform for people who had servers rather than a place for collaboration and shared creativity.
Sanger turned out to be wrong. The uncredentialed crowd did not run off the experts. Instead the crowd itself became the expert, and the experts became part of the crowd.
To tell the truth or not
Page told his advisor Winograd that, according to his rough estimate, his Web crawler would be able to accomplish the task in a few weeks. “Terry nodded knowingly, fully aware it would take much longer but wise enough to not tell me,” Page recalled. “The optimism of youth is often underrated!”
UX - no user blaming
‘Why are you giving people garbage?’ ” Page said. The answer he got was that the poor results were his fault, that he should refine his search query. “I had learned from my human-computer interaction course that blaming the user is not a good strategy, so I knew they fundamentally weren’t doing the right thing. That insight, the user is never wrong, led to this idea that we could produce a search engine that was better.”
Page and Brin proceeded to refine PageRank by adding more factors, such as the frequency, type size, and location of keywords on a Web page. Extra points were added if the keyword was in the URL or was capitalized or was in the title.
Most of the successful innovators and entrepreneurs in this book had one thing in common: they were product people. They cared about, and deeply understood, the engineering and design. They were not primarily marketers or salesmen or financial types; when such folks took over companies, it was often to the detriment of sustained innovation.