Collaboration on various industry sectors

For the birth of the digital age, this included a research ecosystem that was nurtured by government spending and managed by a military-industrial-academic collaboration. Intersecting with that was a loose alliance of community organizers, communal-minded hippies, do-it-yourself hobbyists, and homebrew hackers, most of whom were suspicious of centralized authority.

Combination of technologies

For more than a decade, beginning in the early 1970s, the development of networks and that of home computers proceeded separately from one another. They finally began coming together in the late 1980s with the advent of modems, online services, and the Web. Just as combining the steam engine with ingenious machinery drove the Industrial Revolution, the combination of the computer and distributed networks led to a digital revolution that allowed anyone to create, disseminate, and access any information anywhere.

Connecting arts and sciences

Finally, I was struck by how the truest creativity of the digital age came from those who were able to connect the arts and sciences. They believed that beauty mattered.

How the innovations were made

Emblematic of the time, it provided a unified sense of the extraordinary endeavors of discovery that were under way. She proclaimed in her opening sentence, “The progress of modern science, especially within the last five years, has been remarkable for a tendency to simplify the laws of nature and to unite detached branches by general principles.”

What is imagination?

“What is imagination?” she asked in an 1841 essay. “It is the Combining faculty. It brings together things, facts, ideas, conceptions in new, original, endless, ever-varying combinations.

Babbage’s general purpose computer

Babbage’s new idea, which he conceived in 1834, was a general-purpose computer that could carry out a variety of different operations based on programming instructions given to it. It could perform one task, then be made to switch and perform another. It could even tell itself to switch tasks—or alter its “pattern of action,” as Babbage explained—based on its own interim calculations. Babbage named this proposed machine the Analytical Engine. He was one hundred years ahead of his time.

Industrial revolution and assembly lines

The Industrial Revolution was based on two grand concepts that were profound in their simplicity. Innovators came up with ways to simplify endeavors by breaking them into easy, small tasks that could be accomplished on assembly lines.

Timing is right

the idea of sending a man to the moon was proposed right when the progress of microchips made it possible to put computer guidance systems into the nose cone of a rocket.

Another synergy - digital and logic

A fundamental trait of the computer revolution was that it was based on digital, not analog, computers. This occurred for many reasons, as we shall soon see, including simultaneous advances in logic theory, circuits, and electronic on-off switches that made a digital rather than an analog approach more fruitful.

Combination

Innovation occurs when ripe seeds fall on fertile ground. Instead of having a single cause, the great advances of 1937 came from a combination of capabilities, ideas, and needs that coincided in multiple places.

Quantum and humans

He believed (at least while he was young) that this uncertainty and indeterminacy at the subatomic level permitted humans to exercise free will—a trait that, if true, would seem to distinguish them from machines. In other words, because events at the subatomic level are not predetermined, that opens the way for our thoughts and actions not to be predetermined.

Calculators from mechanical to relays

Calculator, as it was called, was completed in 1939. It had more than four hundred relays, each of which could open and shut twenty times per second. That made it both blindingly fast compared to mechanical calculators and painfully clunky compared to the all-electronic vacuum-tube circuits just being invented. Stibitz’s computer was not programmable, but it showed the potential of a circuit of relays to do binary math, process information, and handle logical procedures.

Expense and innovation

A college friend who was helping him, Helmut Schreyer, urged that they make a version using electronic vacuum tubes rather than mechanical switches. Had they done so right away, they would have gone down in history as the first inventors of a working modern computer: binary, electronic, and programmable. But Zuse, as well as the experts he consulted at the technical school, balked at the expense of building a device with close to two thousand vacuum tubes.

Exhaustive research on the current field

When he decided to build a vacuum-tube computer of his own, Mauchly did what good innovators properly do: he drew upon all of the information he had picked up from his travels.

All ideas are built upon each other

When people take insights from multiple sources and put them together, it’s natural for them to think that the resulting ideas are their own—as in truth they are. All ideas are born that way.

Balance of people

Eckert and Mauchly served as counterbalances for each other, which made them typical of so many digital-age leadership duos. Eckert drove people with a passion for precision; Mauchly tended to calm them and make them feel loved.

Innovation in a group

How you rank the historic contributions of the others depends partly on the criteria you value. If you are enticed by the romance of lone inventors and care less about who most influenced the progress of the field, you might put Atanasoff and Zuse high. But the main lesson to draw from the birth of computers is that innovation is usually a group effort, involving collaboration between visionaries and engineers, and that creativity comes from drawing on many sources.

Interplay of ideas and team

This latter approach tries to show that what may seem like creative leaps—the Eureka moment—are actually the result of an evolutionary process that occurs when ideas, concepts, technologies, and engineering methods ripen together. Neither way of looking at technological advancement is, on its own, completely satisfying. Most of the great innovations of the digital age sprang from an interplay of creative individuals (Mauchly, Turing, von Neumann, Aiken) with teams that knew how to implement their ideas.

Focus yet ability to change mind

The same traits that make them inventive, such as stubbornness and focus, can make them resistant to change when new ideas come along. Steve Jobs was famously stubborn and focused, yet he dazzled and baffled colleagues by suddenly changing his mind when he realized he needed to think different. Aiken lacked that agility.

Store vs load computer programs

beginning of 1944, Mauchly and Eckert realized that there was a good way to make computers easily reprogrammable: store the programs inside the computer’s memory rather than load them in every time. That, they sensed, would be the next great advance in computer development. This “stored-program” architecture would mean that a computer’s tasks could be changed almost instantly, without manually reconfiguring cables and switches.

Proprietary - spent more money on lawsuits

In the seventy years since von Neumann effectively placed his “Draft Report” on the EDVAC into the public domain, the trend for computers has been, with a few notable exceptions, toward a more proprietary approach. In 2011 a milestone was reached: Apple and Google spent more on lawsuits and payments involving patents than they did on research and development of new products.

Open source == faster development?

Instead, beginning in the 1950s, innovation in computing shifted to the corporate realm, led by companies such as Ferranti, IBM, Remington Rand, and Honeywell. That shift takes us back to the issue of patent protections. If von Neumann and his team had continued to pioneer innovations and put them in the public domain, would such an open-source model of development have led to faster improvements in computers?

Software - open source? Hardware - proprietary?

Or did marketplace competition and the financial rewards for creating intellectual property do more to spur innovation? In the cases of the Internet, the Web, and some forms of software, the open model would turn out to work better. But when it came to hardware, such as computers and microchips, a proprietary system provided incentives for a spurt of innovation in the 1950s.

Birth of artificial intelligence

in her final “Note” on Babbage’s Analytical Engine: that machines could not really think. If a machine could modify its own program based on the information it processed, Turing asked, wouldn’t that be a form of learning? Might that lead to artificial intelligence?

Computers: brute force vs re-learning

Turing approached the problem not by thinking of ways to use brute processing power to calculate every possible move; instead he focused on the possibility that a machine might learn how to play chess by repeated practice. In other words, it might be able to try new gambits and refine its strategy with every new win or loss.

Workplace innovation

Like Xerox PARC and other corporate research satellites that followed, Bell Labs showed how sustained innovation could occur when people with a variety of talents were brought together, preferably in close physical proximity where they could have frequent meetings and serendipitous encounters.

3 stages from labs to consumers

Indeed, throughout the digital age, the two approaches went together. Creative geniuses (John Mauchly, William Shockley, Steve Jobs) generated innovative ideas. Practical engineers (Presper Eckert, Walter Brattain, Steve Wozniak) partnered closely with them to turn concepts into contraptions. And collaborative teams of technicians and entrepreneurs worked to turn the invention into a practical product.

Interior design of innovative workplaces

The corridors were extremely long, more than the length of two football fields, and designed to promote random meetings among people with different talents and specialties, a strategy that Steve Jobs replicated in designing Apple’s new headquarters seventy years later. Anyone walking around Bell Labs might be bombarded with random ideas, soaking them up like a solar cell.

Sitting together

There was initially no separate office for Bardeen, so he ensconced himself in Brattain’s lab space. It was a smart move that showed, once again, the creative energy generated by physical proximity. By sitting together, the theorist and the experimentalist could brainstorm ideas face-to-face, hour after hour.

Birth of transistor

Indeed, the transistor was one of the most important discoveries of the twentieth century. It came from the partnership of a theorist and an experimentalist working side by side, in a symbiotic relationship, bouncing theories and results back and forth in real time.

Price is a key for innovation

Transistors were being sold in 1954 to the military for about $16 apiece. But in order to break into the consumer market, Haggerty insisted that his engineers find a way to make them so that they could be sold for less than $3. They did. He also developed a Jobs-like knack, which would serve him then and in the future, for conjuring up devices that consumers did not yet know they needed but would soon find indispensable.

New markets

But Haggerty understood the importance of spawning new markets rather than merely chasing old ones. He convinced a small Indianapolis company that built TV antenna boosters to join forces on what would be called the Regency TR-1 radio. Haggerty made the deal in June 1954 and, typically, insisted that the device be on the market by that November. It was.

Innovation in other fields

Indeed, there was a symbiotic relationship between the advent of the transistor radio and the rise of rock and roll. Elvis Presley’s first commercial recording, “That’s All Right,” came out at the same time as the Regency radio. The rebellious new music made every kid want a radio.

Leadership

One problem with successful teams, particularly intense ones, is that sometimes they break up. It takes a special type of leader—inspiring yet also nurturing, competitive yet collaborative—to hold such teams together.

Leadership of letting go

Another skill of great team leaders is the ability to instill a nonhierarchical esprit de corps. Shockley was bad at that as well. He was autocratic, often snuffing out spirit by quashing initiative. The great triumph of Brattain and Bardeen had come when Shockley was offering up a few suggestions but not micromanaging or bossing them.

How Silicon Valley got started

Beckman wanted it located in the Los Angeles area, where most of his other divisions were. But Shockley insisted that it be located in Palo Alto, where he had been raised, so that he could be near his aging mother. They doted on each other intensely, which some found weird but which had the historic significance of helping to create Silicon Valley. Palo Alto was still, as it had been in Shockley’s childhood, a small college town surrounded by orchards.

Flexible university

Throughout the 1950s Terman, who went on to become Stanford’s provost, grew the industrial park by encouraging its occupants to have a symbiotic relationship with Stanford; employees and executives could study or teach part-time at the university, and its professors were given leeway to advise new businesses. Stanford’s office park would end up nurturing hundreds of companies, from Varian to Facebook.

Shockley recruits Robert Noyce

Shockley tried to recruit some of the researchers he had worked with at Bell Labs, but they knew him too well. So he set about compiling a list of the best semiconductor engineers in the country and calling them cold. The most notable of them all, destined to be a momentous choice, was Robert Noyce, a charismatic Iowa golden boy with a doctorate from MIT, who was at the time a twenty-eight-year-old research manager at Philco in Philadelphia.

Innovation - when to push, when to heed

One useful leadership talent is knowing when to push ahead against doubters and when to heed them. Shockley had trouble striking this balance. One case arose when he devised a four-layer diode that he thought would be faster and more versatile than a three-layer transistor.

How to know if the idea is right?

Worse yet, Shockley’s infatuation with the four-layer diode turned out to be misplaced. Sometimes the difference between geniuses and jerks hinges on whether their ideas turn out to be right. If Shockley’s diode had proved practical, or if he had evolved it into an integrated circuit, he may have again been regarded as a visionary. But that didn’t happen.

Starting companies

“You’re better off to go out and start your own company and fail than it is to stick at one company for thirty years. But that wasn’t true in the 1950s. It must’ve been scary as hell.”

Another innovation - venture capital

It was hard to get money, especially from established corporations, to start a completely independent company. The idea of seed funding for startups was not yet well established; that important innovation would have to wait, as we shall see, until the next time Noyce and Moore leaped into a new venture.

Transistors for pocket radios

The traitorous eight who formed Fairchild Semiconductor, by contrast, turned out to be the right people at the right place at the right time. The demand for transistors was growing because of the pocket radios that Pat Haggerty had launched at Texas Instruments, and it was about to skyrocket even higher;

Space and smaller computers - when demand matches innovation

It also helped assure that the development of these two technologies became linked. Because computers had to be made small enough to fit into a rocket’s nose cone, it was imperative to find ways to cram hundreds and then thousands of transistors into tiny devices.

Birth of microchip and manufacturing

It was, instead, part of a recipe for an innovation. The need to solve this growing problem coincided with hundreds of small advances in ways to manufacture semiconductors. This combination produced an invention that occurred independently in two different places, Texas Instruments and Fairchild Semiconductor. The result was an integrated circuit, also known as a microchip.

Birth of microchip

It was not the most elegant device. In the models that Kilby built that fall of 1958, there were a lot of tiny gold wires connecting some of the components within the chip. It looked like expensive cobwebs sticking out of a silicon twig. Not only was it ugly; it was also impractical. There would be no way to manufacture it in large quantities. Nevertheless, it was the first microchip.

Rapid daily iteration

“I don’t remember any time when a light bulb went off and the whole thing was there,” conceded Noyce. “It was more like, every day, you would say, ‘Well, if I could do this, then maybe I could do that, and that would let me do this,’ and eventually you had the concept.” 9 After this flurry of activity he wrote an entry in his notebook, in January 1959: “It would be desirable to make multiple devices on a single piece of silicon.”

Giving credit where due

There is an inspiring lesson in how Kilby and Noyce personally handled the question of who invented the microchip. They were both decent people; they came from tight-knit small communities in the Midwest and were well grounded. Unlike Shockley, they did not suffer from a toxic mix of ego and insecurity. Whenever the topic of credit for the invention came up, each was generous in praising the contributions of the other. It soon became accepted to give them joint credit and refer to them as coinventors.

Apollo microchip

The Apollo program, as it became known, needed a guidance computer that could fit into a nose cone. So it was designed from scratch to use the most powerful microchips that could be made. The seventy-five Apollo Guidance Computers that were built ended up containing five thousand microchips apiece, all identical, and Fairchild landed the contract to supply them.

Inventing and finding use cases

One aspect of innovation is inventing new devices; another is inventing popular ways to use these devices. Haggerty and his company were good at both. Eleven years after he had created a huge market for inexpensive transistors by pushing pocket radios, he looked for a way to do the same for microchips. The idea he hit upon was pocket calculators.

Build the same thing smaller, cheaper and more efficient

Build a handheld calculator that can do the same tasks as the thousand-dollar clunkers that sit on office desks. Make it efficient enough to run on batteries, small enough to put into a shirt pocket, and cheap enough to buy on impulse.

Innovation and industry

That became the pattern for electronic devices. Every year things got smaller, cheaper, faster, more powerful. This was especially true—and important—because two industries were growing up simultaneously, and they were intertwined: the computer and the microchip. “The synergy between a new component and a new application generated an explosive growth for both,” Noyce later wrote.

Symbiotic industries

There was a key lesson for innovation: Understand which industries are symbiotic so that you can capitalize on how they will spur each other on.

New services are also innovations

Innovations come in a variety of guises. Most of those featured in this book are physical devices, such as the computer and the transistor, and related processes, such as programming, software, and networking. Also important are the innovations that produce new services, such as venture capital, and those that create organizational structures for research and development, such as Bell Labs.

Vision vs details

Gordon Moore was similarly unpretentious, nonauthoritarian, averse to confrontation, and uninterested in the trappings of power. They complemented each other well. Noyce was Mr. Outside; he could dazzle a client with the halo effect that had followed him since childhood. Moore, always temperate and thoughtful, liked being in the lab, and he knew how to lead engineers with subtle questions or (the sharpest arrow in his quiver) a studied silence. Noyce was great at strategic vision and seeing the big picture; Moore understood the details, particularly of the technology and engineering.

Learning the art of management

Years later, after Grove had learned to appreciate this, he read Peter Drucker’s The Practice of Management, which described the ideal chief executive as an outside person, an inside person, and a person of action. Grove realized that instead of being embodied in one person, such traits could exist in a leadership team. That was the case at Intel, Grove said, and he made copies of the chapter for Noyce and Moore. Noyce was the outside guy, Moore the inside, and Grove was the man of action.

Workplace dynamics

Noyce had a theory that he developed after bridling under the rigid hierarchy at Philco. The more open and unstructured a workplace, he believed, the faster new ideas would be sparked, disseminated, refined, and applied. “The idea is people should not have to go up through a chain of command,” said one of Intel’s engineers, Ted Hoff. “If you need to talk to a particular manager you go talk to him.”

3 team - Noyce, Moore, Grove

In contrast to Noyce’s sweet gentility, Grove had a blunt, no-bullshit style. It was the same approach Steve Jobs would later use: brutal honesty, clear focus, and a demanding drive for excellence. “Andy was the guy who made sure the trains all ran on time,” recalled Ann Bowers. “He was a taskmaster. He had very strong views about what you should do and what you shouldn’t do and he was very direct about that.”

Building a general purpose computer

He realized that it was wasteful and inelegant to design many types of microchips that each had a different function, which Intel was doing. A company would come in and ask it to build a microchip designed to do a specific task. Hoff envisioned, as did Noyce and others, an alternative approach: creating a general-purpose chip that could be instructed, or programmed, to do a variety of different applications as desired. In other words, a general-purpose computer on a chip.

Innovation that makes ordinary people accessible to technology

Microprocessors began showing up in smart traffic lights and car brakes, coffeemakers and refrigerators, elevators and medical devices, and thousands of other gizmos. But the foremost success of the microprocessor was making possible smaller computers, most notably personal computers that you could have on your desk and in your home.

Hacker

Members of the Signals and Power Subcommittee embraced the term hacker with pride. It connoted both technical virtuosity and playfulness, not (as in more recent usage) lawless intrusions into a network.

What is the hacker culture?

Spacewar highlighted three aspects of the hacker culture that became themes of the digital age. First, it was created collaboratively. “We were able to build it together, working as a team, which is how we liked to do things,” Russell said. Second, it was free and open-source software. “People asked for copies of the source code, and of course we gave them out.” Of course—that was in a time and place when software yearned to be free. Third, it was based on the belief that computers should be personal and interactive.

Engineering + business

Innovation can be sparked by engineering talent, but it must be combined with business skills to set the world afire.

Key partnerships

Many of the key partnerships in the digital age paired people with different skills and personalities, such as John Mauchly and Presper Eckert, John Bardeen and Walter Brattain, Steve Jobs and Steve Wozniak. But occasionally the partnerships worked because the personalities and enthusiasms were similar, as was the case of Bushnell and Alcorn.

Authrotiy and culture

In addition to being a recruiting tool, the culture at Atari was a natural outgrowth of Bushnell’s personality. But it was not simply self-indulgent. It was based on a philosophy that drew from the hippie movement and would help define Silicon Valley. At its core were certain principles: authority should be questioned, hierarchies should be circumvented, nonconformity should be admired, and creativity should be nurtured.

Innovation ingredients

Innovation requires having at least three things: a great idea, the engineering talent to execute it, and the business savvy (plus deal-making moxie) to turn it into a successful product.

Collaboration with government, laboratories and universities

adding that his “most significant innovation was the plan by which, instead of building large government laboratories, contracts were made with universities and industrial laboratories.”

Science and innovation

Although subsequent waves of science historians sought to debunk the linear model for ignoring the complex interplay between theoretical research and practical applications, it had a popular appeal as well as an underlying truth. The war, Bush wrote, had made it “clear beyond all doubt” that basic science—discovering the fundamentals of nuclear physics, lasers, computer science, radar—“is absolutely essential to national security.”

triangular relationship

The creation of a triangular relationship among government, industry, and academia was, in its own way, one of the significant innovations that helped produce the technological revolution of the late twentieth century.

Making of research labs

A few corporate research centers, most notably Bell Labs, existed before the war. But after Bush’s clarion call produced government encouragement and contracts, hybrid research centers began to proliferate. Among the most notable were the RAND Corporation, originally formed to provide research and development (hence the name) to the Air Force; Stanford Research Institute and its offshoot, the Augmentation Research Center; and Xerox PARC. All would play a role in the development of the Internet.

Computers - Augment humans and not replace them

Unlike some of his MIT colleagues, Wiener believed that the most promising path for computer science was to devise machines that would work well with human minds rather than try to replace them. “Many people suppose that computing machines are replacements for intelligence and have cut down the need for original thought,” Wiener wrote. “This is not the case.” The more powerful the computer, the greater the premium that will be placed on connecting it with imaginative, creative, high-level human thinking.

Time sharing and computing

“The invention of interactive computing through time-sharing was even more important than the invention of computing itself,” according to Bob Taylor. “Batch processing was like exchanging letters with someone, while interactive computing was like talking to them.”

Humans and computers

the sensible goal was to create an environment in which humans and machines “cooperate in making decisions.” In other words, they would augment each other. “Men will set the goals, formulate the hypotheses, determine the criteria, and perform the evaluations. Computing machines will do the routinizable work that must be done to prepare the way for insights and decisions in technical and scientific thinking.”

Meetings and ideas

Ideas are often sparked by the exchanges at meetings, and one popped up at the end of the Michigan session that helped to defuse opposition to the network.

Distributed

Instead control should be completely distributed. In other words, each and every node should have equal power to switch and route the flow of data. This would become the defining trait of the Internet, the ingrained attribute that would allow it to empower individuals and make it resistant to centralized control.

Innovator’s dilehmma

When it was over, the AT&T executives asked Baran, “Now do you see why packet switching wouldn’t work?” To their great disappointment, Baran simply replied, “No.” Once again, AT&T was stymied by the innovator’s dilemma.

The work of 1000 people

Paul Baran, who did deserve to be known as the father of packet switching, came forward to say that “the Internet is really the work of a thousand people,” and he pointedly declared that most people involved did not assert claims of credit. “It’s just this one little case that seems to be an aberration,” he added, referring disparagingly to Kleinrock.

Creative collaboration

They instinctively isolated and routed around any node that tried to claim more significance than the others. The Internet was born of an ethos of creative collaboration and distributed decision making, and its founders liked to protect that heritage. It became ingrained in their personalities—and in the DNA of the Internet itself.

Origin of packet switching

That led to an unstable hair-trigger situation; a nation was more likely to launch a preemptive strike if it feared that its communications and ability to respond would not survive an attack. “The origin of packet switching is very much Cold War,” he said.

The birth of RFCs

I hit upon this silly little idea of calling every one of them a ‘Request for Comments’—no matter whether it really was a request.” It was the perfect phrase to encourage Internet-era collaboration—friendly, not bossy, inclusive, and collegial. “It probably helped that in those days we avoided patents and other restrictions; without any financial incentive to control the protocols, it was much easier to reach agreement,”

the 1960s… music, space, computers

It was thus that in the second half of 1969—amid the static of Woodstock, Chappaquiddick, Vietnam War protests, Charles Manson, the Chicago Eight trial, and Altamont—the culmination was reached for three historic enterprises, each in the making for almost a decade. NASA was able to send a man to the moon. Engineers in Silicon Valley were able to devise a way to put a programmable computer on a chip called a microprocessor. And ARPA created a network that could connect distant computers.

Collaboration of Kahn and Vint Cerf

They began by organizing a meeting at Stanford in June 1973 to gather ideas. As a result of this collaborative approach, Cerf later said, the solution “turned out to be the open protocol that everybody had a finger in at one time or another.” But most of the work was done as a duet by Kahn and Cerf, who holed up for intense sessions at Rickeys Hyatt House in Palo Alto or at a hotel next to Dulles Airport. “Vint liked to get up and draw these spider drawings,” Kahn recalled.

IP and TCP

The result was an Internet Protocol (IP) that specified how to put the packet’s destination in its header and helped determine how it would travel through networks to get there. Layered above it was a higher-level Transmission Control Protocol (TCP) that instructed how to put the packets back together in the right order, checked to see if any of them was missing, and requested retransmission of any information that had been lost. These became known as TCP/IP.

People interactions based on common interests

“Life will be happier for the on-line individual because the people with whom one interacts most strongly will be selected more by commonality of interests and goals than by accidents of proximity,” they wrote in a visionary 1968 paper titled “The Computer as a Communication Device.”

The early Internet

It was still a gated community, open primarily to researchers at military and academic institutions. It wasn’t until the early 1980s that civilian counterparts to ARPANET were fully opened, and it would take yet another decade before most ordinary home users could get in.

Innovations that propelled personal computers

The personal computer was made possible by a number of technological advances, most notably the microprocessor, a circuit etched on a tiny chip that integrated all of the functions of a computer’s central processing unit.

Conjunction

Yes, the Trip Festival’s conjunction of drugs, rock, and technology — acid and a.c. outlets!—was jarring. But it turned out to be, significantly, a quintessential display of the fusion that shaped the personal computer era: technology, counterculture, entrepreneurship, gadgets, music, art, and engineering. From Stewart Brand to Steve Jobs, those ingredients fashioned a wave of Bay Area innovators who were comfortable at the interface of Silicon Valley and Haight-Ashbury.

Picture of the whole earth

He resolved to convince NASA to take such a picture. So, with the offbeat wisdom that comes from acid, he decided to produce hundreds of buttons so that people in the pre-Twitter age could spread the word. “Why haven’t we seen a photograph of the whole Earth yet?”

Man and machine combination

Instead he argued that the intuitive talents of the human mind should be combined with the processing abilities of machines to produce “an integrated domain where hunches, cut-and-try, intangibles, and the human ‘feel for a situation’ usefully co-exist with powerful concepts, streamlined terminology and notation, sophisticated methods, and high-powered electronic aids.”

The mother of all demos

They even were able to create hypertext links together. In short, Engelbart showed, back in 1968, nearly everything that a networked personal computer does today. The demo gods were with him, and to his amazement there were no glitches. The crowd gave him a standing ovation. Some even rushed up to the stage as if he were a rock star, which in some ways he was.

What Engelbart did

He had already seen and embraced Engelbart’s ideas, but the drama of the demonstration struck him like a clarion call. “To me he was Moses opening the Red Sea,” Kay said. “He showed us a promised land that needed to be found, and the seas and rivers we needed to cross to get there.”

Merging humanities and engineering

In his doctoral thesis he described some of its traits, most notably that it should be simple (“It must be learnable in private”) and friendly (“Kindness should be an integral part”). He was designing a computer as if he were a humanist as well as an engineer. He drew inspiration from an Italian printer in the early sixteenth century named Aldus Manutius, who realized that personal books would need to fit into saddlebags and thus produced ones of the size now common.

Sharpening ideas

Taylor had another leadership skill that he had refined at his meetings with ARPA researchers and graduate students: he was able to provoke “creative abrasion,” in which a team of people can question each other, even try to eviscerate each other’s ideas, but then are expected to articulate the other side of the dispute. Taylor did that at what he called “Dealer” meetings (evoking people trying to beat the dealer at blackjack), in which one person had to present an idea while others engaged in constructive (usually) criticism. Taylor was not a technology wizard himself, but he knew how to get a group of them to sharpen their sabers in friendly duels.

Computers that are friendly and simple

By keeping children (of all ages) in mind, Kay and his colleagues advanced Engelbart’s concepts by showing that they could be implemented in a manner that was simple, friendly, and intuitive to use.

Mature companies and innovation

That was my revelation that Xerox would never get the personal computer.” Instead, more entrepreneurial and nimble innovators would be the first to foray into the personal computer market. Some would eventually license or steal ideas from Xerox PARC. But at first the earliest personal computers were homebrewed concoctions that only a hobbyist could love.

Homebrew club and first meeting

This first meeting of the Homebrew Computer Club was held on a rainy Wednesday, March 5, 1975, in Gordon French’s Menlo Park garage. It occurred just when the first truly personal home computer became available, not from Silicon Valley but from a sagebrush-strewn strip mall in a silicon desert.

Advent of personal computing

“The era of the computer in every home—a favorite topic among science-fiction writers—has arrived!” the lede of the Popular Electronics story exclaimed. For the first time, a workable and affordable computer was being marketed to the general public. “To my mind,” Bill Gates would later declare, “the Altair is the first thing that deserves to be called a personal computer.”

Hacker ethic vs commercial ethic

By the time his query was published, Gates had been thrown into a more fundamental dispute with the Homebrew Computer Club. It became archetypal of the clash between the commercial ethic that believed in keeping information proprietary, represented by Gates, and the hacker ethic of sharing information freely, represented by the Homebrew crowd.

Hardware as commodity and software as profits

What Gates and Allen set out to do on that December day in 1974 when they first saw the Popular Electronics cover was to create the software for personal computers. More than that, they wanted to shift the balance in the emerging industry so that the hardware would become an interchangeable commodity, while those who created the operating system and application software would capture most of the profits.

Fuzzy vs precise

The computer terminal became to him what a toy compass had been to the young Einstein: a mesmerizing object that animated his deepest and most passionate curiosities. In struggling to explain what he loved about the computer, Gates later said it was the simple beauty of its logical rigor, something that he had cultivated in his own thinking. “When you use a computer, you can’t make fuzzy statements. You make only precise statements.”

Operating system

Gates and Allen came to appreciate the importance of the computer’s operating system, which was akin to its nervous system. As Allen explained, “It does the logistical work that allows the central processing unit to compute: shifting from program to program; allocating storage to files; moving data to and from modems and disk drives and printers.”

Programming language for a chip

That summer Gates and Allen became enchanted by Intel’s new 8008 microprocessor, a powerful upgrade of its 4004 “computer on a chip.” They were so excited by a story on it in Electronics Magazine that years later Gates would remember the page number it was on. If the chip really could act like a computer and be programmed, Allen asked Gates, why not write a programming language for it, specifically a version of BASIC?

Waiting

Gates dismissed the 8008 as not being up for such a task. “It would be dog-slow and pathetic,” he replied. “And BASIC by itself would take up almost all the memory. There’s just not enough horsepower.” Allen realized that Gates was right, and they agreed to wait until, in accordance with Moore’s Law, a microprocessor twice as powerful came out in a year or two.

Squeezing into 3.2k

By late February 1975, after eight weeks of intense coding, they got it down, brilliantly, into 3.2K. “It wasn’t a question of whether I could write the program, but rather a question of whether I could squeeze it into under 4k and make it super fast,” said Gates. “It was the coolest program I ever wrote.” Gates checked it for errors one last time, then commanded the Aiken Lab’s PDP-10 to spew out a punch tape of it so Allen could take it to Albuquerque.

Fanaticism

Of them all, Gates was the prime example of the innovator’s personality. “An innovator is probably a fanatic, somebody who loves what they do, works day and night, may ignore normal things to some degree and therefore be viewed as a bit imbalanced,” he said. “Certainly in my teens and 20s, I fit that…

Bill Gates

In fairness to Gates, he was the person who, by then, was actually running the fledgling company. Not only was he writing much of the code, but he also was in charge of sales, making most of the calls himself. He would kick around ideas about product strategy with Allen for hours, but he was the one who made the final decisions on which versions of Fortran or BASIC or COBOL would be built.

Steve Woz

For fun, he would study the manuals of the office computers made by Hewlett-Packard and DEC and then try to redesign them using fewer chips. “I have no idea why this became the pastime of my life,” he admitted. “I did it all alone in my room with my door shut. It was like a private hobby.” It was not an activity that made him the life of the party, so he became pretty much a loner, but that talent to save chips served him well when he decided to build a computer of his own.

Learning to do a smaller hit product first

The escapade ended after they got ripped off at gunpoint trying to sell one in a pizza parlor, but from the seeds of the adventure a company would be born. “If it hadn’t been for the Blue Boxes, there wouldn’t have been an Apple,” Jobs later reflected. “Woz and I learned how to work together.” Wozniak agreed: “It gave us a taste of what we could do with my engineering skills and his vision.”

Simplicity + focus

Devices should not need manuals. “That simplicity rubbed off on him and made him a very focused product person,” said Ron Wayne, who worked with Jobs at Atari. In addition, Bushnell was able to help mold Jobs into an entrepreneur.

Act like it

Bushnell recalled. “He was interested not just in engineering, but also the business aspects. I taught him that if you act like you can do something, then it will work. I told him, pretend to be completely in control and people will assume that you are.”

Woz and sharing

but he was so proud of his design that he loved standing in the back, showing it off to any who gathered around, and handing out the schematics. “I wanted to give it away for free to other people.” Jobs thought differently, just as he had with the Blue Box. And as it turned out, his desire to package and sell an easy-to-use computer—and his instinct for how to do it—changed the realm of personal computers just as much as Wozniak’s clever circuit design did.

Designing and selling

Jobs also started accompanying Wozniak to Homebrew meetings, carrying the television set and conducting the demonstrations, and he came up with a plan to sell circuit boards preprinted with Wozniak’s design. It was typical of their partnership. “Every time I’d design something great, Steve would find a way to make money for us,” said Wozniak. “It never crossed my mind to sell computers. It was Steve who said, ‘Let’s hold them in the air and sell a few.’ ”

From kits to fully assembled ones

By the time Jobs had finished his pitch, Terrell had agreed to order fifty of what became known as the Apple I computer. But he wanted them fully assembled, not just printed boards with a pile of components. It was another step in the evolution of personal computers. They would not be just for solder-gun-wielding hobbyists anymore.

From blue box to Apple I and Apple II

But the Apple II was the first personal computer to be simple and fully integrated, from the hardware to the software. It went on sale in June 1977 for $1,298, and within three years 100,000 of them were sold.

Controlling of user experience

The Apple II also established a doctrine that would become a religious creed for Steve Jobs: his company’s hardware was tightly integrated with its operating system software. He was a perfectionist who liked to control the user experience end to end.

UX principle - least surprise

“The goal was to give the user a conceptual model that was unsurprising,” Frankston explained. “It was called the principle of least surprise. We were illusionists synthesizing an experience.”

Oxymoron taglines

“In fact, our tagline in our ad had been ‘We set the standard,’ ” Gates recalled with a laugh. “But when we did in fact set the standard, our antitrust lawyer told us to get rid of that. It’s one of those slogans you can use only when it’s not true.”

Execution > Prototype > Idea

But Jobs was the first to become obsessed with the idea of incorporating PARC’s interface ideas into a simple, inexpensive, personal computer. Once again, the greatest innovation would come not from the people who created the breakthroughs but from the people who applied them usefully.

The first GUI

What caught his attention was the graphical user interface featuring a desktop metaphor that was as intuitive and friendly as a neighborhood playground. It had cute icons for documents and folders and other things you might want, including a trash can, and a mouse-controlled cursor that made them easy to click. Not only did Jobs love it, but he could see ways to improve it, make it simpler and more elegant.

GUI as a breakthrough for personal computing

Jobs had one major worry about Microsoft: he didn’t want it to copy the graphical user interface. With his feel for what would wow average consumers, he knew that the desktop metaphor with point-and-click navigation would be, if done right, the breakthrough that would make computers truly personal.

Integrated vs licensing

The primary reason for Microsoft’s success was that it was willing and eager to license its operating system to any hardware maker. Apple, by contrast, opted for an integrated approach. Its hardware came only with its software and vice versa.

From Minix to Linux

He read a book on operating systems by a computer science professor in Amsterdam, Andrew Tanenbaum, who had developed MINIX, a small clone of UNIX for teaching purposes. Deciding that he would replace the MS-DOS with MINIX on his new PC, Torvalds paid the $169 license fee (“I thought it was outrageous”), installed the sixteen floppy disks, and then started to supplement and modify MINIX to suit his tastes.

Linus’ approach

Instead of trying to market what he had produced, he decided simply to offer it publicly. He had recently gone with a friend to hear a lecture by Stallman, who had become an itinerant global preacher for the doctrine of free software. Torvalds didn’t actually get religion or embrace the dogma: “It probably didn’t make a huge impact on my life at that point. I was interested in the technology, not the politics — I had enough politics at home.” But he did see the practical advantages of the open approach.

Using openness for innovation

Torvalds decided to use the GNU General Public License, not because he fully embraced the free-sharing ideology of Stallman (or for that matter his own parents) but because he thought that letting hackers around the world get their hands on the source code would lead to an open collaborative effort that would make it a truly awesome piece of software.

How to solve software bugs

In his book The Cathedral and the Bazaar, Eric Raymond, one of the seminal theorists of the open software movement, propounded what he called “Linus’s Law”: “Given enough eyeballs, all bugs are shallow.”

Best work + passion

The hacker corps that grew up around GNU and Linux showed that emotional incentives, beyond financial rewards, can motivate voluntary collaboration. “Money is not the greatest of motivators,” Torvalds said. “Folks do their best work when they are driven by passion. When they are having fun. This is as true for playwrights and sculptors and entrepreneurs as it is for software engineers.”

Stallman and Linus

Torvalds admitted to “not exactly being a huge fan” of Stallman, explaining, “I don’t like single-issue people, nor do I think that people who turn the world into black and white are very nice or ultimately very useful. The fact is, there aren’t just two sides to any issue, there’s almost always a range of responses, and ‘it depends’ is almost always the right answer in any big question.”

Coexisting all approaches

Each model had its advantages, each had its incentives for creativity, and each had its prophets and disciples. But the approach that worked best was having all three models coexisting, along with various combinations of open and closed, bundled and unbundled, proprietary and free. Windows and Mac, UNIX and Linux, iOS and Android: a variety of approaches competed over the decades, spurring each other on—and providing a check against any one model becoming so dominant that it stifled innovation.

Modem - another innovative link

The little device that finally created a connection between home computers and global networks was called a modem. It could modulate and demodulate (hence the name) an analog signal, like that carried by a telephone circuit, in order to transmit and receive digital information.

Technology and people

In almost every decade of the Digital Revolution, the amused and amusing Stewart Brand found a way to stand at the locus where technology overlapped with community and the counterculture.

Online software vs hardware they run on

The software approach pioneered by Bill Gates would apply to the online realm as well: online services would be unbundled from the hardware and would work on all computer platforms.

Case and AOL

With no marketing dollars, Case needed a name that clearly described what the service did. And the name America Online accomplished that. AOL, as it became known, was like going online with training wheels. It was unintimidating and easy to use. Case applied the two lessons he had learned at Procter & Gamble: make a product simple and launch it with free samples.

Social media and community

As Case understood, the secret sauce was not games or published content; it was a yearning for connection. “Our big bet, even back in 1985, was what we called community,” he recounted. “Now people refer to it as social media. We thought the killer app of the Internet was going to be people. People interacting with people they already knew in new ways that were more convenient, but also people interacting with people they didn’t yet know, but should know because they had some kind of shared interest.”

Computers vs network

As a kid growing up on the edge of London in the 1960s, Tim Berners-Lee came to a fundamental insight about computers: they were very good at crunching step by step through programs, but they were not very good at making random associations and clever links, the way that an imaginative human could.

Tim Berners-Lee’s hobby

“You buy these little bags of microchips with your pocket money and you’d realize you could make the core of a computer.” Not only that, but you could understand the core of the computer because you had progressed from simple switches to transistors to microchips and knew how each worked.

Places and people for innovation

During his Oxford years, microprocessors became available. So, just as Wozniak and Jobs had done, he and his friends designed boards that they tried to sell. They were not as successful as the Steves, partly because, as Berners-Lee later said, “we didn’t have the same ripe community and cultural mix around us like there was at the Homebrew and in Silicon Valley.” Innovation emerges in places with the right primordial soup, which was true of the Bay Area but not of Oxfordshire in the 1970s.

Ideas floating around

Later he would realize a truth about innovation: New ideas occur when a lot of random notions churn together until they coalesce. He described the process this way: “Half-formed ideas, they float around. They come from different places, and the mind has got this wonderful way of somehow just shoveling them around until one day they fit. They may fit not so well, and then we go for a bike ride or something, and it’s better.”

Web protocols

Instead Berners-Lee insisted that the Web protocols should be made available freely, shared openly, and put forever in the public domain. After all, the whole point of the Web, and the essence of its design, was to promote sharing and collaboration.

Rapid short, tiny feedback loop

Mosaic was popular because it could be installed simply and enabled images to be embedded in Web pages. But it became even more popular because Andreessen knew one of the secrets of digital-age entrepreneurs: he fanatically heeded user feedback and spent time on Internet newsgroups soaking up suggestions and complaints. Then he persistently released updated versions.

Display and not editing

The emphasis on display rather than editing tools nudged the Web into becoming a publishing platform for people who had servers rather than a place for collaboration and shared creativity.

advertising

“The whole business of using advertising to fund communication on the Internet is inherently self-destructive. If you have universal backlinks, you have a basis for micropayments from somebody’s information that’s useful to somebody else.” But a system of two-way links and micropayments would have required some central coordination and made it hard for the Web to spread wildly, so Berners-Lee resisted the idea.

Evan Williams and blogger

One of the basic lessons for innovation is to stay focused. Williams knew that his first company had failed because it tried to do thirty things and succeeded at none. Hourihan, who had been a management consultant, was adamant: Williams’s blogger scripting tool was neat, but it was a distraction. It could never be a commercial product. Williams acquiesced, but in March he quietly registered the domain name blogger.com . He couldn’t resist. “I have always been a product guy, and am just always thinking about products and thought this would be a cool little idea.”

Patience and stubbornness

But ingrained in Williams’s hardscrabble heritage was the patience of a corn farmer and the stubbornness of an entrepreneur. He had an abnormally high level of immunity to frustration. So he persevered, testing that hazy borderline between persistence and cluelessness, remaining placid as problems bombarded him. He would run the company by himself, from his apartment. He would tend to the servers and the coding himself.

Early signs of Wikipedia

“Suddenly I felt like I was back in grad school, and it was very stressful. I realized that the way we had set things up was not going to work.” That was when Wales and Sanger discovered Ward Cunningham’s wiki software. Like many digital-age innovations, the application of wiki software to Nupedia in order to create Wikipedia—combining two ideas to create an innovation—was a collaborative process involving thoughts that were already in the air.

Crowd intelligence

Sanger turned out to be wrong. The uncredentialed crowd did not run off the experts. Instead the crowd itself became the expert, and the experts became part of the crowd.

Web crawler

There was one obvious problem: with the number of websites increasing tenfold each year, there was no way to keep a directory updated by hand. Fortunately, there was a tool that was already being used to ferret out information that resided on FTP and Gopher sites. It was called a crawler, because it crawled from server to server on the Internet compiling an index.

Hand-curated list vs web crawler

The Yahoo! team believed, mistakenly, that most users would navigate the Web by exploring rather than seeking something specific. “The shift from exploration and discovery to the intent-based search of today was inconceivable,”

To tell the truth or not

Page told his advisor Winograd that, according to his rough estimate, his Web crawler would be able to accomplish the task in a few weeks. “Terry nodded knowingly, fully aware it would take much longer but wise enough to not tell me,” Page recalled. “The optimism of youth is often underrated!”

UX - no user blaming

‘Why are you giving people garbage?’ ” Page said. The answer he got was that the poor results were his fault, that he should refine his search query. “I had learned from my human-computer interaction course that blaming the user is not a good strategy, so I knew they fundamentally weren’t doing the right thing. That insight, the user is never wrong, led to this idea that we could produce a search engine that was better.”

Refining

Page and Brin proceeded to refine PageRank by adding more factors, such as the frequency, type size, and location of keywords on a Web page. Extra points were added if the keyword was in the URL or was capitalized or was in the title.

Getting user feedback

Page and Brin pushed to make Google better in two ways. First, they deployed far more bandwidth, processing power, and storage capacity to the task than any rival, revving up their Web crawler so that it was indexing a hundred pages per second. In addition, they were fanatic in studying user behavior so that they could constantly tweak their algorithms. If users clicked on the top result and then didn’t return to the results list, it meant they had gotten what they wanted. But if they did a search and returned right away to revise their query, it meant that they were dissatisfied and the engineers should learn, by looking at the refined search query, what they had been seeking in the first place.

Start a company when you are compelled to

“Those companies were worth hundreds of millions or more at the time,” Page later said. “It wasn’t that significant of an expense to them. But it was a lack of insight at the leadership level. A lot of them told us, ‘Search is not that important.’ ” As a result, Page and Brin decided to start a company of their own.

Key innovations

Machines such as these emerged in the 1950s, and during the subsequent thirty years there were two historic innovations that caused them to revolutionize how we live: microchips allowed computers to become small enough to be personal appliances, and packet-switched networks allowed them to be connected as nodes on a web.

Brute force win

But as she was the first to admit, these were not true breakthroughs of humanlike artificial intelligence. Deep Blue won its chess match by brute force; it could evaluate 200 million positions per second and match them against 700,000 past grandmaster games. Deep Blue’s calculations were fundamentally different, most of us would agree, from what we mean by real thinking.

Guided information for humans

“Our early experience was with wary physicians who resisted by saying, ‘I’m licensed to practice medicine, and I’m not going to have a computer tell me what to do.’ So we reprogrammed our system to come across as humble and say, ‘Here’s the percentage likelihood that this is useful to you, and here you can look for yourself.’ ” Doctors were delighted, saying that it felt like a conversation with a knowledgeable colleague. “We aim to combine human talents, such as our intuition, with the strengths of a machine, such as its infinite breadth,”

Collaboration with clear vision

The most successful endeavors in the digital age were those run by leaders who fostered collaboration while also providing a clear vision. Too often these are seen as conflicting traits: a leader is either very inclusive or a passionate visionary. But the best leaders could be both. Robert Noyce was a good example. He and Gordon Moore drove Intel forward based on a sharp vision of where semiconductor technology was heading, and they both were collegial and nonauthoritarian to a fault.

Product people

Most of the successful innovators and entrepreneurs in this book had one thing in common: they were product people. They cared about, and deeply understood, the engineering and design. They were not primarily marketers or salesmen or financial types; when such folks took over companies, it was often to the detriment of sustained innovation.