‘Woz,’ Jobs Planted Seeds of a Revolution
Once when the word “revolutionary” described a computer product, it was more than a marketing cliche.
A quarter-century ago two friends working in a Silicon Valley garage began a business that became Apple Computer, and it transformed the world’s relationship to computing.
Steve Wozniak was a 25-year-old underappreciated engineering grunt at technology giant Hewlett-Packard and a mainstay of a Silicon Valley hobbyist group called Homebrew Computer Club.
He became friends with fellow club member Steve Jobs, a 20-year-old college dropout and occasional employee of computer-game pioneer Atari.
“Woz,” an unassuming engineering genius, found his efforts to build a personal computer rebuffed by HP as unmarketable. Jobs had relatively pedestrian engineering skills but understood that Woz’s designs could become the first practical computer for individual users.
Working on a shoestring, they created the Apple I in 1976--a crude precursor to the PC--and sold it for $666 to computing zealots in the Bay Area. Apple sold only a handful of Apple I’s, but those sales funded a successor that would build a major corporation.
*
In 1977 the $1,295 Apple II was born. It was the first computer to display color images, the first to come with a keyboard and the first to offer a “killer app”--Visicalc--a spreadsheet program that automated routine tasks like budget planning and made personal computing a business advantage for the first time.
By 1981, the company had sold 300,000 Apple IIs; by early 1985, about 2.5 million were sold. As games and other software was written for the Apple II, it gained a huge share of fast-growing markets for home users and schools and eclipsed offerings from its chief competitors of the day, Radio Shack and Commodore.
Apple’s success also forced mainframe computer giant IBM to release its own PC in 1981. It became an instant success due to Big Blue’s marketing muscle but did not match the Apple II’s capabilities.
When Apple made its initial offering of stock to the public in 1980, Jobs’ shares were worth $217 million and Wozniak saw his stake soar to $116 million. Three years later, Apple joined the Fortune 500.
While Woz provided engineering elegance, Jobs contributed a contempt for convention. His pursuit of innovation led to a 1979 visit to Xerox Corp.’s Palo Alto Research Center, or PARC.
Years before, PARC scientists had invented (or borrowed from Douglas Engelbart, a pioneering scientist at the Stanford Research Institute) many of the underlying technologies upon which the future of computing was built: The graphical interface, networking, the mouse pointing device, the laser printer and the ability to display any shape, rather than just text and numbers.
That visit dramatically altered Apple, and all computing. It sparked a five-year marathon to produce a marketable improvement on PARC’s ideas.
*
That race was the Macintosh project.
Jobs hammered his Macintosh team with the idea that they could save the world from the colorless mediocrity of mainstream computing and create an experience that was both functional and compelling.
He drove the developers relentlessly, alternating charismatic leadership--a quality that ultimately fed a cultish movement of Macintosh devotees--with infamous tirades. He castigated top engineers with obscenity-laced diatribes and alienated colleagues, partners and customers alike.
Yet more than anyone else, Jobs ended the era of text-based systems, those obscure number and letter commands that formed the basis of nearly all computing before the Mac. Apple’s mouse and point-and-click system of visual metaphors on the screen--the desktop, folders, clocks--formed the basis of modern computing iconography, including that of the Internet.
The new era was launched with a stunning attack on IBM via a TV commercial. A giant, Orwellian head preached totalitarian propaganda to drone-like onlookers. It was shattered with a sledgehammer as the narrator spoke: “On Jan. 24, Apple Computer will announce Macintosh. And you’ll see why 1984 won’t be like 1984.”
Unfortunately, the pricey $2,495 Mac could run only a handful of software programs--compared to thousands for the Apple II and IBM PC--which by then had millions of buyers.
The Mac was saved by its own killer app: PageMaker. With the Macintosh and Apple’s LaserWriter printer, the program created desktop publishing, a simple method of composing text and graphics on screen that irrevocably altered the publishing industry.
Mac sales rose rapidly, and Apple’s graphical approach began to look like computing’s inevitable future.
Jobs and his successors believed that by keeping the Mac’s design proprietary they could corner the market on graphical computing. That assumption nearly buried the company.
After years of effort, Microsoft borrowed from the Mac’s design and created a more usable version of its own graphical software, called Windows, in 1991. Though widely viewed as inferior knockoffs of Apple’s technology, Windows-based computers easily outsold Macs because Microsoft licensed its software to all comers.
Apple sued Microsoft, claiming copyright infringements. But long before Apple’s suit failed in 1995, Bill Gates was on his way to being the most powerful businessman of the computer age.
Long before, Jobs had been ousted from Apple. In 1983 Jobs hired John Sculley from PepsiCo. to improve Apple’s marketing skills. Two years later, Jobs was ousted by Sculley.
Wozniak left Apple in 1981 while recovering from a plane crash. He returned to school to earn his bachelor’s degree in engineering and launched an abortive career as a rock-festival promoter. Though Woz returned to Apple briefly, he ultimately became a grade-school computer teacher and administrator.
In 1997 Jobs sold his firm, Next Computer, to Apple. Then the wayward founder returned to save his old company--in steep decline after a succession of mediocre products.
Amazingly, Apple once again shamed the rest of the industry with its creativity and nerve. Radical designs--most notably the fruit-colored iMac--broke new ground and again shattered competitors’ complacency, provoking many to strive for a form of computing that people can love.