“In 1969, the Neiman Marcus catalog offered the first home PC, a stylish stand-up model called the Honeywell Kitchen Computer, priced at $10,600. The picture shows an aproned housewife caressing the machine, with this tag line: "If she can only cook as well as Honeywell can compute." That image should be on every cubicle in Silicon Valley; it's a testament both to what technologists get right and what they get badly wrong.
To their credit, they understood that Moore's law would bring computing within the reach of regular people. But they had no idea why anyone would want it. Despite countless brainstorming sessions and meetings on the subject, the only application the Honeywell team could think of for a home computer (aside from the perennial checkbook balancing) was recipe card management. So the Kitchen Computer was aimed at housewives and featured integrated counter space. Those housewives would, however, require a programming course (included in the price), since the only way to enter data was with binary toggle switches, and the machine's only display was binary lights. Needless to say, not a single Kitchen Computer is recorded as having sold.
Today, of course, we have computers in every home—and in every pocket and car and practically everywhere else. But one of the few things the average person doesn't use them for is managing recipe cards.
Don't blame Honeywell—blame the computing world of the 1960s. In those days, computers were expensive mainframes. Because processing power was so scarce and valuable, it was reserved for use by IT professionals, mostly working for big companies and the government. Engineers both built the computers and decided how to use them—no wonder they couldn't think of nonengineering applications.
But as the Kitchen Computer hinted, computers would soon get smaller and cheaper. This would take them out of the glass boxes of the mainframe world—and away from the IT establishment—and put them in the hands of consumers. And the real transformation would come when those regular folks found new ways to use computers, revealing their true potential.
All this was possible because Alan Kay, an engineer at Xerox's Palo Alto Research Center in the 1970s, understood what Moore's law was doing to the cost of computing. He decided to do what writer George Gilder calls "wasting transistors." Rather than reserve computing power for core information processing, Kay used outrageous amounts of it for frivolous stuff like drawing cartoons on the screen. Those cartoons—icons, windows, pointers, and animations—became the graphical user interface and eventually the Mac. By 1970s IT standards, Kay had "wasted" computing power. But in doing so he made computers simple enough for all of us to use. And then we changed the world by finding applications for them that the technologists had never dreamed of.
This is the power of waste. When scarce resources become abundant, smart people treat them differently, exploiting them rather than conserving them. It feels wrong, but done right it can change the world.”
Read the rest here