The value of Non-Fungible Tokens (NFTs) has decreased around 90% since September 2021, dovetailing with a decline in the larger crypto market that has seen entire currencies wiped out. On this week’s Now & Then episode, “The Mirage of Money (or, NFTs, WTF?),” Heather Cox Richardson and Joanne Freeman discussed the societal impact of past American booms and busts, from the collapse of the securities market in 1792, to the Yukon Gold Rush of the 1890s, to the dot-com bust of the early 2000s. A two-year downturn in the computer industry beginning in 1985 led to another one of these large-scale reckonings, ultimately morphing into a philosophical discourse about the role of technology in everyday life. 

For their 1984 holiday party, Silicon Valley computer chip manufacturer Advanced Micro Devices Inc. (AMD) hosted a $700,000 bash with the band Chicago and a 50-piece orchestra. By mid-year 1985, the company had closed their factory for two weeks to stem losses and had initiated an austerity program called “STAUNCH,” or “Stress Those Actions Urgently Needed to Check Hemorrhaging.”

In a June 1985 New York Times article entitled “Computer Makers in a Severe Slump,” reporter Andrew Pollack profiled AMD and sought explanations from tech CEOs for the sudden downturn in the computer industry, which plummeted from 15% yearly growth in 1984 to 1.8% over the first eight months of 1985, even as the stock market continued to rise.

Hewlett-Packard President John A. Young simply told Pollack, “We’re having a hard time correlating what’s going on.”

“What we’re going through is a large-scale hangover from a speculative orgy,” added tech investor Bill Hambrecht, who had helped organize Apple and Genentech’s frenzied 1980 Initial Public Offerings. 

Notwithstanding Young and Hambrecht’s broad assessments, most analysts agreed on a series of specific causes for the slump. Most central was the strength of the U.S. dollar; computer sales in Europe, whose currencies were far weaker, remained high, while many American corporations began searching for cheaper equipment abroad.

Additionally, the sheer number of computer startups had oversaturated the still-youthful market. “The user population was like a starving man sitting down at a banquet,” International Data Corporation analyst  Aaron Goldberg told Pollack. “They’ve feasted for two years and now can’t eat any more.”

Computer companies had also arguably over-prioritized hardware, offering countless new models without adequately improving software to make user experience easier. “The customers don’t want bigger machines,” Garter Group research firm president Thomas Crotty told the Los Angeles Times in April 1986, a year into the slump. “They want software to make the machines do the job.”

Syndicated Washington Post columnist Art Buchwald, a notorious satirist of the computer market, wrote a March 30th, 1985 op-ed capturing the finger-pointing nature of the collapse: “The retailers blame the manufacturers for advertising products that do not exist. The manufacturers fault their sales force for failing to move the machines out of the warehouse. The salespeople blame the market research departments for predicting everyone in America was dying for a home computer, and the market research people say the public lied to them.” 

Whoever was to blame, the human cost of the downturn was becoming clear by the end of June 1985. Apple Computers announced plans to lay off 1,200 employees. Grove’s Intel looked to fire 1,000 staffers, while Massachusetts-based Data General looked to let go another 1,300.

The slump was dramatized by the personal plight of 30-year-old Apple co-founder Steve Jobs. As Apple’s woes intensified in May 1985, Jobs talked to reporters outside of Tavern on the Green Restaurant in New York. “I think Henry Ford probably had a few bad quarters in the 1920s,” Jobs told the press. 

Jobs’s dismissals, however, belied a rapidly-worsening power struggle with his hand-picked CEO, PepsiCo veteran John Sculley, who Jobs felt had sidelined him. Jobs quit on September 16th, 1985, writing in his resignation letter, “I am but 30 and want still to contribute and achieve.” The New York Times even wrote a feature on Jobs that almost read like an obituary, entitled “Apple Computer Entrepreneur’s Rise and Fall.” 

By early 1986, the slump had sparked an existential reckoning over the ideal purposes of microprocessing. In July 1986, Washington Post columnist Peter Behr penned a mournful op-ed called “Pity the Poor Computer.” Behr synthesized the questions swirling in the industry: “Researchers are looking anew at the computer revolution, asking some sobering questions: ‘Has the promise of the technology been oversold? Is the information computers assemble expanding too rapidly for human operators to absorb? Do investments in technology really justify the costs?’”

As Behr wrote, struggling Silicon Valley firms were beginning to court the Department of Defense, a revenue source that had mostly been dominated by the Massachusetts-based computer firms clustered in offices around Route 128. In January 1986, Newsweek’s Michael Rogers profiled Rational, a company whose main product was a $600,000 development computer useful in automating space stations. In sharp contrast to dropouts and counterculture-adjacent entrepreneurs like Jobs, Rational founders Paul Levy and Mike Devlin were straitlaced Air Force Academy graduates. 

Rogers ended his profile with a rumination on the shifting priorities of the sector: “Dreams and visions have always been Silicon Valley’s raw material, and so the current search for identity may be simply a scaling-down of the dream. If the consummate Silicon Valley vision of the ‘70s was to grow wealthy by changing the world, the goal of the ‘80s may be to play the niches—and prosper.”

The willingness of Silicon Valley firms to make pacts with the military reflected the uncertain picture not only in the economy of technology, but also in the overall popularity of working in tech. In 1986 alone, Northeastern University in Boston saw a 40% dip in computer science enrollments, while a nationwide survey found that 3.5% of entering college freshmen were interested in majoring in computer science, down from 8.5% in 1982. 

“We are riding the coattails of a new trend away from science and engineering and more to law and business, which seem less arduous,” Joseph Flaherty, the chairman of the computer science department at Rensselaer Polytechnic Institute, told the Boston Globe in January 1987. 

By 1987, however, the worst had passed. IBM and other top companies were reporting higher-than-expected earnings. Tech stocks were rising twice as fast as the rest of the market. New ways of networking computers and sleeker desktop interfaces began to open the collective imagination toward the Internet.

Computer game enthusiast and children’s book author Dan Gutman summed up the boom and bust cycle in the December 1987 edition of COMPUTE’s Apple Applications. Of the dry spell, Gutman wrote, “The computer slump began and all those industry analysts who had predicted a computer in every home had changed their minds. Suddenly everybody was saying that the home computer was a fad, just another hula hoop.” 

The 1987 reversal, Gutman argued, was just the beginning: “Apple is selling Macs and Apple IIs like frozen daiquiris on a hot day at the beach…I guess the home computer isn’t just another Pet Rock after all.” 

At a triumphant February 1987 Personal Computer Forum in Phoenix, John Sculley breathlessly proclaimed that the Jobs-less Apple was on the upswing. Sculley, conscious of the software critiques that had proliferated during the downturn, discussed the launch of Apple’s new Open Mac, or Macintosh II, which could more easily incorporate outside operating systems. “I am more optimistic about the outlook for the industry than at any time since I have been in it,” Sculley told the 400 assembled computer executives. 

The durability of the recovery was demonstrated in October 1987, when the market collapse on Black Monday did little to affect the computer sector. At the 100,000-attendee Computer Dealer’s Exhibition (COMDEX) in Las Vegas, keynote speaker James Manzi, the CEO of software giant Lotus Development, captured the renewed optimism: “Looking around at COMDEX, I can see that we have everything going for us to do gangbuster business in the coming year, recession be damned.”

Amid the renewed perception of a boom, however, some analysts believed that a certain idealism and innocence was forever gone. “The mystique is out of computers,” Morgan Stanley computer consultant Ulric Weil told the Wall Street Journal in early 1987. “Regrettably, it’s not as much of a fun industry to follow as it used to be. It’s more painful now.” 

Whether the cryptocurrency and NFT marketplace is simply in a maturation phase is yet to be seen. But the hubbub surrounding the loss of value brings many of the same pronouncements of doom that accompanied the computer slump–and raises the possibility of new political and economic adjustments to come. As Joanne said at the end of this week’s episode, “I love the idea that the same kind of mirage of money, and energies, and wishes, and desires that push people forward in these efforts have an outcome that–at least in part–is progress.”

For a colorful take on the highs and lows of 1980s Silicon Valley, read Robert X. Cringely’s 1992 Accidental Empires: How the Boys of Silicon Valley Make Their Millions, Battle Foreign Competition, and Still Can’t Get a Date.

And head to the Twitter account of author and Now & Then Editorial Producer David Kurlander for supplemental archival threads on each Time Machine piece: @DavidKurlander.

To receive Time Machine articles in your inbox, sign up to receive the CAFE Brief newsletter sent every Friday.  

The Time Machine Archive  

Catch up on some recent Time Machine’s deep dives into history: