Bits, Bytes, and Money!
The clip above (from 1981) is overflowing with predictions that came true. It was filmed when the personal computer was still in its infancy.
It is astonishing how many predictions alluded to here came to be our reality not very long after this segment originally aired. The piece features CBS News reporter Barry Peterson covering the rise of the personal computer’s popularity in the early 1980s. Here we are viewing documentation of the birth of new ideas and new words. This clip has a prophetic quality to it when viewed from today’s vantage point.
Peterson and his interviewees come to the very brink of using terms we all take for granted today. However, the terms were not yet in the collective vocabulary at the time of Peterson’s report. It is strange to watch because we know all these ideas did come to fruition and what they are now called. Terms like; “Graphic Design, Laptops, MIDI music, Internet, World Wide Web, and Computer-Generated Imagery” are all alluded to but never spoken. These words seem on the tip of everyone’s tongue but they were not yet part of the collective lexicon.
It is interesting to hear Peterson use the words “Personal Computer.” You can tell it is a new term to him. He tries the phrase on for size during the first part of his report but then switches to “small computer” to describe his subject by the end of it. This may even be the first time the phrase “Personal Computer” was uttered in a television news report.
Dan Rather starts off the segment looking dapper. Steve Jobs sports a full beard in the clip and was a multimillionaire even then. Note Job’s reference to Orwell’s 1984! Even then he had it on his mind. Later, he would introduce Apple’s now famous “1984 Ad” during the Super Bowl. That ad is world-renowned today.

I love the shots of the low-bit graphics we catch glimpses of occasionally on the computer screens shown in this clip. The old blocky digital images take me back. My brother and I played the “Intellivision” game console in our house growing up. My Dad and Mom were into it as well. It was the world’s first 16-bit game console. Intellivision was the biggest competitor to the Atari game system and touted “better graphics” and “more sophisticated” gameplay. I loved our Intellivision. I remember being unbeatable at Intellivision’s Baseball. I even got into a physical fight with a friend growing up over the game. It infuriated him to be struck out every turn at bat. Shut-outs are great in real life, but tick off your average 11-year-old.
All of those old games and the graphics seen on the monitors in this report look so childlike when compared to today’s systems. They are the primitive drawings on our digital cave walls. Imagine what gaming will be like in five, ten, or 50 more years. I foresee a completely immersive virtual experience (like Star Trek’s Holo Deck) that is not far off in video gaming and computing in general. When it happens, we will plug in and simply seem to be somewhere else. We are getting close with Apple Vision Pro and Meta Quest.
I noticed in the segment Peterson says “Jobs and a partner founded Apple Computer.” The “partner” of course is the great Steve Wozniak, (inventor of the Apple I) who has since become a household name. Wozniak is now credited with inventing the personal computer that we know today. There is no mention of the Altair computer box in this segment. It was one of the first home offerings to techie hobbyists. There were computer kits available before the Altair such as the Mark-8. However, the Altair is regarded as the first true Personal Computer. Assembly was minimal when compared to its predecessors. Anyone who has seen Bob Cringely’s three-part documentary “Triumph of the Nerds” on PBS knows the Altair’s significance to personal computer history. However, it is Steve Wozniak who took us all to the next level. He has been called the Mozart of computer chip technology.

This video clip even documents for us the rollout of the IBM personal computer. Back then IBM was attempting to tap into the market that Apple had found (or created) and ruled as kings. Those first IBM home computers were then “cloned” by many other manufacturers wanting to get in on the action as well. This cloning of the IBM PC caused the Apple Computer world to become flooded by a sea of PC’s. The cloning of the IBM PC was not illegal as IBM did not have a product they created from the ground up. They instead had an assembled computer made from readily available parts. Hardware that IBM designed themselves was reverse-engineered and then copied by cloners. IBM had little legal means to stop anyone who wanted to clone their product.
Apple was no longer the only “small computer” to be found but they were still about 10 years ahead of the competition. Arguably, that gap has not been closed to this day. Apple is still a company that other manufacturers look at to see what direction the industry should be heading. There were in fact so many clones made of the IBM PC, that IBM had to close the doors on its own personal computer facilities. Some successful IBM clone makers were Dell, Hewlett-Packard, and Gateway. Hence IBM’s own product was outsold by the clones of itself!
This is how Bill Gates became the richest person on Earth. The IBM PC and subsequently, ALL the clones ran Bill’s operating system MS-DOS. Bill Gates’s OS ran on practically every computer being sold except Apple. Gates then switched to the Windows OS running it on top of DOS. This is ironic because he got the Windows GUI-style OS from Apple! His Windows was a knockoff of what Apple had already been doing. Apple sued Microsoft over it and lost. The judge ruled that Apple couldn’t patent a “look and feel” of computing. In the lawsuit, it was also determined that Apple had already given much of the GUI platform to Microsoft in a previous agreement. I think they “gave” it to them so the Microsoft programmers could port the Office Suite over to the Mac OS, not so they could take it. I’m not a judge, though, so what do I know?
Apple did keep much stricter control over the cloning of their computers, however. Apple designed its computers to have legally copyright-protected firmware built right into the computer’s ROM chips. This suppressed the clone makers as it made it both too difficult and too expensive for anyone attempting to clone an Apple Computer. It also put any potential cloners at great legal risk if they attempted to clone an Apple.
Going even further, Apple then began to sell licenses to clone their products to anyone willing to pay a royalty. This enabled Apple to have more control over who could clone them if anyone. There were some successful Apple Clones made during this time. Some people believe Apple should have encouraged the cloning of their computers even more. Steve Jobs later decided the clone markets were not helping Apple and cloning was not the way to go for Apple. He raised the royalty fee and the Apple clone makers dropped out of site. A short time after that Apple released the first iMac computer. It was the most successful personal computer ever sold. The year the iMac was released it outsold previous Apple Computers and PCs as well.
Apple was not as successful at legally holding onto their OS designs, however. Some argue that Apple was handed an unfair judgment in that case. Others believe a GUI operating system was an inevitable step to doing all computing. In one episode of Seinfeld Jerry and George Costanza are talking in the coffee shop about which explorer they most admire. George says: “I like de Soto” because “he discovered the Mississippi.” Jerry is not impressed and says something like “Oh, like they wouldn’t have found that anyway...it runs through the entire country!” For some, a GUI computing environment seems like it just had to happen sooner or later as well. Whatever you believe, we must credit the folks at Apple for being the first to introduce the world to computing the way it should be. They were the first to see into the crystal ball and to run with it. They were the first to develop the GUI into something we all would use.
Apple and Microsoft eventually began a new partnership. In August of 1997, Microsoft founder Bill Gates came up on the big screen at the Boston Macworld Expo. Steve Jobs had just re-taken the helm at Apple, and he announced Microsoft would invest in Apple’s future. Many in the crowd booed at seeing Gates appear on the screen. Others cheered. Steve Jobs, who had invited Gates, thanked him. Following this announcement, Apple’s iTunes and iPods were ported to work on PCs. Microsoft’s Office (a favored software suite to run on Apple Computers) would still be available for the foreseeable future on the Mac. Bill Gates and Steve Jobs had forged these new deals that mutually benefited each other’s interests. Steve Wozniak has also come out publicly and said Microsoft is not the enemy. The "Pirates of Silicon Valley" made peace.

The “crystal ball” that Jobs gazed into is said to be the Xerox Palo Alto Research Center (PARC). So the story goes that some of the building blocks that became the first Macintosh Computer were shown to Jobs on a visit to the PARC. According to former Apple employee Jef Raskin (who has been called the father of the Macintosh), it was incorrectly reported time and again that the Macintosh project was started because of what Jobs saw at the PARC. Raskin said the Macintosh team was already assembled and working hard on the project before Jobs even went to visit the PARC. In fact, according to Steve Wozniak, it was Raskin who first went to the PARC and then he told Jobs to go as well.
Raskin wrote...“For example, in Stross’s book (Steve Jobs and the NeXT Big Thing) he speaks of Xerox’s Palo Alto Research Center (PARC), "... like Old Testament genealogy, every important development in personal computers traces back to this same single source ." To be sure, PARC’s influence was broad, deep, and beneficial, but it was by no means the "single source" of "every important development." Stross’s blanket claim ignores the influence of Sutherland’s far earlier Sketchpad system, Engelbart’s prior conception of the mouse and windows, that the all-important invention of the microprocessor itself did not take place at PARC, and that the people who created the early personal computers (Apple I, SOL, Poly 88, Heath H8, IMSAI, Altair, PET, etc.) generally knew nothing of and took nothing from PARC. Many significant examples of influential software that did not derive from PARC’s work, such as the systems written by Bill Gates, Gary Kildall, and Steve Wozniak also come to mind.”

Raskin then goes on to say... ”Strangely, by misattributing everything to PARC, the true contribution of PARC (insofar as we can evaluate it at such a small historical distance) is also diminished. A blanket "everything" often leaves the impression that what you see on the Mac and Windows is the sum of what PARC did. But the people at PARC have done much more than that, not only with regard to interfaces, but in many other independent and collateral areas of computer science, and they continue to do significant and pioneering work...If Stross or Levy had gone back and read the works I had written before PARC was founded, or even interviewed the people I had known at PARC, they would have learned that many of the Mac’s key concepts had had an independent genesis...A good story will often beat out the dull facts into print.” These excerpts were from Jef Raskin’s article “Holes in the Histories.”
Much of today’s personal computer technology was budding at PARC. They had developed a computer using these technologies. It was called the Xerox Alto. The Alto fit under a desk at about the size of a small refrigerator. Its cost was between $10,000 and $40,000 depending on who you ask. Xerox could have become Apple, IBM, Microsoft, Hewlett-Packard, and Epson combined with what they had at the PARC. However, the executives at Xerox did not see the wisdom in the continued development of these technologies.
People at Apple such as Raskin and Jobs saw the huge potential in the GUI (Graphical User Interface) if it could be developed more and then mass-marketed. With the advent of the Macintosh, Apple successfully upgraded the GUI and married it to a very small computer. The personal computer industry was changed forever. Apple continued this tradition of innovation following the release of the Macintosh in 1984. The Apple computer’s hardware was always “married” to the OS at every point possible. With the release of OS X (pronounced "O-S ten”), in March of 2001, Apple once again raised the bar on what an Operating System should be for the end users.
The current Operating Systems for both Macs and PCs are classified as “Modern” Operating Systems. I foresee a “Zooming User Interface” or “ZUI” for short coming next. I loved OS X (and I love Mac OS 15 Sequoia), but the ZUI ideas seem made for VR integration. I predict that a ZUI is going to be part of the next evolution in personal computer technologies and gaming. It has already begun. Raskin was working toward that end (among other things) after he left Apple. Raskin wrote the book “The Human Interface.” His book deals with the subject of improving “The Humane Environment or (THE).” That is, how we all interact with our computers. Raskin worked constantly to improve the prevailing models of interacting with computers.
No one can tell where we will end up, but as Barry Peterson’s news segment above does so well, we can probably read the tea leaves. I'll take a stab at making some predictions of my own. I’m confident students will patch into camera-mounted robots on the surfaces of the Moon, Mars, and other planets soon. The kids will be able to virtually explore them. To them (and us), it will feel like you are there. The same will be true for exploring the ocean floors, Mount Everest, and going to see famous tourist attractions. You could still buy that plane ticket, but you won’t need to anymore. Do it from the comfort of your classroom or living room. Cool! I also think it isn’t a stretch at all to claim seeing a concert or sporting event will be available virtually. They can sell thousands more tickets, and the “viewers” will be court-side, ringside, and perhaps even right up on the stage with the performer(s). The audio will be high definition, and the visual experience will, in some instances, surpass actually being there. Plus no fighting traffic after the big event! This innovation won’t do away with “going out” to see a show. It will just be another option.

Raskin was developing a ZUI before he passed away. There was a sample demo of the feel of Raskin’s Zooming User Interface on the web but it was removed. Even the beginning stages of it gave me a glimpse of the potential. Of course, to get the full effect of a futuristic Zooming User Interface we will have to wear (the currently) stupid-looking Virtual Reality Headsets. What we are collectively waiting on is the “fix” to the problem of these cumbersome goggles (and headgear) we currently are being offered. Most people are rightfully repelled by how uncool, and ugly these things all are. They are still so cringe! Will the fix come in the form of a pair of Bono-style glasses? Will it be VR contact lenses? Someday perhaps. Think how the very first computers were the size of a large room. Then, computers became the size of a washer or a dryer. Then, computers became as small as refrigerators, a desktop tower, a laptop, and finally, a computer that fits into our pockets! If the virtual environments and the augmented reality experience are good enough, when the “cringe-fix” does become available to consumers, they will flock to it.
Interestingly, many of the proposed uses for the “small computers” in Peterson’s television report have not changed much from when this segment aired. If anything, the original uses have been developed even further over time. The makers of the software may have grown or changed since then but we are still using computers to teach, to make music, for gaming, in finance, for graphic arts, and more.
Even when the personal computer was brand spanking new, all eyes were fixed on its future. Predictions about what the new invention would become were made with stunning accuracy. The father of the IBM PC Don Estridge even mentions in this historic clip that the computer might become “the world’s largest backyard fence to talk over before long.” I noticed he says this with a big smile as if he thinks what he just said is a great-sounding idea. Well, today’s internet, texting, video conferencing (and the proliferation of social networks) prove Estridge was right on the money and then some. He was a giant in this industry who genuinely looked into the future. In the clip, Peterson also refers to people who talk about computers one day being “the size of a book” as “visionaries.” Nowadays it seems everyone and their Grandmother has an iPad, a laptop, or both. We kind of take it all for granted today, don’t we? We tend to forget just how truly visionary all this stuff truly is.
Peterson goes on to talk about designers creating “computer pictures” on the “small computers.” Today that seems like a quaint way to describe it. Today, Graphic Design is a multi-billion dollar industry. Movie-making utilizes CGI as the norm. In the music industry, everything from the creation of songs to selling them is done on computers. Personal Computers have pervaded almost every facet of our lives and have transformed countless industries. Multiple new industries have also sprung up that employ millions of people around the world as a direct result of the advancement of personal computer technology. People around the world can connect in ways we were never able to before personal computers. The world’s economies are morphing towards digital commerce as the norm. This industry has come a very long way in a very short time. The “small computer” has probably impacted us more profoundly than any other invention in any of our lifetimes.
We are all indebted to the men, women, and organizations who developed this phenomenon. I have mentioned only a few of them. “Computer pictures made on small computers...” Isn’t that still a nice-sounding phrase? Isn’t that still a magical idea in a world that can be as harsh as ours? This historic news piece by Barry Peterson is a time capsule of the personal computer industry before the lingo was even established. That lingo continues to grow, doesn’t it? So much of the speculation presented in Peterson’s segment came true in spades. In that light, it is most prophetic of all that Peterson ended his segment with the quote he chose. He credits a mysteriously unnamed “industry observer” as saying: “I have seen the future, and it computes.” That observer might as well have been from the future. The future did compute Mr. Peterson. I also know an industry observer who thinks it still does.
The great and powerful WOZ!
Steve Wozniak co-founder of Apple explains what it was really like in the early days of the company.
August 1997 Macworld Expo in Boston
Steve Jobs returned to the helm of Apple and introduced Bill Gates to the stunned crowd.
The "Mother of All Demos" - Dec. 9th, 1968
Witness the birth of modern computing. Douglas Engelbart gave a 90-minute demonstration of his (and his team's) astonishing work. He introduced the world to the Computer Mouse, Graphical User Interfaces, Hypertext & Links, collaborative editing, word processing & formatting, file search, & organization, and even video conferencing.
__________________________________________
