As a computer science student, I use computers for nearly the whole day, often several different types. For example, I work from my desktop computer (with lovely dual-monitors), my Chromebook, and sometimes even my phone. However, I recently learned of supercomputers when studying the history of computers which piqued my curiosity. So I spent some time learning about what a computer is, as well as the different types of computers. And let me tell you, there are quite a few!
A computer is a device that takes in some form of input data, processes it, then produces logical output. Computers used to be mechanical machines. However, in recent history they’ve transformed into electrical devices. The earliest computers where simply calculators designed to assist in scientific computation. However, computers have since evolved to process data at incredibly rates, even storing data and program instructions in their internal memory.
Within the last 60 years, computers have gone from taking up entire rooms and costing millions of dollars to being the size of a credit card and costing a mere $35. I’m referring to the first supercomputer, the CDC 6600, and the Raspberry Pi computer minicomputer respectively. Not only is the Raspberry Pi nearly a million times less expensive and many times smaller, but it’s also more than ten times faster.
“What a computer is to me is, it’s the most remarkable tool that we’ve ever come up with. It’s the equivalent to a bicycle for our minds.”-Steve Jobs
If you’re unsure how computers work, they probably seem like magic to you. That’s how computers seemed to me before I lifted the veil and discovered their inner workings. There are four basic functions of computers that define how they work:
There used to be only a few different types of computers but today, there are at least 15 types of computers in the world. These include analog computers, digital computers, hybrid computers, PC’s, tablets, mainframes, servers, supercomputers, minicomputers, quantum computers, smartphones, smartwatches, and more. Additionally, with the continuously decreasing size of computers over time, a growing number of appliances are coming online, referred to as the Internet of Things (IoT).
|Computer Type||Description||Peak Popularity|
|Analog Computers||Mechanical computing devices of varying functions||1950s – 1960s|
|Digital Computers||Nearly all computers in operation today are digital||1940s – Present|
|Hybrid Computers||A computer combining analog and digital qualities||1960s – 1980s|
|Quantum Computers||Utilizes concepts from quantum physics||1990s – Present|
|Mainframe Computers||Large, powerful computers for bulk data processing||1960s – 1980s|
|Server Computers||Provide additional functionality for other computers||1990s – Present|
|Supercomputers||The fastest digital computers on the planet||1980s – Present|
|Minicomputers||Smaller than mainframes yet larger than PCs||1960s – 1970s|
|Personal Computer||A computer intended for personal use||1980s – Present|
|Desktop Computers||PC that remains atop a desk while in use||1980s – 2000s|
|Laptop Computers||Portable PC that the user can place on their lap||1990s – Present|
|Smartphones||Mobile phone with the functionality of a computer||2000s – Present|
|Tablet Computers||A mobile, flat, touchscreen computer||2010s – Present|
|Wearable Computers||A computer that is worn on the body||2010s – Present|
|IoT Devices||Devices and appliances with embedded computers||2010s – Present|
Analog computers have been around for at least 2,000 years, dating back to the Antikythera Mechanism (pictured below). However, analog computers peaked in popularity sometime around the 1950s. Eventually, analog computers peaked when they were used inside the Saturn V rocket and assisted in the Apollo Moon landings.
With the invention of the integrated circuit, the transistor, and the microprocessor around the same time led to much faster, smaller, and less expensive digital computers. Since then, they haven’t disappeared but they’ve become far less popular with very few still in use today.
Analog computers are a type of computer that uses constantly changing mechanisms and displays output data in an analog fashion. For example, an analog watch has many complex gears that are constantly turning precisely and continuously and display the time with turning hands. On the other hand, a digital watch has electric components that compute the time and display it in a still, digital fashion.
Nearly every type of computer in the world today is classified as digital. This includes all of our personal computers and wearables, supercomputers and minicomputers, and IoT devices. Digital computers process information in a different way than analog computers. Rather than processing continuously changing data, digital computers process the simplest language in the world: binary.
Binary, a base-2 number system, is referred to as ‘machine language’ because it’s the language that computers understand. It contains only two numbers in the entire number system: 0 and 1. With these two simple numbers, computers can take on two different states: ‘off’ and ‘on.’ The beauty of modern digital computers is that they can process many series of binary inputs in a short time.
Digital computers process digital data, often in binary format. Technically, the Abacus, invented more than 4,000 years ago, was the first digital computer. However, we typically think of digital computers as the modern digital electrical computing powerhouses of today. Typically, digital computers consist of input devices such as keyboard and mouse and output devices such as screens and speakers. The ‘brain’ of a digital computer is its CPU or Central Processing Unit.
The history of hybrid computers dates back to the 1960s. In fact, the first hybrid computer, the HYCOMP 250, was created in 1961. There were other hybrid computers that came about in the 1960s such as the HYDAC 2400 in 1963 but they never quite became mainstream devices. However, they were still made even in the 1980s as the Marconi Space and Defense System Limited came out with their Starglow Hybrid Computer. Around this time, hybrid computers have dwindles in popularity.
Hybrid computers combine aspects of both digital and analog computers. Essentially, you get the high speeds and complexity of analog computers combined with the accuracy of digital computers. Also, there are often digital components of hybrid computers that act as controllers.
Quantum Computers are a mysterious new type of computer, separate from the digital and analog computers. However, they do take principles from digital computers, borrowing the binary system, and extending it include qubits. The first quantum computer, a 2-qubit device, was created fairly recently in 1998 by three leading quantum computer scientists. It However, they’ve since made tremendous progress.
Just two years later, in 2000, a functioning 4-qubit quantum computer was created by David Wineland and others of the U.S. National Institute for Standards and Technology (NIST). Only a week later, a 7-qubit quantum computer was completed by another group of researchers by utilizing trans-crotonic acid in the development of the device.
In the last 20 years, quantum computers have made… (dare I say it?) quantum leaps. Some of the leaders in the space today include IBM, Google, and the world’s first quantum computing company, D-Wave Systems Inc. Just recently, in 2015, D-Wave broke a quantum barrier in the field, the 1000 qubit barrier when they developed a 1000-qubit quantum annealing processor chip. This processor opened a world of possibilities.
Qubit is short for quantum bit. A classic bit refers to a ‘0’ or a ‘1’ and is the basis of all digital computers. However, a qubit can maintain a state of ‘0’, ‘1’, as well as both simultaneously. This seemingly magical phenomenon of the two states occurring simultaneously stems from quantum theory and is commonly referred to as superposition.
Quantum computers are a type of computer that utilizes concepts from quantum physics such as superposition. The concept of superposition stems from the fact that, unlike digital computers that rely on bits, quantum computers use qubits or quantum bits. Because of the quantum state of the bits, quantum computers are able to perform at unprecedented speeds and are expected to soon attain quantum supremacy, leaving digital computers in the dust.
Some of the first digital computers were large mainframe computers. They’re known to be huge computers, dubbed “big iron” from their bulky origins. The first mainframe, the Harvard Mark I, goes back to the 1940s. It was costed $200,000 to develop and was a large as a room, weighing five tons!
Mainframe computers took off in the 1960s and 1970s. However, demand began to shrink in the 1980s when in 1984, the sale of personal computers surpassed that of mainframe computers. This was shortly after the release of the Apple II and IBM’s PC, the IBM Model 5150.
Although mainframe computers have dwindles in popularity, they’re still very widely used today and will continue to be relevant in the future. To this day, roughly 70% of Fortune 500 businesses use mainframes in some regard. Additionally, innovations are still being made in mainframes. The IBM z13 (shown below) was created in 2015 and the Rockhopper (shown next to the z13) was created in late 2018.
Mainframe computers are also known simply as ‘mainframes’ or even as a ‘big iron.’ They were referred to as such because they were extremely large and powerful computer. The main function of mainframes is processing extremely large amounts of data very quickly. Although popularity has dwindled in recent times, they’re still very useful today, especially in enterprise applications.
Servers play a major role in computing and have ever since IBM launched the first list server in 1981, the IBM VM Machine. Of course, there was also the first web server that was created in 1991 which launched the worldwide web. In more recent years, physical servers have waned and virtual cloud servers have quickly become the market leader, hosting most of today’s web pages and applications.
Over the years, there have been many types of server computers that have come into existence. In fact, there are several types of servers in use today, in addition to list servers, web servers, and virtual cloud servers. Here a brief list of some of the different types of servers.
There are several types of server computers, also known simply as ‘servers.’ However, a server is a computer (or program) that provides additional functionality for other computers, referred to as ‘clients.’ Perhaps the most popular servers are web servers, which allow other computers such as a PC (Personal Computer) to connect to the web via the internet.
Supercomputers are actually some of the earliest digital computers, with the first being the CDC 6600 which was built back in 1964. It was a highly sought after computer for any scientists that were in need of running complex computations. The same scientists that developed the CDC 6600, also invented several other supercomputers going into the 1970s, including the Cray-1, followed by the liquid-cooled Cray-2 in the 1980s.
Through the 1990s and early 2000s, supercomputers continued advancing until in 2008, the IBM Roadrunner broke the petaFLOPS barrier. PetaFLOPS is a measure of computing speed that computes one thousand million million (1015) floating-point operations per second. In other words, it’s fast. Yet, as unthinkably fast as the IBM Roadrunner was, it pales in comparison to the latest supercomputer, and became obsolete just 5 years after it was made.
The Fugaku supercomputer, the successor of the 2011 Fujitsu K Computer (shown below), was operational as of June 2020. The amazing thing about the Fugaku supercomputer is that it reaches speeds of up to 415 PetaFLOPS. That’s more than three times faster than the next fastest supercomputer, the IBM Summit, which runs as fast as 122 PetaFLOPS. It won’t be long now until supercomputers reach ExaFLOPS (1,000 PetaFLOPS) territory.
Supercomputers are the fastest digital computers on the planet, rivaled only by quantum computers. Many will make the claim that supercomputers are also similar to mainframes because of their size and structure, but mainframes don’t come close in terms of processing speed. Supercomputers are often used for scientific work.
Minicomputers, also referred to as ‘mini’s,’ first appeared in the 1960s with the first mini being the DEC PDP-8 (shown below). By today’s standards, these computers were anything but mini. However, when compared to the previous generation of computers in the 1950s that used vacuum tubes and occupied an entire room, you realize they were indeed very small.
What made minicomputers possible was the invention of the transistor in 1947 and the integrated circuit in 1958. With these new inventions replaced vacuum tubes, making computers smaller and cheaper. The DEC PDP-8 weighed 250 pounds and cost $20,000 which was smaller and cheaper than most computers available at the time.
Through the 1960s and 1970s, computers continued to make consistent strides, as described by Moore’s Law. Thus, with the inventions of the personal computer and laptop, the demand for minicomputers quickly dwindled. The decrease in popularity for minis began in the 1980s and sharply increased in the 1990s as newer computers utilized microprocessors, spelling the end for minicomputers.
A minicomputer, or mini, is a computer that’s smaller and less powerful than a mainframe yet larger and more powerful than a personal computer. According to the 1970 article in The New York Times, minicomputers by definition must also cost less than $25,000. Unlike personal computers which are very much general-purpose, minicomputers were often designed for a specific function.
Many claim that the first-ever Personal Computer was created in 1971 by John Blankenbaker, known as the Kenbak-1. This first Personal Computer, or PC, cost a reasonable $750 and had a whopping 256 bytes of RAM. The concept caught on and in 1977, the Apple-II was released, becoming the first mass-produced personal computer.
Flash forward to today and PCs have reached a whole new plateau. Today, PCs come in several shapes and sizes. There are desktops, laptops, tablets, smartphones, and even wearable computers such as smartwatches. Personal Computers in all of their variety have enabled people like never before. We’re all more connected than every have boundless opportunities.
Anything we want to learn is just a click away, including coding. Anyone could learn to code and launch their own product or website, just like this one. The Personal Computer, especially those on the market today, is the single most empowering invention in modern history.
A Personal Computer, more commonly known as a PC, is a computer intended for personal use. PCs are general-purpose and are highly capable devices of varying types. Desktop computers, laptops, tablets, smartphones, and smartwatches all classify as Personal Computers.
The first-ever desktop computer was the Programma 101 (shown below), invented by Pier Giorgio Perotto in 1964. However, it’s not at all like the desktop computers of today. The Programma 101, also known as the P101, didn’t have a monitor for an output device, nor did it have a mouse for an input device.
For input, it had a small keyboard consisting of numbers, a few letters, and a few arithmetic operators. The output was printed onto a small roll of paper. One of the amazing things about this initial desktop computer is that it was capable of playing a simple mathematic dice game. Thus, becoming the first game to ever run on a desktop.
Truth be told the Programma 101 was more of a calculator that a computer by today’s standards. However, it was remarkable at the time and it served pushed computers forward into the modern era of computing. It can be likened to the grandfather of the Apple II, which is the grandfather of modern desktop computers.
The lineage of desktop computers runs deeper than you might have expected. Desktop computers saw a massive boost in popularity during the 1980s when desktops became cheaper and more practical for the average person. However, since around the mid-2000s, the laptop computer has overtaken the desktop in popularity.
A desktop computer also referred to simply as a desktop, is a type of personal computer that is intended to sit atop a desk. Modern desktops have a monitor, a keyboard, and a mouse as input and output devices. Desktops differ from Laptops in that laptops are more mobile and compact and can sit atop the user’s lap.
Laptop Computers have been around since the early 1980s. However, they really took off during the 1990s. In fact, one laptop, in particular, ended the 1990s with a lot of flash, style, and performance. Apple has pathed the way for the best new personal computers since the 1980s and the Apple iBook (as shown below) is no exception.
The iBook laptop dazzled with its looks and its groundbreaking wireless technology. It was the first of its kind to do so, using their Airport. Suddenly, there was a compact, sleek laptop computer that can wirelessly surf the web and send emails.
Today, as advanced as laptops are, they’re still decreasing in popularity as a newer, smaller computer has taken over. I’m referring of course to smartphones. However, even as laptops aren’t the most popular computer, they’re still the first choice for many who need more capability than that contained in today’s smartphones.
A laptop computer, or simply laptop, is a portable personal computer that can rest on the user’s lap, or a desk. Laptops are more portable than PCs, yet offer very similar performance, making them extremely popular for most students and enterprises. Modern laptops are all Wi-Fi enabled, adding to their portability.
The first smartphone was the Simon Personal Communicator, created in 1994. In the nineties, you were the coolest person alive if you had one of these bricks. However, fast forward to 2007 and there’s a brand new hot product sweeping the market: the iPhone.
Smartphones were able to access the internet since earlier in the 2000s. However, the iPhone greatly improved the experience. Also, while other smartphones at the time had built-in apps, the iPhone had an app store. The truly groundbreaking thing about the iPhones relation to applications is that it opened its app store up to 3rd-party developers.
Suddenly, a whole new industry was created, as well as a whole new class of developers: mobile applications. Before long, we had social media apps and awesome game like Angry Birds, don’t even get me started on Flappy Bird! With most people having a smartphone in their pocket, it has quickly become the most popular personal computer in the world and remains so.
A smartphone is a mobile phone with nearly all the same functionality as a desktop or laptop computer. Unlike older generations of mobile phones, smartphones also have large touchscreens that function as both input and output. Not only can smartphones connect to the internet wirelessly, but they can also access a wide range of applications that provide additional functionality.
Tablet computers are still pretty new, relative to the other types of computers on this list. The prototype tablets were various PDAs (Personal Digital Assistant) such as Apple’s Newton MessagePad in 1993. However, Windows coined the phrase “tablet computer” when they released arguably the first true tablet in the year 2000: The Microsoft Tablet PC.
Only 10 years later, in 2010, the legendary Steve Jobs presented the Apple iPad (shown below) and once again stunned the crowd. Unlike other tablets, Apple’s iPad had access to the Apple store and all of the applications within it. However, it was also simply an amazing new personal computer with the look, feel, and performance you would expect from any Apple product.
Just a year later, Apple launched the iPad 2 with even more features, including a front-facing camera for FaceTime video calls. It was also thinner more powerful. Other competitors have release fantastic tablets such as Amazon’s Kindle Fire table, of which I personally own one. These two tablets are complete opposite sides of the pricing spectrum. However, they offer a lot of value.
A tablet computer, also known simply as a tablet, is a flat, mobile computer with a touchscreen display. Tablets can be compared to smartphones, as they’re both very similar. Tablets are typically larger and faster than smartphones, yet lack the capability of making phone calls.
The newest and smallest PCs don’t reside in your pocket. Rather, they live on our wrists and even on our faces. I’m referring to smartwatches and smartglasses. The two leaders in the space are to no one’s surprise, Apple and Google products. Google released its smartglasses, the Google Glass (shown below) in 2013 on a limited basis and Apple released the Apple Watch in 2015.
Wearable computers haven’t been around long, but they’re certainly here to stay. Although, they will continue to evolve. In fact, the next generation of personal computers will likely reside in us, rather than on us. Elon Musk’s company Neuralink is developing computers that will interface directly with the human brain. Prototypes are already functioning inside pigs and human trials are right around the corner.
A wearable computer, also known as a wearable, is a type of computer that is worn on the body. Smartwatches such as the Apple Watch and smartglasses such as the Google Glass are two prime examples of wearable computers. Wearables are changing the way people interface with computers.
The Internet of Things (IoT) refers to the growing number of items with embedded computers and internet access. The term IoT was coined by Keven Ashton in 1999 during the internet boom. The following year, LG announced its first smart fridge with a large digital touchscreen display on the front of it.
IoT devices continued to grow in popularity at an extraordinary rate through the 2000s and in 2008, the number of “things” online surpassed the number of people on the internet. In 2009, Google began testing self-driving cars that recorded and relayed sensory data via the internet.
In 2011, you probably remember the Nest thermostat (shown below) that took the internet by storm. What was once an ugly albeit small appliance was now shiny, cool, and cost-effective as it saved money on heating bills. But the wave was just rolling in the world of IoT and many tech companies rode that wave brilliantly.