The history of the humble computer.
You can’t point to one person in history that is responsible for inventing the computer, as the development of the computer from its origins to its not-so-humble place in today’s society, is one that involved many clever people and technological advancements.
The First Computer
Usually credited with the first computer is a man named Charles Babbage, who came up with a machine that could process input data using punch cards and provided the outputted information through a printer, a curve plotter and a bell. This first machine was created in 1833, almost 100 years before the technology advanced to anything that resembles computers today.
The principles of the modern computer were developed during the Second World War by a man named Alan Turing, who was working for the British government trying to decrypt the famous enigma machine that the Germans were using to keep their communication secret. Turing’s main developments involved the theoretical underpinnings that would then be left to others to try when the technology had advanced far enough.
The next major breakthrough happened when the technology developed from mechanical and electromagnetic components to purely electrical circuits, which increased the speed and reduced the size of computers. Around the 1950s, the first bipolar transistor replaced vacuum tubes and we start to see computers resembling what we know today.
The next major development came when Jack Kilby developed integrated circuit technology in the late 1950s. This further increased the computing power and reduced the size but it was when a competitor of Kilby’s changed the material used to create these circuits to silicon that the game really changed. It was these silicon chips that heralded the beginning of a rapid increase in computing power that started bringing computers out of the labs and into the homes of ordinary people. It was around this time that we saw technology develop to the point where we had the capacity to travel into space and land on the moon.
The next few decades saw computing increase according to what became known as Moore’s’ Law, which states that computing power doubles roughly every 18 months and by the early 1990s, computers were ubiquitous with modern homes and businesses, some people thought it was as technologically advanced as it was ever going to get.
But as we now know, the Internet was the real game changer, the biggest technological advancement since the printing press, what started as a little network for military and scientific computers to talk to each other, became the globalised behemothic force that we see today.
The modern computer
Computers are literally everywhere; in your phone, in your car, in your household appliances. They are becoming more affordable all the time, go to any online computer store and you can see the incredible range of models on offer. Moore’s Law is still in effect and we are seeing computers develop at a massive clip. The big tech companies are working tirelessly to feed our societies need for new innovation and we will soon be seeing complicated computers finding their way into all aspects of our lives.
The Internet of things
We are now entering a new age that technology experts are calling the internet of things, this refers to the idea that all of our gadgets are now developing internet capabilities. What this means is that our devices and our appliances will soon start talking to each other, for instance, now that you have entered your car and are on the way home, your smartwatch will tell the heating in your house to turn on so it’s toasty warm when you arrive.
Twenty years ago, nobody could imagine what our lives would be like now, so there really is no telling what we’ll be doing in another twenty years. There are reasons to worry, but plenty of reasons to be excited too!