Created | Updated Mar 8, 2014
It is very possible that you are using a computer to view this information you are reading. In fact, it'd be nigh-on impossible to view it without one in some way, shape or form. Whether it be your home PC, office network, laptop, PDA or even mobile phone1 it's more than likely you are readily familiar with this all too common, in fact some would argue essential, piece of modern technology - but you can never be too careful. Knowing your enemy is half the battle.
A Not So Brief History
Over time, computers have progressed from the amazingly simple abacus to semi-autonomous piles of electronics designed to remove the workload of their users, however retaining the unique ability of requiring exponentially as much work to keep them operational. Originally, a 'computer' was the name given to anyone who performed mathematical calculations. So, the brain in these humans was the 'computer'. Data (information) was inputted, basic instructions were followed and answers or further information was produced. The first computers were therefore pretty ordinary humans, who for want of a better name were accountants. They were soon aided in their calculations by more modern machinery (the abacus) but it wasn't until around 1833 when one Scotsman Charles Babbage decided that a mechanical machine for the Industrial Age could perform the task of adding things up much more quickly than a lackey hunched over a desk with quill, ink and parchment. Babbage (and friend Ada Lovelace) came up with some design plans for a computing machine, but due to lack of funds and the necessary parts (unfortunately in the 1800s the microchip and floppy drive weren't about), his creation never took flight. Of course that means it didn't get built, he was designing a computer, not an aircraft. Some blokes called the Wright Brothers cottoned onto that little idea in the early 1900s.
During the first part of the 20th Century, however, machinery became even more complex, and war (as ever) aided in the progress of technologies. During the 1930s and 1940s the use of digital electronics found its way into the computing machines, and it is arguable as to what the 'first' ever digital computer was (and who invented it). In 1937 a machine called the Atanasoff-Berry Computer used valve-driven (vacuum tube) computation, binary numbers and had regenerative memory. In 1941 Konrad Zuse's electro-mechanical 'Z machines' featured automatic binary arithmetic and the ability to be programmed to perform basic tasks. Then there was the British 'Colossus' computer2 of 1944 which had limited programmability, but demonstrated that a device using thousands of valves could be made reliable and reprogrammed electronically.
After the Second World War a decimal-based computer built by Presper Eckert and John Mauchly for the US Army, the ENIAC (Electronic Numerical Integrator Analyzer & Computer), had the prestigious title of being the first 'general purpose electronic computer'. In 1948, the Ferranti Mark 1 'Baby' built by Frederic Williams and Tom Kilburn in Manchester became the first computer to store programs electronically, and even further advances in electronics meant that other transistor-based computers soon replaced computers with the valve design. Semiconductor-based electronics thus enabled machines to be smaller, faster in working out problems due to the quickness of electrical signals, inexpensive, and much more reliable - thus allowing them to be mass-produced.
During the 1960s big and small business alike jumped at the chance to have all their accounting and filing needs managed by someone else other than them, so the computer became a common sight in the office. It was often a huge hulking mass secluded to the basement and maintained by a large team of engineers - or in most cases a janitor who dusted it every second Tuesday. As time went on, the huge computers as seen in early James Bond films gave way to smaller varieties due to the development of the silicon chip - a small electronic circuit about the size of a postage stamp that could do the work of a whole bucketload of transistors. So, come the early 1970s the ability of computers to work faster and get information out quicker meant the business world flourished with digital aid. The office world soon went from being a place of paper-shuffling, to one with the steady hen-pecking clicking of people typing on their 'Qwerty' keyboards.
The Home PC
In the late 1970s, the adoption of integrated circuit technology enabled computers to be produced at a low enough cost for it to be feasible that the machines be moved from the office environment into the average home! In 1981, IBM (International Business Machines) brought out the first true PC (Personal Computer) and in 1983 Apple made the Macintosh. These desktop computers, (so-called because, unlike their predecessors they could actually fit on a standard desktop instead of take up much of your garage, attic and spare room) were rapidly cloned by many companies, such that by the late 1990s most western households did indeed have their own 'IBM compatible' computer system. At the turn of the 21st Century it was common for families to own more computers than cars as possession of two or more of the most modern PCs3 was something of a status symbol.
Once a computer has crept silently into a home, it isn't long before members of the household are introduced to the Internet, sometimes through a modem. This quickly leads to the playing of games, or other activities, when 'surfing' the web - and a degree of anti-social behaviour. People found they could do their work from home, chat to other Internet users, pay their bills, do their shopping, choose their next holiday, ask for advice, watch their favourite television programmes and get their favourite music from other computers. In fact, with a computer and access to the Internet, there was no longer the need to leave the house. Unless of course it was on fire, or relatives were visiting5.
So, fancy going out and buying a computer because you don't have one?6 Don't forget by the time you've gone down the shop and brought it home (or had it delivered) it's probably out of date. Computers are continually being upgraded, redesigned and made increasingly smaller. So even if you do buy one, make sure you don't forget where you left it. Of course, if you feel particularly brave you could always build your very own! A bit of balsa wood, some glue and the inside of next-door's trashed microwave oven are not the ingredients for a good Personal Computer though.
Hard, Soft and Floppy Bits
Computers are made of lots of different pieces. Much like a sandwich. There's the outer layer (bread), then an inner layer (filling) and other added elements (sauces). Okay, they're not really like sandwiches at all. But computers have (in general):
- Hardware - A computer's hardware is its machinery. In other words, the parts that make it work like the processor, memory and power source that gives 'life' to the ever present flashing lights, wires and chips and all the other bits and bobs inside that make it whirr and clunk.
- Software - This is the information that tells a computer what to do. The most common variety of computer software is known as the O/S (Operating System), these programs can only really be accessed by interpreting information on the computer monitor, often through icons. Software can range from anything that will create a letter for you, to games that mimic the flight of 'TIE' Fighters from Star Wars - right down to the cool zooming noise they make when they whizz past.
- Peripherals - If you actually want to use a computer for something other than computing, you will more than likely need assorted peripherals. Just that, peripherals are bits you add on but don't really need for the computer to do its base job. Included in the vast array of peripherals available are monitors, hard drive(s), floppy drive(s)7, keyboards, mice, scanners, webcams, printers, speakers, joysticks, Bluetooth, wireless connections, dongles, IT Support Workers - the list is, to some degree, almost endless.
Other Uses for a Computer
Other than make-shift coffee tables to stack your books on, computers have a wide and varied use. From aiding in the manufacture of various day-to-day things in the world, to creating some of the best films, the computer can be found inside the smallest child's toy through to the most destructive weapons on Earth. Some would argue that they are one and the same, but let's not splice atoms about it.
Computers aren't without flaws. Some are in really silly colours. Some talk to you in that tone of voice. Others you find you want to like, but there's just something about them. Others just refuse to work and give you a blank stare, usually from what has been termed a 'blue-screen'. Computers also have their fare share of bugs. No, not the creepy-crawly variety - 'bugs' are problems that most often only arise when the user presses the right combination of buttons and whispers the magic word backwards while rubbing their stomach. The end result of a 'bug' is often the desire to exact some sort of physical revenge upon the computer - such as throwing it out of window or off the top of a tall building.
With the advent of the Internet, computers were then prone to something even more terrifying than junior sticking a banana in the floppy drive - the virus. Computer users were quickly introduced to email after the world wide web took hold and the information superhighway careered off across the globe. And then with email came 'spam', and the more sinister 'hackers' - people who wanted your information to be their information. They would invade your computer's personal space, violating it, but measures were soon developed to keep your computer safe from viruses and hackers, such as 'firewalls' and 'security software'. And then of course there was Y2K. But that was quickly forgotten about.
It's not always the computer's fault though. Don't be fooled by people who say that the fact you have an enormous electricity bill, or you've had 263 garden gnomes delivered to your house, tell you it was a computer error. Computers cannot think for themselves (yet). They need information to be given to them, and need the instructions that they follow to be correct too. If there's any mistake in those, usually made by a human - the computer is simply following orders. Which in time of war is not a good excuse, so really the computer and the computer programmer should be taken out and beaten with a big stick. Best for all involved that way.
Some Famous Computers
Computers have been about for a while, but they were once only the stuff of science fiction. Some memorable fictional computers include 'Eddie' the shipboard Computer, 'Deep Thought' and of course 'Earth' from The Hitchhiker's Guide to the Galaxy, 'HAL' from the film 2001 - A Space Odyssey, 'KITT' from the TV series Knightrider and 'Holly' from cult sci-fi comedy Red Dwarf. In reality, computers don't find fame very easily, but one named 'Deep Blue' did, after defeating champion chess-player Gary Kasparov at his own game in 1997.
A glossary of computer terminology, so when you need one to be fixed you can nod sagely at the repairman and knowingly understand when he waffles on about RAM, USBs, PCBs and buses.
- CPU - Central Processing Unit. Just a fancy term for the main bit of the computer, where all the workings live.
- PCB - Printed Circuit Board. These are what some of the electronic components are put on. There are many in a computer, but the 'Motherboard' is the hub of it all. 'Daughterboards' are attached to the motherboard to make other things like lights and so forth work.
- Ports - These are little holes in a computer where you can plug in various peripherals or other hardware. USB or Universal Serial Bus8 is one of the most common varieties of port9, but there are many others, such as SCSI, FireWire, AGP and Serial Ports - the ever confusing list is astounding!
- ROM - Read Only Memory. Basically the computer's long-term memory. It contains the programs that tell the computer how to work.
- RAM - Random Access Memory. The computer's short-term memory, it stores information and instructions while the computer is working.
- Bytes - Computer information or data is stored in 'bits', or binary digits. Eight bits make a 'byte' and computer processors have 16 bit, 32 bit or even 64 bit wide registers, enabling the post 8086 processors to communicate with multiples of 8 bits. A computer's memory size is measured in bytes with 1,024 bytes making a Kilobyte (KB), 1,048,576 bytes making a Megabyte (MB) and 1,024 Megabytes making a Gigabyte (GB). Bytes carry data around a computer, some telling it what to do (or what not to do).
- Buses - Buses are simply many parallel conduits (or metal 'tracks') for bytes, so maybe they should really be called trains. There are three main types of buses in a computer processor; the data bus which carries information between the CPU and the memory, the control bus that carries information from the CPU to other parts of the computer, and the address bus that carry information about the computers memory.
- Drivers - A program that tells the computer how to work with different parts of its hardware or software.
- RSI - Repetitive Strain Injury. A quasi-medical term referring to the fact that if you stay at a computer too long, you're sure to strain something - your eyes, wrists, knees, spine or brain.
If you still need to know what to do with a computer, try taking up a computer course at a local college where you can meet computers, and people.