Ever since Charles Babbage first proposed his ‘Difference Engine’ computer to the Royal Society in 1822, technology scientists have sought to create faster and more powerful computing machines.
Our desire to process data more quickly and generally extract more ‘juice’ from our electronic devices has meant that we have been involved in a constant cycle of computing enhancements and augmentations.
As the race for home PC power reached what many people regard as its zenith during the Nineties, microprocessor manufacturers including Intel and AMD carried forward their research into increasingly thin silicon wafer technology in order to give us more power.
Building CPU chips (or central processing units) that would run at ever-faster speeds meant that the scientists had to find a way of packing in more work-hungry transistors onto each layer of silicon. Eventually the silicon layers got so thin that home and business machines started using parallel processing techniques to perform multiple computations concurrently, which had in fact been employed within mainframes and supercomputers for many years.
So while your home laptop probably has most of the computing power that the supercomputers of the Sixties struggled to pump out, modern variants have been developed with astonishing levels of speed in terms of processing power. These machines can process algorithmic logic at speeds almost impossible for us to comprehend. But scientists and doctors still argue that however fast the hardware and software gets, it’s still a cavernous gap away from the speed at which synaptic signals are transmitted around the human brain.
So how do these machines work in the real world? The power behind the supercomputer is the CPU, or as its often just called, the processor. This unit is prebuilt with an on-board degree in mathematics called an arithmetic logic unit (ALU), which it uses to perform mathematical operations including addition, subtraction, multiplication and division. Really speedy modern processors of the type found in supercomputers can handle calculations using large floating-point numbers, which gives them incredible speed.
If you don’t remember floating-point numbers from school, put simply these are numbers where the decimal point can be moved around to create additional numbers. By moving the decimal point around, more numbers can be created from a single set of digits – and the rest we can leave to the scientists. What we should know is that floating-point operations are crucial to supercomputers as the speed at which they can carry out calculations upon them give us our floating point operations per second (FLOPs) measure.
So where has the development of supercomputers brought us in terms of day-to-day technology usage? While many super machines are used for research projects, other machines are being employed by so-called cloud computing providers to build “virtual’ computing machines shared by multiple tenants across a network. It may sound complicated, but it’s not really.
“There has always been a lot of jargon associated with supercomputing, computing clusters, computing grids and now we have differing definitions of cloud computing to contend with,” says Ken Hertzler, vice president of product management at Platform Computing. “But essentially, cloud computing is very simple. It is a set of large-scale ‘servers’, which exist to ‘host’ your computing needs on a pay-per-use basis. It’s a bit like throwing an extra log on the fire when it’s snowing outside, ie the extra power is there when you need it. This is virtual computing in its plainest sense; much of the hardware and software exists at the cloud data centre – all you really need is a laptop.”
If we plunge deeper into the supercomputing supernova we start to see where the real beasts reside; ie dedicated lumps of raw computing power that sit immobile on their indomitable data-driven haunches to carry out mankind’s most complex computational desires. One such beast that most of us will have heard about is the supercomputer located on the French-Swiss border at CERN, which carries out the number crunching for the Large Hadron Collider. In fact, CERN’s life force is monitored by something in excess of 2,500 clustered servers running the Linux operating system.
Because the Large Hadron Collider generates such massive volumes of data, the scientists need custom-built supercomputing power if they are to successfully analyze the subatomic particles that get bashed around the inside the Collider’s seemingly endless miles of internal structure.
But this is still science, so how do supercomputers work in our world where we can see and feel their real effects and power? Computing giant IBM tried to show us evidence of the real power of its Deep Blue supercomputer in May of 1997 when it set the machine’s logic to play a fascinating match with the reigning World Chess Champion, Garry Kasparov.
After beating every chess computer that IBM had developed for over a decade and a half, Kasparov was finally defeated by Deep Blue. Writing a number of years after suffering his defeat, Kasparov noted that Deep Blue was hardly what the supercomputing pioneers had initially dreamed of. “Instead of a computer that thought and played chess like a human, with human creativity and intuition, they got one that played like a machine, systematically evaluating 200 million possible moves on the chess board per second and winning with brute number-crunching force,” said Kasparov.
This reality, of course, did not stop IBM using the win to great effect for publicity purposes and indeed the company has continued to label the research arm of its supercomputer business as “Deep Computing’ to this day. “Combining these [Deep Computing] capabilities with advances in algorithms, analytic methods, modelling and simulation, visualization, data management and software infrastructures is enabling valuable scientific, engineering and business opportunities,” says IBM.
With all the high-end big scale computing discussion, you may think that you personally don’t impact the growth of supercomputers, right? Well, you’d be wrong. The average human is pushing the planet’s data consumption rates up exponentially, every day. From every time you make a Google search, to every email, video, picture image or sound file you share. This is not just data, this is what the technologists call “unstructured data’.
It’s not a simple spreadsheet with ten numerical values in it that could be easily broken down and expressed in its most simple binary value state. This is complex-rich data with meta tags, embedded extras and when we start to also exchange this data on mobile devices on the go the situation multiplies tenfold. The era of the green screen is gone, the era of the supercomputer has only just begun.