100 IDEAS THAT CHANGED THE WORLDbinary numbers to classify musical meters. Pingala formed a matrix...

6
Part one 500 BC to 1983 100 IDEAS THAT CHANGED THE WORLD

Transcript of 100 IDEAS THAT CHANGED THE WORLDbinary numbers to classify musical meters. Pingala formed a matrix...

Page 1: 100 IDEAS THAT CHANGED THE WORLDbinary numbers to classify musical meters. Pingala formed a matrix to give a unique value to each meter, but wrote from left to ... In the first of

Part one500 BC to 1983

100 IDEAS THAT

CHANGED THE WORLD

Page 2: 100 IDEAS THAT CHANGED THE WORLDbinary numbers to classify musical meters. Pingala formed a matrix to give a unique value to each meter, but wrote from left to ... In the first of

@ P C P R O F A C E B O O K . C O M / P C P R O 100 ideas

2

500 BCThe abacus Without mathematics, there are no computers. So we could point to the work of hundreds of gifted mathematicians in ancient Egypt, Greece, China, Japan, the Roman Empire and many more – instead, we’re going to take a pebble-based shortcut and point to the abacus. This counting device is arguably the first computer, if we simplify the definition to mean a man-made machine that aids calculations, and was found in numerous cultures from 500 BC onwards.

2nd century ADBinary numbersThe earliest known use of a binary numbering system dates back to the second century AD and a mathematician named Acharya Pingala. His Chandas Shastra used binary numbers to classify musical meters. Pingala formed a matrix to give a unique value to each meter, but wrote from left to right, instead of right to left, as binary is written today. He also started with one rather than a zero.

Binary was first discussed in the West in 1666 by Gottfried Wilhelm Leibniz, the legendary German mathematician looking for a way to represent all logical thought through a universal mathematical language. Binary numbers represented opposites for Leibniz, such as black versus white, or yes versus no. He introduced the idea in Dissertatio de arte combinatoria (Dissertation on the Art of Combinations). Leibniz believed that binary numbers represented creation. The number one portrayed God, and zero depicted the void.

1823Babbage’s Difference Engine In the early 19th century, the process of generating mathematical tables, such as logarithms, was handed to large teams of

Smartphones, video doorbells and bendable screens stand on the shoulders of giants. In the first of a two-part feature exclusive to PC Pro

subscribers, we celebrate the ideas that changed the world

people performing calculations manually – a

process that took a very long time. Due to the fact that these

people were employed solely to compute tables, they became known

as “computers”, a term that remained a job description into the 1940s. To speed

up this process, British mathematician and inventor Charles Babbage proposed that a machine, called the Difference Engine, be created to perform these tasks. This essentially led to the design of a mechanical “Computer”, which, instead of employing transistors, used a series of gears to calculate numbers using the mathematical method of differences.

Babbage later revised his plan in a design covering an incredible 1,000 square feet of paper, but despite the intricacy of planning, the government decided against building it.

While the Difference Engine was considered a breakthrough in the development of automatic computing devices, Babbage’s next idea – the Analytical Engine – was far more influential. This new device was more comparable to the computers of today. The idea was to use punched cards to control the calculations, with the Analytical Engine able to make decisions based on the results. Babbage then worked with a brilliant mathematician named Ada Lovelace, who created a program for the Engine and is now credited as being the first ever computer programmer.

1854Boolean logic

It wasn’t until the 19th century that binary numbering was fully realised in a

mathematical system by George Boole, a British

mathematician. His groundbreaking paper of 1854, An investigation into the Laws of Thought, on which are founded

the Mathematical Theories of Logic and

Probabilities, introduced the idea of Boolean logic.

500 BC The abacus See full entry 200 AD Binary numbers See full entry 1725 First punch card 300 years ago, a French textile worker created a clever labour-saving device: paper tape with punched holes to control a loom. It still needed an operator, though, so

in 1805 Joseph Marie Jacquard invented a mechanism to automate the process. 1823 Babbage’s Difference Engine See full entry 1854 Boolean logic See full entry 1856 Undersea cabling See full entry overleaf 1873 Qwerty keyboard To avoid jams in his “Type Writer”, newspaper editor Christopher

Latham Sholes devises an odd arrangement of letters. Qwerty was born, with E Remington and Sons buying the rights to make the device in 1873. 1873 Electric motor See full entry overleaf 1889 Automatic calculator Herman Hollerith files patent for a system to automate American census using punch sheets – he would go on to form a company that would eventually become known as International Business Machines Corporation, or IBM. 1903 Printed circuit board German inventor Albert Hanson describes what would become a multilayer printed circuit board – flat foil conductors laminated to an insulating board. Without which, modern computers couldn’t exist.

100 ideas timeline

Our thanks to Steve Cassidy, Jon Honeyball, Paul Ockenden and Dick Pountain for their ideas and suggestions

Page 3: 100 IDEAS THAT CHANGED THE WORLDbinary numbers to classify musical meters. Pingala formed a matrix to give a unique value to each meter, but wrote from left to ... In the first of

@ P C P R O F A C E B O O K . C O M / P C P R O

3

100 ideasIn 1940, American mathematician and electrical engineer Claude Shannon used Boolean logic to analyse and optimise relay-switching circuits in his Master’s thesis for the Massachusetts Institute of Technology, A Symbolic Analysis of Relay and Switching Circuits. This is widely viewed as one of the original works of American computer science.

1858Undersea cabling Still essential to the way we communicate, undersea cabling dates back to the age of steam. Work on the first cross-pond cable began in 1856, but the first attempt at connecting the two ends in the middle saw them sink without trace. Whoops. Further attempts followed (and failed) but, in 1858, a cable successfully connected Newfoundland to the Irish coast. It lasted for nearly four weeks before being blown by an operator using too high a voltage.

Not an auspicious start, but the lessons were learned and within 20 years several thousand miles of undersea cable formed the backbone of the communications network. All intercontinental telegraphic communications data used this method, speeding up the transfer of news from weeks to seconds.

The telegraph brought changes that surpassed those of the telephone or those of the present internet revolution. The telegraph was the quantum leap of communication’s speed. Today’s cables are a light year away from those early attempts, but many of the basic principles remain the same. Dual pipes such as FLAG Telecom and Global TeleSystems’ FA-1 connect London, Paris and New York at speeds of 2.4Tbits/sec in each direction. This capacity can carry over 200 hours of digital video per second, 30 million clear voice channels,

or over two trillion bits of IP or data traffic per second.

1873Electric motor In 1873, the first commercially successful DC motor was demonstrated in Vienna at an exhibition by Zénobe Théophile Gramme, a Belgian electrical engineer. Even in the IT realm alone, we would be at a loss without

the electric motor. Despite the increase in the use of solid-state technology, this mechanical device continues to drive the industry. Without the electric motor, we wouldn’t have the benefit of hard disk drives, DVD-ROM drives, floppy drives and any other storage unit that requires angular velocity. Plus, the electric motor has assisted in keeping electronics cool, which has become even more necessary in recent years as processors need more cooling.

1918EnigmaThe German Enigma is surely the best known cipher machine. Invented in 1918 , it was developed as a commercial and military encipherment system. Enigma was an electro-mechanical device that utilised a stepping wheel system to “scramble” a plain text message to produce cipher text via polyalphabetic substitution. The number of cipher text alphabets is enormous, leading Germany’s military authorities to believe, wrongly as it turned out, in the absolute security of the system.

1937Zuse Z1 The first known working binary digital computer was called the Z1 and built by Konrad Zuse. It had a mechanical memory system. A prototype with electromagnetic relays called the Z2 was built a year later, with storage capacity for 16 words, plus card punch and reader I/O. It had 200 relays and operated with 16-bit integer arithmetic. The idea was to use this basic design in a bigger system like the Z1. The result was the Z3, which had a 64-word

alphabet, floating point arithmetic, 22-bit word length and 2,400 relays. Of these, 600 were for calculations and 1,800 for memory. Construction was interrupted in 1939 when Zuse was called up for military service, so the Z3 wasn’t completed until 1941. The Z3 was the first fully functioning, program-controlled electromechanical digital computer. The first three Z computers were destroyed during the war, but a fourth, the Z4, eventually ended up in Zurich. The Z4

1918 Enigma See full entry 1920 Fibre optics The idea of guiding light by refraction dates back to the 1840s, but it took until the early 1920s for information (namely pictures) to be transmitted.

1922 Media streaming George Owen Squier invents “Wired Radio”, precursor of Muzak – and all things streaming. While we admit that there are several steps from Muzak to Spotify to Netflix, the basic idea of a subscription-based media-streaming service is born.

1937 Zuse Z1 See full entry

1937 Turing Machine

See full entry overleaf 1943 Colossus See full entry overleaf 1945 ENIAC

See full entry overleaf

1947 First transistor A three-man team at Bell Labs

created the first “point-contact transistor”, without which processors as

we know them wouldn’t exist. They were awarded the Nobel Prize in 1956. 1948 Cybernetics See full entry overleaf 1949 Magnetic tape EDVAC (Electronic Discrete Variable Automatic Computer) was one of the first electronic computers, but it’s celebrated here because it was the first to use magnetic tape. 1949 Viruses concept bornJohn von Neumann gives lectures on the Theory and Organization of Complicated Automata, widely seen as the birth of the concept of computer viruses. 1950 Digital modem US Defense Department begins work on the digital modem. 1951 Tablet Isaac Asimov gives the world the first idea of a tablet with his “Calculator Pad” in his sci-fi classic, Foundation. It only took 60 years for the idea to truly catch on, courtesy of the Apple iPad. 1953 FORTRAN John W Backus tells his IBM bosses he has a better way to submit code, then sets to work on this influential language. 1956 Hard drive 305 RAMAC (Random Access Method of Accounting and Control) is launched, marking the first hard drive.

Page 4: 100 IDEAS THAT CHANGED THE WORLDbinary numbers to classify musical meters. Pingala formed a matrix to give a unique value to each meter, but wrote from left to ... In the first of

@ P C P R O F A C E B O O K . C O M / P C P R O

4

100 ideaswas started in 1942 and was intended to have a storage capacity of 1,024 words.

1937Turing MachineIn 1937, while a graduate student, Alan Turing wrote his amusingly entitled On Computable Numbers with an Application to the Entscheidungsproblem. The premise of Turing’s paper was that some classes of mathematical problems don’t lend themselves to algorithmic representations and so aren’t easily solved by machines. Since Turin didn’t have access to a real computer, because they didn’t exist at the time, he invented his own as an abstract “paper exercise”, which consisted of a grid of squares, each containing a zero or a one. This theoretical model became known as a “Turing machine”, and is one of the first descriptions of a software program working with probabilities in a binary computing environment.

1943ColossusAlan Turing later became a key player in the design and creation of Colossus, one of the world’s earliest working programmable electronic digital computers. Colossus was used during the Second World War to break the code created by the German Geheimfernschreiber (secret telegraph),

which was far stronger than Enigma. The Colossus Mark l consisted of 1,500 vacuum tubes, and was soon superseded by (and upgraded to) the Mark 2. Colossus read data at 5,000 characters per second and could perform up to 100 Boolean operations simultaneously through each of its five tape channels across a five-character matrix – in 200 microseconds. Although it’s hard to equate this with today’s calculating power, Colossus’ extreme specialisation makes it fast at breaking codes even compared to today’s computers.

1945ENIACConsidered by some to be the first electronic digital computer, the massive

American Electronic Numerical Integrator And Computer (ENIAC), which had 18,000 tubes, was predated by both Colossus and Konrad Zuse’s first four Z systems.

1948CyberneticsCybernetics is the fancy name for systems theory, which studies the way feedback loops work. It was invented during the 1940s at the Massachusetts Institute of Technology and paved the way for both automation and computing. A multidisciplinary team including Norbert Wiener (a mathematician), Warren McCulloch (a neurophysiologist) and Jay Forrester (an electronics engineer) modelled theories of how living organisms worked on self-regulating mechanical processes, and vice versa. Wiener’s Cybernetics, or Control and Communication in the Animal and the Machine, and The Mathematical Theory of Communication by Claude Shannon and Warren Weaver, both published in 1948, marked the beginning of a new epoch. The latter founded information theory.

1959The integrated circuitThe invention of the integrated circuit (IC) was central to the industry as we know it. In the late 1950s, electrical engineers were confounded by a problem they called the “tyranny of numbers”. This grandiose title referred to the problem of manufacturing the constituent electrical parts to make increasing complex circuits from discrete components. The problem was that, as the design for circuits improved, the number of components required grew exponentially, far in excess of the number that could actually be physically assembled together.

The solution to this came from two men. Jack Kilby was working at Texas Instruments and Robert Noyce at Fairchild Semiconductors. Their answer was to fabricate complete networks of components on to a single crystal of semiconductor material. This breakthrough, which was called the “monolithic integrated circuit”, enabled devices to be made much smaller, more complex and faster, and is credited as being the discovery that kicked off the computer revolution of the late 20th century. In fact, Noyce went on to be one of the key instigators of this revolution as co-founder of chip giant Intel and gained the nickname “the Mayor of Silicon Valley”.

1958 DARPA created In response to the Soviets launching Sputnik, President Dwight Eisenhower creates the influential Advanced Research Projects Agency – “Defense” was added later. 1959 The integrated circuit See full entry 1960 Hypertext See full entry overleaf 1960 Open source See full entry overleaf 1963 First mouse Douglas Engelbert, sadly no longer with us, jotted down some thoughts about a “bug” in his notebook in November 1963. Its key concepts? A “drop point and two orthogonal wheels”. It took a further five years before he demonstrated the mouse in public. 1965 Packet switching The foundation of transmitting data over digital networks, packet switching gained traction in 1965 thanks to British computer scientist Donald Davies. 1965 Moore’s law See full entry overleaf 1965 Touchscreen The Royal Radar Establishment’s Eric Johnson describes “a novel input/output device for computer systems” that is “sensitive to the touch of a finger”. 1969 Unix born Bell Labs’ engineers – including Dennis Ritchie of C fame – started development of a single-tasking platform originally called “Uniplexed Information and Computing Service” that was pronounced “Eunuchs” for short. How did that become Unix? That’s one of those annoying mysteries lost to the mists of time... 1971 Microprocessor First commercial microprocessor, the 4004, designed by Intel’s Marcian Hoff. It contained 2,300 transistors and took advantage of silicon gate technology, a replacement for the previously used aluminium gate.

1972 Pong See full entry overleaf

1972 First GUI OS

Xerox starts work on the first GUI-based system, known as the Alto and then later the Xerox Star. 1972 C

Once again, Bell Labs’ Dennis Ritchie is at the centre of a

big development. Literally, this time, with his general purpose

Page 5: 100 IDEAS THAT CHANGED THE WORLDbinary numbers to classify musical meters. Pingala formed a matrix to give a unique value to each meter, but wrote from left to ... In the first of

@ P C P R O F A C E B O O K . C O M / P C P R O

5

100 ideas

1960HypertextConventional wisdom has it that the internet started off as an American defence and educational network, then Tim Berners-Lee came along and invented the World Wide Web – and the rest is history. But there’s another, far more obscure figure, who can be credited with originating the idea of hypertext long before it was ever applied to the internet. As far back as 1960, Ted Nelson was developing an idea for an interconnected network of documents with embedded links to each other. In 1963, he chose the word “hypertext” to describe the system, publishing his theories in 1965’s groundbreaking paper, A File Structure for the Complex, the Changing and the Indeterminate, for the Association for Computing Machinery’s national conference. By 1967, he had chosen the name Xanadu for his hypertext project, from Coleridge’s poem about Kublai Khan. The ideas directly inspired Tim Berners-Lee’s World Wide Web, Ray Ozzie’s Lotus Notes and Bill Atkinson’s HyperCard (the first multimedia system). By inventing the idea of hypertext, Ted Nelson was the underpraised architect of some of the most influential new information structures in the history of computing. See xanadu.com for more information.

1960s and 1970sOpen sourceThroughout the 1960s and 1970s – a golden age of computing – open-source software, largely funded by the US government, was the wellspring of creation for the programming industries. Through a combination of key funding agencies, administrative oversight of software standards and government purchasing rules, the US helped stimulate open-source software and open standards for decades. Until Microsoft’s MS-DOS and Windows, almost the entire software market was based on sharing and open source.

Bill Gates has never liked the concept and in 1976, at the age of 21, asked in an angry letter to an open source programmer: “One thing you do is prevent good software from being written. Who can afford to do professional work for nothing?”

Open source then faded into the background – until its re-emergence with Linux at the forefront.

1965Moore’s law In April 1965, Dr Gordon Moore published a paper entitled Cramming more components onto integrated circuits. In this paper, written when an integrated circuit

contained fewer than 100 transistors, Moore made the now-famous prediction that ten years later there would be 65,000 components on a single silicon chip, equating to a doubling of transistor density every year. Remarkably, this prediction held true and the law, eventually revised to the more popular “processor clock speeds double every 18 months”, was one of the factors driving development in the industry

for decades. Some people saw it almost as a self-fulfilling prophecy: Moore’s law set the timetable for where manufacturers had be if they were to compete. It was only in recent years, when we reached manufacturing processes of less than 50nm, that the law eventually foundered.

1972PongPong is widely regarded as the first commercially available computer game. While that’s not strictly true, it was the game that launched a thousand misspent youths.

The game – two-dimensional tennis – is generally accredited to Atari, which first marketed the low-spec two-player game in 1972, but an even earlier version of the game was programmed by William Higinbotham. The designer created an interactive tennis game on an oscilloscope back in 1967, five years before Atari came into being.

In 1971, Nolan Bushnell and Ted Dabney invented Computer Space, a basic shoot ’em up, and it flopped. Undaunted, the pair – along with Al Alcorn – formed a company called Syzygy, and designed Pong. They renamed the company Atari and put Pong on sale in November 1972 – and the rest is gaming legend. With 100,000 units, Pong turned out to be the biggest-selling video game of the 20th century, only rivalled by the next milestone: Pac-Man.

programming language C still thriving today.

1973 Ethernet See full entry overleaf 1973 GPS See full entry overleaf 1973 VoIP First example of VoIP – the Network Voice Protocol – transported human speech via packets over a computer network. 1973 Email Proprietary mail systems had existed since the 1960s, allowing early computer users to message one another, but it took ARPANET’s RFC 561 to standardise the concept. Thanks so much, guys. 1975 Microsoft born Bill Gates and Paul Allen write first BASIC implementation for the MITS Altair, setting up a little company called Microsoft to trade under.

1975 Smart home Pico Electronics, of that technology hotbed Glenrothes, produced the X10 protocol to remotely control home appliances. And so the smart home concept was born. 1976 Apple Computer 1 Legend Steve Wozniak hand-built the original Apple 1, with 200 eventually going on sale. This was the product that gave birth to Apple. 1976 5.25in floppy 5.25in floppy drives introduced by Shugart Associates. 1978 Laser printer Xerox introduces the 9700, the first commercially available laser printer. 1980 3.5in floppy Sony releases the first 3.5in floppy, with Apple debuting a drive in its 1984 Macs.

Sony stopped making floppies in 2010. 1980 Spreadsheet SuperCalc, the first spreadsheet for popular micro OS, CP/M, was launched.

1980 Sinclair computers

See full entry overleaf

1981 IBM PC Having spent a few years watching Apple, Commodore and Tandy make money via personal computers, the world’s biggest mainframe maker decided to have a go. It cost $1,565, included 16Kb of RAM and included a colour graphics adapter. Rumour has it that it sold 40,000 units on the first day.

Page 6: 100 IDEAS THAT CHANGED THE WORLDbinary numbers to classify musical meters. Pingala formed a matrix to give a unique value to each meter, but wrote from left to ... In the first of

@ P C P R O F A C E B O O K . C O M / P C P R O

6

100 ideas

1973GPSWhere would we be today without GPS? We’re not just thinking of satnav, but all of the applications that currently depend on knowing our location, and all the future applications that will. Take geofencing, a key feature of security products to drones. Or augmented reality, cartography, disaster emergency services, fleet tracking – the list is virtually endless. And whatever you think about driverless cars, there’s no way such technology could work without GPS.

As is so often the case in computing, GPS was born to meet a military requirement, with the US Department of Defense launching the GPS project back in 1973. The idea dates back to 1957, though, when two American physicists realised they could track the exact location of Sputnik via its radio transmissions. This led to the creation of Transit, a navigation system used by the US Navy, which was first successfully tested in 1960 and used five satellites to provide a fix.

1973Ethernet The first experimental Ethernet system, the Alto Aloha Network, was developed by Dr Robert M Metcalfe at Xerox PARC. It was designed to interconnect a selection of Xero Altos, personal workstations with a graphical user interface. It also linked them to servers and laser printers. The data transmission rate was 2.94Mbits/sec. Metcalfe then wrote his dissertation on a reworked model of AlohaNet, which he called EtherNet. In 1973, this became Ethernet. He’d changed the name from Alohanet to indicate the system was capable of supporting any computer, not just the Altos, and had evolved beyond the original Aloha system.

He chose to base the name on the word “ether”, the mysterious fifth element that Greek philosophers thought filled the heavens. Later, in the 19th century, “ether” was used as a name for the medium which was thought to carry electromagnetic and gravitational waves. In a similar way, the Ethernet medium (for example, a cable) carries bits to all stations. Ethernet was patented #4063220 on 13 December 1977 by Metcalfe, who went on to join 3Com in June 1979.

1980Sinclair computersProducing everything from electric cars to home computers, Clive Sinclair was the 1980s embodiment of British eccentric genius, While America’s kids whiled away their free time on Atari and Commodore boxes, a whole

generation of future programmers instead proudly turned to England’s alternative – the Sinclair.

It all kicked off with the ZX80, designed for hobbyists in kit form, or ready-built for

just £99 – a groundbreaking price for the time. This bought you a paltry

1KB of RAM, a 3.25MHz NEC processor and a 24 x 32 line mono

character display – that’s right, not even any graphics.

This was followed up a year later with the similar ZX81, which

was expandable to up to 64KB of RAM, but the

best was yet to come.1982 saw the arrival of the UK’s most

influential computer, the ZX Spectrum. It had 48KB of RAM, a sleek black shell with grey rubber keys and up to eight glorious colours at a resolution of 256 x 192. A whole generation of home gamers and programmers was born, and is possibly the reason we have so many programmers in the UK. It didn’t have the Commodore 64’s accurate graphics, coloured sprites or complex sound, but we still loved it all the same.

1983LCD screens and laptopsThe development of the LCD can be traced back to 1888, when Austrian botanist Friedrich Reinitzer discovered the liquid crystal, but it wasn’t until 1968 that scientists at the RCA group developed the first display using the technology. Billions of pounds of investment over the next 30 years aided advances in brightness, contrast, colour, viewing angle, response time and cost. However, after years of use in watches and calculators, one implementation in particular captured the mind of computing enthusiasts.

In 1983, Tandy launched the TRS-80 (which was unflatteringly nicknamed the “Trash-80”) Model 100, a portable computer that featured an eight-row by 40-column reflective LCD screen. It’s now regarded by many as being the first mainstream laptop.

1981 MS-DOS Hired by IBM to develop an OS for its personal computers, Microsoft cannily bought the pre-made 86-DOS from one of its newest recruits – Tim Paterson, who had developed the software for his previous company. MS-DOS became the OS of choice as PC-compatible designs grew more popular. 1981 WIMP Xerox 8010 (“Star”) System, first to use a WIMP (“windows, icons, menus, pointer”) graphical user interface launched. 1981 BBC Micro The BBC Micro went on sale in December 1981 and helped inspire a generation of British programmers.

1982 First virus A virus, Elk Cloner, appears in the wild for the first time. Spread by Apple II floppies,

the virus was made by a ninth grader.

1982 Commodore 64 The Commodore 64 was king of

home computers, at least in the US market, in the mid-1980s. Comparatively cheap, its popularity spread due to the 10,000 software

titles available – including many games.

1982 TCP/IP

The Transmission Control Protocol/Internet Protocol is established. 1983 Apple Lisa Hot on the heels of the Xerox 8010 and its graphical user interface, Apple produced the bug-ridden but hugely profitable Lisa. Ahead of its time in many ways, it originally sold for $9,995 – around $25,000 in today’s money. 1983 Lotus 1-2-3 Not the first spreadsheet software, but Lotus 1-2-3 brought the spreadsheet to the masses and is described as one of the IBM PC’s “killer apps”. 1983 LCD and laptops See full entry

Coming soon

Part 2: 1984-todayThis feature is inspired by a PC Pro article that dates all the way back to 2001 entitled “100 ideas that changed IT”. Look out for part two, from 1984 to the present day, which will be made available as an exclusive download for PC Pro subscribers.