Kenneth d

17
Kenneth D. Gonzaga Grade 7- TVE 5 ASSIGNMENT NO.1 What is a Computer? In its most basic form a computer is any device which aids humans in performing various kinds of computations or calculations. In that respect the earliest computer was the abacus, used to perform basic arithmetic operations. Every computer supports some form of input, processing, and output. This is less obvious on a primitive device such as the abacus where input, output and processing are simply the act of moving the pebbles into new positions, seeing the changed positions, and counting. Regardless, this is what computing is all about, in a nutshell. We input information; the computer processes it according to its basic logic or the program currently running, and outputs the results. Modern computers do this electronically, which enables them to perform a vastly greater number of calculations or computations in less time. Despite the fact that we currently use computers to process images, sound, text and other non-numerical forms of data, all of it depends on nothing more than basic numerical calculations. Graphics, sound etc. are merely abstractions of the numbers being crunched within the machine; in digital computers these are the ones and zeros, representing electrical on and off states and endless combinations of those. In other words every image, every sound, and every word have a corresponding binary code. While abacus may have technically been the first computer most people today associate the word “computer” with electronic computers which were invented in the last century, and have evolved into modern computers we know of today. First Generation Computers (1940s – 1950s)

description

computer

Transcript of Kenneth d

Page 1: Kenneth d

Kenneth D. GonzagaGrade 7- TVE 5

ASSIGNMENT NO.1What is a Computer?

In its most basic form a computer is any device which aids humans in performing various kinds of computations or calculations. In that respect the earliest computer was the abacus, used to perform basic arithmetic operations.Every computer supports some form of input, processing, and output. This is less obvious on a primitive device such as the abacus where input, output and processing are simply the act of moving the pebbles into new positions, seeing the changed positions, and counting. Regardless, this is what computing is all about, in a nutshell. We input information; the computer processes it according to its basic logic or the program currently running, and outputs the results.Modern computers do this electronically, which enables them to perform a vastly greater number of calculations or computations in less time. Despite the fact that we currently use computers to process images, sound, text and other non-numerical forms of data, all of it depends on nothing more than basic numerical calculations. Graphics, sound etc. are merely abstractions of the numbers being crunched within the machine; in digital computers these are the ones and zeros, representing electrical on and off states and endless combinations of those. In other words every image, every sound, and every word have a corresponding binary code.

While abacus may have technically been the first computer most people today associate the word “computer” with electronic computers which were invented in the last century, and have evolved into modern computers we know of today.

First Generation Computers (1940s – 1950s)

First electronic computers used vacuum tubes, and they were huge and complex. The first general purpose electronic computer was the ENIAC (Electronic Numerical Integrator And Computer). It was digital, although it didn’t operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch.

Page 2: Kenneth d

It took up 167 square meters, weighed 27 tons, and consuming 150 kilowatts of power. It used thousands of vacuum tubes, crystal diodes, relays, resistors, and capacitors.

The first non-general purpose computer was ABC (Atanasoff–Berry Computer), and other similar computers of this era included german Z3, ten British Colossus computers, LEO, Harvard Mark I, and UNIVAC.

Second Generation Computers (1955 – 1960)

The second generation of computers came about thanks to the invention of the transistor, which then started replacing vacuum tubes in computer design. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by today’s standards.

The first transistor computer was created at the University of Manchester in 1953. The most popular of transistor computers was IBM 1401. IBM also created the first disk drive in 1956, the IBM 350 RAMAC.

Third Generation Computers (1960s)

IBM System/360

The invention of the integrated circuits (ICs), also known as microchips, paved the way for computers as we know them today. Making circuits out of single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce. This also started the ongoing process of integrating an ever larger number of transistors onto a single microchip. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on.

First appeared minicomputers, first of which were still based on non-microchip transistors, and later versions of which were hybrids, being based on both transistors and microchips, such as IBM’s System/360. They were much smaller, and cheaper than first and second generation of computers, also known as mainframes. Minicomputers can be seen as a bridge between mainframes and microcomputers, which came later as the proliferation of microchips in computers grew.

Fourth Generation Computers (1971 – present)

First microchips-based central processing units consisted of multiple microchips for different CPU components. The drive for ever greater integration and miniaturization led towards single-chip CPUs, where all of the necessary CPU components were put onto a single microchip, called a microprocessor. The first single-chip CPU, or a microprocessor, was Intel 4004.

The advent of the microprocessor spawned the evolution of the microcomputers, the kind that would eventually become personal computers that we are familiar with today.

Page 3: Kenneth d

First Generation of Microcomputers (1971 – 1976)

Altair 8800

First microcomputers were a weird bunch. They often came in kits, and many were essentially just boxes with lights and switches, usable only to engineers and hobbyists whom could understand binary code. Some, however, did come with a keyboard and/or a monitor, bearing somewhat more resemblance to modern computers.

It is arguable which of the early microcomputers could be called a first. CTC Datapoint 2200 is one candidate, although it actually didn’t contain a microprocessor (being based on a multi-chip CPU design instead), and wasn’t meant to be a standalone computer, but merely a terminal for the mainframes. The reason some might consider it a first microcomputer is because it could be used as a de-facto standalone computer, it was small enough, and its multi-chip CPU architecture actually became a basis for the x86 architecture later used in IBM PC and its descendants. Plus, it even came with a keyboard and a monitor, an exception in those days.

However, if we are looking for the first microcomputer that came with a proper microprocessor, was meant to be a standalone computer, and didn’t come as a kit then it would be Micral N, which used Intel 8008 microprocessor.Popular early microcomputers which did come in kits include MOS Technology KIM-1, Altair 8800, and Apple I. Altair 8800 in particular spawned a large following among the hobbyists, and is considered the spark that started the microcomputer revolution, as these hobbyists went on to found companies centered around personal computing, such as Microsoft, and Apple.

Second Generation Microcomputers (1977 – present)

Commodore PET2001

Page 4: Kenneth d

As microcomputers continued to evolve they became easier to operate, making them accessible to a larger audience. They typically came with a keyboard and a monitor, or could be easily connected to a TV, and they supported visual representation of text and numbers on the screen.

In other words, lights and switches were replaced by screens and keyboards, and the necessity to understand binary code was diminished as they increasingly came with programs that could be used by issuing more easily understandable commands. Famous early examples of such computers include Commodore PET, Apple II, and in the 80s the IBM PC.

The nature of the underlying electronic components didn’t change between these computers and modern computers we know of today, but what did change was the number of circuits that could be put onto a single microchip. Intel’s co-founder Gordon Moore predicted the doubling of the number of transistor on a single chip every two years, which became known as “Moore’s Law”, and this trend has roughly held for over 30 years thanks to advancing manufacturing processes and microprocessor designs.

The consequence was a predictable exponential increase in processing power that could be put into a smaller package, which had a direct effect on the possible form factors as well as applications of modern computers, which is what most of the forthcoming paradigm shifting innovations in computing were about.

Graphical User Interface (GUI)

Macintosh 128k

Possibly the most significant of those shifts was the invention of the graphical user interface, and the mouse as a way of controlling it. Doug Engelbart and his team at the Stanford Research Lab developed the first mouse, and a graphical user interface, demonstrated in 1968. They were just a few years short of the beginning of the personal computer revolution sparked by the Altair 8800 so their idea didn’t take hold.

Instead it was picked up and improved upon by researchers at the Xerox PARC research center, which in 1973 developed Xerox Alto, the first computer with a mouse-driven GUI. It never became a commercial product, however, as Xerox management wasn’t ready to dive into the computer market and didn’t see the potential of what they had early enough.

It took Steve Jobs negotiating a stocks deal with Xerox in exchange for a tour of their research center to finally bring the user friendly graphical user interface, as well as the mouse, to the masses. Steve Jobs was shown what Xerox PARC team had developed, and directed Apple to improve upon it. In 1984 Apple introduced the Macintosh, the first mass-market computer with a graphical user interface and a mouse.Microsoft later caught on and produced Windows, and the historic competition between the two companies started, resulting in improvements to the graphical user interface to this day.

Meanwhile IBM was dominating the PC market with their IBM PC, and Microsoft was riding on their coat tails by being the one to produce and sell the operating system for the IBM PC known as “DOS” or “Disk Operating System”. Macintosh, with its graphical user interface, was meant to dislodge IBM’s dominance, but Microsoft made this more difficult with their PC-compatible Windows operating system with its own GUI.

Page 5: Kenneth d

Portable Computers

Powerbook 150

As it turned out the idea of a laptop-like portable computer existed even before it was possible to create one, and it was developed at Xerox PARC by Alan Kay whom called it the Dynabook and intended it for children. The first portable computer that was created was the Xerox Notetaker, but only 10 were produced.

The first laptop that was commercialized was Osborne 1 in 1981, with a small 5″ CRT monitor and a keyboard that sits inside of the lid when closed. It ran CP/M (the OS that Microsoft bought and based DOS on). Later portable computers included Bondwell 2 released in 1985, also running CP/M, which was among the first with a hinge-mounted LCD display. Compaq Portable was the first IBM PC compatible computer, and it ran MS-DOS, but was less portable than Bondwell 2. Other examples of early portable computers included Epson HX-20, GRiD compass, Dulmont Magnum, Kyotronic 85, Commodore SX-64, IBM PC Convertible, Toshiba T1100, T1000, and T1200 etc.The first portable computers which resemble modern laptops in features were Apple’s Powerbooks, which first introduced a built-in trackball, and later a trackpad and optional color LCD screens. IBM’s ThinkPad was largely inspired by Powerbook’s design, and the evolution of the two led to laptops and notebook computers as we know them. Powerbooks were eventually replaced by modern MacBook Pro’s.

Of course, much of the evolution of portable computers was enabled by the evolution of microprocessors, LCD displays, and battery technology and so on. This evolution ultimately allowed computers even smaller and more portable than laptops, such as PDAs, tablets, and smartphones.

Abacus (3000 BC)

The first manual data processing device which was developed in China 5000 years ago as the earliest form of the computer that add and subtract numbers.

Pascaline (1642)

Page 6: Kenneth d

It is invented by Blaise Pascal that was called a numeral wheel calculator, which adds and subtracts bigger numbers.Blaise Pascal (1623-1662)

He invented the first commercial calculator, a hand power adding machines which is Pascaline.

Leibnitz Calculator (1694)

Can multiply, divide, add and subtract. Mechanical device made of copper and steel. Carriage is performed with a stepped wheel, which mechanism is still in use today. Contrary to Pascal, Leibniz (1646-1716) successfully introduced a calculator onto the market. It is designed in 1673 but it takes until 1694 to complete. The calculator can add, subtract, multiply, and divide. Wheels are placed at right angles which could be displaced by a special stepping mechanism. The speed of calculation for multiplication or division was acceptable. But like the Pascaline, this calculator required that the operator using the device had to understand how to turn the wheels and know the way of performing calculations with the calculator.

Analytical Engine

Page 7: Kenneth d

The Analytical Engine was a proposed mechanical general-purpose computer designed by English mathematician Charles Babbage. It was first described in 1837 as the successor to Babbage's Difference engine, a design for a mechanical computer. The Analytical Engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. Babbage was never able to complete construction of any of his machines due to conflicts with his chief engineer and inadequate funding.  It was not until the 1940s that the first general-purpose computers were actually built.

Tabulating Machine (1889)

The tabulating machine was an electromechanical machine designed to assist in summarizing information and, later, accounting. Invented by Herman Hollerith, the machine was developed to help process data for the 1890 U.S. Census. It spawned a class of machines, known as unit record equipment, and the data processing industry. The term "Super Computing" was first used by the New York World newspaper in 1931 to refer to a large custom-built tabulator that IBM made for Columbia University.

Charles Babbage (1791-1871)

He is the father of “Modern Computer” who invented analytical engine and difference engine.

Page 8: Kenneth d

Augusta Ada Byron

She is the first computer programmer who helped Babbage to mechanically translate a short written work.

Hardware

Computer hardware is the collection of physical elements that constitute a computer system. Computer hardware refers to the physical parts or components of a computer such as monitor, keyboard, Computer data storage, and hard drive disk, mouse, CPU (graphic cards, sound cards, memory, motherboard and chips), etc. all of which are physical objects that you can actually touch. In contrast, software is untouchable. Software exists as ideas, application, concepts, and symbols, but it has no substance. A combination of hardware and software forms a usable computing system.

Software

Computer software, or just software, is any set of machine-readable instructions (most often in the form of a computer program) that directs a computer's processor to perform specific operations. The term is used to contrast with computer hardware, the physical objects (processor and related devices) that carry out the instructions. Hardware and software require each other; neither has any value without the other.

Peopleware

Peopleware is a term used to refer to one of the three core aspects of computer technology, the other two being hardware and software. Peopleware can refer to anything that has to do with the role of people in the development or use of computer software and hardware systems, including such issues as developer productivity, teamwork, group dynamics, the psychology of programming, project management, organizational factors, human interface design, and human-machine-interaction.

Hard copy

In information handling, a hard copy is a permanent reproduction, or copy, in the form of a physical object, of any media suitable for direct use by a person (in particular paper), of displayed or transmitted data. Examples of hard copy include teleprinter pages, continuous printed tapes, computer printouts, and radio photo prints.[1]

Magnetic tapes, diskettes, and non-printed punched paper tapes are not hard copies. Information sent via fax or email is not a hard copy.

Soft copy

A soft copy is the unprinted digital document file. This term is often contrasted with hard copy. It can usually be viewed through an appropriate editing program, such as word processing programs, database programs, or presentation software, depending on the file type.It can be transported from one computer to another through file transfer/downloading mechanisms such as FTP or HTTP, as an email attachment, or through USB drives and other disk drives. Keeping a digital copy of a document can allow easy editing of it later on. See hard copy for information about printed documents.

Page 9: Kenneth d

Using soft copies of work over traditional printed documents eliminates the need for paper and ink. Multiple copies of the same document can be kept in different versions, allowing the user to easily backtrack to an earlier version. Also, soft copies are more easily manipulated by users than hard copies, which can be both an advantage and a disadvantage.When soft copies are kept on storage devices, they conserve office space. Softcopy documents are more portable compared to hardcopy because it is not bulky like hardcopy.

INPUT DEVICES The input unit is formed by the input devices attached to the computer. Ex - Keyboard, Microphone etc. An input unit takes the input & converts it into binary form so that it can be understood by the computer.

KEYBOARD

This is most commonly used device which acts as input device. Its structure is like typewriter.It contains no. of  keys which have some specific ASCII values. Like ‘A’ has ASCII value 65. When this is pressed , it is converted into 65 & this 65 is sent to CPU in the form of Binary language (i.e. 1000001). Then operations are done on this data.

MOUSE

This is a pointing device which contains a roller in its base. When the mouse is moved on any surface, the pointer on the screen is also moved. It contains a potentiometer coupled with the roller. This potentiometer senses the motion of mouse & converts it into digital value. A mouse may contain two or three buttons. Now a day's optical mouse is very popular.

BAR CODE READER

It is a device which is used to read the code from the products which are usually in the form of Bars. It contains a light sensitive detector which identifies the values of the bars on the product & converts them into numeric code.These Bar readers are used in Shopping malls in a very large scale. 

Page 10: Kenneth d

DIGITAL CAMERA

As the name specifies, these camera stores the data digitally, which then can be stored in the computer & can be stored for a long time. But it has very limited storage capacity. These are very popular because of less expensive photographs & Speed.

LIGHT PEN

It is a pointing device which contains a photocell mounted at its tip. It senses the light from the screen when it becomes closer to the screen, & generates a pulse. So for identifying a particular location on the screen these light pens are very useful. But this is not in very much use these days. SCANNER

The scanner is an input device like the photocopier machine which makes the electronic copy of the picture or document which can be further edited.  MIC's or MICROPHONES

The scanner is an input device like the photocopier machine which makes the electronic copy of the picture or document which can be further edited.

Page 11: Kenneth d

OUTPUT DEVICES

The devices which are used to display the data to the user either in the form of hard copy or soft copy are called output devices. SPEAKERS

Speakers receive the sound in the form of electric current from the sound card & convert it to sound format. These are used for listening music, chatting , seminars for presentations etc. VDU (Visual Display Unit)

This is also called monitor. It is used to get the data in the form of soft copy. Their functioning is exactly similar to the television. It contains a CRT which emits the electrons to trace a regular pattern of horizontal lines on the screen.   PRINTER

These devices give the hard copy of the output. These are in different types.Impact - Have mechanical contact in between paper & printing headNon Impact - No mechanical contact between paper & printing head LCD (Liquid Crystal Display)

These screens are used in laptops & notebook sized PCs. A special type of liquid is sandwiched between two plates. The top plate is clear & bottom plate is reflective. The molecules in these liquids are normally aligned & the computer signals are used to align these molecules.

Page 12: Kenneth d

SAVING DEVICE OR MEMORY

The working place in computer where all data is stored is called memory. There are small cells called bit. In these cell data is stored in the form of 0 & 1. Its unit is bytes. A memory cell may be defined as a device which can store a symbol selected from a set of symbols.

PRIMARY MEMORY

Primary Memory consist of RAM (random access memory) and ROM (read only memory). In this type of primary memory data is not stored permanently, it is stored temporarily. After getting off data is erased from the memory of computer. “These are considered as short term memory”

SECONDARY MEMORY

Secondary memory (or secondary storage) is the slowest and cheapest form of memory. It cannot be processed directly by the CPU. It must first be copied into primary storage (also known as RAM ). Secondary memory devices include magnetic disks like hard drives and floppy disks; optical disks such as CDs and CD ROMs; and magnetic tapes, which were the first forms of secondary memory.

Page 13: Kenneth d

PARTS OF COMPUTER AND ITS FUNCTION