GENERAL COMPUTING

GENERAL COMPUTING

What is a computer?

 In anything you are asked to define or discuss there are key terms used or involved in it as describing the full details of what that thing is all about. In computer too there are basic key words to be considered when defining a computer as far as computer is concern and they are “input, process, storage, and output”

 Therefore, by defining a computer we regard it as an electronic machine that can accept data as input process it, store it and produce output either as soft or hard copy. The basic terms used in the definition of a computer are explained bellow.

a. Input: this is the process of sending data into the computer system or the process by which the user communicate with the computer system.

b. Process: this is the process or act of changing the raw fact, data or unprocessed information into meaningful information.

c. Storage: this is just the process of preserving or keeping a processed data for future use.

d. Output: this is referred to the result given to us after processing and this result can be in hard or soft copy.

HISTORY OF COMPUTER

 Computer technology started as early as 1842 by a French philosopher called Blaise Pascal. This computer had series of ten-toothed wheels each of these tooth represent digits (numbers) starting from 0-9, and these number were applied by turning these wheels. This machine was used for addition.

 There was an improvement on the computer by a German philosopher and mathematician called Gottfried Wilhelm Von Heibniz, he devised a machine that could also multiply, and this was in 1870s. computer undergo a lot of improvement until 1880s when an American statistician, Herman Hollerith introduced an idea of using a perforated card called a punch card, similar to Jaiguared’s boards, for processing data. Using this method, he was able to help the U.S. for their census in 1890.

 Until 19th Century, as the improvement proceed when a British mathematician and inventor Charles Babbage, worked out the principles of the modern digital computer. He introduced an idea that was used to construct a machine that solves complicated mathematical problems. Historians believed that Babbage and his group are the main inventors of the modern digital computer although they did not have enough equipment to build it.

 The approximation of numerical equation was made possible by the machine. During the first and second world war, this analog computer was used by submarines and bombsight controllers in aircraft.

  In 1940s, Howard Aiken a mathematician of the Harvard University created what is usually considered as the first digital computer. The sequence of instruction that are to be used in solving a mathematical problem was fed into the machine on a roll of punch paper tape rather than being stored in the computer.

GENERATIONS OF COMPUTER

FIRST GENERATION OF COMPUTER (1944-1948): computers under this generation were very slow, high power consumption, short life span occupy a very large space, tedious to work with and uses a vacuum tube for storage. The computers were used for commercial and business environment they also generate a lot of heat. Example of this computers are as follows:

ENIAC – Electronic Numeric Integration and computers.

EDVAC – Electronic Discrete Variables Automatic Computers.

UNIVAC – Universal Automatic Computer. The manufacturers were I.B.M (International Business Machine) in USA.

SECOND GENERATION (1948-1959): In this generation, the scientist led by William sckoley at Bell Laboratories invented a device called Transistor. And it led to significant changes in computer system, by replacing vacuum tubes, less expensive, reduce the power consumption, does not generate heat, uses magnetic disk storage device e.g. ATLAS, IBM 7070, IBM 7000 series. It was an improvement from the first generation of computer.

THIRD GENERTION (1957-1971): This generation marked the invention of integrated circuit (IC) which consists of series of inter-connected transistors and capacitors all attached on a single silicon chip. They were much smaller as compared to that of second generation, more effective in terms of power consumption reduce heat, cheaper, use multi-user operating system which enhances time sharing technology of data processing eg. IBM 360, ZCL 2903, CDC 1700.

FOURTH GENERATION (1971-1981): Here, computers were on high technology because the L.S.I (Large Integration Circuit) which can hold more than 1 million transistors or components of the computer on a chip and they use V.L.S.L (Very Large Integration Circuit). Using the two integrations the computer is sense reduced as compared to that of the third generation of computers, increment in power consumption, efficient and more reliable because the computer uses microprocessor eg. Pentium and Intel.

FIFTH GENERATION (1982-DATE): Under this generation, computers were developed by Japanese to mark the high level of technology development. The computers here have the capacity to reason and handle complex decisions making processes automatically because of its inbuilt artificial intelligence called kips (Knowledge Information Processing System). They also use many central processing units for processing. But work the same at the cause of bringing out the result U.L.S.I (Ultra Large Scale Integration)





Development of computer science

Computer science as an independent discipline dates to only about 1960, although the electronic digital computer that is the object of its study was invented some two decades earlier. The roots of computer science lie primarily in the related fields of electrical engineering and mathematics. Electrical engineering provides the basics of circuit design—namely, the idea that electrical impulses input to a circuit can be combined to produce arbitrary outputs. The invention of the transistor and the miniaturization of circuits, along with the invention of electronic, magnetic, and optical media for the storage of information, resulted from advances in electrical engineering and physics. Mathematics is the source of one of the key concepts in the development of the computer—the idea that all information can be represented as sequences of zeros and ones. In the binary number system, numbers are represented by a sequence of the binary digits 0 and 1 in the same way that numbers in the familiar decimal system are represented using the digits 0 through 9. The relative ease with which two states (e.g., high and low voltage) can be realized in electrical and electronic devices led naturally to the binary digit, or bit, becoming the basic unit of data storage and transmission in a computer system.

The Boolean algebra developed in the 19th century supplied a formalism for designing a circuit with binary input values of 0s and 1s (false or true, respectively, in the terminology of logic) to yield any desired combination of 0s and 1s as output. Theoretical work on computability, which began in the 1930s, provided the needed extension to the design of whole machines; a milestone was the 1936 specification of the conceptual Turing machine (a theoretical device that manipulates an infinite string of 0s and 1s) by the British mathematician Alan Turing and his proof of the model's computational power. Another breakthrough was the concept of the stored-program computer, usually credited to the Hungarian-American mathematician John von Neumann. This idea—that instructions as well as data should be stored in the computer's memory for fast access and execution—was critical to the development of the modern computer. Previous thinking was limited to the calculator approach, in which instructions are entered one at a time.

The needs of users and their applications provided the main driving force in the early days of computer science, as they still do to a great extent today. The difficulty of writing programs in the machine language of 0s and 1s led first to the development of assembly language, which allows programmers to use mnemonics for instructions (e.g., ADD) and symbols for variables (e.g., X). Such programs are then translated by a program known as an assembler into the binary encoding used by the computer. Other pieces of system software known as linking loaders combine pieces of assembled code and load them into the machine's main memory unit, where they are then ready for execution. The concept of linking separate pieces of code was important, since it allowed “libraries” of programs to be built up to carry out common tasks—a first step toward the increasingly emphasized notion of software reuse. Assembly language was found to be sufficiently inconvenient that higher-level languages (closer to natural languages) were invented in the 1950s for easier, faster programming; along with them came the need for compilers, programs that translate high-level language programs into machine code. As programming languages became more powerful and abstract, building efficient compilers that create high-quality code in terms of execution speed and storage consumption became an interesting computer science problem in itself.

Increasing use of computers in the early 1960s provided the impetus for the development of operating systems, which consist of system-resident software that automatically handles input and output and the execution of jobs. The historical development of operating systems is summarized below under that topic. Throughout the history of computers, the machines have been utilized in two major applications: (1) computational support of scientific and engineering disciplines and (2) data processing for business needs. The demand for better computational techniques led to a resurgence of interest in numerical methods and their analysis, an area of mathematics that can be traced to the methods devised several centuries ago by physicists for the hand computations they made to validate their theories. Improved methods of computation had the obvious potential to revolutionize how business is conducted, and in pursuit of these business applications new information systems were developed in the 1950s that consisted of files of records stored on magnetic tape. The invention of magnetic-disk storage, which allows rapid access to an arbitrary record on the disk, led not only to more cleverly designed file systems but also, in the 1960s and '70s, to the concept of the database and the development of the sophisticated database management systems now commonly in use. Data structures, and the development of optimal algorithms for inserting, deleting, and locating data, have constituted major areas of theoretical computer science since its beginnings because of the heavy use of such structures by virtually all computer software—notably compilers, operating systems, and file systems. Another goal of computer science is the creation of machines capable of carrying out tasks that are typically thought of as requiring human intelligence. Artificial intelligence, as this goal is known, actually predates the first electronic computers in the 1940s, although the term was not coined until 1956.

Computer graphics was introduced in the early 1950s with the display of data or crude images on paper plots and cathode-ray tube (CRT) screens. Expensive hardware and the limited availability of software kept the field from growing until the early 1980s, when the computer memory required for bit-map graphics became affordable. (A bit map is a binary representation in main memory of the rectangular array of points [pixels, or picture elements] on the screen. Because the first bit-map displays used one binary bit per pixel, they were capable of displaying only one of two colours, commonly black and green or black and amber. Later computers, with more memory, assigned more binary bits per pixel to obtain more colours.) Bit-map technology, together with high-resolution display screens and the development of graphics standards that make software less machine-dependent, has led to the explosive growth of the field. Software engineering arose as a distinct area of study in the late 1970s as part of an attempt to introduce discipline and structure into the software design and development process. For a thorough discussion of the development of computing, see computers, history of.

CLASSIFICATION OF COMPUTER.

Computer can be classified in three different aspects these are Type, Size and Purpose.

BY TYPE

1. Analog computer: This is a hydraulic or electronic device that is built, constructed, designed to handle input in terms of hydraulic pressure or voltage level, rather than numeric figures. To explain better, this type of computer works in response to pressure or wave vibration. The more the force or pressure on this type of computer, it indicates response. It is mostly used by scientist and engineers. It does not undergo any arithmetical process e.g. Thermometer, meter, speedometer etc.

2. Digital computer: This type of computer work in binary numbers called code on a very high speed, they also receive data inform of human understanding. We can say that it is built to have an ability to determine if a switch or gate is ON or OFF, “gate” is a logic circuit that takes in two digits and brings out the logical output” the computer can only recognize ON or OFF in its microprocessor or microcircuit (Smaller circuit).

3. Hybrid computer: This type of computer possesses the feature of both the digital and analog component of computer, used in hospitals, weather forecasting, space program etc.

BY SIZE. Mainframe computer: This mainframe computer is the biggest computer technology ever produced. Multiple users share the computer these users are connected to the computer through its terminals. The most powerful mainframe computer is the supercomputer. It can perform complex work but this takes time, that is to say the computer is slow. The mainframe computer uses vacuum tubes and big electronic components. The mainframe computer is as big as a house and is very costly, used mainly by big organizations.

2. Minicomputer: The minicomputer is also big but not as mainframe computer, it is smaller in size than the mainframe. It is not as costly as mainframe. The minicomputer can be used in networking. The minicomputer can be used in transactions, can be connected to a mainframe computer. It uses electronic tubes instead of transistors and capacitors.

3. Microcomputer: This computer was invented due to the development in computer. It is the smallest computer technology, it has its CPU as a small chip called IC (Integrated Circuit) it also has capacitors, resistors as its components and they are all very small not like the vacuum tubes. The IBM cooperation where the first computer to produce the personal computer (PC). The microcomputer is portable, movable and is not very costly.



BY PURPOSE

1. Special purpose computer: This is a computer that is designed to perform a specific function. The special purpose computer uses a specific program that can not be changed and the program works for that particular task. Example can be found in aircrafts, microwave oven, cameras, automobiles and calculators.

2. General purpose computer: General purpose computer is a computer that is constructed to perform many functions such as graphics, communication, entertainment, typing etc. it is not restricted to a particular job.

Sizes of General Purpose Computer

a. Desktop computer: The desktop microcomputer is the one that has its monitor on top of the system unit.

b. Mini tower computer: The mini tower computer is a general purpose microcomputer that has the system unit at the same height (level) with the monitor and the system unit standing besides its monitor.

c. Full tower computer: A full tower general purpose microcomputer is a computer system that is system unit is taller than the monitor and the system unit stands beside the monitor.

d. Laptop computer: The laptop computer is smaller than all the above named computer sizes. It is portable and can be easily moved. The laptop uses batteries that can last for some period of time its power usage is low unlike others. The screen is also different from desktop, mini tower and full tower computers it uses LCD (Liquid Crystal Display) type of screen while others use Cathode Ray Tube. The laptop uses pointing stick, track ball or touch pad as its mouse.



Comments

Popular posts from this blog

MS WORD SHORTCUT KEYS