What is Computer.


DEFINITION.

A computer is a device that accepts information (in the form of digitalized data and manipulates it for some result based on a program or sequence of instructions on how the data is to be processed. Complex computers also include the means for storing data (including the program, which is also a form of data) for some necessary duration. A program may be invariable and built into the computer (and called logic circuitry as it is on microprocessors) or different programs may be provided to the computer (loaded into its storage and then started by an administrator or user). Today's computers have both kinds of programming.
Most histories of the modern computer begin with the Analytical Engine envisioned by Charles Babbage following the mathematical ideas of George Boole, the mathematician who first stated the principles of logic inherent in today's digital computer. Babbage's assistant and collaborator, Ada Lovelace, is said to have introduced the ideas of program loops and subroutines and is sometimes considered the first programmer. Apart from mechanical calculators, the first really useable computers began with the vacuum tube, accelerated with the invention of the transistor, which then became embedded in large numbers in integrated circuits, ultimately making possible the relatively low-cost personal computer.
Modern computers inherently follow the ideas of the stored program laid out by John von Neumann in 1945. Essentially, the program is read by the computer one instruction at a time, an operation is performed, and the computer then reads in the next instruction, and so on. Recently, computers and programs have been devised that allow multiple programs (and computers) to work on the same problem at the same time in parallel. With the advent of the Internet and higher bandwidth data transmission, programs and data that are part of the same overall project can be distributed over a network and embody the Sun Microsystems slogan: "The network is the computer.
linke...



  • Information. Information is stimuli that has meaning in some context for its receiver. When information is entered into and stored in a computer, it is generally referred to as data. After processing (such as formatting and printing), output data can again be perceived as information In information technology, knowledge is, to an enterprise or an individual, the possession of information or the ability to quickly locate it. This is essentially what Samuel Johnson, compiler of the first comprehensive English dictionary, said when he wrote that:
    "Knowledge is of two kinds: we know a subject ourselves, or we know where we can find information upon it." In the context of the business enterprise or the personal computer user, knowledge tends to connote possession of experienced "know-how" as well as possession of factual information or where to get it. Enterprises have recently begun to treat their accumulated knowledge as an asset and to develop knowledge managemet plans and applications. A new kind of application, called data mining, attempts to develop knowledge from a company's accumulated business transactions and other data.
    In philosophy, the theory of knowledge is called epistemology and deals with such questions as how much knowledge comes from experience or from innate reasoning ability; whether knowledge needs to be believed or can simply be used; and how knowledge changes as new ideas about the same set of facts arise.


  • program.- In computing, a program is a specific set of ordered operations for a computer to perform. In the modern computer that John von Neumann outlined in 1945, the program contains a one-at-a-time sequence of instructions that the computer follows. Typically, the program is put into a storage area accessible to the computer. The computer gets one instruction and performs it and then gets the next instruction. The storage area or memory can also contain the data that the instruction operates on. (Note that a program is also a special kind of "data" that tells how to operate on "application or user data.")
    Programs can be characterized as interactive or batch in terms of what drives them and how continuously they run. An interactive program receives data from an interactive user (or possibly from another program that simulates an interactive user). A batch program runs and does its work, and then stops. Batch programs can be started by interactive users who request their interactive program to run the batch program. A command interpreter or a Web browser is an example of an interactive program. A program that computes and prints out a company payroll is an example of a batch program. Print jobs are also batch programs.
    When you create a program, you write it using some kind of computer language. Your language statements are the source program. You then "compile" the source program (with a special program called a language compiler) and the result is called an object program (not to be confused with object-oriented programming). There are several synonyms for object program, including object module and compiled program. The object program contains the string of 0s and 1s called machine language that the logic processor works with.
    The machine language of the computer is constructed by the language compiler with an understanding of the computer's logic architecture, including the set of possible computer instructions and the length (number of bits) in an instruction.


  • A microprocessor
    A microprocessor is a computer processor on a microchip. It's sometimes called a logic chip. It is the "engine" that goes into motion when you turn your computer on. A microprocessor is designed to perform arithmetic and logic operations that make use of small number-holding areas called registers. Typical microprocessor operations include adding, subtracting, comparing two numbers, and fetching numbers from one area to another. These operations are the result of a set of instructions that are part of the microprocessor design. When the computer is turned on, the microprocessor is designed to get the first instruction from the basic input/output system (BIOS) that comes with the computer as part of its memory. After that, either the BIOS, or the operating system that BIOS loads into computer memory, or an application progam is "driving" the microprocessor, giving it instructions to perform.


  • Analytical Engine.The Analytical Engine was, or would have been, the world's first general-purpose computer. Designed in the 1830s by the English mathematician and inventor Charles Babbage , the Analytical Engine introduced a number of computing concepts still in use today. Features included a store and mill, analogous to today's memory and processor . Input and output was provided using punched cards, based on the invention by Jacquard in the early 1800s.
    Babbage began his work on the Analytical Engine in 1834. He envisaged the computer to be constructed with brass fittings and powered by steam. It was never built, since the government of the day was unwilling to fund its construction, having already sunk 17,000 English pounds into Babbage's fruitless project to build an earlier invention, the Difference Engine .
    Babbage was assisted in his endeavours by Ada Augusta, Countess of Lovelace (and daughter of the poet Byron) who is regarded as the world's first computer programmer for her work with Babbage. She developed a punched card program to calculate the Bernoulli numbers.
    While Babbage's earlier Difference Engine was finally contructed in 1991, his Analytical Engine remains unrealized. As the originator of several important concepts in computing,


  • Charles Babbage. If John von Neumann is the father of modern computing, then the English mathematician and inventor Charles Babbage can be considered its grandfather. Babbage designed, though never built, a Difference Engine and an Analytical Engine, the world's first computing machines.
    Babbage worked as a mathematician in Cambridge University where he received his MA in 1817 and later, like Newton, whose mathematical principles he espoused, occupied the Lucasian chair in mathematics. As a scientist, Babbage was obsessed with facts and statistics and lived in a rationalistic world where it was assumed that if all facts, past and present, could be known then all future events were determinable. His statistical publications include "Table of the Relative Frequency of the Causes of Breaking of Plate Glass Windows" and "Table of Constants of the Class Mammalia," the minutiae of which included the heart rate of the pig. Babbage founded the Statistical Society in 1834.
    A prolific disseminator of ideas and an eclectic inventor, Babbage's varied range of inventions reflected the diversity of his interests. Fascinated by the railroad, which was invented in his native England in 1823, Babbage devised a standard rail gauge width as well as a cowcatcher (for the front of trains). He also recorded inventions related to lighthouse signalling, code breaking, and the postal system. He founded the British Association for the Advancement of Science and the (Royal) Astronomical Society.
    Although remembered today primarily for his calculating engines, Babbage left a legacy in the fields of political theory (he was an ardent industrialist) and operations research (where his 1832 publication, "On the Economy
    of Manufactures," cataloged the manufacturing processes of the day).
    Charles Babbage died in London on October 18, 1871.


  • George Boole.George Boole (1815-1864) was a British mathematician and is known as the founder of mathematical logic. Boole, who came from a poor family and was essentially a self-taught mathematician, made his presence known in the world of mathematics in 1847 after the publication of his book, "The Mathematical Analysis of Logic". In his book, Boole successfully demonstrated that logic, as Aristotle taught it, could be represented by algebraic equations. In 1854, Boole firmly established his reputation by publishing "An Investigation of the Laws of Thought, on Which Are Founded the Mathematical Theories of Logic and Probabilities", a continuation of his earlier work.
    In 1855 Boole, the first professor of mathematics at The College of Cork, Ireland, married Mary Everest, who is now known as a mathematician and teacher in her own right. Mary, who was 18 years younger than Boole, served as sounding-board and editor for her husband throughout their nine years of marriage. Unfortunately, Mary's poor choice of medical treatment may have hastened Boole's death. After getting caught in the rain and catching a cold, Boole was put to bed by his wife, who dumped buckets of water on him based on the theory that whatever had caused the illness would also provide the cure. (It seemed logical to her.) George and Mary had five daughters; the third daughter, Alicia Boole Stott, became well-known for her work in the visualization of geometric figures in hyperspace.
    Boole's work in symbolic logic, collectively known as "Boolean algebra", is widely regarded to be based on the work of earlier mathematician G.W. Leibniz. Although Boole's work was well-received during his lifetime, it was considered to be "pure" mathematics until 1938, when Claude Shannon published his thesis at MIT. Shannon demonstrated that Boole's symbolic logic, as it applies to the representation of TRUE and FALSE, could be used to represent the functions of switches in electronic circuits. This became the foundation for digital electronic design, with practical applications in telephone switching and computer engineering.
    Today, when using a search engine on the Internet, we use Boole's mathematical concepts to help us locate information by defining a relationship between the terms we enter. For instance, searching for George AND Boole would find every article in which both the word George and the word Boole appear. Searching for George OR Boole would find every article in which either the word George or the word Boole appears. We call this a boolean search
    .


  • Augusta Ada King, countess of Lovelace. Augusta Ada King, countess of Lovelace, nee Lady Byron, was an English mathematician often credited as the first computer programmer for her writings about Charles Babbage 's Analytical Engine . She was born in 1815, in Middlesex (now part of London) and died in London in 1852.
    Ada, as she was called, was the daughter of the famous poet, Lord Byron and Annabella Milbanke Byron, who was, herself, an accomplished mathematician. Ada was rigorously trained in the arts and sciences by a succession of tutors, and through self-education. She married William King, 8th Baron King, in 1835 and became countess of Lovelace in 1838 when her husband was made an earl.
    Ada had met Babbage when she was still in her teens and asked him to serve as her tutor several years later. While translating Luigi Menabrea's Elements of Charles Babbage's Analytical Engine from its original French, the countess contributed so many annotations that the result was three times the length of the original manuscript. In her annotations, Ada described a method by which the Analytical Engine could be made to compute Bernoulli numbers, which is why she's been called the first programmer.
    Because her parents separated when she was an infant, and her father subsequently left the country, Ada did not have a personal relationship with Lord Byron. Nevertheless, she seems to have inherited something of his poetic sensibility along with her mother's mathematical talent. Among the countess' additions were such comments as "...the Analytical Engine weaves algebraic patterns, just as the Jacquard-loom weaves flowers and leaves."


  • vacuum tube.
    Also see cathode ray tube ( CRT ), the specialized kind of vacuum tube that is in most desktop display monitors.
    A vacuum tube (also called a VT, electron tube or, in the UK, a valve ) is a device sometimes used to amplify electronic signals. In most applications, the vacuum tube is obsolete, having been replaced decades ago by the bipolar transistor and, more recently, by the field-effect transistor . However, tubes are still used in some high-power amplifiers, especially at microwave radio frequencies and in some hi-fi audio systems.
    Tubes operate at higher voltages than transistors. A typical transistorized amplifier needs 6 to 12 volts to function; an equivalent tube type amplifier needs 200 to 400 volts. At the highest power levels, some tube circuits have power supplies delivering several kilovolts.
    Vacuum tubes are making a comeback among audiophiles who insist that tubes deliver better audio quality than transistors. These old-fashioned components are more electrically rugged than their solid-state counterparts; a tube can often withstand temporary overload conditions and power-line transients that would instantly destroy a transistor.
    The major disadvantages of tubes include the fact that they require bulky power supplies, and the high voltages can present an electric shock hazard.


  • ModeJohn von Neumann. John von Neumann was the scientist who conceived a fundamental idea that serves all modern computers - that a computer's program and the data that it processes do not have to be fed into the computer while it is working, but can be kept in the computer's memory - a notion generally referred to as the stored-program computer . In his short life, von Neumann became one of the most acclaimed and lauded scientists of the 20th century. He left an indelible mark on the fields of mathematics, quantum theory, game theory, nuclear physics, and computer science.
    Born in Budapest, von Neumann was a child prodigy who went on to study chemistry in Berlin and Zurich, where he earned a Diploma in Chemical Engineering in 1926. His doctorate in mathematics (on set theory) from the University of Budapest followed in the same year. After lecturing at Berlin and Hamburg, von Neumann emigrated to the US in 1930 where he worked at Princeton and was one of the founding members of the Institute for Advanced Studies.
    At Princeton, von Neumann lectured in the nascent field of
    quantum theory
    and through his work on rings of operators (later renamed Neumann algebras) he helped develop the mathematical foundations of that theory which were unveiled in the paper "Mathematische Grundlagen der Quantenmechanik" (1932). His seminal publication on game theory, "Theory of Games and Economic Behaviour" was published in 1934 with co-author Oskar Morgenstern.
    Spurred by an interest in hydrodynamics and the difficulty of solving the non-linear partial differential equations involved, von Neumann turned to the emerging field of computing. His first introduction to computers was Howard Aiken's Harvard Mark I. As a consultant to Eckert and Mauchly on the ENIAC, he devised a concept for computer architecture that remains with us to this day. Known subsequently as the "von Neumann architecture", the stored-program computer (where both the instructions and the data they operate upon reside together in memory) with its central controller, I/O, and memory was outlined in a "Draft Report" and paved the way for the modern era of computing.
    von Neumann was constantly busy with both his extensive consulting career and his varied research interests. von Neumann was a pioneer in the field of cellular automata (an n-dimensional array of cells where the contents of a cell depend of the contents of neighbouring cells) and also popularized the binary digit as the unit of computer memory. Among his employers was the U.S. military, for whom he worked on the development of the hydrogen bomb. He received the Enrico Fermi award in 1956, the latest in a long line of honors (including 7 honorary doctorates and 2 Presidential Awards). John von Neumann died on February 8, 1957 in Washington D.C.


  • Bandwidth In computer networks, bandwidth is often used as a synonym for data transfer rate - the amount of data that can be carried from one point to another in a given time period (usually a second). This kind of bandwidth is usually expressed in bits (of data) per second (bps). Occasionally, it's expressed as bytes per second (Bps). A modem that works at 57,600 bps hastwice the bandwidth of a modem that works at 28,800 bps. In general, a link with a high bandwidth is one that may be able to carry enough information to sustain the succession of images in a video presentation.
    It should be remembered that a real communications path usually consists of a succession of links, each with its own bandwidth. If one of these is much slower than the rest, it is said to be a bandwidth bottleneck.
    2) In electronic communication, bandwidth is the width of the range (or band) of frequencies that an electronic signal uses on a given transmission medium. In this usage, bandwidth is expressed in terms of the difference between the highest-frequency signal component and the lowest-frequency signal component. Since the frequency of a signal is measured in hertz (the number of cycles of change per second), a given bandwidth is the difference in hertz between the highest frequency the signal uses and the lowest frequency it uses. A typical voice signal has a bandwidth of approximately three kilohertz (3 kHz); an analog television (TV) broadcast video signal has a bandwidth of six megahertz (6 MHz) -- some 2,000 times as wide as the voice signal.


  • Analog computing. Analog computing is a term used by Paul Saffo of the Institute for the Future in Palo Alto, California, to describe silicon-based microsensors that sense and react to external (natural) stimuli in something that approximates the rhythm of reality rather than the "artificial" binary behavior of digital computing. Saffo foresees that, by implanting tiny machines including sensors and actuators in the same materials used to manufacture digital memory and processors (and by using some of the same manufacturing techniques), the next decade will increasingly find uses for "intelligent" material that responds to its environment in analog or dynamically responding fashion. Examples include packages that can "talk back" to their handlers; airplane wings that can reshape themselves as they meet turbulence; chairs that can mold themselves into the best supporting shape for each person.
    Saffo's analog computers also go by the names of MEMS (micro-electromechanical systems) and smart matter.


  • Analoge.In telecommunications, an analog signal is one in which a base carrier's alternating current frequency is modified in some way, such as by amplifying the strength of the signal or varying the frequency, in order to add information to the signal. Broadcast and telephone transmission have conventionally used analog technology.
    An analog signal can be represented as a series of sine waves. The term originated because the modulation of the carrier wave is analogous to the fluctuations of the human voice or other sound that is being transmitted.
    2) Analog describes any fluctuating, evolving, or continually changing process.
    linke...
[ ... ]
 

©2009 Sweet Dreams | by TNB