نمایش پست تنها
  #6  
قدیمی 11-02-2008
دانه کولانه آواتار ها
دانه کولانه دانه کولانه آنلاین نیست.
    مدیر کل سایت
        
کوروش نعلینی
 
تاریخ عضویت: Jun 2007
محل سکونت: کرمانشاه
نوشته ها: 12,700
سپاسها: : 1,382

7,486 سپاس در 1,899 نوشته ایشان در یکماه اخیر
دانه کولانه به Yahoo ارسال پیام
پیش فرض

به این لینک هم مراجعه کنید
http://www.mrob.com/pub/comp/computer-history.html
=============================================
The history of computing began with an ****og machine. In 1623 German scientist Wilhelm Schikard invented a machine that used 11 complete and 6 incomplete sprocketed wheels that could add, and with the aid of logarithm tables, multiply and divide.
French philosopher, mathematician, and physicist Blaise Pascal invented a machine in 1642 that added and subtracted, automatically carrying and borrowing digits from column to column. Pascal built 50 copies of his machine, but most served as curiosities in parlors of the wealthy. Seven****th-century German mathematician Gottfried Leibniz designed a special gearing system to enable multiplication on Pascal’s machine.

In the early 19th century French inventor Joseph-Marie Jacquard devised a specialized type of computer: a silk loom. Jacquard’s loom used punched cards to program patterns that helped the loom create woven fabrics. Although Jacquard was rewarded and admired by French emperor Napoleon I for his work, he fled for his life from the city of Lyon pursued by weavers who feared their jobs were in jeopardy due to Jacquard’s invention. The loom prevailed, however: When Jacquard died, more than 30,000 of his looms existed in Lyon. The looms are still used today, especially in the manufacture of fine furniture fabrics.

Another early mechanical computer was the Difference Engine, designed in the early 1820s by British mathematician and scientist Charles Babbage. Although never completed by Babbage, the Difference Engine was intended to be a machine with a 20-decimal capacity that could solve mathematical problems. Babbage also made plans for another machine, the ****ytical Engine, considered the mechanical precursor of the modern computer. The ****ytical Engine was designed to perform all arithmetic operations efficiently; however, Babbage’s lack of political skills kept him from obtaining the approval and funds to build it.

Augusta Ada Byron, countess of Lovelace, was a personal friend and student of Babbage. She was the daughter of the famous poet Lord Byron and one of only a few woman mathematicians of her time. She prepared extensive notes concerning Babbage’s ideas and the ****ytical Engine. Lovelace’s conceptual programs for the machine led to the naming of a programming language (Ada) in her honor. Although the ****ytical Engine was never built, its key concepts, such as the capacity to store instructions, the use of punched cards as a primitive memory, and the ability to print, can be found in many modern computers.

Herman Hollerith, an American inventor, used an idea similar to Jacquard’s loom when he combined the use of punched cards with devices that created and electronically read the cards. Hollerith’s tabulator was used for the 1890 U.S. census, and it made the computational time three to four times shorter than the time previously needed for hand counts. Hollerith’s Tabulating Machine Company eventually merged with two companies to form the Computing-Tabulating-Recording Company. In 1924 the company changed its name to International Business Machines (IBM).

In 1936 British mathematician Alan Turing proposed the idea of a machine that could process equations without human direction. The machine (now known as a Turing machine) resembled an automatic typewriter that used symbols for math and logic instead of letters. Turing intended the device to be a “universal machine” that could be used to duplicate or represent the function of any other existing machine. Turing’s machine was the theoretical precursor to the modern digital computer. The Turing machine model is still used by modern computational theorists.

In the 1930s American mathematician Howard Aiken developed the Mark I calculating machine, which was built by IBM. This electronic calculating machine used relays and electromagnetic components to replace mechanical components. In later machines, Aiken used vacuum tubes and solid state transistors (tiny electrical switches) to manipulate the binary numbers. Aiken also introduced computers to universities by establishing the first computer science program at Harvard University in Cambridge, Massachusetts. Aiken obsessively mistrusted the concept of storing a program within the computer, insisting that the integrity of the machine could be maintained only through a strict separation of program instructions from data. His computer had to read instructions from punched cards, which could be stored away from the computer. He also urged the National Bureau of Standards not to support the development of computers, insisting that there would never be a need for more than five or six of them nationwide.

At the Institute for Advanced Study in Princeton, New Jersey, Hungarian-American mathematician John von Neumann developed one of the first computers used to solve problems in mathematics, meteorology, economics, and hydrodynamics. Von Neumann's 1945 design for the Electronic Discrete Variable Automatic Computer (EDVAC)—in stark contrast to the designs of Aiken, his contemporary—was the first electronic computer design to incorporate a program stored entirely within its memory. This machine led to several others, some with clever names like ILLIAC, JOHNNIAC, and MANIAC.

American physicist John Mauchly proposed the electronic digital computer called ENIAC, the Electronic Numerical Integrator And Computer. He helped build it along with American engineer John Presper Eckert, Jr., at the Moore School of Engineering at the University of Pennsylvania in Philadelphia. ENIAC was operational in 1945 and introduced to the public in 1946. It is regarded as the first successful, general digital computer. It occupied 167 sq m (1,800 sq ft), weighed more than 27,000 kg (60,000 lb), and contained more than 18,000 vacuum tubes. Roughly 2,000 of the computer’s vacuum tubes were replaced each month by a team of six technicians. Many of ENIAC’s first tasks were for military purposes, such as calculating ballistic firing tables and designing atomic weapons. Since ENIAC was initially not a stored program machine, it had to be reprogrammed for each task.

Eckert and Mauchly eventually formed their own company, which was then bought by the Rand Corporation. They produced the Universal Automatic Computer (UNIVAC), which was used for a broader variety of commercial applications. The first UNIVAC was delivered to the United States Census Bureau in 1951. By 1957, there were 46 UNIVACs in use.

Between 1937 and 1939, while teaching at Iowa State College, American physicist John Vincent Atanasoff built a prototype computing device called the Atanasoff-Berry Computer, or ABC, with the help of his assistant, Clifford Berry. Atanasoff developed the concepts that were later used in the design of the ENIAC. Atanasoff’s device was the first computer to separate data processing from memory, but it is not clear whether a functional version was ever built. Atanasoff did not receive credit for his contributions until 1973, when a lawsuit regarding the patent on ENIAC was settled.

In 1948, at Bell Telephone Laboratories, American physicists Walter Houser Brattain, John Bardeen, and William Bradford Shockley developed the transistor, a device that can act as an electric switch. The transistor had a tremendous impact on computer design, replacing costly, energy-inefficient, and unreliable vacuum tubes.

In the late 1960s integrated circuits (tiny transistors and other electrical components arranged on a single chip of silicon) replaced individual transistors in computers. Integrated circuits resulted from the simultaneous, independent work of Jack Kilby at Texas Instruments and Robert Noyce of the Fairchild Semiconductor Corporation in the late 1950s. As integrated circuits became miniaturized, more components could be designed into a single computer circuit. In the 1970s refinements in integrated circuit technology led to the development of the modern microprocessor, integrated circuits that contained thousands of transistors. Modern microprocessors can contain more than 40 million transistors.

Manufacturers used integrated circuit technology to build smaller and cheaper computers. The first of these so-called personal computers (PCs)—the Altair 8800—appeared in 1975, sold by Micro Instrumentation Telemetry Systems (MITS). The Altair used an 8-bit Intel 8080 microprocessor, had 256 bytes of RAM, received input through switches on the front panel, and displayed output on rows of light-emitting diodes (LEDs). Refinements in the PC continued with the inclusion of video displays, better storage devices, and CPUs with more computational abilities. Graphical user interfaces were first designed by the Xerox Corporation, then later used successfully by Apple Computer, Inc.. Today the development of sophisticated operating systems such as Windows, the Mac OS, and Linux enables computer users to run programs and manipulate data in ways that were unimaginable in the mid-20th century.

Several researchers claim the “record” for the largest single calculation ever performed. One large single calculation was accomplished by physicists at IBM in 1995. They solved one million trillion mathematical subproblems by continuously running 448 computers for two years. Their ****ysis demonstrated the existence of a previously hypothetical subatomic particle called a glueball. Japan, Italy, and the United States are collaborating to develop new supercomputers that will run these types of calculations 100 times faster.

In 1996 IBM challenged Garry Kasparov, the reigning world chess champion, to a chess match with a supercomputer called Deep Blue. The computer had the ability to compute more than 100 million chess positions per second. In a 1997 rematch Deep Blue defeated Kasparov, becoming the first computer to win a match against a reigning world chess champion with regulation time controls. Many experts predict these types of parallel processing machines will soon surpass human chess playing ability, and some speculate that massive calculating power will one day replace intelligence. Deep Blue serves as a prototype for future computers that will be required to solve complex problems. At issue, however, is whether a computer can be developed with the ability to learn to solve problems on its own, rather than one programmed to solve a specific set of tasks.
__________________
مرا سر نهان گر شود زير سنگ -- از آن به كه نامم بر آيد به ننگ
به نام نكو گر بميــرم رواست -- مرا نام بايد كه تن مرگ راست



پاسخ با نقل قول
جای تبلیغات شما اینجا خالیست با ما تماس بگیرید