بازگشت   پی سی سیتی > کامپیوتر اینترنت و شبکه Computer internet > نرم افزار - سخت افزار و بازیهای کامپیوتری > سخت افزار Hard ware

سخت افزار Hard ware معرفی سخت افزارهای مختلف . قیمت و پرسش و پاسخ در مورد آن

پاسخ
 
ابزارهای موضوع نحوه نمایش
  #1  
قدیمی 11-02-2008
دانه کولانه آواتار ها
دانه کولانه دانه کولانه آنلاین نیست.
    مدیر کل سایت
        
کوروش نعلینی
 
تاریخ عضویت: Jun 2007
محل سکونت: کرمانشاه
نوشته ها: 12,700
سپاسها: : 1,382

7,486 سپاس در 1,899 نوشته ایشان در یکماه اخیر
دانه کولانه به Yahoo ارسال پیام
پیش فرض

به این لینک هم مراجعه کنید
http://www.mrob.com/pub/comp/computer-history.html
=============================================
The history of computing began with an ****og machine. In 1623 German scientist Wilhelm Schikard invented a machine that used 11 complete and 6 incomplete sprocketed wheels that could add, and with the aid of logarithm tables, multiply and divide.
French philosopher, mathematician, and physicist Blaise Pascal invented a machine in 1642 that added and subtracted, automatically carrying and borrowing digits from column to column. Pascal built 50 copies of his machine, but most served as curiosities in parlors of the wealthy. Seven****th-century German mathematician Gottfried Leibniz designed a special gearing system to enable multiplication on Pascal’s machine.

In the early 19th century French inventor Joseph-Marie Jacquard devised a specialized type of computer: a silk loom. Jacquard’s loom used punched cards to program patterns that helped the loom create woven fabrics. Although Jacquard was rewarded and admired by French emperor Napoleon I for his work, he fled for his life from the city of Lyon pursued by weavers who feared their jobs were in jeopardy due to Jacquard’s invention. The loom prevailed, however: When Jacquard died, more than 30,000 of his looms existed in Lyon. The looms are still used today, especially in the manufacture of fine furniture fabrics.

Another early mechanical computer was the Difference Engine, designed in the early 1820s by British mathematician and scientist Charles Babbage. Although never completed by Babbage, the Difference Engine was intended to be a machine with a 20-decimal capacity that could solve mathematical problems. Babbage also made plans for another machine, the ****ytical Engine, considered the mechanical precursor of the modern computer. The ****ytical Engine was designed to perform all arithmetic operations efficiently; however, Babbage’s lack of political skills kept him from obtaining the approval and funds to build it.

Augusta Ada Byron, countess of Lovelace, was a personal friend and student of Babbage. She was the daughter of the famous poet Lord Byron and one of only a few woman mathematicians of her time. She prepared extensive notes concerning Babbage’s ideas and the ****ytical Engine. Lovelace’s conceptual programs for the machine led to the naming of a programming language (Ada) in her honor. Although the ****ytical Engine was never built, its key concepts, such as the capacity to store instructions, the use of punched cards as a primitive memory, and the ability to print, can be found in many modern computers.

Herman Hollerith, an American inventor, used an idea similar to Jacquard’s loom when he combined the use of punched cards with devices that created and electronically read the cards. Hollerith’s tabulator was used for the 1890 U.S. census, and it made the computational time three to four times shorter than the time previously needed for hand counts. Hollerith’s Tabulating Machine Company eventually merged with two companies to form the Computing-Tabulating-Recording Company. In 1924 the company changed its name to International Business Machines (IBM).

In 1936 British mathematician Alan Turing proposed the idea of a machine that could process equations without human direction. The machine (now known as a Turing machine) resembled an automatic typewriter that used symbols for math and logic instead of letters. Turing intended the device to be a “universal machine” that could be used to duplicate or represent the function of any other existing machine. Turing’s machine was the theoretical precursor to the modern digital computer. The Turing machine model is still used by modern computational theorists.

In the 1930s American mathematician Howard Aiken developed the Mark I calculating machine, which was built by IBM. This electronic calculating machine used relays and electromagnetic components to replace mechanical components. In later machines, Aiken used vacuum tubes and solid state transistors (tiny electrical switches) to manipulate the binary numbers. Aiken also introduced computers to universities by establishing the first computer science program at Harvard University in Cambridge, Massachusetts. Aiken obsessively mistrusted the concept of storing a program within the computer, insisting that the integrity of the machine could be maintained only through a strict separation of program instructions from data. His computer had to read instructions from punched cards, which could be stored away from the computer. He also urged the National Bureau of Standards not to support the development of computers, insisting that there would never be a need for more than five or six of them nationwide.

At the Institute for Advanced Study in Princeton, New Jersey, Hungarian-American mathematician John von Neumann developed one of the first computers used to solve problems in mathematics, meteorology, economics, and hydrodynamics. Von Neumann's 1945 design for the Electronic Discrete Variable Automatic Computer (EDVAC)—in stark contrast to the designs of Aiken, his contemporary—was the first electronic computer design to incorporate a program stored entirely within its memory. This machine led to several others, some with clever names like ILLIAC, JOHNNIAC, and MANIAC.

American physicist John Mauchly proposed the electronic digital computer called ENIAC, the Electronic Numerical Integrator And Computer. He helped build it along with American engineer John Presper Eckert, Jr., at the Moore School of Engineering at the University of Pennsylvania in Philadelphia. ENIAC was operational in 1945 and introduced to the public in 1946. It is regarded as the first successful, general digital computer. It occupied 167 sq m (1,800 sq ft), weighed more than 27,000 kg (60,000 lb), and contained more than 18,000 vacuum tubes. Roughly 2,000 of the computer’s vacuum tubes were replaced each month by a team of six technicians. Many of ENIAC’s first tasks were for military purposes, such as calculating ballistic firing tables and designing atomic weapons. Since ENIAC was initially not a stored program machine, it had to be reprogrammed for each task.

Eckert and Mauchly eventually formed their own company, which was then bought by the Rand Corporation. They produced the Universal Automatic Computer (UNIVAC), which was used for a broader variety of commercial applications. The first UNIVAC was delivered to the United States Census Bureau in 1951. By 1957, there were 46 UNIVACs in use.

Between 1937 and 1939, while teaching at Iowa State College, American physicist John Vincent Atanasoff built a prototype computing device called the Atanasoff-Berry Computer, or ABC, with the help of his assistant, Clifford Berry. Atanasoff developed the concepts that were later used in the design of the ENIAC. Atanasoff’s device was the first computer to separate data processing from memory, but it is not clear whether a functional version was ever built. Atanasoff did not receive credit for his contributions until 1973, when a lawsuit regarding the patent on ENIAC was settled.

In 1948, at Bell Telephone Laboratories, American physicists Walter Houser Brattain, John Bardeen, and William Bradford Shockley developed the transistor, a device that can act as an electric switch. The transistor had a tremendous impact on computer design, replacing costly, energy-inefficient, and unreliable vacuum tubes.

In the late 1960s integrated circuits (tiny transistors and other electrical components arranged on a single chip of silicon) replaced individual transistors in computers. Integrated circuits resulted from the simultaneous, independent work of Jack Kilby at Texas Instruments and Robert Noyce of the Fairchild Semiconductor Corporation in the late 1950s. As integrated circuits became miniaturized, more components could be designed into a single computer circuit. In the 1970s refinements in integrated circuit technology led to the development of the modern microprocessor, integrated circuits that contained thousands of transistors. Modern microprocessors can contain more than 40 million transistors.

Manufacturers used integrated circuit technology to build smaller and cheaper computers. The first of these so-called personal computers (PCs)—the Altair 8800—appeared in 1975, sold by Micro Instrumentation Telemetry Systems (MITS). The Altair used an 8-bit Intel 8080 microprocessor, had 256 bytes of RAM, received input through switches on the front panel, and displayed output on rows of light-emitting diodes (LEDs). Refinements in the PC continued with the inclusion of video displays, better storage devices, and CPUs with more computational abilities. Graphical user interfaces were first designed by the Xerox Corporation, then later used successfully by Apple Computer, Inc.. Today the development of sophisticated operating systems such as Windows, the Mac OS, and Linux enables computer users to run programs and manipulate data in ways that were unimaginable in the mid-20th century.

Several researchers claim the “record” for the largest single calculation ever performed. One large single calculation was accomplished by physicists at IBM in 1995. They solved one million trillion mathematical subproblems by continuously running 448 computers for two years. Their ****ysis demonstrated the existence of a previously hypothetical subatomic particle called a glueball. Japan, Italy, and the United States are collaborating to develop new supercomputers that will run these types of calculations 100 times faster.

In 1996 IBM challenged Garry Kasparov, the reigning world chess champion, to a chess match with a supercomputer called Deep Blue. The computer had the ability to compute more than 100 million chess positions per second. In a 1997 rematch Deep Blue defeated Kasparov, becoming the first computer to win a match against a reigning world chess champion with regulation time controls. Many experts predict these types of parallel processing machines will soon surpass human chess playing ability, and some speculate that massive calculating power will one day replace intelligence. Deep Blue serves as a prototype for future computers that will be required to solve complex problems. At issue, however, is whether a computer can be developed with the ability to learn to solve problems on its own, rather than one programmed to solve a specific set of tasks.
__________________
مرا سر نهان گر شود زير سنگ -- از آن به كه نامم بر آيد به ننگ
به نام نكو گر بميــرم رواست -- مرا نام بايد كه تن مرگ راست



پاسخ با نقل قول
  #2  
قدیمی 10-14-2009
رزیتا آواتار ها
رزیتا رزیتا آنلاین نیست.
مسئول و ناظر ارشد-مدیر بخش خانه داری



 
تاریخ عضویت: Aug 2009
نوشته ها: 16,247
سپاسها: : 9,677

9,666 سپاس در 4,139 نوشته ایشان در یکماه اخیر
پیش فرض History Of The Personal Computer

عنوان : HISTORY OF THE PERSONAL COMPUTER

In the mid-1940s, early computers were the size of houses and as expensive as battleships, but they had none of the computational power or ease of use that are common in modern PCs. The miniaturization of electronic circuitry and the invention of integrated circuits and microprocessors enabled computer makers to combine the essential elements of a computer onto tiny silicon computer chips, thereby increasing computer performance and decreasing cost. The first microprocessor, the Intel 4004, created in 1971 by Intel Corporation, was originally designed to be the computing and logical processor of calculators and watches. From its simple design modern microprocessors evolved
Microsoft ® Encarta ® Reference Library 2005. © 1993-2004 Microsoft Corporation. All rights reserved.
The Altair 8800, developed in 1975 by Micro Instrumentation Telemetry Systems (MITS), is considered to be the first PC. The Altair was built from a kit and programmed by using switches. Information from the computer was displayed by light-emitting diodes on the front panel of the machine. The Altair appeared on the cover of Popular Electronics magazine in 1975 and inspired many computer enthusiasts who would later establish companies to produce computer hardware and software.

American computer designers Steven Jobs and Stephen Wozniak created the Apple II in 1977. The Apple II was one of the first PCs to incorporate a color video display and a keyboard that made the computer easy to use. Jobs and Wozniak incorporated Apple Computer, Inc., the same year.

Microsoft ® Encarta ® Reference Library 2005. © 1993-2004 Microsoft Corporation. All rights reserved.
In 1981 International Business Machines Corporation (IBM) introduced the IBM PC. It was designed with an open architecture that enabled other computer manufacturers to create similar machines, or clones, that could also run software designed for the IBM PC. The design of the IBM PC and its clones soon became the PC standard, and an operating system developed by Microsoft Corporation became the dominant software running PCs.

A graphical user interface (GUI)—a visually appealing way to represent computer commands and data on the screen—was first developed in 1983 when Apple introduced the Lisa, but the new user interface did not gain widespread notice until 1984 with the introduction of the Apple Macintosh. The Macintosh GUI combined icons (pictures that represent files or programs) with windows (boxes that each contain an open file or program). A pointing device known as a mouse controlled information on the screen. Inspired by earlier work of computer scientists at Xerox Corporation, the Macintosh user interface made computers easy and fun to use and eliminated the need to type in complex commands (see User Interface). Today, software available for IBM PCs and clones, as well as most other popular computer platforms, also feature a GUI.

Since the early 1970s, computing power has doubled about every 18 months due to the creation of faster microprocessors, the incorporation of multiple microprocessor designs, and the development of new storage technologies. Ongoing research is focused on creating computers that use light and biological molecules instead of—or in combination with—conventional electronic computer circuitry. These technological advances, coupled with new methods for interconnecting computers, such as the proposed Internet2, an advanced Internet under development by universities, industry, and the government, promise to make PCs even more powerful and useful.

Microsoft ® Encarta ® Reference Library 2005
__________________
زمستان نیز رفت اما بهارانی نمی بینم
بر این تکرارِ در تکرار پایانی نمی بینم

به دنبال خودم چون گردبادی خسته می گردم
ولی از خویش جز گَردی به دامانی نمی بینم

چه بر ما رفته است ای عمر؟ ای یاقوت بی قیمت!
که غیر از مرگ، گردن بند ارزانی نمی بینم

زمین از دلبران خالی است یا من چشم ودل سیرم؟
که می گردم ولی زلف پریشانی نمی بینم

خدایا عشق درمانی به غیر از مرگ می خواهد
که من می میرم از این درد و درمانی نمی بینم

استاد فاضل نظری
پاسخ با نقل قول
پاسخ


کاربران در حال دیدن موضوع: 1 نفر (0 عضو و 1 مهمان)
 

مجوز های ارسال و ویرایش
شما نمیتوانید موضوع جدیدی ارسال کنید
شما امکان ارسال پاسخ را ندارید
شما نمیتوانید فایل پیوست در پست خود ضمیمه کنید
شما نمیتوانید پست های خود را ویرایش کنید

BB code is فعال
شکلک ها فعال است
کد [IMG] فعال است
اچ تی ام ال غیر فعال می باشد



اکنون ساعت 08:11 PM برپایه ساعت جهانی (GMT - گرینویچ) +3.5 می باشد.



Powered by vBulletin® Version 3.8.4 Copyright , Jelsoft Enterprices مدیریت توسط کورش نعلینی
استفاده از مطالب پی سی سیتی بدون ذکر منبع هم پیگرد قانونی ندارد!! (این دیگه به انصاف خودتونه !!)
(اگر مطلبی از شما در سایت ما بدون ذکر نامتان استفاده شده مارا خبر کنید تا آنرا اصلاح کنیم)


سایت دبیرستان وابسته به دانشگاه رازی کرمانشاه: کلیک کنید




  پیدا کردن مطالب قبلی سایت توسط گوگل برای جلوگیری از ارسال تکراری آنها