|
Product Description
A compact and accessible history, from punch cards and calculators to UNIVAC and ENIAC, the personal computer, Silicon Valley, and the Internet.
The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of “smart” hand-held devices, with subplots involving IBM, Microsoft, Apple, Facebook, and Twitter. In this concise and accessible account of the invention and development of digital technology, computer historian Paul Ceruzzi offers a broader and more useful perspective. He identifies four major threads that run throughout all of computing's technological development: digitization―the coding of information, computation, and control in binary form, ones and zeros; the convergence of multiple streams of techniques, devices, and machines, yielding more than the sum of their parts; the steady advance of electronic technology, as characterized famously by “Moore's Law”; and the human-machine interface.
Ceruzzi guides us through computing history, telling how a Bell Labs mathematician coined the word “digital” in 1942 (to describe a high-speed method of calculating used in anti-aircraft devices), and recounting the development of the punch card (for use in the 1890 U.S. Census). He describes the ENIAC, built for scientific and military applications; the UNIVAC, the first general purpose computer; and ARPANET, the Internet's precursor. Ceruzzi's account traces the world-changing evolution of the computer from a room-size ensemble of machinery to a “minicomputer” to a desktop computer to a pocket-sized smart phone. He describes the development of the silicon chip, which could store ever-increasing amounts of data and enabled ever-decreasing device size. He visits that hotbed of innovation, Silicon Valley, and brings the story up to the present with the Internet, the World Wide Web, and social networking.
Customers Who Bought This Item Also Bought
- Re-Coded
- Machine Learning: The New AI (The MIT Press Essential Knowledge Series)
- Data Science (MIT Press Essential Knowledge series)
- Computational Thinking (The MIT Press Essential Knowledge series)
- Deep Learning (The MIT Press Essential Knowledge series)
- Cloud Computing (The MIT Press Essential Knowledge Series): The MIT Press Essential Knowledge Series
- The Internet of Things (MIT Press Essential Knowledge series)
- Information and Society (The MIT Press Essential Knowledge Series)
- Metadata (The MIT Press Essential Knowledge series)
- Understanding Digital Literacies: A Practical Introduction
*If this is not the "Computing: A Concise History (The MIT Press Essential Knowledge Series)" product you were looking for, you can check the other results by clicking this link. Details were last updated on Nov 5, 2024 16:02 +08.