Course Icon

Natural Science - Year II

Unit 66: The Technological Revolution

SO Icon

History Weblecture for Unit 66


This Unit's Homework Page History Lecture Science Lecture Lab Parents' Notes

History Lecture for Unit 66: Information Science and the Age of Computers

For Class

Lecture:

Information Science

Information science is a specialized area that covers how we record information, store it, organize it, and retrieve it. As we become more dependent on information as a commodity — something that we buy or sell — our ability to store and access information easily because a critical component not only of our scientific efforts, but in our daily lives as well.

Early Attempts to Automate Calculations

Look at the History of Information Technology and Systems to see pictures of these pioneers of the information age, and the inventions they built as you read through web lecture below.

In the ancient world, information was recorded on tablets or papyrus scrolls. It was time-consuming labor, and produced a fragile record that could easily break or burn or become mildewed. If you wanted to look something up, you had to look at every tablet in your collection, or unroll and skim through an entire scroll, which was also time-consuming and inaccurate (you could easily miss the thing you were looking for).

You may remember that in our first year, we discuss the conditions necessary for "science" to happen. While individuals can make scientific observations, using scientific methods, we really don't make advances in science unless that information is communicated, and observations from many different people are compared and correlated.

The invention of the codex or book, which used individual pages that could be accessed immediately, and the invention of the index, started a revolution in how information was recorded and retrieved. If you knew what page your information was on, you could go directly to that page. When Johannes Gutenberg combined the idea of movable type (which the Chinese had been using for some time) with hand-driven press and created the mechanical printing press, it became much easier to copy data precisely, and books suddenly became much cheaper.

The encyclopedists of the early Middle Ages (like Cassiodorus and Isidore of Seville) tried to record as much information as possible, and were forced to consider how to organize it. These early encyclopedias are generally organized by topic, but that meant that you had to know how different ideas were related in order to look in the right chapter if you wanted to know about a specific event or thing. The later medieval philosophers like Richard Kilwardby, created an elaborate divisio scientiarum, a hierarchical classification of knowledge, so that they could better teach basic subject matter before more advanced topics. This made it easier to see how the areas of knowledge were related, but more difficult to find specific information.

During the Enlightenment in France, Denis Diderot and other philosophers collected the vast influx of information form voyages of exploration to Africa, the Pacific, and the New World, along with what they already knew of Europe and the Middle East, in a new Encyclopedia, this time writing short articles on different topics and collecting them arranged in alphabetical order. This made it much easier to look up a specific topic, but much harder to see how individual areas fit together. In the 1950s, the Encyclopedia Britannica try to remedy this by adding a volume called the Propaedia to their alphabetically-organized encyclopedia set. The Propaedia is a gigantic outline of knowledge, a kind of guide to reading the individual articles in the best order to learn a particular subject.

At the same time that the encyclopedists try to deal with general knowledge, there are mathematicians who are also trying to deal with the quantities of numerical information. The first mechanical device for computation was probably the abacus. This aid to arithmetic was popular in China and Japan, and although not widely used, was known throughout Europe.

In the 17th century, an Scottish mathematician named John Napier realized that the relationships in exponential equations could be calculated and transferred to scales. An exponential equation has the form y = ax. you have probably calculated the value of y for simple exponents like 23 = 2 * 2 * 2 = 8. Napier was interested in "going the other way", and answering the question: "to what power must 2 be raised in order to get 8?". He came up with the notation log2 8 = 3. Then he wondered "to what power must 2 be raised in order to get 9?" This is a more difficult value to determine, but he recognized that if numbers were plotted to a logarithmic scale rather than an arithmetic scale, it would be possible to calculate exponents and logarithms easily. Following Napier's logic, William Oughtred made the first slide rules to facilitate complex calculations.

Blaise Pascal, a French philosopher, was also a mathematical genius. He built a calculating device, something like a mechanically-driven abacus, that allowed him to multiply and divide numbers quickly.

Pascal's Machine

Using the same kind of mechanical logic, Joseph Marie Jacquard, a French textile manufacturer created a mechanical loom capable of producing patterned silk and based on instructions encoded in punched cards. This is probably the first "real time computing system", where a complex machine is driven by interchangeable sets of instructions. In the early 19th century, the power for these machines came from steam-driven engines, but towards the end of the century, more machines were driven by electrical power.

Charles Babbage, an English Fellow of the Royal Society, tried to adapt Pascal's mechanism to solve more complex mathematical problems than just adding and subtracting. His "difference engines" used iterative addition and subtraction to perform multiplication and division problems. For example, he would count the number of times he could subtract 5 from 30 to get 6 as the answer to the problem x = 30/5:

Although British government gave Babbage ample funds to build his machine, he was not successful because the machine required around 25000 working parts. His second design also failed; but was actually built as a working model over 150 years after it was originally designed at the London Science Museum.

Despite these failures, by which attempted to make a more general "analytical engine" that would use punched cards, like the Jacquard looms, so that different problems could be solved. Babbage envisioned instructions that would use interactive looping (using the results of one calculation to perform the next calculation), sequential control, and branching (choosing the next step on the basis of the results of the current state).

What Babbage lacked was a logic of pure mathematics, an understanding of how members are related. That mathematical system was the product of George Boole's investigations into logic and human reasoning. Boole toyed with the concept of a mathematical system with only two quantities, 0 and 1, something akin to Aristotle's idea of "A" and "not-A". on a space as he built an entire system of algebra, but his work was published too late for Babbage to use it in the design of the "analytical machine".

The Twentieth Century and Electronic Computers

Jacquard's looms gave Herman Hollerith an idea. Hollerith was a German-American engineer who worked on the 1880 US Census. All of the data collection was done by hand (as much still is), but so was all is the data correlation and all of the calculations used to add up the number of people in a given Congressional district. Accuracy and timeliness were critical, since the Constitution of the United States requires that representatives come from districts apportioned by population, and key political decisions could depend on how the districts were allocated. But took nearly 10 years to analyze all the data from the 1880 census, and there was no way to guarantee the accuracy of the calculations.

Hollerith decided that he could build a machine using instructions on punched cards, like Jacquard's loom, but processing information, instead of weaving silk. His designs won the competition to support the U.S. Census for 1890, and he produced a state in three months instead of 10 years. In applying for a patent for his machine in 1899 he wrote:

The herein described method of compiling statistics which consists in recording separate statistical items pertaining to the individual by holes or combinations of holed punched in sheets of electrically non-conducting material, and bearing a specific relation to each other and to a standard, and then counting or tallying such statistical items separately or in combination by means of mechanical counters operated by electro-magnets the circuits through which are controlled by the perforated sheets, substantially as and for the purpose set forth.

— (Patent No. 395782)

Hollerith added an automatic card to his mechanism, and created a key punch machine to make the cards. In 1906 he built a new machine called the Type 1 Tabulator. This machine had a wiring panel that could be removed and rewired to do different tasks. In 1911, he merged his Tabulating Machine Company with three other companies to form the Computing Tabulating Recording Corporation. CTR made time clocks that employees could punch to record their time on the job. Expanding the vision of what CTR can do, a new president of the company, Charles Watson, renamed it in 1924 to International Business Machines. Although Hollerith didn't make the conceptual connection, his "punched cards" with their holes punched or not punched were perfectly suited to implementing Boolean logic to perform calculations and determine states on which to base branching decisions.

While Hollerith was making progress in building electromechanical calculator devices to process census data and keep track of employee hours, some mathematical engineers begin to consider how these devices could be made to do other kinds of work. At the University of Harvard in 1944, Howard Aiken decided to complete his graduate work in differential calculus by creating a machine based on Charles Babbage's difference engine to do the cumbersome thousands of complex calculations he required. He took his proposal to the math department at Harvard, which realize that it did not have the ability to build the engine, so it contracted with IBM. His proposal also drew the attention of the United States Navy, then heavily engaged in the war in the Pacific, which sent Grace Hopper to work with Aiken in designing and building the machine. The Mark I went into operation in 1944, using engineering techniques developed at IBM, mathematical principles developed by Aiken, and the rudiments of a computer programming language developed by Grace Hopper. Excited by their work, IBM funded a sequence of projects so that Aiken and Hopper could build three more projects, and developed new hardware for each machine, including a magnetic drum for storage for the Mark IV.

At the end of the research project, Hopper (now retired) worked for Remington-Rand on the new Univac computer, writing the the first compiler, craftily named "Compiler A", which could translate assembly language into machine code. Hooper believed that programs can be written in a language close to English, and directed the committee that eventually produced the COBOL programming language. Called back into active duty, she helped the Navy test programs in FORTRAN and COBOL and served as a consultant to Digital Equipment Corporation (DEC).

While Aiken and IBM were building the machines, and Hopper was inventing a language, John von Neumann, Alan Turing and Donald Knuth were determining methods to program computers. Neumann applied set theory to programming techniques to reduce the number of instructions a computer had to perform when doing thousands of iterative (repeated) calculations. His ideas were critical to the success of the Manhattan Project, the U.S. Government's project to build a nuclear weapon. Alan Turing was British mathematician who performed critical cryptanalysis work for the British government during World War II, and was instrumental in cracking the German Enigma codes. Turing defined a test which has become a standard in artificial intelligence: a machine is "intelligent" if a trained investigator cannot tell whether he is talking to a person or a program. Donald Knuth devised methods of analyzing mathematical algorithms or ways of solving problems so that they could be translated into program instructions, a fundamental requirement for writing programs and the compilers to interpret them.

Personal Computers

By the 1970s, computers were in widespread use not only for military applications, but also for banking and insurance functions, where many thousands of rapid calculations are required just to maintain the day-to-day business. These computers were so large, and required so much expert maintenance, that companies like IBM and DEC did not sell their computers, but rented them to their customers. Of course, such expensive equipment was too expensive for most individuals.

That didn't stop at Ed Roberts, who ran a New Mexico small business. Micro Instrumentation Telemetry Systems sold kits for assembly electronic devices, and was facing stiff competition in the calculator business from a rival company called Texas Instruments. In 1974, Roberts put together a kid for assembly a home computer based on a new chip from Intel, the 8080. The kit was the Altair (named after a planet in a Star Trek episode), and it became extremely popular, despite the fact that it had no keyboard, no video display, and only 256 bytes of memory. Two Harvard students, William Gates and Paul Allen, called Roberts to tell him they had programs that could run the BASIC programming language on the Altair (they didn't actually have it written, they had only the basic architecture designed). When Roberts agreed to demonstration, the two students sat down and wrote the software in about six weeks, flew to New Mexico, and demonstrated the program. When Roberts approved the program, Gates and Allen formed Microsoft to sell "MBasic" for programming small computers. Roberts engineered the first standards for recording data to cassette tapes.

Atari begin building arcade games that used more sophisticated computer graphics but were not programmable by their owners. Steve Wozniak (working for Hewlett-Packard on large-frame computers) and Steve Jobs (working for Atari on computer games), were two old high school buddies who put their engineering experience together to build their own computer, the Apple I. After making some improvements, they put the Apple II, with the first personal computer graphics interface, on the market in 1977. Banking on increasing miniaturization of computer components, drives, memory, and screens, Lee Feldstein and Adam Osborne produced the Osborne I, the first portable computer, which they marketed in 1981. The Osborne I also introduced the idea of "bundling" hardware and software with the primary computers; the Osborne I included a built-in screen, two drives, a MBasic compiler, word processor (Wordstar 3), a spreadsheet (SuperCalc), and ran the original micro computing operating system CP/M (Computer Processor/Monitor).

The Internet and the Web

With the growing number of computers at different government, military, and university sites, a new problem arose: how could computers easily exchange information? The first step was to create a physical connection between the different computers. Work at MIT and Rand Corporation (a think tank in Santa Monica, California) envisioned a network of computers, all talking to each other. The first problem was to create a kind of "traffic control" for bits of information. The solution was the idea of packet switches: each packet had an address of its destination as the first of the information stream. When it arrived at a physical phone system switch, it would be forwarded toward the correct destination along a path that might take it through multiple switches. The Defense Advanced Research Projects Agency of the United States Department of Defense created a network, ARPANET, using packet switches to connect computers in the Defense Department's programs.

Arpanet 1977

As the network grew in complexity, it became important to split off different groups of computers and establish gateways through which an internal cluster of computers access the main connection to the ARPANET "backbone". All the organizations had to agree to use the same communications protocols (the transmission control protocol, TCP) and the internet protocol (IP). IP addresses use a four-part numerical code (like 204.57.184.211) to identify the location of a particular machine on a network. This early standardization is what made the Internet possible. New groups joined the network, with universities forming USENET, and other countries forming internal networks with gateways to the established ARPANET or USENET networks.

In 1984, the number of hosts on the network reached 1000 machines. In order to better manage IP addresses, ARPA created the Domain Name System (DNS) method of granting conventional names (like www.scholarsonline.org) IP address and split into a military component (MILNET) and a general component (ARPANET). As use of the network increased, new hardware and software methods developed to handle the load, and new methods of exchanging information appear, including a protocol for exchanging messages instantly between two computers, Internet Chat Relay (IRC). Social components start showing up on the network, because people don't just do work when they are together, they also exchange recipes (rec.humor), compare movies (rec.arts.movies) and books (rec.sf-lovers). By 1990, the U.S. Government suspended support for project ARPANET: the international network (Internet) had become self-sustaining.

During the 20th century, as we shall see, the standard practice for publishing new scientific discoveries began with submitting the experiment report to a journal, where it was sent out to other scientists for review. Publishing the results of an important discover could take two to three years, while in the meantime, others might waste resources by attempting to perform the same research. It was easy to save a report on the server on the Internet. But it was hard to share it.

To solve these problems, the physicists at CERN, the European Organization for Nuclear Research, created a language called hypertext markup (HTML) which allowed them to display information with simple character string tags to set up paragraphs, italics, boldface, colors, and formulae. The collection of HTML documents available on the web through a browser program that interpreted the HTML instructions to display the document became a "web" of knowledge sitting on servers in the international network (Internet). Files could be identified and located by giving them a Uniform Resource Locator (URL) which identifies the server by its domain name, and gives the server's hosting program instructions on how to find the file itself.

For example, the URL

"https://www.dorthonion.com/drcmcm/NATURAL_SCIENCE2/Lessons/Lectures/wk66_TechRev/NSH.php"

tells your browser to launch a query to its Domain Name Server, which will forward your IP address and the query to the IP address for www.dorthonion.com. When the web server running on the computer www.dorthonion.com sees the request, it looks for the file "drcmcm/NATURAL_SCIENCE2/Lessons/Lectures/wk66_TechRev/NSH.php" in the appropriate part of its web-accessible content. Since this is a PHP file, the web server reads the file, executing the PHP programming instructions and creating an HTML page, which it sends back to your computer using the IP address your browser including in its request. Your browser following the HTML instructions to display the page. [If you want to see what your browser gets, use the "View Source" option, usually under "View" in your browser's tool bar.]

Voila....you are now not only reading about the Information Age, you are part of the story!

Study/Discussion Questions:

On your own