Darby Consulting Home Page
Web Search: 

A Brief History of Information Technology

YEAR DESCRIPTION
1957 Planar transistor developed by Jean Hoerni
With this technology the integrated circuit became a reality. This process forces certain types of atoms to infuse into an otherwise pure piece of silicon. These impurities or dopants create the conducting and control structures of the transistors on the chip. With this technology, microscopic circuit boards could be laid out on the silicon surface, thus allowing the compacting of these circuits onto integrated circuits.
1958 First integrated circuit
In 1957, a group of eight electronics engineers and physicists formed Fairchild Semiconductor. The next year, one of these men, Jack Kilby, produced the first integrated circuit for commercial use.
1960's ARPANET developed by the U. S. Department of Defense
Originally intended as a network of government, university, research, and scientific computers, the Advanced Advanced Research Projects Agency NETwork was designed to enable researchers to share information. This government project eventually grew into the Internet as we know it today. The networking technology and topology was originally designed to survive nuclear attack. This was back during the Cold War era, when most scientists expected that the USA would be subject to a nuclear attack someday. The design required that the network would route traffic and data flow around any damage. This robustness enabled the Internet to grow at incredible speed, until today it serves up any of billions of web pages.
1962 The first recorded description of the social interactions that could be enabled through networking was a series of memos written by J.C.R. Licklider of MIT in August 1962 discussing his "Galactic Network" concept. He envisioned a globally interconnected set of computers through which everyone could quickly access data and programs from any site. In spirit, the concept was very much like the Internet of today. Licklider was the first head of the computer research program at DARPA, starting in October 1962. While at DARPA he convinced his successors at DARPA, Ivan Sutherland, Bob Taylor, and MIT researcher Lawrence G. Roberts, of the importance of this networking concept.
1969 UNIX Operating System Developed
Developed at AT&T labs by engineers Ken Thompson and Dennis Ritchie, the UNIX operating system was the first operating system that ran on a minicomputer and could handle multitasking and networking. It was also written in the C programming language - then a high level language with power and flexibility. Other operating systems existed, but they were usually written in assembly language for speed and efficiency. C was a natural environment for writing an operating system. Today, both C and UNIX are available for a wider variety of computer hardware platforms than any other programming language or operating system. This level of portability in computer programming makes UNIX popular even still.
1971 First microprocessor chip
Three inventors, Andrew Grove, Robert Noyce, and Gordon Moore founded Intel to produce computer memory chips in 1968. In 1971, the 4004 microprocessor chip, designed by a team under the leadership of Federico Faggin, was introduced to replace the central processing units that heretofore had been constructed from discrete components. The microprocessor chip was born. Intel's later products, from 8080 through 8088 and currently Pentium IV were all descended from the 4004.
1972 Optical laserdisc
Back in 1972, music was sold on vinyl records. These records were large platters with spiral grooves cut in them. The music information was stored in the grooves by controlling the depth and direction of the cutting machine. However, the grooves eventually wore, resulting in decreased fidelty. The laserdisc was created by Philips to correct this problem. Instead of grooves, pits were burned into the aluminum surface to represent the 1's and 0's of computer technology. A laser beam either reflected off the spot or was absorbed by the pit. The early laserdiscs were the same size and shape as vinyl records, but they could hold both video and audio on their reflective plastic platter. The information had to be read by a laserdisc player, which was initially expensive. But in time this became a popular medium for home movies.
1974 Motorola microprocessor chip
Motorola's 6800 was the forerunner of the 68000. The 68K was used in the original Macintosh computer system. It provided the computer horsepower to run a graphical user interface, or GUI. Although the Intel microprocessor line would come to dominate desktop computing, the current Apple computer products still use Power PC chips, which are the descendants of this powerful microprocessor chip.
1975 Altair Microcomputer Kit
The Altair Microcomputer is the first personal computer available to the general public. In fact, it made the cover of Electronics Illustrated in 1975. The Altair was the first computer that was marketed to the home enthusiast. It came as a kit, so it was most suitable to people with electrical engineering backgrounds. The front panel consisted of a series of small, red light emitting diodes and the user could list and run programs written in machine language. The program listing and the results of the program after it had run were read off this display as a binary number. It was up to the programmer to read the results. The programmer could load the computer with a new program by setting the switches for each machine language code and depositing the binary number into memory at a given location. Needless to say, this was a time-consuming process; but it represented the first time that home enthusiasts could get their hands on real computer hardware.
1977 Radio Shack introduces the first pre-built personal computer with built-in keyboard and display
This was the first non-kit personal computer to be marketed to the general public. In 1977, Brad Roberts bought one of these Tandy/Radio Shack computers, known as the trS-80. It came with a simple cassette tape player for loading and saving programs. This allowed Brad to do word processing, using programs like CopyArt. It also produced a revolution in thinking which gradually took hold and gained momentum during the next decade. No longer would the computer be seen as an expensive mathematical tool of large scientific, military, and business institutions, but as a communication and information management tool accessible to everyone.
1977 Apple Computer begins delivery of the Apple II computer The Apple II came fully assembled with a built-in keyboard, monitor and operating system software. The first Apple II's used a cassette tape to store programs, but a floppy disk drive was soon available. With its ease in storing and running programs, the floppy disk made the Apple II computer the first computer suitable for use in elementary school classrooms.
1984 Apple Macintosh computer
The Macintosh was the first computer to come with a graphical user interface and a mouse pointing device as standard equipment. With the coming of the Mac, the personal microcomputer began to undergo a major revolution in its purpose and use. No longer a tool for just scientists, bankers, and engineers, the microcomputer became the tool of choice for many graphic artists, teachers, instructional designers, librarians, and information managers. Its basic metaphor of a user desktop with its little folders and paper documents hit home with these users, many of whom had never seen a big computer mainframe. The Macintosh would eventually develop standardized symbols for use by humans in communicating with the machine and ultimately contribute to the World Wide Web's metaphor of a virtual world. The Macintosh GUI also paved the way for the development of multimedia applications. The hardware obstacles that prevented hypermedia from becoming a reality were no more.
Mid 1980's Artificial intelligence develops as a separate discipline from information science.
Artificial Intelligence (AI) is a somewhat broad field that covers many areas. With the development of computer programming involving ever increasing levels of complexity, inheritance, and code re-use culminating in object oriented programming, the software foundations for AI were laid. Other developments in cybernetics, neural networks, and human psychology added their contributions. Some practical but as of yet imperfect implementations of AI include expert systems, management information systems, (MIS), information searching using fuzzy logic, and human speech recognition. Artificial Intelligence today is best defined as a collection of electronic information processing tools that can be applied in a myriad of innovative ways to existing information technologies. Most scientists believe that a machine can never be built to replicate the human mind and emotions, but will be used to do more and more of the tedious labor in finding and presenting the appropriate information in humanity's vast, evergrowing collection of data.
1987 Hypercard developed
In August of 1987, Apple Computer introduced Hypercard to the public by bundling it with all new Macintosh computers. Hypermedia was a reality at last, with the hardware and software now in place to bring it into being. Hypercard made hypertext document linking possible for the average person who wished to build an information network linking all of his or her electronic documents that could be entered or pasted into a Hypercard stack. Based on the metaphor of index cards in a recipe box, it was easy enough for even young students to use. Yet it was powerful enough to become the software tool used to create the Voyager educational multimedia titles. Hypercard also had provision for displaying graphics and controlling an external device to display video, which would ideally be a laserdisc player.
1991 450 complete works of literature on one CD-ROM
In 1991, two major commercial events took place which put the power of CD-ROM storage technology and computer based search engines in the hands of ordinary people. World Library Incorporated produced a fully searchable CD-ROM containing 450 (later expanded to 953) classical works of literature and historic documents. This demonstrated the power of the CD-ROM to take the text content of several bookshelves and concentrate it on one small piece of circular plastic. The other product was the electronic version of Grolier's Encyclopedia which actually contained a few pictures in addition to text. Both products were originally marketed through the Bureau of Electronic Publishing, a distributor of CD-ROM products. Many saw this as the ultimate in personal data storage and retrieval. They didn't have to wait long for much greater things in the world of multimedia. Though both titles sold initially for several hundred dollars, by 1994 they could be found at electronic flea markets selling for a dollar or two each. Technological advances had occurred so rapidly in this area that both the Multimedia PC standard and the Macintosh multimedia system extensions made these two products obsolete in a couple of years.
1991 Power PC chip introduced
Working together, Motorola, Apple, and IBM developed the Power PC RISC processor to be used in Apple Computer's new Power Macintosh. The product line currently includes the 601, 603, and 604 microprocessors. These chips are designed around a reduced instruction set machine language, intended to produce more compact, faster executing code. Devotees of the Intel CISC chip architecture heartily disagree with this assertion. The result is that the consumer benefits from the intense competition to develop a better computer chip.
1991 The Internet is born
The World-Wide Web was introduced by Tim Berners-Lee, with assistance from Robert Caillau (while both were working at CERN). Tim saw the need for a standard linked information system accessible across the range of different computers in use. It had to be simple so that it could work on both dumb terminals and high-end graphical X-Window platforms. He got some pages up and was able to access them with his 'browser'.
1993 1993 Internet access and usage grow exponentially, as tools become more available and easier to use. People begin referring to the Internet as the information superhighway.
1995 Term Internet is formally defined
On October 24, 1995, the FNC unanimously passed a resolution defining the term Internet. This definition was developed in consultation with members of the internet and intellectual property rights communities. RESOLUTION: The Federal Networking Council (FNC) agrees that the following language reflects our definition of the term "Internet". "Internet" refers to the global information system that -- (i) is logically linked together by a globally unique address space based on the Internet Protocol (IP) or its subsequent extensions/follow-ons; (ii) is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols; and (iii) provides, uses or makes accessible, either publicly or privately, high level services layered on the communications and related infrastructure described herein.
1995 CDROM Capacity increase A handful of CD-ROM disks has the capacity to store all the knowledge and memories of an average person's lifetime.
1999 Y2K Bug Feared
When computers were first built, memory was a precious resource. To conserve memory, dates were stored in a compressed form, utilizing every bit (i.e. single binary digits containing 1 or 0). Not surprisingly, years were stored as two decimal digits, 00 through 99. As the end of the second millenium came, fears arose as to what would happen to computer systems when the new millenium started. Early tests showed that many computers improperly handled the transition from 1999 to the year 2000, so this became known as the Y2K bug.
A massive effort was undertaken to avert this doomsday scenario. People feared that planes would fall out of the sky. All computer source code was reviewed, and fixes were designed for the problem areas. Some were band-aids, just offsetting the date by, say, 50 years or so. Others were massive rewrites of source code that had been running successfully for 30 years. Engineers were called out of retirement that had worked on the source code in the 1960s.
2000 On January 1, 2000, everyone held their breath. Although there were some problems , the general population never saw them. The massive Y2K bug worst case scenario had been averted.
2002 DVD introduced
The Compact Disc was by now in every home. But the CD suffered from the fact that it only contained audio or musical information. A new medium, known as the Digital Versatile Disc, or DVD, came to the market. The DVD could store video or audio. It had capacity for gigabytes of information, where the CD was limited to megabytes. This technological development made it possible for consumers to buy home movies again. The DVD worked like a laserdisc, reading the pits in the media via a laserbeam without making physical contact. Hence, there is virtually no wear and tear on a DVD.
2003 Broadband takes off
Broadband is the name for high capacity interfaces between the home and the public Internet. In 2003, this became readily available in most metropolitan areas of the US. This made it possible for PC users to download files that were megabytes in size in just a few minutes, rather than taking hours over a modem connection. This rapid increase in capacity enabled all sorts of new applications, including musical file sharing.
2004 A/V CPU Chips Arrive
PCs have always depended on integration of circuits - i.e. the IC chip of 1958. New CPU chips combine the processor with new capabilities to process audio and video. The result is a new set of computers with built in support for High Definition (HDTV) and 7.1 surround sound. This will further reduce costs while providing even smaller packages.
2005 Blogs Take Off
Blogs are personal web spaces where individuals can share their own thoughts and ideas. Corporations began to incorporate blogs in their networks, allowing employees to use sophisticated web server technology to talk about their work. Unfortunately, sometimes this lead to problems, as employees shared more than they were supposed to; but on the whole, most employees found a new, creative outlet for self-expression.
2006 HD DVD vs. Blu-Ray
Players and game consoled were introduced for both new high definition video formats. It reminds you of the war between VHS and Beta. But who will win? Only time will tell. However, the winning format promises to capture the DVD market for years to come. So billions are at stake. And the consumer wins in the long run as DVD takes on additional resolution and quality. The amazing thing is that a piece of media using these formats can store at least 30 GB! Still, we are a long way from being able to make computer backups on these media.
2007 Blu-Ray Wins
Game consoles finally settled down with Blu-Ray winning the format war. Now game developers and players alike can concentrate on the new games that this format allows.
2008 Memory
Memory prices continue to come down, enabling smaller and smaller electronic gadgets. Where will it all end? Maybe a postage stamp sized memory circuit to hold all of the memory you will ever need. In actuality, there are researchers at IBM working on just that. Stay tuned.
2009 Smart Phones
Smart Phones became more and more popular, as businesses like Apple and Google came on board. In one sense, these devices have become handheld computers, integrating Personal Information Management applications. Besides business applications like eMail and instant messaging, these phones now provide entertainment. As 3G technologies grow in popularity, data traffic also increases - to the point of being a problem for the main network carriers. The cell phone has become the indispensable device that everyone has and uses everywhere.
2010 Social Networking
Social Networking sites really take off in popularity. Who would think that something as simple as uploading a picture to a web site could become so popular? Yet, thousands, even millions are doing just that on sites like Facebook. And others are self-tracking their every move so their friends can find them wherever they are on sites like Twitter.
2011 Tablets Take Off
The Apple iPad raises the bar for functionality and style in tablet computers. These hand held devices, about the size of a book, can access the Internet, display the pages of thousands of books, play music, play videos. Apps become ubiquitos.
2012 Storage Explodes
Hard Disk storage finally exceeds Terabytes (TB) per drive unit. This drives the cost per GB below $1. Computers now seem to come with more storage than anyone can use.
2013 Internet Of Things
The Physical and Digital Worlds fused together. Real time traffic reports started appearing, based on the current feed from cameras mounted on the road. Self-driving cars appeared on the highways in NV. Cutting-edge apps have increased the social flow of information for all.
2014 Internet Of Everything
The Internet is now connecting everything from cars to skyscraper buildings. The challenges include how to extend this connectivity in ways that allow people in specific places to share information with their social group. This information flow became immediate and drove social changes like the Hong Kong protests.
 
Other Articles:   Daily Life of a Consultant | History of the Internet | Outsourcing


Contact webmaster .

| Legal Disclaimer | Privacy Policy | Powered by Bluehost Hosting Services