The computer systems that you use today are a result of 50 years of evolutionary success. Computers started from a simple machine as the abacus and have evolved through as complex as IBM and apple server machines. However, the foundation of the current computer systems was laid during the 80s and 90s. This was the era in which Apple and IBM produced desktop systems and Microsoft started releasing windows based GUI operating systems.
In the 80s, computers were pretty simple with 4.7 MHz processors and 2kb of RAM. Programmers had to write to computer programs line by line and there was no way of saving the program. Also, debugging was one heck of tasks. However, throughout the decade these problems will be solved through significant advances with microprocessors leading directly to the microcomputer or personal computer revolution.
In the 80s IBM was the prime computer manufacturer with systems prices dangling around $1000. Commodore 64 was the first real powerful machine introduced by IBM in those days. It would be unfair without mentioning Apple manufacturers. Apple systems were used mostly in school back in those days.
The company’s 1983 release of the Apple Lisa marked something significant in the computer world that would be copied by other rivals. The Lisa was the first personal computer with a graphical user interface (GUI) that was sold commercially. It was expensive so it was not very successful in terms of sales.
With new microprocessors available, Apple would go on to release the first Macintosh desktop and then later in 1989, the Macintosh Portable which was battery-powered and predecessor to our current laptops. That same year Intel introduced the Touchstone Delta supercomputer which proved that microprocessors have reached their peak in development.
This supercomputer had 512 microprocessors and was even used in projects like real-time processing of satellite images and fabricating molecular models for various fields of research. However, apple and windows DOS OS wasn’t that successful in those days. Apple OS could only run on Apple hardware, while IBMs hardware could run BASIC language programs.
Come, the 90s when computer software and hardware industry really started to take off. In the early 90s, 286 and 386 were ruling the hardware industry. During the early 90s, Microsoft’s Windows 3.1 coupled with DOS had taken over as the most successful and widely used OS. With good hardware memory and sound knowledge of DOS, the user could perform most of the basic to advanced level tasks. In DOS, only one program could be run while Windows provided multitasking (Actually pausing the one task and starting the other). 14.4K modem was used to connect to the internet with a landline phone.
In the later ’90s, Pentium processors were introduced, and with the advent of Windows 95 and then later Windows 98, things started to take a sharp turn. A key introduction to the computer world was the development of the Google search engine at Stanford University by Sergey Brin and Larry Page in 1996.
Multitasking was implemented in its essence and the OSs were much robust. It is safe to say that the astonishing progress that we are seeing today in the field of computer software and hardware is a result of the seeds sown in the 80s and the 90s.
If the foundations of the current computer systems were made during the 80s and 90s then the years following the turn of the millennia firmly established it. Most people now have a laptop, a tablet, a smartphone, or all 3.
A milestone in the modern era was in 2001 when Apple revealed the Mac OS X operating system, with features such as pre-emptive multitasking and protected memory architecture. The same year Windows XP with its new Graphical User Interface is launched by Mircosoft. Two years later, the first 64-bit processor, AMD’s Athlon 64, becomes available to the global market.
In 2005, Google acquired the mobile phone operating system, Android, which is Linux-based. While in 2006, Apple reaches another milestone by releasing the MacBook Pro and an Intel-based iMac. The Macbook Pro is Apple’s first Intel-based, dual-core mobile computer. The next year, Apple capitalizes on its momentum and releases the iPhone with the advertised purpose of bringing computer functionality to the smartphone.
Microsoft launches Windows 7 in 2009. Windows 7 offers the ability to pin applications to the taskbar as well as improvements in touch and handwriting recognition. It is generally a well-received upgrade from Windows XP.
Between 2010 and 2015, tablet computers become nearly ubiquitous when Apple unveils the iPad in 2010. This sparks a change in how consumers viewed the tablet. With improved technology and loaded features, it quickly becomes a must-have and changes the way consumers view media. Google releases its first Chromebook, a laptop that runs the Google Chrome OS in 2011.
Apple tries to be a forerunner once again. This time in the wearable devices market by launching the Apple Watch. This product is once again successful. Microsoft also releases Windows 10 in 2015.
Major advancements are made between 2015 and our current time, some of these are on par with science fiction. For instance, the first reprogrammable quantum computer was created in 2016 with the capability to process new algorithms into their system. Also he Defense Advanced Research Projects Agency (DARPA) starts developing a new “Molecular Informatics” program that uses molecules as computers in 2017.