The computer systems that you use today are a result of 50 years of evolutionary success. Computers started from a simple machine as the abacus and have evolved through as complex as IBM and apple server machines. However, the foundation of the current computer systems was laid during the 80s and 90s. This was the era in which Apple and IBM produced desktop systems and Microsoft started releasing windows based GUI operating systems.
Building on the success of microprocessors from the 70s, which made computers relatively affordable for personal use, significant strides were made in the 80s. This development led to a considerable spike in the number of people with home computers across America in the earlier part of the decade. Over 600,000 computers were purchased across homes within the country – more than any other nation in the world. Each system was priced at a little over $500, which was considered cheap at that period.
In the 80s, computers were pretty simple with 4.7 MHz processors and 2kb of RAM. Programmers had to write to computer programs line by line and there was no way of saving the program. Also, debugging was one heck of a task. However, throughout the decade these problems will be solved through significant advances with microprocessors leading directly to the microcomputer or personal computer revolution.
In the 80s IBM was the prime computer manufacturer with systems prices dangling around $1000. Commodore 64 was the first real powerful machine introduced by IBM in those days. It would be unfair without mentioning Apple manufacturers. Apple systems were used mostly in school back in those days.
The company’s 1983 release of the Apple Lisa marked something significant in the computer world that would be copied by other rivals. The Lisa was the first personal computer with a graphical user interface (GUI) that was sold commercially. It was expensive so it was not very successful in terms of sales.
With new microprocessors available, Apple would go on to release the first Macintosh desktop and then later in 1989, the Macintosh Portable which was battery-powered and predecessor to our current laptops. That same year Intel introduced the Touchstone Delta supercomputer which proved that microprocessors have reached their peak in development.
This supercomputer had 512 microprocessors and was even used in projects like real-time processing of satellite images and fabricating molecular models for various fields of research. However, apple and windows DOS OS wasn’t that successful in those days. Apple OS could only run on Apple hardware, while IBMs hardware could run BASIC language programs.
Amstrad CPC 464 is also worth mentioning. Created in 1984, it was the company’s first try at a home computer. It came with different screens; a green monitor and a colored monitor. The green monitor was cheaper and cost £199 ($285). The colored screen, on the other hand, cost £299 ($285). An estimated 2 million pieces of this home computer were sold within Europe. The success allowed it to become the best-performing microcomputer within that period. One of the advantages of this computer unit was that it was easy for anybody to set up the system.
Come, the 90s when the computer software and hardware industry really started to take off. In the early 90s, 286 and 386 were ruling the hardware industry. During the early 90s, Microsoft’s Windows 3.1 coupled with DOS had taken over as the most successful and widely used OS. With good hardware memory and sound knowledge of DOS, the user could perform most of the basic to advanced level tasks. In DOS, only one program could be run while Windows provided multitasking (Actually pausing the one task and starting the other). 14.4K modem was used to connect to the internet with a landline phone.
In the later ’90s, Pentium processors were introduced, and with the advent of Windows 95 and then later Windows 98, things started to take a sharp turn. A key introduction to the computer world was the development of the Google search engine at Stanford University by Sergey Brin and Larry Page in 1996.
Multitasking was implemented in its essence and the OSs were much robust. It is safe to say that the astonishing progress that we are seeing today in the field of computer software and hardware is a result of the seeds sown in the 80s and the 90s.
If the foundations of the current computer systems were made during the 80s and 90s then the years following the turn of the millennia firmly established it. Most people now have a laptop, a tablet, a smartphone, or all 3.
IBM, on the other hand, had a challenging decade. In the early 90s, the company was unable to innovate with respect to a personal computer. This failure cost them a lot because people were generally moving towards using cheaper personal computers rather than expensive mainframe systems. Nonetheless, IBM was still able to release the ThinkPad 701 in 1995. This laptop was quite popular. It had an expanding keyboard that was bigger than the frame of the computer. The keyboard was called the butterfly due to its shape.
Despite the popularity of IBM’s microcomputers, clone laptop makers disrupted revenue. However, Microsoft and Intel were able to make significant gains from software and chip sales.
Apple’s progress in the 90s wasn’t remarkable. In fact, the lack of significant improvement was carried over from the 80s. The release of Windows 3.0 in 1990 further amplified its lack of impact. Nonetheless, Apple still released the PowerBook Series in 1991, which did nothing to help the situation. Apple didn’t make a comeback until the late 90s when Steve Jobs returned to the company. In 1998, Apple was able to release the iMac G3, which brought it back into the scene of personal computers. The computer had a decent 4GB HDD, 233MHz processor, 32MB RAM, and a CD-ROM. With a 15-inch monitor, the laptop came in different colorful high-quality finishings.
A milestone in the modern era was in 2001 when Apple revealed the Mac OS X operating system, with features such as pre-emptive multitasking and protected memory architecture. The same year Windows XP with its new Graphical User Interface is launched by Mircosoft. Two years later, the first 64-bit processor, AMD’s Athlon 64, becomes available to the global market.
In 2005, Google acquired the mobile phone operating system, Android, which is Linux-based. While in 2006, Apple reaches another milestone by releasing the MacBook Pro and an Intel-based iMac. The Macbook Pro is Apple’s first Intel-based, dual-core mobile computer. The next year, Apple capitalizes on its momentum and releases the iPhone with the advertised purpose of bringing computer functionality to the smartphone.
Microsoft launches Windows 7 in 2009. Windows 7 offers the ability to pin applications to the taskbar as well as improvements in touch and handwriting recognition. It is generally a well-received upgrade from Windows XP.
Between 2010 and 2015, tablet computers become nearly ubiquitous when Apple unveils the iPad in 2010. This sparks a change in how consumers viewed the tablet. With improved technology and loaded features, it quickly becomes a must-have and changes the way consumers view media. Google releases its first Chromebook, a laptop that runs the Google Chrome OS in 2011.
Apple tries to be a forerunner once again. This time in the wearable devices market by launching the Apple Watch. This product is once again successful. Microsoft also releases Windows 10 in 2015.
Major advancements are made between 2015 and our current time, some of these are on par with science fiction. For instance, the first reprogrammable quantum computer was created in 2016 with the capability to process new algorithms into their system. Also, the Defense Advanced Research Projects Agency (DARPA) starts developing a new “Molecular Informatics” program that uses molecules as computers in 2017.
For a while, Intel was the leading CPU manufacturer for consumer PCs and Laptops. This came after AMD’s CPUs suffered a series of manufacturing design defects in 2010, which almost bankrupted AMD.
However, AMD has since overcome those challenges and has now successfully countered Intel’s CPUs.
And unlike in the past where it focused on offering lower-priced CPUs to compete with Intel, its AMD CPUs now compete with Intel’s CPUs based on performance alone.
In a bid to hold onto its laptop market share, Intel released its 10th-gen “Comet Lake” in 2020. This CPU has 10 cores, two more than its predecessor 8-core 9900K, but it is built on the aging 14nm transistor design.
In response, AMD introduced the Zen 3, which is based on an improvised design of its desktop CPUs that supports up to 16 cores. The Zen 3 doesn’t stand out because of its number of cores, but for its higher number of instructions per clock.
Moreover, its power consumption didn’t increase, which makes them Zen 3 ideal for portable laptops. Even more exciting is that it’s still built for the same socket design as its predecessor.
And for 2022, AMD is likely to switch to a redesigned motherboard socket as well as a new manufacturing process. It’s also likely to support a high-speed DDR5 memory, all within the same release, which is quite exciting for PC and laptop gamers.
On its part, Intel released the “Comet Lake” mobile chips in 2020, which are based on its new CPU architecture and manufacturing for mobile CPUs. The “Ice Lake”, Intel’s 10th gen chip, followed “Comet Lake”, which was efficient, but still suffered from slow clock speeds.
They fixed that shortcoming with “Tiger Lake”, which is Intel’s 11th Gen CPU. It offers faster speeds and comes with an upgraded GPU that is capable of playing games such as Overwatch or Fortnite.
As you would expect, Apple has also stirred the market with its M1 chips. These aren’t based on any existing laptop or Desktop chips; instead, Apple-based the design on its iPhone and iPad chips. Their main performance feature is their super-fast speeds while consuming just a few watts of power.
And there is speculation that Apple plans to release chips with up to eight times the number of CPU and GPU cores in the M1 chip.
When you also consider the RISC-V Linux systems can offer improved performance than PC existing instruction sets, we can expect 2022 and beyond to be exciting times for the consumer computing landscape.