CPU History

Welcome to the CPU Guru tour of the history of CPUs, starting with the earliest Intel processors, right through to today's dual core 64 bit architectures. If stats are your thing, then the CPU comparison table may prove useful. If you have any questions or comments, or spot any errata, then please feel free to send me an email. Enjoy...

In the beginning...

In this section we will look at some of the first processors used in computers. We'll start with the Intel 4004, a 4 bit CPU, and progress until the end of the 16 bit era. Feel free to skip this section and look at 32 bit CPUs.

Intel 4004, 8008 and 8080

Intel 4004 In 1971, Intel released the world's first general-purpose processor: the 4004. I refer to it as 'general-purpose' because it could perform many different instructions, reacting in a pre-designed way to electrical input. Until this time, such a complicated single piece of integrated circuit had never been seen. The 4004 was a 4 bit CPU assembled from some 2300 transistors and ran at roughly 0.1 MHz, i.e. 'ticking-over' at a little over 100 000 times per second. It could perform some 60 thousand instructions per second .

While not sophisticated enough to run a real 'computer', the purpose of the 4004 was to drive the Busicom calculator. This was the first time all the functions of a calculator had been provided by a single chip. Hence it was described by Electronics magazine as the 'calculator on a chip'.

One year later, Intel released the 8008, an 8-bit version of the 4004 that ran at nearly twice the clockspeed. With 8 bits to work with, the microprocessor could expand beyond the realms of calculator-like functionality. It had a 14-bit address bus, allowing it to access 16kB of memory.

This chip was the centre of what some consider to be the world's first 'home computer': the Mark-8, designed by Jon Titus. However, such a machine was very difficult to assemble, expensive, had limited application and was not designed for a large market.

In 1974, Intel released the 8080. This was also an 8-bit microprocessor, but was built from nearly twice as many transistors as the 8008. It had a 16-bit address bus and ran at over ten times the clockspeed, resulting in benchmarks of up to 1 million instructions per second.

Initially, this processor found a home as a traffic-light controller! However, it is more famous for being the processor of the world's first true personal computer: the MITS Altair.

The 16-bit address bus of the 8080 allowed it to access 64kB of memory, which was certainly a lot in those days. Those first 'PCs' usually had far less memory than the address bus limited them to. To make use of this available memory, a suitable operating system was required and the OS of choice turned out to be Control Program/Microcomputer-80, aka CP/M-80, created by Digital Research Inc.

Intel 8086, 8088 and 8087

1978 was a pivotal time for Intel when they released the 16-bit 8086. This pure CISC microprocessor was a huge evolutionary step over previous chips. It was assembled from 29000 transistors, ran at 4.77MHz, had a 16-bit data bus, and with 20-bit address pathways could access a whopping one megabyte of memory! This was a huge leap over the 8080 which, with its 16-bit address bus, could only see 64kB.

Furthermore, the 8086 instruction set proved so complete and robust that is has remained the core of instruction sets in all CPUs for the last twenty years!

Intel 8086

One year later, Intel released a variant of the 8086 called the 8088. While still running at the same clockspeed and with the same capacity for memory addressing, this processor had only an 8-bit data bus. At that time, 8 bit architecture was prevalent, while 16 bit was not.

Intel 8088

It turned out that releasing this 'cut-down' version was a very good move by Intel. The 8088 was subsequently chosen by IBM to be the core of their IBM PC. This PC proved extremely popular, immediately making Intel and IBM two of the largest companies in the world. The IBM PC set the standard for PCs to come, and that standard has remained in place, to some degree, until today. (Although that is not necessarily a good thing...)

Compared to modern CPUs, these chips were pretty simple. There was only one integer unit (IU), and therefore no such thing as superscalar architecture. There was no floating point unit (FPU), so floating point numbers (i.e. numbers with a decimal portion) had to be manipulated by splitting them up and using integer operations only. To circumvent this problem, an optional math coprocessor (the 8087) could be plugged into another socket on the motherboard if desired. The 8087 contained an FPU dedicated to performing what are now known as x87-type instructions. Number-crunching programs that could make use of the co-processor were estimated to run some ten times faster than when using the 8086 alone. It's also worth mentioning that the clever architecture of the 8087 allowed it to be used with both the 8-bit 8088 and the 16-bit 8086.

There were no such things as level 1 or level 2 cache and no multistage pipeline. These all came later, so keep reading to find out when. (If you don't know what these terms mean, please refer to the CPU Optimisation section.)

One last useful piece of information for you... Ever heard people talk about real mode and protected mode? These terms refer to the memory mode within which a CPU is operating. CPUs such as the 8086 and 8088 could only run in real mode, or more accurately, the real mode segmented model. Essentially, this model came about to allow these CPUs, with their 20-bit address buses, to be able to mimic the 16-bit addressing of the older 8080 chips. (In short, for those that care... It worked by carving the 1MB of addressable memory into 64kB segments. Any segment can be expressed as a 16 bit address, while the actual address of your byte in that segment is expressed as an offset. Both segment and offset are 16 bit values, and this combination can be used to describe any 20 bit address.) In this way, many existing programs written using the CP/M-80 OS could be easily ported onto the 8086. The downside of this approach is that real mode stuck around much longer than it should have, making memory addressing beyond 16 bits much more difficult than it should have been.

Intel 80286

The 286 introduced in 1982, was a significant step up from the 8088/8086. It had a 16-bit data bus like the 8086, but also had a 24-address bus, permitting it to access 16MB of memory. However, the 286 had to be backward compatible with programs written for the 8088/8086, i.e. programs that have only 1MB of memory available. For this reason, only programs that need more than 1MB have access to it, and this is by way of what is called the A20-Gate.

Intel 80286 The primary reason that the 286 was such a major jump from the previous generation is that it can make use of protected mode, a reference to a particular state the CPU can executes instructions within. Protected refers to the fact that more than one program can exist simultaneously, sharing available memory (and hardware), but not "stepping on each others' toes" - hence protected from each other. Thus the 286 was the first chip to support true multitasking, i.e. where more than one program can run at the same time. The previous generation of chips could only run in real mode. Indeed, all generations of CPUs are in the real mode state when a PC is initially booted. They will only switch to protected mode if the appropriate software is available.

However, during the era of the 286, DOS was still the primary operating system. The versions of DOS available at the time did not offer much support for memory above 1MB and could not multitask. Therefore not much use was made of the availability of protected mode and the 286 was used more as a super-8086. Even so, the IBM AT ('Advanced Technology') which featured the 286 processor, proved to be a very popular model.

More on protected mode

As already mentioned, protected mode prevents applications from trying to grab the same system resources. The operating system is the only software that has full control over the CPU. All applications should be controlled by the OS, which allocates memory and system resources as required.

In fact, it is the CPU itself that enforces protected mode. The x86 CPUs can run applications in one of "four rings of protection". The innermost ring is ring 0 and this is where the OS sits. All applications should run in ring 3 which has restricted access to system resources, as dictated by the CPU and the operating system. As yet, not much use is made of the other two rings.

Now move on to look at 32 bit CPUs.