Jump to content

Itanium

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Maury Markowitz (talk | contribs) at 09:26, 30 August 2002 (better description of RISK vs. VLIW). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

The Itanium 64-bit microprocessor architecture has been developed jointly by Hewlett-Packard and Intel.

Itanium represents the goal of producing a "post-RISC era" high performance microprocessing architechture. It is referred to as a VLIW architechture, standing for Very Long Instruction Word. Its native instruction set is IA-64, but it can slowly run x86 code in a firmware emulation mode, and has hooks for PA-RISC family migration.

At the most basic level the Itanium design is similar to RISC. That is, the core logic consists of a small set of instructions that are designed to be able to run very fast. The CPU includes several of these cores, and can thus run a number of instructions in parallel, a design known as a superscalar processor.

Where the Itanium breaks with current RISC design philosophy is in how it feeds instructions into those core units. In a traditional design a complex decoder examines each instruction as they flow through the pipeline and sees which can be fed off to operate in parallel. This is actually a very complex task, one that the circuitry cannot guarentee to "get right" in any reasonable amount of time. Instead the CPU has "good enough" circuitry for this task, but when it does get it wrong there is a huge performance hit.

Itanium instead relies on the compilers used to generate the code for this task. Compilers already tend to take a while to run, so this additional task is not much to ask of them and they can spend a considerable amount of time on this task. When the compiler is complete in examining the code, it bundles the instructions that it knows can be run in parallel into a single large bundle, which is where the term "very long" comes into play.

This can greatly simplify the chip design itself. Now the decoder stage has considerably less logic, it basically has to simply "open up" the instruction package and send each one to the proper core. The chip space can then be spent on other tasks.

Of course there is always a downside. The downside in this case is that a running program's behaviour is not always obvious in the code used to generate it. That means that it is possible for the compiler to "get it wrong", perhaps (in theory) even more often than the same logic placed on the CPU. Thus the design relies heavily on the performance of the compilers.

Performance also suffers if the user runs the binary code on a processor with a different microarchitecture to the one for which the binary was compiled; this primarily affects proprietary software that is distributed in a binary-only form.

The original idea behind this new architecture was cutting microprocessor hardware complexity by increasing compiler software complexity. As of 2002, there are two main problems with this idea: one is that Itanium is seemingly as complicated as many more traditional designs. The other, more important, one is that while dynamic scheduling in hardware has been done many times, designing an Itanium-friendly compiler is a new art.

Software support for the Itanium is a work in progress, but Linux is a shipping platform, and work on NetBSD will begin when Itanium-based hardware ships. Proprietary operating systems being ported include include Microsoft Windows, HP/UX, Tru64, OpenVMS, and AIX. It remains to be seen how they overcome the limitations of microarchitecture-specific scheduling.

The Compaq / DEC Alpha, the HP PA-RISC family, and the SGI MIPS UNIX lines will eventually be retired in favor of Itanium hardware. With the exception of SGI's IRIX, the OS's running on these machines will remain similar.

So far there are no concrete plans to introduce Itanium into the home PC market.

As of 2001, Itanium has returned mostly disappointing performance numbers. The second generation Itanium slated for release in 2002 was predicted to be much more impressive. The second generation Itanium chips were launched in July 2002.

put benchmark comparisons of Itanium 2 vs. Pentium IV vs. Alpha etc. here

Critics of the Itanium processor have labeled it the "Itanic". Intel will be in a difficult position if the Itanium processor is a disappointment, as the need for 64-bit architecture in commodity servers is now pressing, and the need for a 64-bit architecture in personal computers is only a few years away.

A real archtectural threat to Intel now exists in the form of AMD's x86-64 architecture. AMD's x86-64 follows Intel's earlier behavior of extending a single architecture, first from the 8-bit 8088 to the 16-bit 8086, then from 16-bits to the 32-bit 80386 and beyond, without ever removing backwards compatibility. The x86-64 architecture extends the 32-bit x86 architecture by adding 64-bit registers, with a full 32-bit and 16-bit compatibility modes for earlier software. There are now pre-release versions of both Linux and the Microsoft Windows operating systems available for x86-84, together with early test silicon.

The failure of Itanium would also have a substantial impact on manufacturers such as HP who have announced that they will abandon their proprietary CPU architectures (such as the Alpha architecture CPU in the case of HP) for the Itanium.


External links: