Trusted Computing
![]() | The neutrality of this article is disputed. |
Trusted computing is a proposal to make Internet commerce less cumbersome and therefore more efficient and commonplace by increasing the security of networked personal computers. The basic system concepts are:
- More positive identification of each client and their operators,
- Secure encryption of all data between the consumer and the point of sale.
The idea is controversial because it is at odds with the concept of secure computing, where anonymity (not disclosure) is the main issue. Internet freedom advocates characterize a "trusted system" more as a system you are forced to trust rather than one which is particularly trustworthy. As described by trusted computing opponents, the new systems would come at a high cost, by trusting networked computers to controlling authorities rather than to individuals. Proponents argue that the needs for more efficient e-commerce outweigh most privacy concerns, and that consumers will still have a choice between systems, based on their individual needs.
The TCG project is known by a number of names. `Trusted computing' was the original one, and is still used by IBM, while Microsoft calls it `trustworthy computing'. Intel has just started calling it `safer computing'. Prior to May 2004 the TCG was known as the TCPA.
Trusted systems advocates claim that their needs require changes to the current systems at the hardware level. Others argue that the additional security referred to by "trusted computing," could be achieved without relinquishing root control over their computer. Trusted computing architects, however, claim that the name means that a computer can be trusted as to its hardware/software configuration. This is a requirement to see this computer as a trusted client.
According to cryptographer Bruce Schneier "A 'trusted' computer does not mean a computer that is trustworthy. The DoD's definition of a trusted system is one that can break your security policy; i.e., a system that you are forced to trust because you have no choice."
Basics
A variety of initiatives fall under the heading of trusted computing: Microsoft is working on a project called NGSCB. An industry consortium including Microsoft, Intel, IBM, HP and AMD, have formed the Trusted Computing Platform Alliance (TCPA), which has a Trusted Computing Group (TCG), designing a Trusted Platform Module (TPM). Intel is working on a form called LaGrande Technology (LT), while AMD's is called Secure Execution Mode (SEM). But essentially, there are proposals for four new features provided by new hardware, which require new software (including new operating systems and applications) to be taken advantage of. Each feature has a different reason, although they can be used together. The features are:
- Secure I/O
- Memory curtaining
- Sealed storage
- Remote attestation
Secure I/O
Secure input and output (I/O) provides a secure path between your computer and devices like your keyboard and screen. This prevents programs from getting access to what's being typed or displayed in other programs. (Malicious programs sometime watch these things to spy on users.)
It will also be able to distinguish a physically-present user from a program impersonating a user and prevent some programs from misleading the user by modifying another program's output.
Memory curtaining
Memory curtaining has the hardware keep programs from reading or writing each other's memory (the space where the programs store information they're currently working on). Even the operating system doesn't have access to curtained memory, so the information would be secure from an intruder who took control of the OS.
Something very similar can be achieved with new software, but doing it in hardware requires less code to be rewritten.
Sealed storage
Sealed storage protects private information by allowing it to be encrypted using a key derived from the software and hardware being used. This means the data can be read only by the same combination of software and hardware.
For example, if you keep a private diary on your computer, you wouldn't want other programs or other computers able to read it. Currently, a virus could search for your diary, read it, and send it to someone else. (The SirCam virus did something similar.) Even if your diary was protected by a password, the virus could try most common passwords—on a modern computer, this is pretty fast. Or the virus could modify your diary software to have it leak the text once you unlocked your diary. With sealed storage, the diary is securely encrypted so that only the unmodified diary program on your computer can read it.
Remote attestation
Remote attestation allows changes to your computer to be detected by you and others. That way, you can avoid having private information sent to or important commands sent from a compromised or insecure computer. It works by having the hardware generate a certificate stating what software is currently running. The user can present this certificate to a remote party to show that their computer hasn't been tampered with.
Remote attestation is usually combined with public-key encryption so that the information sent can only be read by the programs that presented and requested the attestation, and not by snoops listening in on the conversation.
To take our diary example again, your diary software could send your diary to other machines, but only if they could attest that they were running a secure copy of the diary software. Combined with the other technologies, this provides a completely secured path for your diary: secure I/O protects it as it's entered on the keyboard and displayed on the screen, memory curtaining protects it as it's being worked on, sealed storage protects it when it's saved to the hard drive, and remote attestation protects it from unauthorized software even when it is used on other computers.
Drawbacks
Opponents of trusted computing argue that the security features that protect computers from viruses and attackers also protect computers from their owners. This makes new anti-competitive and anti-consumer techniques possible, potentially hurting people who buy trusted computers.
Cambridge Crytographer Ross Anderson has concerns that "TC can support remote censorship. In general, digital objects created using TC systems remain under the control of their creators, rather than under the control of the person who owns the machine on which they happen to be stored (as at present). So someone who writes a paper that a court decides is defamatory can be compelled to censor it - and the software company that wrote the word processor could be ordered to do the deletion if she refuses. Given such possibilities, we can expect TC to be used to suppress . . . writings that criticise political leaders." He goes on to state that
" . . . software suppliers can make it much harder for you to switch to their competitors' products. At a simple level, Word could encrypt all your documents using keys that only Microsoft products have access to; this would mean that you could only read them using Microsoft products, not with any competing word processor.
. . . most important, benefit for Microsoft is that TC will dramatically increase the costs of switching away from Microsoft products (such as Office) to rival products (such as OpenOffice). For example, a law firm that wants to change from Office to OpenOffice right now merely has to install the software, train the staff and convert their existing files. In five years' time, once they have received TC-protected documents from perhaps a thousand different clients, they would have to get permission (in the form of signed digital certificates) from each of these clients in order to migrate their files to a new platform. The law firm won't in practice want to do this, so they will be much more tightly locked in, which will enable Microsoft to hike its prices.
The . . . most important, benefit for Microsoft is that TC will dramatically increase the costs of switching away from Microsoft products (such as Office) to rival products (such as OpenOffice). For example, a law firm that wants to change from Office to OpenOffice right now merely has to install the software, train the staff and convert their existing files. In five years' time, once they have received TC-protected documents from perhaps a thousand different clients, they would have to get permission (in the form of signed digital certificates) from each of these clients in order to migrate their files to a new platform. The law firm won't in practice want to do this, so they will be much more tightly locked in, which will enable Microsoft to hike its prices."
Anderson summarizes the case by saying "The fundamental issue is that whoever controls the TC infrastructure will acquire a huge amount of power. Having this single point of control is like making everyone use the same bank, or the same accountant, or the same lawyer. There are many ways in which this power could be abused."
Users can't change software
Take our diary example from before. Sealed storage protects my diary from malicious programs like viruses, but it doesn't distinguish between those and useful programs, like one that will convert my diary to a new format, or let me search my diary in a new way. If I wanted to switch to a competing diary program, it would be impossible for that new program to read my old diary—my information would be "locked in" to the old program. It also makes it impossible for you to ever read or modify your own diary except as specifically permitted by the diary software. If you are given diary software with no edit or delete option then it is forever impossible for you to change or delete previous entries, short of deleting the entire diary.
Remote attestation provides similar problems. Currently web sites can be visited using a number of web browsers. For example, even when Microsoft's MSN website briefly refused to serve pages to non-Microsoft browsers, users could instruct their browsers to masquerade as a Microsoft browser. Remote attestation makes this kind of masquerading impossible; MSN could demand a certificate stating the user was actually running an Internet Explorer browser.
Thus someone with a popular website could use their popularity to take away freedom from users. Or web sites that depend on ad revenue might require you to prove you're running a browser that hasn't been modified to remove advertisements. This use of remote attestation has nothing to do with increasing security, the stated goal of trusted computing, and only serves other motivations.
The same thing could be done on a larger scale. Microsoft could make its web servers only talk to its web browsers. Or the file server built into Windows could refuse to share files with competing operating systems, like Linux or Mac OS X. Or AOL could insist on license fees from anyone who wanted to create compatible instant messaging software, and instruct their client to refuse to talk to any clients whose authors hadn't paid up.
Users don't control information they receive
One of the early motivations behind trusted computing was a desire to support stricter Digital Rights Management (DRM): technology to prevent users from sharing and using copyrighted or private files without permission. And Microsoft has announced a DRM technology that it says will make use of trusted computing.
All the elements of trusted computing can be used for DRM. Take the example of downloading a music file from Metallica: First, Metallica will come up with some rules for how their music can be used. (For example, they might only want you to play the file three times a day without paying more money.) Then they'll use remote attestation to only send their music to a music player that enforces their rules. Sealed storage prevents you from opening the file with another player that doesn't enforce the restrictions. Memory curtaining prevents you from making an unrestricted copy of the file while it's playing. Secure output prevents you from capturing what's sent to the speakers.
Without remote attestation, you wouldn't have this problem. You could simply download the song with a player that doesn't enforce Metallica's restrictions or one that lets you convert the song to an unrestricted format like MP3.
Users don't control their applications
A similar combination of trusted computing tools can do other bad things. For example, if you upgrade computers, sealed storage can prevent you from moving all your music files to your new computer—instead, you could be forced to buy all the songs again. It can also enforce spyware: the music files would only be given to you if your machine attests that it will tell the artist every time you play the song.
Worse, these technologies can be used for a form of remote control. Imagine that to download news articles from Microsoft's newsmagazine, Slate, you need to attest that you use MS Reader. MS Reader could be programmed not to let you view the news story you downloaded without asking Microsoft if a change has been made. This could allow Microsoft to "rewrite history" by changing or deleting certain articles. Even though you saved the original article on your computer, the software would refuse to let you view it once a change had been announced. This is eerily reminscent of George Orwell's 1984, where the government changed everything ever archived to make it seem like their predictions were always correct.
Proposed owner override for TC
All these problems come up because trusted computing protects programs against everything, even the owner. A simple solution to this is to let the owner of the computer override these protections. This is called Owner Override, and it only currently outlined as a TCPA standard.
When you activate Owner Override, the computer will use the secure I/O path to make sure you're physically present and actually the owner. Then it will bypass the protections. So with remote attestation, you can force the computer to generate false attestations -- certificates that say you're running Internet Explorer, when you're really running Opera. Instead of saying when your software has been changed, remote attestation will say when the software has been changed without your permission.
While it would seem that the idea of Owner Override would be met with praise, some Trusted Computing Group members have instead heralded it as the biggest potential downfall of the TC movement. Owner Override defeats the entire idea of being able to trust other people's computers, remote attestation. Owner Override continues to offer all of the security and enforcment benefits to an owner on his own machine, but loses any ability to ensure another owner cannot waive rules or restrictions on his own computer. Once you send data to someone else's computer, whether it is your diary, a DRM music file, or a joint project, that person controls what security if any their computer will enforce on their copy of that data.