Trusted Computing
Trusted computing (TC) refers to a controversial technology from the Trusted Computing Group (TCG) which is claimed to allow computers and servers to offer improved computer security and protection from computer viruses and the like.
Many computer security experts disapprove of trusted computing, because it allows computer manufacturers and software authors increased control to monitor and dictate what users are able to do with their computers. Microsoft refers to "TC" as trustworthy computing; Intel calls it safer computing; Richard Stallman and some other critics suggest the backronym treacherous computing.
For policy purposes, trusted computing is a form of implementation of a trusted system.
Synopsis
The basic system concepts in trusted computing are:
- Unique machine/CPU is identified using certificates;
- Encryption is performed in the hardware;
- Data can be signed with the machine's identification;
- Data can be encrypted with the machine's secret key.
The nature of trust
Trust means something different to security experts than the meaning laypersons often assign. For example, the United States Department of Defense's definition of a trusted system is one that can break your security policy; i.e., "a system that you are forced to trust because you have no choice." Cryptographer Bruce Schneier observes "A 'trusted' computer does not mean a computer that is trustworthy." According to those definitions a video card is trusted by its users to correctly display images. Trust in security parlance is always a kind of compromise or weakness—sometimes inevitable, but never desirable as such. As another analogy, your best friend cannot share your medical records, since he or she does not have them. On the other hand, your doctor does, and can (legal issues with doing so aside). It is possible that you trust your doctor and think he or she is a great person; it's equally possible that there is only one doctor in your town, so you are forced to trust him or her.
The main controversy around trusted computing is around this meaning of trust. The Trusted Computing group describes "Technical Trust" as "an entity can be trusted if it always behaves in the expected manner for the intended purpose." Critics characterize a trusted system as a system you are forced to trust rather than one which is particularly trustworthy.
Critics of trusted computing are further concerned that they are not able to look inside trusted computing hardware to see if it is properly implemented or if there are backdoors which poses a serious risk to national security, company secrets, and privacy. The trusted computing specifications are open and available for anyone to review, but the actual implementations are not. As well, many are concerned that cryptographic designs and algorithms become obsolete. This may result in the forced obsolescence of TC-enabled computers. For example, recent versions of trusted computing specifications added, and require, the AES encryption algorithm.
While proponents claim that trusted computing increases security, critics counter that not only will security not be helped, but trusted computing will facilitate mandatory digital rights management (DRM), harm privacy, and impose other restrictions on users. Trusting networked computers to controlling authorities rather than to individuals may create digital imprimaturs. Contrast trusted computing with secure computing in which anonymity, not disclosure, is the main concern. Advocates of secure computing argue that the additional security can be achieved without relinquishing control over computer from users to superusers.
Proponents of trusted computing argue that privacy complaints are baseless since consumers will retain a choice between systems, based on their individual needs. Moreover, trusted computing advocates claim that some needs require changes to the current systems at the hardware level to enable a computer to act as a trusted client.
Background
A variety of controversial initiatives fall under the heading of trusted computing: Microsoft is working on a project called NGSCB. An industry consortium including Microsoft, Intel, IBM, HP and AMD, have formed the Trusted Computing Group (TCG), designing a trusted platform module (TPM). Intel is working on a form called LaGrande Technology (LT), while AMD's is called Secure Execution Mode (SEM), also known as Presidio. But essentially, there are proposals for four new features provided by new hardware, which would require new software (including new operating systems and applications) to be taken advantage of. Each feature has a different reason, although they can be used together. The features are:
- Secure I/O
- Memory curtaining
- Sealed storage
- Remote attestation
Secure I/O
Secure input and output (I/O) is attested to by using checksums to verify that the software used to do the I/O has not been tampered with. Malicious software injecting itself in this path could be identified.
This would not be able to defend against a hardware based attack such as a key capture device physically between the user's keyboard and the computer.
Memory curtaining
Memory curtaining has the hardware keep programs from reading or writing each other's memory (the space where the programs store information they're currently working on). Even the operating system doesn't have access to curtained memory, so the information would be secure from an intruder who took control of the OS.
Sealed storage
Sealed storage protects private information by allowing it to be encrypted using a key derived from the software and hardware being used. This means the data can be read only by the same combination of software and hardware. For example, users who keep a private diary on their computer do not want other programs or other computers to be able to read it. Currently, a virus can search for the diary, read it, and send it to someone else. The Sircam virus did something similar to this. Even if the diary were protected by a password, the virus might run a dictionary attack. Alternately the virus might modify the user's diary software to have it leak the text once he unlocked his diary. Using sealed storage, the diary is securely encrypted so that only the unmodified diary program on his computer can read it.
Remote attestation
Remote attestation allows changes to the user's computer to be detected by him and others. That way, he can avoid having private information sent to or important commands sent from a compromised or insecure computer. It works by having the hardware generate a certificate stating what software is currently running. The user can present this certificate to a remote party to show that their computer hasn't been tampered with.
Remote attestation is usually combined with public-key encryption so that the information sent can only be read by the programs that presented and requested the attestation, and not by an eavesdropper.
To take the diary example again, the user's diary software could send the diary to other machines, but only if they could attest that they were running a secure copy of the diary software. Combined with the other technologies, this provides a more secured path for the diary: secure I/O protects it as it's entered on the keyboard and displayed on the screen, memory curtaining protects it as it's being worked on, sealed storage protects it when it's saved to the hard drive, and remote attestation protects it from unauthorized software even when it is used on other computers.
Criticism
Opponents of trusted computing point out that the security features that protect computers from viruses and attackers also restrict the actions of their owners. They argue that this makes new anti-competitive techniques possible, which may hurt the people who buy trusted computers.
The acclaimed Cambridge cryptographer Ross Anderson has great concerns that "TC can support remote censorship. In general, digital objects created using TC systems remain under the control of their creators, rather than under the control of the person who owns the machine on which they happen to be stored (as at present). So someone who writes a paper that a court decides is defamatory can be compelled to censor it—and the software company that wrote the word processor could be ordered to do the deletion if she refuses. Given such possibilities, we can expect TC to be used to suppress . . . writings that criticise political leaders. He goes on to state that:
- " . . . software suppliers can make it much harder for you to switch to their competitors' products. At a simple level, Word could encrypt all your documents using keys that only Microsoft products have access to; this would mean that you could only read them using Microsoft products, not with any competing word processor.
- "The . . . most important, benefit for Microsoft is that TC will dramatically increase the costs of switching away from Microsoft products (such as Office) to rival products (such as OpenOffice). For example, a law firm that wants to change from Office to OpenOffice right now merely has to install the software, train the staff and convert their existing files. In five years' time, once they have received TC-protected documents from perhaps a thousand different clients, they would have to get permission (in the form of signed digital certificates) from each of these clients in order to migrate their files to a new platform. The law firm won't in practice want to do this, so they will be much more tightly locked in, which will enable Microsoft to hike its prices."
Anderson summarizes the case by saying "The fundamental issue is that whoever controls the TC infrastructure will acquire a huge amount of power. Having this single point of control is like making everyone use the same bank, or the same accountant, or the same lawyer. There are many ways in which this power could be abused."
Users can't change software
In the diary example, sealed storage protects the diary from malicious programs like viruses, but it doesn't distinguish between those and useful programs, like ones that might be used to convert the diary to a new format, or provide new methods for searching within the diary. A user who wanted to switch to a competing diary program might find it would be impossible for that new program to read the old diary, as the information would be "locked in" to the old program. It could also make it impossible for the user to read or modify his diary except as specifically permitted by the diary software. If he were using diary software with no edit or delete option then it could be impossible to change or delete previous entries.
Remote attestation could cause other problems. Currently web sites can be visited using a number of web browsers, though certain websites may be formatted (intentionally or not) such that some browsers cannot decipher their code. Some browsers have found a way to get around that problem by emulating other browsers. For example, when Microsoft's MSN website briefly refused to serve pages to non-Microsoft browsers, users could access those sites by instructing their browsers to emulate a Microsoft browser. Remote attestation could make this kind of emulation irrelevant, as sites like MSN could demand a certificate stating the user was actually running an Internet Explorer browser.
Users don't control information they receive
One of the early motivations behind trusted computing was a desire by media and software corporations for stricter Digital Rights Management (DRM): technology to prevent users from freely sharing and using potentially copyrighted or private files without explicit permission. Microsoft has announced a DRM technology that it says will make use of trusted computing.
Trusted computing can be used for DRM. An example could be downloading a music file from a band: the band could come up with rules for how their music can be used. For example, they might want the user to play the file only three times a day without paying additional money. Also, they could use remote attestation to only send their music to a music player that enforces their rules: sealed storage would prevent the user from opening the file with another player that did not enforce the restrictions. Memory curtaining would prevent the user from making an unrestricted copy of the file while it's playing, and secure output would prevent capturing what is sent to the sound system.
Once digital recordings are converted to analog signals, that (perhaps degraded) signal could be recorded by conventional means, such as by connecting an audio recorder, instead of speakers, to the card, or by recording the produced sound with a microphone.
Without remote attestation, this problem would not exist. The user could simply download the song with a player that did not enforce the band's restrictions, or one that lets him convert the song to a normal "unrestricted" format such as MP3.
Users don't control their data
If a user upgrades his or her computer, sealed storage could prevent them from moving their music files to the new computer. It could also enforce spyware, with music files only given to users whose machines attest to telling the artist or record company every time the song is played. In a similar vein, a news magazine could require that to download their news articles, a user's machine would need to attest to using a specific reader. The mandated reader software could then be programmed not to allow viewing of original news stories to which changes had been made on the magazine's website. Such "newest version" enforcement would allow the magazine to "rewrite history" by changing or deleting articles. Even if a user saved the original article on his computer, the software might refuse to view it once a change had been announced.
Loss of Internet Anonymity
Because a TC-equipped computer is able to uniquely attest to its own identity, it will be possible for vendors and others who possess the ability to use the attestation feature to zero-in on the identity of the user of TC-enabled software with a high degree of certainty.
Such a capability is contingent on the reasonable chance that the user at some time provides user-identifying information, whether voluntarily or indirectly. One common way that information can be obtained and linked is when a user registers a computer just after purchase. Another common way is when a user provides identifying information to the website of an affiliate of the vendor.
While proponents of TC point out that online purchases and credit transactions could potentially be more secure as a result of the remote attestation capability, this may cause the computer user to lose expectations of anonymity when using the Internet.
Critics point out that this could have a chilling effect on political free speech, the ability of journalists to use anonymous sources, whistleblowing, political blogging and other areas where the public needs protection from retaliation through anonymity.
Proposed owner override for TC
All these problems come up because trusted computing protects programs against everything, even the owner. A simple solution to this is to let the owner of the computer override these protections. This is called Owner Override, and it is only currently outlined as a suggested fix.
When you activate Owner Override, the computer will use the secure I/O path to make sure that you are physically present and actually the owner. Then it will bypass the protections. So, with remote attestation, you can force the computer to generate false attestations — certificates that say you're running Internet Explorer, when you're really running Opera. Instead of saying when your software has been changed, remote attestation will say when the software has been changed without your permission.
While it would seem that the idea of Owner Override would be met with praise, some Trusted Computing Group members have instead heralded it as the biggest potential downfall of the TC movement. Owner Override defeats the entire idea of being able to trust other people's computers, remote attestation. Owner Override continues to offer all of the security and enforcement benefits to an owner on his own machine, but loses any ability to ensure another owner cannot waive rules or restrictions on his own computer. Once data is sent to someone else's computer, whether it is a diary, a DRM music file, or a joint project, that person controls what security, if any, their computer will enforce on their copy of those data.
External links
- Trusted Computing Group (TCG) — Trusted computing standards body, previously known as the TCPA.
- 'Trusted Computing' Frequently Asked Questions — Anti-TC FAQ by Cambridge University security director and professor Ross Anderson.
- TrouSerS - The open-source TCG Software Stack with a good FAQ
- TCPA Misinformation Rebuttal and Linux drivers from the IBM Watson Research - Global Security Analysis Lab
- Experimenting with TCPA/TCG Hardware, Or: How I Learned to Stop Worrying and Love The Bear. Technical Report TR2003-476, CS, Dartmouth College. December 2003. and the "Enforcer" Linux Security Module
- Next-Generation Secure Computing Base (NGSCB) — Microsoft's trusted computing architecture
- Palladium and the TCPA — from Bruce Schneier's Crypto-Gram newsletter.
- Against-TCPA
- Interesting Uses of Trusted Computing
- Can you trust your computer? — essay by the FSF
- Technically Speaking blog's "Microsoft Meeting" article -- Explains "sealed storage" in more depth than this article, yet without going into all the mathematics
- Trust Computing: Promise and Risk, a paper by EFF (Electronic Frontier Foundation) staff technologist Seth Schoen.
- Microsoft's Machiavellian manoeuvring (ZDNet UK) by Bruce Schneier
- LAFKON - A movie about Trusted Computing. Video opposed to Trusted Computing
- The Trusted Systems Problem: Security Envelopes, Statistical Threat Analysis, and the Presumption of Innocence," Homeland Security - Trends and Controversies, IEEE Intelligent Systems, Vol. 20 No. 5, pp. 80-83 (Sept./Oct. 2005) (discussing trusted systems more generally as a security strategy for homeland security). See also, The Trusted Systems Project, a part of the Global Information Society Project (GISP), a joint research project of the World Policy Insitute (WPI) and the Center for Advanced Studies in Sci. & Tech. Policy (CAS) (The Trusted Systems Project examines the policy implications of using trusted systems strategies for security or social control).