[4105] in cryptography@c2.net mail archive
Re: Cryptoprocessors and reverse engineering
daemon@ATHENA.MIT.EDU (Dave Emery)
Sat Jan 30 13:00:57 1999
Date: Sat, 30 Jan 1999 01:16:50 -0500
From: Dave Emery <die@die.com>
To: Markus Kuhn <Markus.Kuhn@cl.cam.ac.uk>
Cc: cryptography@c2.net
Reply-To: die@die.com
Mail-Followup-To: Markus Kuhn <Markus.Kuhn@cl.cam.ac.uk>,
cryptography@c2.net
In-Reply-To: <E106GK8-00004u-00@heaton.cl.cam.ac.uk>; from Markus Kuhn on Fri, Jan 29, 1999 at 03:59:10PM +0000
On Fri, Jan 29, 1999 at 03:59:10PM +0000, Markus Kuhn wrote:
>
> I think we are talking about *very* different things here. You talk
> about decompilation in order to steal ("liberate" ;-) know-how and
> technology, while we were more concerned about full security evaluations
> without the cooperation of the vendor.
>
I have had personal experiance some years ago disassembling and
reverse engineering DEC and Sun diagnostic firmware and low level OS
code so as to better understand what exactly boot time diagnostic tests
were testing so that cheaper and/or more cost effective independantly
designed third party memory and peripheral controller hardware could be
made to reliably pass the diagnostics and work in these systems (getting
the OS to work in normal mode with our hardware was almost always very
straightforward and rarely required reverse engineering of code). It
was not clear that this was exactly a conventional theft, since the
customers who bought our products got a better deal than the grossly
overpriced stuff from the original manufacturers and the relatively
picky and minute details of what exactly the diagnostics did in special
cases hardly constituted an important proprietary technology, nor did we
use any of the reverse engineered diagnostic code, algorithms or
strategies directly in our products. We did the reverse engineering
solely to make our stuff compatible with theirs and compete with them
by offering our independantly engineered products at a lower price.
One can of course argue about whether such reverse engineering
is fair and ethical ("steal" is a very loaded word), but one can also
raise serious questions about deliberate restraint of competition and
protection of proprietary markets in which exorbitant prices were
charged for inferior hardware. Certainly while the OEMs may have suffered
erosion of their lucrative markets for add-ins, the world in general
got much lower memory prices as a result of our (and other's) competition.
We did see evidence from time to time that tests had been
deliberately added to later revisions of the code to detect our hardware
and fail to work with it, and it is clear that some of the obscurity of
what was tested and how it was tested was included primarily to make it
harder to design compatible hardware.
One hates to see a new technology introduced (encrypted bus
processors) that could go further to make developing compatible
replacement or add on products essentially impossible if an OEM chooses
to make it so. One can envision products in which the sole important
"trade secret" technology protected by the encryption is the nasty magic
trick code designed to make it impossible to build compatible products
that will work reliably with the OEM gear or software - the test for the
competitors product and fail code. This means that even careful and
complete reverse engineering and independant design and implementation
of a compatible product won't work because of special trick routines or
tests hidden by the crypto that have been added for no purpose but to
defeat reverse engineers. And given past history in which many such
tricks have been tried for just this purpose, it seems reasonable to
assume that many will be in the future.
After all given the black hole that crypto code will provide,
why not incorperate such nasties since determining what they are would
be very very difficult and expensive and putting them in very cheap,
and the chance of one's competition discovering them and neutralizing
them relatively low given the black hole and the absolute unpatchablity
of the encrypted code.
Of course this crypto cpu technology is being introduced exactly
as a mechanism to allow creaters of technology to make it impossible to
interface with it in ways they don't want (namely run it on unlicensed
cpus or obtain access to unauthorized copies of proprietary intellectual
property) so using it to ban reverse engineering for the purpose of
creating legitimate compatible products is just a natural follow on...
> If the goal of the investigation is however to exclude with high
> probability that the vendor or someone in the development and
> distribution pipeline has built a backdoor into the code that allows
> unauthorized violation of documented security mechanisms, you do *not*
> know what you are looking for. There are zillions of ways and places to
> hide backdoors and I have seen proposals for extremely clever ones. You
> can't usually narrow down the search to a limited area, because the trap
> door could be hidden almost anywhere.
Markus's points are *very* well taken, but his solution does have
the frightening property that even if one has some reason (rumors and
leaks about the possible existance of a certain backdoor or trojan,
mysterious enciphered network traffic, indications of unusual behavior,
hints of leakage of secret information etc ) to suspect that a particular
specific backdoor exists and is being used, there is no easy and
effective way to find it or prove that it exists. This has the strong
side effect of making it *much* more attractive for all the nasty
creatures from the shadows - afraid of light - to sneak in many *more*
backdoors and trojans, knowing, as they would, that even if there
existed substantial suspicion that a particular backdoor or trojan
existed it would be nearly impossible to find and document it.
And the very small software companies and grassroots developers
who he identifies as the big beneficiaries of this technology are
unfortunately an extremely fertile ground for someone planning on
planting a trojan - they are often obscure with little reputation to
lose and very shallow pockets and marginal capitalization and readily
subject to corruption and pressure; and hidden alliances with larger
shadowy intelligence and criminal organizations for such usually
privately held companies would be very hard to trace. A big software
company has a lot to lose if someone determines their code contains
deliberate backdoors or trojans and may actually take the possiblity
seriously enough to implement some controls and checks, but who knows
about the little garage shop. And the CIA or NSA or whatever can
afford to fund a lot of such little operations out of the black budgets
they have, as can many of their foreign competitors.
I am sure that the day that Microsoft is sued because someone
snuck a backdoor or trojan into their code and a big company loses a lot
of money as a result is the day that careful review and vetting of
programmers and others in contact with code becomes common. And having
the binary and its execution open enough so one could find and prove the
hole exists goes some distance toward ensuring that that day comes.
>
> So I certainly stay with my assessment that bus-encryption processors
> will not significantly reduce our protection against malicious
> backdoors.
The present situation of 10 million transistor chips and multi
megabyte binaries of applications and 10s of megabyte OS binaries and
DLL libraries does indeed make it effectively impossible to guarantee
that there are no security holes or deliberate backdoors, but it *does*
allow a suitably alarmed and determined user with signficant resources to
determine if a particular rumored trojan or backdoor does or does not
exist. And this provides some quite significant protection, because it
means that anyone exploiting a backdoor has to do so in such total
secrecy that no hint of the backdoor's existance ever leaks out.
--
Dave Emery N1PRE, die@die.com DIE Consulting, Weston, Mass.
PGP fingerprint = 2047/4D7B08D1 DE 6E E1 CC 1F 1D 96 E2 5D 27 BD B0 24 88 C3 18