[1850] in cryptography@c2.net mail archive
Re: Signature Certificates
daemon@ATHENA.MIT.EDU (John Kelsey)
Mon Nov 17 14:50:23 1997
From: "John Kelsey" <kelsey@plnet.net>
To: "Perry's Crypto List" <cryptography@c2.net>
Date: Mon, 17 Nov 1997 06:55:23 -0600
-----BEGIN PGP SIGNED MESSAGE-----
[ To: Perry's Crypto List ## Date: 11/17/97 ##
Subject: Re: Signature Certificates ]
>Date: Fri, 14 Nov 1997 14:36:13 -0600
>To: Larry Layten <larry@ljl.com>, cryptography@c2.net
>From: Rick Smith <smith@securecomputing.com>
>Subject: Re: Signature Certificates
>At 4:01 PM -0600 11/13/97, Larry Layten wrote:
>> ...
>>I really don't like the idea of allowing my signature to be used
>>for anything other than a security product that specifically
>>allows <me> to sign something. Hence, I really don't want
>>to give a general purpose communications routine or a Java
>>enabled browser to be using it without telling me each time
>>that I am signing something -- which makes it unusable
>>for authentication purposes. ????
>>
>>Is this where I need an attribute certificate that identifies
>>my PC, not me?
>My own opinion is that Yes, you will need separate
>certificates for separate purposes, and that "separate
>purposes" includes their use in environments with greater or
>lesser risk of abuse. Today, for example, one could use a
>different credit card for Internet purchases than you use
>for other purposes.
First, let me put in a nitpick: You need separate keys and
certificates for different functions. It's not enough to
have separate certificates, since if an attacker has your
contract-signing key, he probably doesn't need your
contract-signing certificate to use it to defraud you. I am
almost certain that you meant ``certificate and key'' where
you said ``certificate,'' but I think the distinction is
worth making.
Beyond the obvious reasons for this (if I write a virus to
take control of your desktop machine, I shouldn't be able to
sign your name to a contract for a million dollars), there
are also cut-and-paste attacks, and the chosen-protocol
attack, which Dave Wagner, Bruce Schneier and I published at
this year's Security Protocols Workshop. (There's a copy of
the paper available at http://www.counterpane.com ).
The basic idea behind our attack is this: Suppose you're
using a given cryptographic key for some high-value purpose,
such as signing multi-million dollar contracts. I can
always write a new ("chosen") protocol which will carry out
some other legitimate purpose (such as verify that you're
over 18 before granting you access to a website), such that
if you run the chosen protocol with me using the same key as
your contract-signing protocol, I can end up with a signed
contract from you, while you think you've just proved to me
that you were old enough to see the Playboy website. This
protocol doesn't need to be obviously flawed (like giving me
a signing oracle) to work. It's possible to come up with
required design principles for protocols that, if followed,
appear to make this attack unworkable. The best defense,
however, is simply to use a different keypair for each
different function. (This is a very short summary--see the
paper for details.)
>But I think you're also touching on an emerging political
>issue: are programmers supposed to control users' desktops
>or are users supposed to control them?
This seems to come up a lot in copyright protection and GAK
systems, where the user's machine is expected to
more-or-less act as an agent of some third party, who
doesn't *want* the user to control, or even understand,
what's going on in all cases.
>I tend to look at successful security as a set of properties
>that give the user a consistent lack of surprise in
>important matters ("don't steal my credit card number, don't
>wipe out my copies of X-Files .gifs, don't put checks in my
>Quicken output queue, etc.") and, equally important,
>confidence that this lack of surprising events will
>continue. Our confidence *should* be shaken each time we
>introduce new software into our desktop. After all, that's
>what we learn from software "upgrades." Yet this is the
>brave new world promised us by Java and ActiveX.
Yes. I always wonder this about those ``remote upgrade
over the internet'' utilities I keep hearing about. When my
computer changes its behavior over time, I want to know why!
Silently upgrading my word processor to the latest version
(just like the previous version, but with more bugs, and
requiring twice as much RAM) is a bad idea. And then there
are all the *wonderful* security vulnerabilities of such a
system--if one becomes widespread, it is literally the
fattest target in the world for virus writers, people
wanting to make political statements, companies wanting to
force upgrades from their competitors' products to their
own, etc. This strikes me as being way too far over on the
``convenience'' side of the convenience/security tradeoff.
Still, lots of people seem to like the tradeoff.
>Rick.smith@securecomputing.com
--John Kelsey, Counterpane Systems, kelsey@counterpane.com
PGP 2.6 fingerprint = 4FE2 F421 100F BB0A 03D1 FE06 A435 7E36
-----BEGIN PGP SIGNATURE-----
Version: 2.6.2
iQCVAwUBNHBalkHx57Ag8goBAQHlNQP9EqycM+P966+RQnV2/2e/mijNVnn1G3Lq
G3gIr9hPwkrb2JTxiepw/RsAHcgUVXNoIHSMTklzU2XoJtzEUPbMWzoOR7b8cCN3
HL+7qz1RU1cHEce67n/2pjF2YgrBsF6kElyEhnaRh5RbItT1S0OqPea29ESJwZfC
u14ZjyYYfHU=
=Ifkl
-----END PGP SIGNATURE-----
--John Kelsey, Counterpane Systems, kelsey@counterpane.com
PGP 2.6 fingerprint = 4FE2 F421 100F BB0A 03D1 FE06 A435 7E36