digital signatures : Java Glossary


digital signatures
A way of digitally signing a file or program to ensure it has not been tampered with and that the author is the author claimed. Digital signatures support non-repudiation. A sender can’t later claim he did not send a message or write a program that he actually did. A third party can’t forge a message. Digital signatures however, don’t block snooping. That requires encryption.

Steven C. Den Beste graciously allowed me to take his newsgroup post on digital signatures and work it up into this essay.

This essay is an under-the-hood background piece on how digital signatures work. It is not a how to on what tools to use to sign and verify files.

Digital signatures allow you be confident a file or program came from the author claimed and that the file has not been tampered with. There are various schemes in widespread use: X.509 v3 certificates, Microsoft’s Authenticode ActiveX unforgeable signatures, Microsoft’s J++ cab files, PGP (Pretty Good Privacy) signing, RSA (Rivest, Shamir and Adelman), Lucent and Java’s signed Applets.

In Java, you may relax some of the usual security constraints you would place on wild Applets you randomly encounter surfing the web. For example, you might allow a trusted Applet to write to your local hard disk. Just because a program is digitally signed is no guarantee it is bug free or harmless. It is not rocket science to sign a file. See Daniel Griscom’s FAQ or Ted Landry’s FAQ on how to do it.

This essay will give you a broad overview of how digital signatures work, partly to give you some confidence in them and partly to show you why they are not 100% secure. This essay tends to get a bit too much into the details of why they are secure. I suggest you read the sections in the Java & Internet Glossary on certificates, PGP and X.509 v3 before tackling this.

A digitally signed file or message may or may not be encrypted itself. The signature is just an appendage that cannot be forged. The appendage ensures the file has itself has not been changed and that it came from who it was claimed to be from.


An asymmetric cipher (also known as a public key cipher, or a trapdoor cipher) can be used for digital signing if it has the following characteristics: There are two algorithms (functions) p(x) (the public cipher) and s(x) (the secret cipher) based on keys P and S such that:
  1. Given an arbitrary bitstream x, p(s(x))==x and s(p(x))==x.

    That is just the way mathematicians say that you must have a way to encrypt and decrypt to get precisely back to where you started. You must both be able to encrypt with a secret key and decrypt with a public one and also to encrypt with a public key and decrypt with a secret one.

  2. Knowing the public key P does not permit you to derive the secret key S in any reasonable way. However, knowing S permits you to easily derive P.
These are based on certain mathematical problems which are easy to do one way but hard to do the other way (known as trapdoor problems). The most famous is the prime/composite trapdoor.

It is pretty simple with modern computers to find two arbitrary and very large prime numbers. It is trivially easy to multiply them together yielding a composite number. It is grossly difficult to take the resulting huge composite number and to factor it, yielding the two original prime numbers. Thus the public key P is isomorphic to the composite number and the secret key S is isomorphic to the two primes. In fact, one of RSA’s patented ciphers is based on exactly this and it’s the one used for signature verification.

Certifying Authority

The signature algorithm requires a certifying authority, whose public key is PC (Personal Computer) and whose secret key is SC. You can purchase object-signing certificates from certifying authorities listed under certificates in the Java & Internet Glossary.

They come in various flavours and costs, depending on how much checking they do on you and what you intend to use them for e.g. signing files for Internet Explorer, Solaris, email, or posting to newsgroups. (The public key PC has to be built into the browser/whatever when you receive it; this is the point of biggest risk, but only to the degree that like any native program you have to be careful where you get it. But if I download the browser from Microsoft’s website, the risk is very low that I’m actually getting a phony version with a bogus path and key for the certification authority. It would require someone to get into the DNS (Domain Name Service) database (the database used to convert web domain names to dotted quad numerical addresses) and redirect accesses intended for Microsoft to some other website which looked enough like MS’s to fool you-- and for it never to be discovered and publicized afterwards. I consider that risk to be vanishingly small. Redirection at the DNS is conceivable [it’s happened] but I don’t believe it would not be noticed. The guy who did this is currently being prosecuted.

When an Applet is created, the vendor creates a secret key SV and a public key PV. A large checksum c (something like a 128-bit CRC (Cyclic Redundancy Check) ) gets calculated for the Applet and it is then enciphered using the secret cipher sv(c) yielding a bit stream c' which anyone can decipher using pv(c') to yield again the CRC c. This is only true if they have the ability to get the correct public key PV and know it’s right.

How A Digital Signature Lets You run an Applet

That’s where the certification authority comes in. The exact sequence of events goes as follows:
  1. The user’s browser downloads the Applet, including the enciphered checksum c' and a vendor ident value.
  2. The user’s browser contacts a certification authority for which PC (the certification authority’s public key) is known. The user’s browser requests a copy of the info packet for that vendor ident. In a less secure system, this step may be bypassed. The factory-supplied information in the browser is trusted as up to date.
  3. The certification authority sends the info packet back, enciphered using sc().
  4. The user’s browser deciphers the info packet using pc() and thus knows that the info packet was not forged. All information in it is as trusted as the certification authority is.
  5. Part of the info packet is the true name of the vendor, who registered with the certification authority. Part is his public key PV.
  6. The user’s browser calculates the same checksum that the vendor was supposed to calculate.
  7. The user’s browser deciphers the enciphered checksum which was included with the file, using PV gotten from the certification authority.
  8. If they do not match, the file is corrupt and will not be run.
  9. If they do match, the user is prompted with the true identity of the vendor (as sent by the certification authority) and asked whether the vendor is trusted.
  10. If the user says yes, then the app is run.
Now it’s not so much that this system can’t be beaten, as that it can’t be beaten for long, surreptitiously. Anyone who manages to break this (and it’s not easy) will be found and punished. (It’s a violation of federal law to write and distribute deliberately harmful software. [No, bugs in Microsoft code don’t count.] Many other countries have similar laws.)

Passwords: Establishing Identity

The problem with passwords is they must be passed back and forth. They are susceptible to snooping. Trapdoor encryption allows you to prove you know a secret, without actually revealing that secret. You don’t have to actually transmit the password to prove you are who you say you are. Here is how it works: Party A sends a random challenge string to party B. Party B encrypts the string with its private key. Party B sends it back to party A. Party A decodes it with B’s public non-secret key. If the decrypted string is the same as originally sent, then B has established their identity. Only a holder of B’s private key could perform that magic feat. Even if evil party C snooped on the exchange, they can’t use what they discovered to spoof being party B in future.


What if two parties on the Internet want to have a private conversation, but they have never talked before? They can exchange public keys and use those public keys to encrypt data for the other. Only the holder of the corresponding private key can decrypt the messages. Unfortunately, this sort of encryption is extremely slow. So it is used only to exchange secret keys for a faster encryption technique that takes over for the remainder of the session. The most common way of doing this is called Diffie-Hellman. Diffie-Hellman is susceptible to a man-in-the-middle attack, but not to plain snooping.

Certification Authority’s Security Measures

Certification authorities take their jobs seriously and they are careful about what they do. So, for instance, they won’t let you register with a name which might easily be mistaken for someone else’s. (So you can forget about registering as Micorsoft; it ain’t gonna happen.) There are various levels of check the signing authority may do to prove the author really exists, the lowest (and cheapest) being that they check if they can send and receive email from him, the next that his credit card does not bounce.

If you register and then deliberately distribute a harmful program, it won’t take long for someone to connect their grief with your app. Word will get back to the certifying authority who will do two things:

  1. They’ll instantly stop certifying you, which means that thereafter anyone trying to load your app will fail. (In the process above, step 4 becomes a Nyet response and the process stops.)
  2. They’ll try to verify the failure and then will contact the proper authorities. And you won’t be able to plead innocent, either because it will be possible to prove mathematically that the harmful feature was in your app when it left your hands!

Hacking To Defeat the System

Well, you could hack the app and then change the ident on it to your own. In that case either you leave the checksum alone (in which case the comparison in step 8 above will fail and it won’t load) or you recalculate the checksum using your own private key (in which case you can be traced — remember that it will be your ident which gets sent to the certification authority and your name which will be shown to the user in step 10).

So how about stealthily making your harmful app look like it came from someone else? In other words, change the app, recalculate the checksum but set the ident to someone else’s. Well, that requires that you have their SV, the secret key of the vendor you’re trying to mimic. All you have to do is break his cipher. That’s all. Easy, right?

Huh-uh. lots of work and maybe not practical in any reasonable time period. When RSA originally issued the prime/composite cipher, they published a message in it and offered a prize for anyone who could crack it.

Someone eventually did — one of those swarms of hobbyists working across the net efforts. But that particular cipher only used prime numbers with less than 70 digits. The problem is a lot bigger when the primes are 130 digit or 150-digit. (Remember that the problem is not twice as big with 140 digit primes, it’s at least 10^70th times bigger.) It would be essentially impossible to use this kind of approach without word getting out and your success (assuming you succeeded) would be very fragile because the vendor you’re trying to mimic could choose a new key and recertify in a day or two. Why spend years rounding up support to crack a cipher which can be changed in 24 hours?

In fact, espionage is the only really practical solution if you’re bound to create mischief. It’s a lot easier to steal the key than to get it by brute force. And espionage is definitely not easy and again the success would be very fragile since it would be so easy to change the key.

Keep in mind that a file being signed, by itself, proves nothing (e.g. the PGP trailer you see on some newsgroup posts). Some evil person could get the original file, modify it, re-checksum it and re-sign it with a different digital signature and pass it off as the original. The recipient would be none the wiser unless he checked the digital signature against the one expected from that author. The recipient either must have requested that expected key directly from the author in some secure way, or from the certifying authority over the insecure Internet, using the certifying authority’s digital signature to prevent tampering.

The Catch

The big problem with a digital signature is someone can steal a copy of your private key and you would never know it. It is not like stealing a passport, where you notice it is missing. They can then forge your signature.

Digital Signatures and Postage Stamps has a scheme to let you print your own postage using an ordinary laser printer. It uses a digital signature scheme to prevent forgery. When you buy postage, the government gives you some digitally signed documents you can then print as two-dimensionsal bar codes (called an indicum) as part of the postage stamp. Since only the government knows the private key, they are the only ones who can make up valid patterns. I don’t know that they do to prevent you from making several copies of the same stamp and improperly reusing it.

cacerts. vs .keystore

  1. You cannot sign with the certificates in cacerts. because the are not yours and because they do not contain private keys. You can, however, use them to validate certicates issued by certificate authorities.
  2. You can sign with your certificates in .keystore. You may not export their private keys, just the public ones.