dark room : Java Glossary


dark room
This is my idea primarily to prevent software piracy and viruses. It is also useful in other encryption applications such as private email, file privacy and migrating desktop applications. Unfortunately it requires silicon and not only that, silicon built right into the CPU (Central Processing Unit) chip. It will eventually happen, it is only a matter of time. Here is how it works.

Every CPU has a totally secret unique private key. Nobody, not even Intel, not Microsoft, not even the owner of the CPU knows the secret key. There is no way to get the CPU to divulge it. However, the CPU will happily divulge the corresponding public key to pretty well anyone who asks for it.

People or software vendors can encrypt messages (e.g. hunks of programs) using the widely known public key. They do this encrypting on a server, so that pirates never get to see the code before it is encrypted. As you will see later, they never get to see the code after it is decrypted later either. Only that particular targeted CPU can decrypt them.

Further, the CPU decrypts them in a dark room. This is a little piece of RAM (Random Access Memory) on the CPU chip itself. When you load the dark room with data or instructions, the CPU automatically decrypts it and leaves the decrypted form in the dark room. However, you can’t peek at the decrypted results. They stay only inside the dark room.

The CPU can execute instructions in the dark room, starting only at the beginning of the dark room code. Those instructions may deliberately export data from the dark room, but other than that, there is no way data can leak out of the dark room to the outside world.

How then is this tool used? Most commonly the software vendor loads a small encrypted DES or Blowfish key into the dark room and exports the decrypted symmetric key. That key is then used to unlock the full software package. Public/private keys are quite slow in comparison with symmetric keys.

When you install a software package, a small encrypted piece of it is tuned to run only on your CPU. The decrypted form of those crucial pieces never leave the dark room where pirates could examine them. Further the dark room code could time-limit when you could run the software. Your license may limit you to one month’s use, or it could limit you to using it off-office hours, etc.

If you move the software to a new machine, the server must generate you a new unlocking piece of code encrypted with the private CPU of the new key. I am presuming 24-hour Internet access will become common so this will not be perceived as an inconvenience.

The software vendor can thus control precisely when and where his software is run. It is still possible for a hacker to replace the encrypted piece of code with code he generates out of thin air. To combat this, the software vendors need to automatically install updates on a weekly basis, even if those updates make almost no change at all. If updates have an expiry date burned into them in a million places, it means the hacked code will stop working after a week anyway. With frequent automatic updates it becomes possible to quickly patch any security hole that pirates are breaching.

You might try to get away with a cheapie darkroom in the form of a CPU instruction that can decrypt a block of public RAM using a microcoded routine and the private key. However, then the decrypted results are then out in the open, easy pickings for pirates. With the dark room, pirates never get to see what the decrypted code or data looks like. You can completely hide any truly clever, crucial or tricky code inside the dark room.

How big should this dark room be? Ideally as big as possible. I was thinking along the lines of 64K to 32 MB That is not that much chip real estate. This is just ordinary DRAM (Dynamic RAM).

Ideally there would be a dark room mode, where the CPU prefetched 1024 byte chunks in parallel from RAM into its SRAM (Static Random Access Memory) cache, decrypting each block on the fly using dedicated parallel hardware. The entire on-chip SRAM cache acts as a dark room. This way code written in dark room mode could be arbitrarily large, even the entire program and all its data! The approach would not be secure unless the CPU fetched and decrypted sufficiently large chunks of code at a pop. Tim Tyler came up with this idea. You would also want automatic reencryption of modified blocks. You might want a Harvard style architecture to cleanly separate code / data / encryted / unencrypted.

To go along with this you need a computer architecture where each program runs it its own secure compartment where no other program can possibly hassle it. The compartment includes not only RAM, but also hard disk space.

The CPU ’s private key has another use, digitally signing. When your cpu digitally signs email, programs, posts, or legal documents with its private key, no other computer but yours could have done it. This makes it extremely difficult for someone to forge your emails or posts.

How do you defend against a pirate who builds a phony dark room in software? One where the private key is known and the entire contents of the dark room can be examined? Here are three defenses:

  1. speed differential

    A virtual dark room would presumably be much slower than one implemented in hardware. The software vendor could, as part of determining the cpu’s public key, determine if the dark room was able to solve a puzzle sufficiently quickly to have been done in hardware.
  2. Frequent updates

    The pirate could never find all the time bombs in the code that stop it from working or working properly if not updated frequently.
  3. Single step foiling

    Code behaves differently when single step than when run at full speed.
  4. Watermarking

    Hide tell tale signs in the code so that if it is pirated, you can trace it back to the public CPU key to whom the software was given.
  5. Factory key escrow

    The CPU could have two private/public key pairs. All CPUs (Central Processing Units) of a given model would also have a common published public key. You could prove the CPU/dark room were genuine by challenging the CPU to decrypt a challenge phrase encrypted with the public key. A clever pirate could perhaps gang together a genuine CPU and a virtual one and squeak by a test not cleverly enough constructed. If the pirate fails the test even once during debugging his virtual dark room, the vendor is alerted to his presence. The factory would not track any private keys. The individual public keys would act as serial numbers, which could be used to track stolen cpus.
Moving from software purchase to rental, just in itself, would take a big bite out of piracy, even without dark rooms to enforce the agreements.

Microsoft and Intel may have read my essay, or perhaps the ideas were in the air. Everyone is quite alarmed simply because it is Microsoft pushing ideas similar to this called Palladium and they have a history of being up to no good. Therefore anything they propose must be evil. The paranoia should instead be expended in making sure such standards are open and Microsoft has so special keys or policing powers. Just because a burglar points out the need for locks is not an argument against locks. It is reminder to take safeguards to make sure burglars don’t get the master keys to all the locks. Microsoft dropped the name Palladium for the term next-generation secure computing base hoping to shake the big-brother image.

Vendors of software should have the right to lock it against unauthorised use. This will bring down prices for legitimate users. Users should have right to lock out spammers and hackers from their machines. Users have the right to demand their mail to be delivered reliably and without anyone snooping on it. Programmers should have the right to lock their software against tampering by rival companies.

These things can’t happen without dark-room sorts of encryption technology becoming the default.

This page is posted
on the web at:


Optional Replicator mirror
of mindprod.com
on local hard disk J:

Canadian Mind Products
Please the feedback from other visitors, or your own feedback about the site.
Contact Roedy. Please feel free to link to this page without explicit permission.

Your face IP:[]
You are visitor number