“A nearly impenetrable thicket of geekitude…”

November 2006

Federations 101

Posted on November 30, 2006 at 18:10

In more UK Federation-related news, I’ve been invited to give a short presentation next week as part of a panel session at the Fall 2006 Internet2 Members Meeting in Chicago.

I’ve been asked to keep the impenetrable geekitude down to non-toxic levels by sticking to a description of policy issues rather than implementation and technology. You can get the other stuff from me pretty much any time.

Tags:

UK Federation Launched

Posted on November 30, 2006 at 17:50

Today was the official launch of the UK Federation, or the UK Access Management Federation for Education and Research to give its Sunday name. This is a huge deal for everyone involved, myself included: some people have been working towards this point since around 2000 (I’m a relative newcomer, only having put a couple of years into it so far).

In the longer term, this will be a fairly important system for many more people: after all, the UK Federation is a federated identity framework for the whole of the UK education and research sectors, which I’m told involve perhaps 18 million people. If we do our job well over the next few years, though, the best case is that like all good infrastructure it will just sink down below the point where people even notice it. That’s a hard job, and we’ve only just started on it.

Second Life Goo and Dancing Pigs

Posted on November 22, 2006 at 10:21

The virtual world of Second Life has recently been suffering from a series of attacks from what has been referred to as “grey goo”, a term which is a direct reference to the scenario of uncontrolled exponential growth in nanotech replicators. The result of a grey goo attack is that the world fills up with junk that prevents anyone getting anything else done.

I haven’t covered this before because it is well known to the point of infuriation to most people connected with Second Life. What’s been more interesting recently is that people outside that community have started picking up issues like this from Second Life, particularly people more commonly associated with security in general. For example, Ed Felten wrote a couple of articles recently about the “copybot”, which allows you to make a copy of anything you can see in-world without paying for it (with some limitations, which aren’t relevant to this discussion). Professor Felten is perhaps most well known for his work on the SDMI challenge, US v. Microsoft and more recently the (in-)security of electronic voting machines.

Directly on point to the grey goo attacks is Eric Rescorla’s Rescorla-goo; again, this is a bit off what most people would think of as Eric’s normal beat.

But that’s my point: if you’re involved however peripherally in security systems, you walk into something like Second Life and see a lot of problems waiting to happen; as Ed Felten puts it, these are really issues “from the It-Was-Only-a-Matter-of-Time file”. New systems should be learning from the mistakes of the past, not blundering through a series of unworkable solutions every time until they get to something that works until the next bad guy comes along. Unfortunately, that doesn’t seem to be how the world operates. Ed Felten has another appropriate quote for this: “Given a choice between dancing pigs and security, users will pick dancing pigs every time.”

If you’re interested in a bit more comment about the grey goo problem per se, I attach the comment I added to Eric Rescorla’s article below.

RUSE MET LORD CURT REEL ION

Posted on November 21, 2006 at 22:38

I learned the difference between haphazard and random a long time ago, on a university statistics course. Since then, I’ve been wary of inventing passwords by just “thinking random” or using an obfuscation algorithm on something memorable (“replace Es by 3s, replace Ls by 7s”, or whatever). The concern is that there is really no way to know how much entropy there is in such a token (in the information theoretic sense), and it is probably less than you might think. People tend to guess high when asked how much entropy there is in something; most are surprised to hear that English text is down around one bit per letter, depending on the context.

If you know how much information entropy there is in your password, you have a good idea of how much work it would take for an attacker to guess your password by brute force: N bits of entropy means they have to try 2^N possibilities. One way to do this that I’ve used for several years is to take a fixed amount of real randomness and express it in hexadecimal. For example, I might say this to get a password with 32 bits (4 bytes) of entropy:

$ dd if=/dev/random bs=1 count=4 | od -t x1
...
0000000    14  37  a8  37

A password like 1437a837 is probably at the edge of memorability for most people, but I know that it has 32 bits worth of strength to it. So, what is one to do if there is a need for a stronger password, say one containing 64 bits of entropy? Certainly d4850aca371ce23c isn’t the answer for most of us.

When I was faced with a need to generate a higher entropy — but memorable — password recently, I remembered a technique used by some of the one-time password systems and described in RFC 2289. This uses a dictionary of 2048 (2^11) short English words to represent fragments of a 64-bit random number; six such words suffice to represent the whole 64-bit string with two bits left over for a checksum. In this scheme, our unmemorable d4850aca371ce23c becomes:

RUSE MET LORD CURT REEL ION

I couldn’t find any code that allowed me to go from the hexadecimal representation of a random bit string to something based on RFC 2289, so I wrote one myself. You can download SixWord.java if you’d like to see what I ended up with or need something like this yourself.

The code is dominated by an array holding the RFC 2289 dictionary of 2048 short words, and another array holding the 27 test vectors given in the RFC. When run, the program runs the test vectors then prompts for a hex string. You can use spaces in the input if you’re pasting something you got out of od, for example. The result should be a six word phrase you might have a chance of remembering. But if you put 64 bits worth of randomness in, you know that phrase will still have the same strength as a password as the hex gibberish did.

Moving a FC5 Root Partition to LVM

Posted on November 19, 2006 at 20:10

One of my servers is pretty old, dating back to the dawn of recorded history — or Red Hat 7 as we now call it. Most of its storage has been migrated onto a new hard disk and the Logical Volume Manager some time ago. Finally removing the original hard drive requires a few tricky operations: moving the root partition, moving the boot partition, and moving the boot loader.

These operations are turning out to be tricky enough to make it worth writing down the procedure in case I ever need to use it again. So, here’s an aide-mémoire on moving a Fedora Core 5 root partition onto LVM.

PGP/GPG Keys

Posted on November 13, 2006 at 12:47

I generated my first PGP RSA keypair way back in 1993. Some friends and I played around with PGP for e-mail for a while, but at the time few people knew about encryption and even fewer cared: the “no-one would want to read my mail” attitude meant that convincing people they should get their heads round all of this was a pretty hard sell. The fact that the software of the day was about as user-friendly as a cornered wolverine didn’t help either.

The PGP software had moved forward a fair bit both technically and in terms of usability (up to “cornered rat”) by 2002, when I generated my current DSS keypair. By this time, it was pretty common to see things like security advisories signed using PGP, but only the geekiest of the geeks bothered with e-mail encryption.

Here we are in 2006: I still use this technology primarily to check signatures on things like e-mailed security advisories (I use Thunderbird and Enigmail), but I’ve finally found a need to use my own key, and it isn’t for e-mail.

Over the years, PGP (now standardised as OpenPGP) has become the main way of signing open source packages so that downloaders have a cryptographic level of assurance that the package they download was built by someone they trust. Of course, the majority of people still don’t check these signatures but systems like RPM often do so on their behalf behind the scenes.

I’ve agreed to take on some limited package build responsibilities for such a project recently, so I’ve installed the latest versions of everything and updated my about page so that people can get copies of my public keys. Of course, there is no particular reason anyone should trust those keys; this is supposed to be where the web of trust is supposed to come in, by allowing someone to build a path to my keys through a chain of people they trust (directly or indirectly). Unfortunately, my current public key is completely unadorned by useful third-party signatures. If you think you can help change that (i.e., you already know me, already have an OpenPGP keypair and would be willing to talk about signing my public key) please let me know.