Why emailing passwords is a bad idea.

Updated: Sat Feb 7 20:12:36 2015

You may be reading this because somebody has got in touch and complained about you sending them their password as part of your signup process or 'forgotten password' function. I hope I can explain why this is not a good idea.

Times Change

Like everything else to do with computers, the world of internet security is constantly evolving. Ideas that seemed great just a few years ago turn out to be not such a good idea, and unfortunately this is one of them.

Best Practice

There are a number of reasons why emailing passwords to people is now considered an unwise idea.

  1. Email was not originally designed to be secure. Emails are often delivered in clear text over the internet and stored as easily readable files on servers. Messages can also bounce to unexpected places when something goes wrong and people sometimes share email accounts. If you email a password, consider that an unauthorised person might gain access to your service. That is probably not something that you want, especially if that can result in reputation damage or credit card chargebacks.

  2. Humans aren't very good at remembering good passwords, so people often use the same password for many services. This means that if the password is revealed other services may be accessed. If this includes internet banking, social media or email account information, the consequences could be significant. When you accept a password and other personal information from a customer, you are taking responsibility for keeping that information safe.

  3. If you are able to email a password, it likely means that you are storing the password in plain text. This means that if your site is compromised attackers can potentially make off with the email addresses and passwords belonging to your users. This annoys customers and leads to bad publicity, which will be something you want to avoid. If your site is storing passwords insecurely, there is an increased likelihood that it has other security issues. You may believe that your site is secure, but with successful attacks against massive names like Adobe, Snapchat and Yahoo leaking customer passwords, it is best not to take the risk.

What should I do?

As of early 2015, you should consider the following at minimum:

  1. Don't store passwords in plain text. You (or the product you choose) should use a one-way hash with key strengthening, such as bcrypt or PBKDF2. This may sound complex, but it is a way of turning a password into a form where you can verify that the right password has been used, but you can't tell what the original password is. A bonus side effect of these 'hash functions' is that they permit passwords of any length.

  2. You should also not encrypt the password in a manner that means it can be decrypted later on, as this is likely to be inadequate - think of it like using a padlock, but keeping the key next to it.

  3. Don't email a copy of the password when somebody signs up.

  4. The best way to handle forgotten passwords is to send the customer a link that will allow them to set a new password. It should be valid only for a short period (say 24 hours), and must stop working after the password has been changed.

  5. Passwords should always be transmitted securely - this means your site uses HTTPS and the little padlock appears in the browser.

I don't understand this. I just sell things through my website.

If your site does any of the following, then it is likely that it has aspects that are not adequately secure:

Hopefully you can ask the people responsible for your site or the vendor of product that you use to help you out. Alternatively, please consider engaging the services of somebody who does understand the detail on this page.

Two-factor time based (TOTP) SSH authentication with pam_oath and Google Authenticator

Updated: Thu Sep 19 21:34:14 2013

Two-factor authentication (2FA) is becoming an increasingly useful way of providing an extra layer of security to services above and beyond passwords.

OATH is an open mechanism for generating either event-based or time-based One Time Passwords and there are a number of hardware tokens and software implementations available, which makes it ideal for a small scale implementation without requiring lots of infrastructure or expense.

Setting up a simple trial to add 2FA to a remote access server using Google Authenticator as a software token, I thought it would be useful to document the bits that I glued together.

These instructions are for RHEL/CentOS 6, and you'll need the EPEL repo for the oath packages (or install the packages and their dependencies directly). pam_oath (and its documentation) is available directly or it might be provided by your OS distribution, if you're not using RHEL/CentOS.

You should leave a session logged in as root while you test this, in case you break anything and need to undo it.

Install the relevant packages, and symlink the pam_oath module into the right place:

# yum install pam_oath oathtool
# ln -s /usr/lib64/security/pam_oath.so /lib64/security/pam_oath.so

Enable ChallengeResponse auth in /etc/ssh/sshd_config:

ChallengeResponseAuthentication yes
PasswordAuthentication no
UsePAM yes

and restart sshd:

# service sshd restart

If you're using a software token, you'll want to generate a random seed. A good way to generate a random string of an appropriate size and format:

# head -10 /dev/urandom | md5sum | cut -b 1-30

Set up your oath seed in /etc/users.oath:

HOTP/T30/6  yourusername    -   15ad027b56c81672214f4659ffb432

You can add as many users as you need, one line at a time. You should also secure that file appropriately, as these strings are effectively a password:

# chmod 600 /etc/users.oath
# chown root /etc/users.oath

You can generate an OTP using oathtool. Run this with the -v option and your chosen key. The Base32 version of the secret is the one that you will need for the Google Authenticator smartphone app. You can type this in, or generate a QR code later...

#  oathtool --totp -v 15ad027b56c81672214f4659ffb432
Hex secret: 15ad027b56c81672214f4659ffb432

Since you probably don't want OTP enabled all the time for all users, create /etc/security/access-local.conf - you can set differing options depending on your requirements.

This configuration would allow access without requiring an OTP from a trusted network:

+ : ALL :
- : ALL : ALL

This configuration only requires OTP for members of the 'otpusers' unix group. This might be useful to selectively 2FA enable user accounts as part of a gradual rollout, or you might decide to only require 2FA for users who have permission to su to root.

- : (otpusers) : ALL
+ : ALL : ALL

You can be quite creative with these rules - they follow the standard pam_access syntax, so check the documentation for that.

Finally, I added the following lines to /etc/pam.d/system-auth-ac and /etc/pam.d/password-auth-ac (This is a RHEL/CentOS-ism) - where you put them will depend on your pam configuration and OS. The pam_access entry is optional, but it does make the above choices possible.

auth [success=1 default=ignore] pam_access.so accessfile=/etc/security/access-local.conf
auth required pam_oath.so usersfile=/etc/users.oath window=30

Now you can ssh into your server (don't close the root session you currently have open in case you've broken something!). You can generate your OTP using oathtool:

# oathtool --totp 15ad027b56c81672214f4659ffb432

Log in quickly (before that token expires), and you should find it lets you in:

username@host:~$ ssh securehost
One-time password (OATH) for `username':
Last login: Wed Jul 10 22:38:53 2013 from somehost.example.com

To set up the Google Authenticator smartphone app, you can take your Base32 formatted secret, and either enter it manually or generate a QR code. To make a QR code, you need a URL formatted string, as below. The example of 'username@securehost' is a simple description, so it can be anything you like.


Feed this into a QR code generator that you trust (remember, this is effectively a password), and scan the code using the app.

With the secret saved into your smartphone app you should now be able to log in using the codes that it generates.

Extra things

The /etc/users.oath file gets updated every time you log in, which can make this a challenge to manage centrally across multiple hosts. It is possible to update this with a custom augeas lens if you're using puppet. I've also got an ANSI escape commandline QR code/seed generator. These are a bit of a bodge, but do seem to work. If there's demand I'll see about sticking a copy of them and the relevant puppet manifest up somewhere.

I've also got a script to decrypt Gemalto PSKC v1 files for the IDProve 100 / Easy OTP v3 tokens.

Update, 2013-09-19

Fixed typo in symlink, thanks to Andreas Ott for spotting it.

Picasa.ini files not properly updated

Updated: Thu May 17 00:00:00 2012

Like many people I've been using the wonderful (and free) Picasa to manage my photos. One of the huge benefits of Picasa aside from its fast and friendly user interface is that it doesn't write changes to your images. Instead it stores a record of changes that are made to each original image in a Picasa.ini file in each directory. This means you can make changes to your images in Picasa, such as adjusting the contrast and brightness or cropping (or marking with a star), and you don't need to worry about it overwriting your original images.

In order to keep performance reasonable it stores a cache of these adjusted thumbnails and the changes in your Local Settings directory too.

I discovered a problem with Picasa's method of updating these Picasa.ini files though - if you make changes to a file, and then move it to a different folder, within picasa, it doesn't update the original or the new .ini file. This means there's a record of the changes still in the old directory, but not the new one, but it doesn't seem to matter because Picasa tracks this information in the Local Settings database.

The problem comes if you lose the database, or (as I did) intentionally delete it. Picasa will happily then trawl back through your pictures directory and rebuild most of the information from these .ini files. Unless you've moved the images to a different folder after editing, in which case your changed will be lost.

The frustrating thing is that this information is still available, picasa just doesn't know where it is.

To fix this with my photos, I wrote a short perl script to trawl all the Picasa.ini files to pull data out of them, then write them to folders where this information is missing. There are a couple of caveats with this though:

  1. If your camera doesn't keep track of file names (they return to 0001.JPG after emptying your memory card) this almost certainly won't work properly.
  2. If you've made changes to the image in the new folder too, they won't be updated or merged. It will warn you if there's filename duplication though.
  3. I ran this on a linux computer. It should work on windows too, using something like activeperl, but I've not tested it.

If you do find it useful, please do let me know, and remember, back up your files before using this script. It works for me, but I make no guarantees that it won't mess up your images.

You can get the script here

To run it simply give it the full path to the directory that contains your photos. e.g.:

./picasa.pl /home/bcc/photos

After running it, you'll need to clear out Picasa's database for it to pick up the changes. Hold down ctrl-alt-shift as Picasa starts and it will ask if you want to do this. You will lose any labels you've applied to images, but if you need this script then that's probably already happened...

Dev8D 2010

Updated: Mon Mar 8 21:31:32 2010

The Event

I hadn't expected to get to go to Dev8D 2010. After the success of our entry in 2009, it was agreed that other people in the department should get the opportunity to go. It came as a pleasant surprise to be invited to join the DevCSI Developer Focus group - intended to help foster a development community based around UK HE, and carrying on the work started at Dev8D 2009. Among other responsibilities, this meant helping with some of the preparation and running of the dev8D 2010 event.

I arrived earlyish, and set up in Base Camp where I started putting together a handful of slides for my lightning talk on list8D. Matt Spence and I had prepared a demo the day before, but I wanted to give a bit of a talk about how the dev8D prototype from last year had turned into a proper funded project, how our management had supported the development, and how agile development had helped us maintain realistic expectations. Most excitingly this would include the first demo of the shiny new theme thanks to some amazing last minute work by Matt.

I had also been roped into taking photos for the event and as more people started to turn up I wandered around getting some pictures.

Lunch was excellent, and with the Linked Data event running at the same time on the first day there were around 500 people in the ULU building.

In the afternoon I had an interesting conversation about where I saw cloud computing with a couple of other interesting folks. Consensus seemed to be that it's a useful tool where appropriate, but not always the right answer. Services such as content delivery and compute-on-demand definitely of value, but it's not mature enough for core service provision yet. Feels a bit like virtualisation did 5 years ago - useful but not quite there yet.

I wandered through to the expert zone to prepare for my talk on list8D which for the most part went well. Minor networking issues meant I couldn't completely demo the addition of new items, but it was nice to show off the new theme and the brilliant work put in by Matt and Simon in getting list8D ready for real use.

I also watched the excellent lightning talks by Joss Winn on Wordpress, the Eprints guys talking about their challenge, and a demo of OpenGL development on android.

Wednesday evening was set aside as 'Games Night'. In addition to a collection of the usual and not-so-usual board games, we played Developer Bingo where you had to find other developers who could sign off a specific item on your sheet. These were things like "has been slashdotted", "coded in fortran" or "is a GNU maintainer" -- based off the signup details and a number of 'likely' other suggestions. This was a brilliant icebreaker, and the prizes of lego boardgames were similarly well received with people playing with their prizes with people they'd only met that evening. Once again, the food was excellent, although the ULU bar could have done with some proper beer.

On Thursday morning (having stayed up finishing my slides later than I probably should have) I gave another lightning talk on Web Security which seemed to go down well. It's a lot of material to cover in 15 minutes and not really in any depth, but the major aim was to give people enough information to go and do some further research themselves. Judging from a couple of conversations I had later on in the day, it seems that at least a couple of dev8Ders will do just that so I consider that a success.

I also watched a brilliant lightning talk by Stephen Johnston on using the Microsoft Azure cloud computing platform to calculate satellite collision probabilities. Very cool stuff, and well suited to the 'compute power on demand' model.

After this I went off to see the RepRap 3D printer which had been set up and was busy printing a coathook. The buzz around this device was amazing - nobody could quite believe this thing was printing physical objects. Adrian Bowyer gave a great talk back in the expert zone on how RepRap came to be, why it was open source and how he hoped it would revolutionise the ability to make things. What's really impressive is that the RepRap device can print about 50% of its own parts, and they're constantly working to improve that percentage. They also encourage the improvement of the design of individual bits and the contribution of those changes back to the central project.

I really can't describe how cool RepRap is, and how much excitement there was at the event - you really got the feeling that RepRap is a game changer in the same way that the internet allowed anyone to publish - this gives people the ability to manufacture. Best of all, it only costs about £300 to build one from scratch, which puts it well within the reach of individuals and communities.

Thursday afternoon meant the Cloud Computing workshop which had Dave Tarrant covering Amazon EC2 and myself talking about Linode. The workshop room was pretty much full for this which only added to the pressure. Dave did a brilliant job going through the basics of EC2 and most people in the room had a working EC2 instance running Apache and MySQL. The Linode demos went pretty well, and I was happy to show off the recovery console and the new StackScripts, and a number of attendees signed up for some of the free instances that Linode had generously provided for the event.

In the evening (entertain yourself evening), despite the horrible rain a few of us went to Ciao Bella for some tasty italian food, then on to the Jeremy Bentham pub for the Shambrarian meetup which was excellent. Good to find another pub in London that has decent beer on tap and a good whiskey selection.

Friday was finally a day where I could relax a bit, so I spent one session in the genetic algorithms workshop by Richard Jones. This is a novel approach to using multiple generations of virtual creatures to solve problems that are non-trivial to work out through conventional means. Using a set of simple rules and a fitness function, you test each set of 'DNA' against the fitness function, pick the best ones, breed them, then run them again. Over a number of generations, you should end up with a pool of creatures that get better and better at solving the problem.

A great visual example of this is the evolution of Mona Lisa demo.

This was a great introduction to an area I knew nothing about, and although I missed part of the workshop due to helping sort out the nominations for the awards dinner, I really enjoyed getting the chance to play with this alternative approach to solving complicated problems.

I also spent a bit of time on Friday putting together a simple list8D API to LTI bridge, for our entry for the LTI challenge which Steve had noticed would be a perfect fit.

Friday evening was the awards dinner which was a lot of fun - we got to give away some cool awards (best newcomer, best leap-of-faith and best t-shirt were my favourites) and the meal was brilliant. I was taking photos of the presentation of the certificates and while there was a convenient balcony, my flash wasn't really strong enough to reach comfortably which was a shame, since the photos taken from the side of the stage weren't as good as I'd hoped.

On Saturday morning (feeling rather blurry from the very late night) I give my web security lightning talk again as it had been asked for. Again, a good number of questions and another chat from someone after the talk suggests it was worthwhile.

I spent the rest of Saturday helping to judge some of the entries for the challenges. I was amazed at the number and quality of the submissions. Clearly a lot of work had gone into many of them, even only over a few days.

Finally with the close of Dev8D came the awarding of the bounty/challenge prizes (again, as photographer-monkey, but the light was rather better this time), then heading home, exhausted.

The Good

The Bad

The Shiny


Well done, if you've read this far. Here's some stuff that may be of interest:

You may also be interested in joining the DevCSI Developer Contact group.


Thanks to Mahendra, David F, the UKOLN events team, and anyone else involved in running Dev8D. It was an amazing event and I had a brilliant time.

Fake Drugs being sold from .ac.uk sites

Updated: Mon Mar 8 14:47:39 2010

BBC News reported that a number of .ac.uk sites are being used to sell counterfeit drugs at the end of last week. I wish I could say this surprised me, but knowing how complicated the issues are in sorting out web security at the university where I work, I can't say it's come as a massive shock.

At a university it is often the case that a department may be responsible for their own web presence - usually someone for who it is not a priority, and they may know nothing about the technical issues involved. Sometimes a department will have had a third party company supply a site or content management system without realising it needs to be kept up to date. Even where there is a good level of centralised support for web publishing, some departments may do their own thing for historical reasons.

We've been fairly proactive at working with departments and getting our own house in order, but it's certainly been a challenge to have security taken seriously across the institution. While incidents like this are unfortunate, they do have the positive side-effect of raising the profile of these issues, and longer term this can only be a good thing.

Finally I'll share a tip for anyone working in academia. Set up some google site alerts for the following:

These will alert you to any new pages that appear on your site with those terms. It's not perfect, but it will alert you to some compromised pages, or even comment spam on wiki pages/blog posts that should be dealt with.

Driving 8x8 LED Displays with an Arduino

Updated: Sun Feb 7 22:13:00 2010

After playing around a bit, I moved on to connecting the 8x8 displays. I spent a bit of time thinking about how best to do it, and had come to the conclusion that using a 595 shift register to drive the anodes of each display was the way forward. I'd ordered some ULN8023A darlington transistor arrays which can sink up to 500ma of current. This is less than I was planning to draw through the 24 LEDs that make up a row, so the plan was to connect the cathodes of all the LED matrices to this chip. Again, a 595 shift register controls the 8023, so it means I can directly address each row and column in the same way.

Once I'd got one 8x8 display working, it seemed like checking it all worked properly was a plan.

It did, so I moved the current limiting resistors over to the 'y-axis' board, and started building the other 2 display boards.

Each board joins directly onto the next, so the wiring isn't ridiculous on any of them, but there's still an awful lot of extremely fiddly connections to make, and it's used pretty much all of my 150 bits of wire up. I wouldn't plan on doing this again in a hurry...

Once I'd connected the other 2 displays, I changed the display code to push out 2 additional sets of bytes on the X-axis with some slightly different display patterns and we were in business:

Here you can see (from the top) one of the 595 shift registers, the 8023 darlington array and 8 current limiting resistors. These collectively make up the Y-axis board, which controls the cathodes of all the displays. Each of these is operated in turn very quickly, lighting up an entire row. These are scanned quickly enough that the image on the display seems to be complete, thanks to the persistence-of-vision effect.

I had considered driving this 595 off separate pins, but decided not to. This is the first one that is connected, so it keeps the last byte of 4 that is sent out. This has the advantage that the latch of all 4 shift registers is operated at the same time, ensuring there's no lag between changing the column and row data. This would probably not be noticeable, but it would annoy me knowing there was a slight lag :)

Putting all this work together, I still had to make the software driving the display useful, rather than just pushing out hard-coded bitmaps.

I wrote some code to turn a 2d boolean array into a series of bytes for direct output. There's an intermediate stage which updates the cached bytes from the boolean array for performance, so the continuous display scanning/multiplexing isn't slowed down by excessive data shuffling. A quick demo later, and we have something that actually shows off the displays as one single screen:

Finally, I need to run some of the LCDs at a different brightness level. I modified my code to maintain 2 arrays and 2 byte caches. One contains the 'bright' LEDs, and one the 'dim', and these are lit alternatively for different periods to create this effect. Again, the refresh rate needs to be fast enough that it's not obvious to the human eye, and that's where I started to run into problems. Switching between the 2 display layers for the 24x8 display, multiplexing the rows, and varying the duty cycle of different LEDs seemed to be getting too much. I couldn't do all that fast enough to keep the refresh rate sufficiently high - the dim LEDs were showing horrible signs of flickering.

I turns out the shiftOut and digitalWrite functions provided by the Arduino software are pretty slow, and this becomes a problem when you're pushing a lot of data. My clever byte caching wasn't actually making a difference since it seems the shiftOut function turns that back into individual bits for output, which I could have done myself without the intermediate layer.

Fortunately it seems I'm not the only person who's had this problem, and thanks to the extremely clever MartinFick on this forum post, I replaced the shiftOut and digitalWrite calls with shiftRaw and fastWrite. The difference in performance is staggering - I have much more control over the duty cycle again, and there's no sign of flicker.

I think it's fair to say this has been a successful weekend. I've got a reasonably sane bit of code for driving the display with both dim and bright LEDs 'simultaneously', and it's run off a data structure that should be dead easy to implement the game of life on top of. All I'm missing now is an RTC to keep time, and actually porting the code over...

More Simple Arduino Goodness

Updated: Sat Feb 6 22:13:00 2010

After a little more fiddling on Friday night, I had a bunch of LEDs connected to one of the 595 shift registers following one of the Oomlout example circuits.

directly driving LEDs

I added a second shift register chained off the second to run another 8 LEDs, again following an example circuit, but this time from the Earthshine Arduino Guide.

Driving with a shift register

This naturally meant cool lighting effects.

Then I had a go at driving the LEDs at different duty cycles to vary their brightness. This is something I'll need to do with the 8x8 displays, so it seemed like a sensible plan to have a go with a simple circuit. It turns out it's not that hard to do:

Duty cycle demo

Arduino Goodness

Updated: Thu Feb 4 22:13:00 2010

So, my plans to build the Game of Life Clock took a step closer to reality today with the arrival of my order of stuff from Oomlout following the recommendation of a couple of people. Everything turned up within 24 hours of placing the order. Very impressed.

Arduino bits

In addition to the 8x8 LED matrices I needed, I bought a new Arduino Duemilanove, since my old NG only has an ATMega8, with 8k of ram. This has been fine for tinkering, but was looking a bit tight for running the game, RTC and matrix driver chips. The Duemilanove has 32k of ram, which is tonnes more than I need.

Old and New

It also gave me a chance to order the ARDX starter kit, which in addition to the Duemilanove has a bunch of extra stuff to play with. Given the Arduino-heavy nature of some of the dev8D workshops this year, it seemed like it'd be worth having some extra bits to play with.

ARDX kit

New arduino and breadboard

I've not really done much this evening other than have a play with the first starter kit circuit, and get the latest arduino software up and running. It is worth noting that the 10mm LED that ships as part of the starter kit is a bit "argh, my eyes".

One thing that did come as a bit of a surprise was not having to hit the reset button to upload a new sketch. That'll take some getting used to.


Next up is driving an 8x8 LED matrix off a pair of 595 shift registers. I'm still torn between using shift registers or the much more sophisticated MAX7219 LED driver. Both have their advantages and disadvantages, so I think the best bet is to have a play and see...

Debenhams Payment Form - Design Gone Wrong

Updated: Wed Jul 1 23:10:26 2009

I just bought a gift for a friend who is getting married shortly, and navigated my way through the debenhams wedding site, which was fine. I finally got to the payment page though, and felt compelled to rant about it. This is basically the email I sent them...

There are so many issues with this form that I'm not sure where to start, so I'll begin at the top.

The form

1) Horrible JPEG compression on card images and the text around them at the top. There's no ALT text for that image, so a screen reader for the blind wouldn't see that information.

2) "Notified terms and conditions apply" - What does this even mean? I haven't been notified of any T&C at this point. If this is supposed to count for the notification, where are the terms and conditions?

3) "Security Card Number" - What security card? Card Security Number might make more sense. If you're going to use a term that requires explanation, you may as well use one of the standard terms, such as Card Verification Value or Card Security Code. This field doesn't line up with the label text.

4) "Card holders name as it appears on the card:". This should be "Card holder's name". This text is too long, redundant, wraps on to the next line and is redundant. Surely most people understand that the name on the card should go here? Failing that, simply "Name as it appears on the card:", or even "Name on card:" - shorter, avoids the grammar pitfall, and reduces repetition.

5) Make the 'if other' title text box bigger, at least to line up with the right edge of the other fields. If you've got to type 'Brigadier General' into the box, it'd be nice if you can see an entire word at a time.

6) The Card number, expiry date, security card number, and address boxes don't line up.

7) Card number (omit spaces): There are no words for how much I hate this behaviour in payment forms. It is such a simple thing to automatically remove spaces when the user clicks 'confirm'. Why not let the user enter the number as they feel comfortable and sort it out for them? If for some inexplicable reason this can't be fixed, at least the wording could be improved; "omit spaces" is a horrible phrase. "Without spaces" would be much more friendly.

9) The gift card image and balance check thing - why is that there? Was it positioned using some sort of 'pin the tail on the donkey' game? At the top of the page before someone is already expected to have made a decision about which card to use would be much better.

10) I've already mentioned that the fields don't line up vertically, but have the 'expiry date' and 'switch issue' fields and labels been out for a heavy night on the beer, stumbled home and collapsed?

11) "The Debenhams Storecard does not require an expiry date. (Excluding Debenhams Mastercard)" -- Why say this? I assume the store card doesn't actually have an expiry date on it, so there's not one for people to enter? Even if they do manage to enter something, why not just ignore it if you don't need it?

12) Billing address - why have a big editable box with an explanation as to why you can't edit it? Why not put the 'find address' button (which clearly doesn't need all that explanatory text) where the textarea is, and make that not look like it's an editable field.

13) Why have 'confirm' button at top and bottom? I could understand having one after the 'select a card' bit, and another at the bottom of the 'new card' form, but the positioning at the top looks really odd. Why are the confirm buttons outside the form frame around the form?

14) It is impossible to operate this form by keyboard only - you can't trigger the 'find address' button or confirm order buttons without using a mouse.

I work for a large organisation, so I know how things like this evolve over time with the input of various people. When things change gradually over time, the decline in customer experience is often overlooked until someone points it out.

In this case, I think this form provides a pretty bad user experience, and that will not encourage people to shop online with Debenhams.

London to Brighton Bike Ride 2009

Updated: Tue Jun 30 23:25:23 2009

On Sunday the 21st of June 2009 I took part in the annual London to Brighton Bike Ride for the British Heart Foundation.

We'd stayed the night with a family member who lives in Mitcham who lives around 4 miles from the start. I ate breakfast, showered, dressed and left at 6:45 for my 7:30 start, arriving with plenty of time to spare at 7:15. As the route passes the end of the road we were staying in, I followed the stream of cyclists in reverse but needed to hit the pavement in a few places to avoid the completely closed roads.

Unsurprisingly, there were lots of cyclists on clapham common, with the 7:00 starters still leaving at 7:30. I got through the start gate (and had my card stampted) at 7:45.

It took a full hour to cover the 4 miles back to to Mitcham riding past the road I'd started from due to lots of stop/start for traffic and cyclist-related congestion. After a relatively uneventful ride, we eventually made it up the hill to Woodmansterne (where I got married) 12 miles in to the actual ride at 9:30, where I met my wife, brother in law and mother in law. Refilled water, ate some food.

Getting back on the bike, there was a nice fast run down rectory lane before dismounting and walking up How Lane again due to congestion. I stopped off at rest stop D 20 miles in for a bacon and sausage sandwich, and to rest my legs briefly.

I left stop D after a decent rest expecting to gently spin my way up Church Lane, to discover massive congestion. Not even a walking pace - a few steps at a time, taking 30 minutes or so to cover maybe 1/4 of a mile. Turns out the delay was due to letting cars past so cyclists can cross the A25 road a few at a time. Eventually we got past and carry on through a reasonably flat section of the route. I was starting to run low on water about 27 miles in, so called in at stop F for some water. This turned out to be a mistake, as Burstow scouts were insisting on a minimum donation of 20p for a refill of tap water. It even tasted odd, the little gits.

I moved on quickly after getting the water with the intention of my next stop being Turners Hill. Pretty good and mostly uneventful run up to Turners Hill although I walked up part of the hill at this point as my legs were starting to get tired. I stopped again briefly for water which was being handed out by the extremely energetic kids from the local church, and decided to pass on this extremely busy rest stop.

It's a nice fast run run down to Ardingly, where I made a proper stop. This turned out to be the right decision as it's a nice location for a rest, with a good BBQ, and decent cups of tea on offer from Ardingly Scouts. I was definitely starting to feel the tiredness at this point, so the rest was welcome.

I was expecting a nice easy run to the bottom of Ditchling beacon at this point, but the route profile we were given lies a fair bit. Lindfield was pretty, but the extremely long hill up through haywards heath is unpleasant and extremely draining. I gently spun my way up this trying not to wear my legs out.

I stopped at Wivelsfield for more water and a brief rest, then on to the bottom of ditchling beacon. Again as you approach the bottom of Ditchling Beacon there's a few miles that are surprisingly lumpy and gently uphill, which doesn't help. I stopped at the last stop before the beacon for more water, a hotdog, a banana and a rest before tackling The Hill.

Ditchling beacon, at 700 feet of climb in just over a mile was exactly as hard as I'd heard. While I suspect I could (slowly) cycle my way up if I were fresh, after riding over 50 miles I had almost nothing left in my legs. I walked up slowly, just like most other people.

Eventually made it to the rest stop at the top, where I stopped for a quick cup of tea, a banana and to appreciate the amazing view before the run down the hill. Leaving the top of the beacon there's a reasonably gentle downhill at first that gets steeper. Finally, you come round a corner for the big descent. I set a new personal speed record of 42.9 mph at this point, and that was going 'slowly' on the brakes due to the 'slow down' warning signs. I could have gone significantly faster given how quiet the road was at that point -- I almost wish I had

The last 3-4 miles through Brighton are fairly frustrating with a lot of stop/start for traffic, especially as you know that you're so close, but it is thankfully all flat. Finally I made it to madeira drive on the sea front, the finish line in sight. It was an amazing feeling crossing it after so much effort. I got my card stamped and collected my medal, grinning like an idiot.

Then I realised I had about 20 minutes to get to the coach back and I had no idea where it was. I asked one of the marshals who directed me back past the 2 piers and on to Hove sea front. About a mile away was what the paperwork said - in reality closer to 3 which I could have done without, especially as it involved navigating round hordes of pedestrians and tired cyclists. I made it with a few minutes to spare, loaded my bike onto the lorry, and collapsed exhausted on the coach, a total of 61 miles down. I'm a bit annoyed that I didn't have enough spare time to make it a metric century.

It was extremely hard (for me, anyway), but an amazing amount of fun. I'm also extremely pleased to say that so far I've raised over £700 for the BHF.

Me, after the ride

Go to Archive

Contact: site@spod.cx