The BBC, it seems, is upsetting tech lovers left, right and centre at the moment. First there was the delete half the content from the BBC's Web servers without first saving an online backup.debacle that enraged both us and the nerdy public at large, and now there's the decision to
The latter of those two problems has been comprehensively solved by an anonymous geek who sent a program to crawl the 172 sites that are marked for deletion and downloaded them all to a cheap webhost that he rented for $3.99 (£2.50). He or she then made a torrent of the files, and is hosting it for everyone to share. In so doing, they have assured its long-term survival.
So why doesn't the BBC just make a backup itself? Well, we aren't entirely convinced that it hasn't. This particular Craver has extensive experience in the corporation and doubts that any of these sites would truly be lost forever. Some of them will sit in content-management systems, and almost everything will have a development server -- usually two -- that would host a non-public mirror for staff to check changes before pushing any files live.
The content may not be lost, but it also may not be saved in one single location. Unless, of course, the BBC has a Web server backup strategy that would keep a copy of the files. Certainly, the BBC homepage has a long-running archive that staff could use to see old iterations of the Beeb's main landing page.
But whatever is really going on here, it sounds to us like the BBC is trying to make a political point. It's certainly not the cost of the backup process that's the problem. Instead, it seems more likely that Auntie wants everyone to know that cutting her budget will hurt us all.