Feature: Fake websites and the new media
A spoof website with the Baltimore mayor’s angry riposte to shadow home secretary Chris Grayling saw several news sites run with the story. Web 2.0 is challenging journalists like never before.
By Ian Dunt
Earlier in the week, shadow home secretary Chris Grayling compared parts of the UK to hit HBO show The Wire. So far, so laughable. And then journalists found a way to keep the story going. The mayor of Baltimore had put out a statement on her website lambasting him for linking the city and the show.
“To present a television show as the real Baltimore is to perpetuate a fiction that dishonours our city,” she wrote. “It is as pointless as boasting that Baltimore has a per capita homicide rate a fraction of that in the popular UK television show Midsomer Murders.”
Except, of course, she didn’t. The site was a fake, albeit a very good one, by naughty political blogger Recess Monkey, who is holidaying in Baltimore. Cue red faces all-round. “The mayor of Baltimore did not make the statements attributed to her in the story below – we were caught out by a hoax,” the Guardian wrote in a disclaimer to the story on the web. The Independent and the city’s own Baltimore Sun had made the same mistake. And so too, for that matter, had politics.co.uk.
So what does the Baltimore hoax say about British journalism, and the effect of the internet on news reporting?
One problem lies in the ease with which people can set up very professional-looking websites. Simon Ruda, of Fired and Inspired, a website design company, says the sites can be set up in a short time scale, especially when, as in this case, most of the look can be simply transported by copying the original site.
Once upon a time, it was corporations which were the target. “It has been around since the start,” he says. “Initially it was a trademark issue. Someone bought cocacola.co.uk before they did and tried to sell it to them. But Coca Cola argued the name was a trademark so you’ve got to give it to them.
“Someone who sets up a website against a corporation would get heavily sued so they are usually cautious. But the internet isn’t policed in any real way. Anyone can do anything.”
The debate over the trustworthiness of internet sources is becoming increasingly pronounced. Moments after Michael Jackson died, a tweet, apparently from foreign secretary David Miliband went out expressing sadness and stating: “Michael Jackson RIP”. It was carried by a host of media outlets, including almost every broadsheet in London. Political sites like politics.co.uk loved it, because it offered a political angle on a story which was dominating the news agenda. Broadsheets loved it because it offered what was still ultimately an entertainment story some weight.
Unfortunately, it was fake. The Foreign Office issued a press release reminding journalists of the fact Miliband doesn’t have a Twitter account, and we were only saved by the prime minister’s statements, later that morning, expressing sadness at the death. Editorial teams across the country started having a debate which will become increasingly common. Do you delete the story, or put up a disclaimer admitting your fault, or change the angle of the story to highlight how many of us were fooled? politics.co.uk, for its part, went with the latter option.
Similar problems were afoot just this week, when the Met established a dedicated Twitter feed to keep the media up to date with the tactics on the first day of Climate Camp, in London. Before midday another account – cO11MetPolice, rather than CO11MetPolice – had been set up, spilling out fake information. None of it was funny or loopy enough to be obviously written off.
Wikipedia has always been at the forefront of these issues, because of its philosophy of user-generated content. The site is therefore a good place to look for an indication of the way the wind is blowing. This week, the site finally ended the ability of users to edit items themselves, leaving a dedicated team of experts to approve changes. “We are no longer at the point that it is acceptable to throw things at the wall and see what sticks,” said Michael Snow, chairman of the Wikimedia board. The move was prompted, in part, by the frenzied and irresponsible behaviour of political activists, who amended the entries of candidates or leaders either to mock them (Tony Blair’s middle name was changed to “Whoop-de-do”) or trash their reputation.
The debate over the trustworthiness of the internet is relatively new, but the one over the trustworthiness of journalists is much older. It has been given added impetus, however, by the recession, and the flagging income among media outlets. With less sales comes less money, and with less money comes mass redundancies. Those journalists still left with jobs are being forced to write more and more copy, but with just the same amount of time at work. One of the first victims of this phenomenon – ably mapped out by Nick Davies in his excellent book Flat Earth News – is the process by which journalists verify the source of a story. This is doubly true when the source is an internet item itself – such as website or a tweet.
“We should treat new media with some scepticism in the same way we would a conversation in a pub,” says Stephen Ward, director of NoSweat journalism training school in London. “The stringent rules we adhere to are getting blurred.
“Professional journalism is trying to capture the internet market. It’s trying to have a presence on the internet market and so the two get blurred because they’re on the same platform.”
The internet’s effect on journalism has been immense, threatening the very existence of newspapers and forcing new, previously unimagined financial models on a sector which is struggling to stay afloat at all. And now it is entering a second phase, where the prevalence of user-generated content, such a Twitter and blogs, threatens to make the veracity of a source even harder to establish. It looks as if we’re experiencing the birth-pangs. Things will probably get worse before they get better.