Thursday, April 18, 2013...9:00 am

Thatcher funeral – why Twitter sentiment analysis is nonsense

Jump to Comments

UPDATE: See the detailed methodological comment from Francesco at Pulsar TRAC for evidence that the Thatcher Funeral Twitter sentiment analysis referred to here really is super accurate.


We still haven’t quite managed collectively to leave Margaret Thatcher’s death behind (don’t worry – I’m sure it will pass eventually). At the moment we’re in the weird meta analysis-of-the-analysis stage that will hopefully soon disappear up itself.

But of course I’m going to add to it while it’s still here – I wouldn’t want to disappoint you.

So – Twitter sentiment. Today’s post on Campaign’s Wall Blog tells us that “social media remained negative during Thatcher’s funeral” and comes complete with some Twitter sentiment analysis infographics from digital meejah agency Face.

Twitter sentiment analysis is the kind of data analysis that journalism loves. It looks terribly serious and authoritative, but is actually just a record of random wittering by anyone at all. And because it’s all on Twitter you don’t even have to go out and vox pop strangers in the high street.

But here’s the problem. How do you actually tell what the sentiment of the tweets is?

The Thatcher funeral analysis is all a PR effort by Face to push its shiny new Pulsar TRAC social meejah analytics tool (oh, all right – I wasn’t going to put in the link, as I know it’s what they’re angling for, but here it is to save you the search).

It seemed to find a lot of negative sentiment – even, if the graphic above is to be believed, among Sun readers. I find this dubious.

As an experiment, to see how well such things worked, I used a low-rent alternative – the free Tweetfeel sentiment analysis site to analyse the sentiment of Tweets matching the search term “Thatcher funeral”. How did that go?

Screen Shot 2013-04-18 at 15.50.48

Like this: “30” positive; “34” negative, making up (I’m guessing – the site is a bit vague) 53% of all Tweets meeting the search criteria.

But hang on – what are those Tweets actually saying?

Here’s one that gets a “negative” from Tweetfeel:

Screen Shot 2013-04-18 at 15.54.29

“Disappointing: Obama will not send a representative to thatcher funeral”

Uh – that sounds like a negative to lefty Obama, not the Great Leaderene. Sounds like that should have been a big thumbs up for the funeral.

Or this one that gets a “positive”:

Screen Shot 2013-04-18 at 15.56.49

“@frankieboyle commentary on #thatcher funeral is funny as fuck!”

You know, I’m imagining that’s not going to be as respectful as Tweetfeel seems to indicate.


In all, of the 63 Twitter sample, 9 Tweets of the 34 marked as negative by Tweetfeel were actually positive for the funeral, while 8 Tweets of the 30 marked as positive were really negative. (Here’s a link to a grab of the actual results so you can check yourself. If you have no life.)

I know what some of you may be thinking: “that kind of evens it out – the analysis is really accurate!”. But data shouldn’t work that way. If more than a quarter of the Tweets were analysed incorrectly, that should put the kibosh on Twitter analysis being anything other than a waste of time.

Now, as you will have spotted, I wasn’t using the shiny Pulsar TRAC tool – maybe it has fantastic algorithms to interpret what those 140 characters really mean, so it’s as accurate as all get out. But maybe it isn’t. Bear that in mind when you see journalists going squee over the next Twitter sentiment infographic…

Leave a Reply