historical stuff by Gareth Millward




Bias is when a source, deliberately or not, shows an inclination towards a particular group or person.

Every source does this.

All sources are created by people, or, at the very least, interpreted by people. And all people are biased. We all have things we believe; things we know; things we cannot know. To dismiss a source because it is “biased” is therefore rather pointless. As George Gosling writes:

[I]n my seminars I recommend students ditch the term ‘bias’ altogether. There is no person, no document (no historical witness or source) that is not biased in some way or another. [I]t’s meaningless. The problem here is that labelling a source as biased sounds like you’ve actually said something when you haven’t, making it all too easy to move on to the next point without actually having made one at all. Instead, identify the perspective from which they see events or from which a source is written. That really can tell us something.

Take for example a news story in The Times from the 1960s. If an event is reported there, we can broadly assumed “it happened”. The Times prided itself as the paper of record and had a reputation to uphold. However, a paper like The Times had biases about what it would report on and how it reported. What constituted “news” for a (typically) centre-right news paper based in England? How did it choose to report on it? What had it chosen not to report on? To help us find out we will – of course – look at other evidence, other newspapers and begin to construct an idea of “what actually happened” from multiple points of view. Why did the Daily Mirror spend more time on this event? Why did The Telegraph not cover it at all? What about Le Monde? Or Pravda? Or The Des Moines Register?

The term “bias” seems to have gained currency online in recent years, particularly in debates about sport and politics. The “main stream media” is, apparently biased. Of course it is – but how? Little further analysis appears to be needed other than to state this fact.

This is damaging not because “the MSM” aren’t biased – they are, and sometimes in very dangerous ways to proper and healthy public discourse – but because this can be used as an excuse for nihilism. Everything is biased. Nothing can be trusted. There is no such thing as fact. So let’s make it all up.

Except, there is something even more dangerous than nihilism – gullibility. Believing everything that agrees with one’s predetermined position, or because it comes from a “trusted” source.

Very often, the accusation of bias is not levelled at sources that agree us. Or rather, it seems that our biases are seen as acceptable, natural or even morally good. To dismiss, for example, everything BBC News says because of “bias” but to be willing to swallow everything posted by The Canary makes absolutely no sense on a rational level given that one of these has much more rigorous fact checking and professionalism.

(You can choose which.)

“Bias” is also different to “lying”. It is possible to genuinely and sincerely believe in something and to be – from a different perspective – wrong. There is a massive difference between portraying the world in a particular way to influence people and intentionally setting out to deceive. The historian has to be able to make judgements about which is which. And they have to do it within the context of the times in which the source was written – not just by “reading back” based on presentist morality.

As historians, we may well look first at more “objective” sources. If, for example, we wanted to find out what people thought about a political leader we might turn to opinion polling. We know that questions and questioners have particular biases. We know that polling in itself is not a flawless process. But we also know that there is a method and validation procedures that all polls have to go through. When we take a poll and triangulate that information with other sources (such as diaries, interviews, press cuttings, official documents, etc.), we can begin to reconstruct a picture of the past based on evidence. It may well be that the polling data were “wrong” in this greater context – but without proper debate and rigour we cannot make that claim.

But we don’t dismiss the biased ones either. Like George said – it’s how sources are biased that can be useful to historians in itself. If we want to tell the history of the EU referendum, the diaries and utterances of Boris Johnson and Nigel Farage will be incredibly fruitful. What they choose to say and why they choose to say it will be fantastic material for historians to get their teeth into.

There is an apocryphal and oft-repeated quote I see online (apparently attributable to George Orwell) that “just because it’s printed in the Daily Mail doesn’t mean it isn’t true”. Perhaps that should be our starting point. To find the truth in the evidence we find – not to simply dismiss that which does not fit our preconceived ideas.

Print Friendly

Anti-vaccination and Corbyn: a discussion


Anti-vaccinationism is fascinating. It takes elements of truth, facts taken entirely out of context and a smattering of outright falsehoods. But what’s really interesting about it is how its supporters take the whole thing as a cohesive whole. It doesn’t matter which bits are true, false or somewhere in the middle. What matters is the central message. And all information can be refracted through that same lens: vaccination must be stopped

It contains within it an elegant and quite brilliant defence strategy – any and all criticism must be based on ignorance, malice or slander. All evidence that is not “anti-vax” approved is dismissed as biased. The medical industry manufactures vaccines – therefore any evidence with the slightest connection to pharmaceuticals is immediately dodgy. Government funding? Well, they have vested interests too. The very fact they have so much evidence shows just how much they fear what anti-vaccinationists are saying – look how hard they’re trying to keep them silent! People only criticise the anti-vaccination community because they fear that if the masses knew the truth the establishment would lose their ability to make money from vaccines.

The same level of critical cynicism, however, is not levelled at evidence which agrees with the anti-vax position. If a single suspected case of vaccine damage can be found, this is taken as bona fide fact. If a single study suggests increased risks of disease within a population, this is seized upon. A vaccinated kid gets the disease they were immunised against? You’d better believe that’s going in the file.

What I find intriguing about all of this is that it purports to use the same tools of rational debate that the humanities, social sciences and hard sciences have used since they were professionalised towards the end of the eighteenth century. Find evidence to support your hypothesis, lay out these facts in a convincing manner, and spread the word amongst your community so that they can build upon your findings. Of course, the academic disciplines have rules and peer review to continually critique those findings. Anti-vaccinationists, however, have their own echo chamber, reinterpreting and reinforcing the same set of information and ideas and shielding it from any criticism. Remember – anyone who criticises them is either working for big pharma or has been taken in by their propaganda.

That is irrational. And it flies in the face of how experts traditionally talk to one another.

Meme posted by vactruth.com on Facebook. A typical anti-vaccination theme is to cast suspicion on the profit motives of vaccine manufacturers and suppliers.

Meme posted by vactruth.com on Facebook. A typical anti-vaccination theme is to cast suspicion on the profit motives of vaccine manufacturers and suppliers.

Now. What on Earth could this have to do with Jeremy Corbyn?


Corbyn’s supporters also have their network of truths and half-truths, some of which I’ll delve into in a minute. But what seems abundantly clear is there is a very well-constructed defence edifice that has been built around this “social movement”. Jeremy fights against corporations, Conservatives, the right-wing of the Labour Party and those who benefit most from the status quo. Anyone who disagrees with him must therefore either have swallowed these vested interests’ lies or be actively working for them. Any evidence they provide is tainted by association, and must be rejected as suspicious. They only do it because they fear that the world will see the truth and sweep away the old establishment. By the same token, any evidence put forward by Corbyn’s supporters, regardless of context, is yet more proof of how important the cause must be.

Meme posted by on Twitter by @LabourEoin. It emphasises Corbyn's fight against vested interests, claiming that his critics only do so because they fear he will end their dominance.

Meme posted by on Twitter by @LabourEoin. It emphasises Corbyn’s fight against vested interests, claiming that his critics only do so because they fear he will end their dominance.

Caveat box!. I am not saying all supporters of Corbyn behave in this manner. There are rational reasons to support him, and plenty of evidence that his enemies are dangerous to our interests.

I found a cache of letters written to the Minister of Health in 1942 at The National Archives. They show that many of the techniques used by anti-vaccination campaigners haven’t changed much over the decades. There’s a shift in nuance here, a change of target there. But broadly it’s the same. Vaccination is unnatural, unsafe and unnecessary. The only reason to support it is because too many people make money from it. And once people hear the truth, they will reject it all, sweeping away those who exploited them.

Let’s just take a few areas as examples. The use of statistics. In the letters, anti-vaccinationists rightly point out that ‘2,380 children who had received a course of immunisation developed diphtheria’.1 What they neglected to mention was that your chances of getting diphtheria were significantly reduced if one was vaccinated. Unvaccinated children were four times more likely to get the disease, and twenty times more likely to die.2

There’s no lie here. The anti-vaccination campaigners were entirely correct. But put in the proper epidemiological context, the claim was bizarre. At face value, relatively convincing; with proper analysis it actually strengthened the argument of the medical authorities.

The same can be applied to election results. Every seat held by Labour is put down to Corbyn’s broad and unquestionable appeal. Yet the local election results of 2016 were largely seen as underwhelming for the party according to all but the most Corbyn-optimistic pundits. Corbyn held most of the seats he was supposed to (which is better than most predicted), but he failed to make gains where his “new politics” was meant to make a difference. He ‘performed woefully in Scotland’, for example.

Again – his supporters aren’t lying when they say Corbyn did well. He didn’t lose many seats. But 6 years into Cameron’s Prime Ministership, this wasn’t the performance of a man with momentum to win a Parliamentary majority in 2020. Far from it.

Meme posted at DailyAlternative.co.uk.

Meme posted at DailyAlternative.co.uk.

‘In spite of all the misrepresentation of facts and figures by medical and other interested apologists, the case against [vaccination] becomes more impressive and condemning’ wrote one correspondent whose signature I can’t decipher.3 This too reflects a key component of anti-vaccinationism’s in-built confidence. The people are “waking up” to the truth, despite the propaganda. And eventually the people will win.

Once more, this isn’t necessarily a lie. It’s not really statistically provable, but its repetition in itself strengthens the cause. Because doctors and the government were engaging in propaganda around vaccines. A war was on – but even for years after 1945, much of the poster and film work around promoting health policy was called “propaganda”.

They government make claims about vaccination being perfectly safe, painless and protecting your child from disease.4 Yet we all know that complications do happen (albeit very, very rarely), it isn’t “painless”, and (as we’ve already seen) vaccinated children can catch the disease. Why is the government exaggerating? What is it hiding? And if I’m noticing this, surely others must be too?

So, back to Corbyn. It’s true, the press has never given him a fair shake.5 Nor did many in his Parliamentary party. Why do they try so hard to discredit him? What are they worried about us knowing or finding out? Since all my friends are seeing this, it can’t be long before the whole nation embraces him.

But… there is always a but. Vaccination is safe. Complications are so rare as to be statistically negligible (though clearly devastating for any family affected by them). The needle does prick and cause discomfort, especially in small children, but it is nowhere near as painful as actually getting diphtheria. And we already know your chances of getting the disease are slim if you get the jab.

Similarly, maybe the press is out to get Corbyn. But he has done and said some pretty questionable things. Taking money from Iranian state TV, for example, while claiming to be pro-gay rights and civil liberties seems to be contradictory. It at least requires addressing if you’re going to attack people for previous employment choices. He clearly does have a problem delegating and managing a large team, regardless of whose interests it is for that story to come out. Just because someone is exaggerating and has an interest doesn’t mean – necessarily – that they’re wrong.

Image from David Icke, a well-known conspiracy theorist. Corbyn is shown here on RT, Russia's state TV channel - also not the most "freedom friendly".

Image from David Icke, a well-known conspiracy theorist. Corbyn is shown here on RT, Russia’s state TV channel – also not the most “freedom friendly”.

I could go on.

The point here is not to say that Corbyn’s supporters are liars. I don’t think they make stuff up. Nor is it to say that they have no truth or power to their cause. Rather that the selective use of evidence and self-proving mythology about why and how they are criticised act in similar ways to the anti-vaccination movement.

Both groups have evidence on their side – and again, for clarity, Corbyn has far more truth on his side than anti-vaccinationists. But this doesn’t really matter. Context is largely irrelevant, and criticism is blunted when the rules of rational debate are subverted. Vested corporate interests are destroying our communities. New Labour did stifle debate and the voice of the working classes. We should have leaders who are willing to challenge established institutions so that they work more fairly for the majority and not just for the privileged few.

And most importantly of all, the Parliamentary Labour Party is a fucking mess that couldn’t organise a piss up in a brewery.

But cherry-picking evidence within that framework to prove Corbyn is the man to change all that doesn’t serve those purposes. It leaves us without a rational defence against the Tories. It leaves us without an internal rational debate within the left as to how best to build coalitions and partnerships with those sympathetic – but not yet converted – to our cause. In short, it leaves us with irrational politics. Voids in the extremes to be filled by fascists and populist racists to the right and dogmatic anti-establishment cults of personality on the left.

And here I am – the academic in his ivory tower – stuck in the middle with you.

Title image from Wikipedia. Photograph by Garry Knight, 4 August 2014.

  1. The National Archives (hereafter TNA): MH 55/1754, Ada Henderson to Ministry of Health, 13 October 1942.
  2. Summary Report of the Ministry of Health for the year ended 31st March, 1943 (Cmd. 6468).
  3. TNA: MH 55/1754, ??? to Ministry of Health, 19 October 1942.
  4. TNA: BN 10/229, Ministry of Health & Ministry Information, “Immunisation against Diphtheria Campaign, 1942”.
  5. Bart Cammaerts, ‘Our report found that 75% of press coverage misrepresents Jeremy Corbyn – we can’t ignore media bias anymore’, Independent (19 July 2016) < http://www.independent.co.uk/voices/jeremy-corbyn-media-bias-labour-mainstream-press-lse-study-misrepresentation-we-cant-ignore-bias-a7144381.html > (accessed 22 July 2016).
Print Friendly

2010 – Wikileaks


Sensitive documents have always been leaked. Sometimes these come from “whistle blowers”, unhappy at the way institutions or government departments have acted. Sometimes these can be deliberately managed by institutions to try to control the media narrative or support their cause. Social media and mass use of the internet, however, have meant that the volume of such leaks has exploded exponentially.

In 2010, Wikileaks released a series of documents relating the Afghan and Iraq wars. They then proceeded to publish a swathe of diplomatic cables between the United States and her allies. Chelsea (née Bradley) Manning was eventually court marshalled for her role in providing the material to Wikileaks.1 The US government still wants to get its hands on the founder, Julian Assange. He is currently holed up in the Ecuadorian Embassy on an unrelated matter.2

In 2013, Edward Snowden was also involved in the publication of a tranche of National Security Agency (NSA) documents, detailing how the United States spied on its own and foreign citizens in way that, to put it mildly, appeared to stretch the law and the US Constitution to breaking point.3 Both Wikileaks and Assange caused a world-wide public debate on the limits of privacy, government confidentiality and the limits of state power vis-a-vis its people.

One of the most powerful political interviews of recent memory was conducted by British comedian John Oliver for his HBO programme Last Week Tonight. In it, Oliver managed to convey just how serious the NSA’s programmes were while simultaneously showing that a) high-volume leaks have security consequences and b) not everyone actually cares. The deflated look on Snowden’s face when Oliver shows him vox pops in Times Square spoke a thousand words.

Where people have discussed the issue, the simplification of the matter in much of the public discourse on the leaks has been highly problematic. Perhaps the medium of satire allowed Oliver to explore the argument in a more rounded way. Yet if Twitter and Facebook are anything to go by, it seems very difficult to see either Assange or Snowden as anything other than heroes or villains. One cannot deny that the activities these leaks exposed were in the public interest (rather than simply “shit the public is interested in“). But it is also undeniable that people have been sacrificed for this noble cause. Chelsea Manning is currently in jail for treason while Assange remains free (albeit in self-imposed confinement in the Ecuadorian Embassy). Snowden is being protected by the Russian state, but the mishandling of his material compromised the positions and intelligence of spies across the globe.4

There is a question about how confidential anything can be in the digital world. In the past, the existence of insecure data did not necessarily result in wider public circulation. It is not enough that something is said or written – it needs to be reproduced and disseminated in order to be read. Enough media outlets need to report on the contents. Enough people need to care about it for it to become news. There is also a massive difference between the individual’s right to privacy and the rights and needs of government departments to restrict access to certain information.

Personal privacy is a minefield for another time. However, governments have always had issues with controlling potentially compromising “facts”. For example, the Black Report into health inequalities was deliberately published with little fanfare and with limited copies. The new Thatcher government did not want the details becoming wider knowledge as the idea of social medicine and the impact of wealth inequality in health outcomes ran counter to their political ideology.5 Regardless of the rights or wrongs of that approach, campaigners were able to circulate the report by making their own copies and disseminating it amongst researchers and journalists. Later reports confirmed that Black was correct. The attempted suppression failed. Nowadays the report can be easily downloaded and found in digitised archives.6

Whether be accident or design, the other way of keeping uncomfortable information hidden is to bury under mounds of data. Moves towards “open government” in the United Kingdom have been very useful to researchers like me. I can very easily gain access to pretty much any government report from the past ten years, as well as regular statistical digests from departments of state. At the same time, this information requires expertise and – most importantly of all – time and resources to read, parse and analyse with any degree of accuracy and confidence. The Spartacus Report is just one example of how citizen activists can use such material.7 But full critique and opposition to government policy nowadays almost requires such a meticulous approach that the average lone citizen cannot manage.

It seems that mass leaks of documents will become a fact of life in the twenty-first century. While the press and citizen activists are able to read, digest and communicate this information accurately and effectively, it will help to scrutinise government activity. However, at the same time as such information becomes available, the staffs of journalistic institutions are shrinking. We are entering a paradoxical age in which we have more access to information than ever – but perhaps a decreasing capacity to understand it. Time will tell how we as a society deal with these issues.

Cover image courtesy of Wikicommons. Photograph by nick.hider.
This post was originally published on 26 August 2016.
  1. ‘United States diplomatic cables leak’, Wikipedia < https://en.wikipedia.org/wiki/United_States_diplomatic_cables_leak >(accessed 26 August 2016).
  2. OR IS IT?!?!?!?! Yes. It is. He’s accused of rape in Sweden. Supporters of Assange claim this is a politically-motivated charge. I will not be touching the debate with a ten-foot barge pole. At least not without getting explicit consent first. Google it. Or see: Nick Davies, ’10 days in Sweden: the full allegations against Julian Assange’, The Guardian (17 December 2010, 9.30pm GMT) < https://www.theguardian.com/media/2010/dec/17/julian-assange-sweden > (accessed 26 August 2016).
  3. See: ‘The NSA Files’ section at The Guardian < https://www.theguardian.com/us-news/the-nsa-files > (accessed 26 August 2016).
  4. ‘British spies “moved after Snowden files read”‘, BBC News (14 June 2015) < http://www.bbc.co.uk/news/uk-33125068 > (accessed 26 August 2016).
  5. Virginia Berridge (ed.), The Black Report and The Health Divide (London : LSHTM, 1999).
  6. For example, see: ‘The Black Report 1980’, Socialist Health Association < http://www.sochealth.co.uk/national-health-service/public-health-and-wellbeing/poverty-and-inequality/the-black-report-1980/ > (accessed 26 August 2016).
  7. Diary of a Benefit Scrounger, S J Campbell, Anon, Sue Marsh, Kaliya Franklin, Declan Gaffney, Anon, Mason Dixon, Leigh James, Sam Barnett-Cormack, Rhydian Fon-James, Dawn Willis and Anon, Responsible Reform: A Report on the Proposed Changes to Disability Living Allowance.
Print Friendly

2009 – H1N1


Overall … I consider [the British government’s] response to have been proportionate and effective. There is much good practice on which to build. I heard nothing but praise and admiration from those interviewed for the health service and health protection staff right across the UK who led the response to the pandemic. Their dedication and professionalism … despite the additional pressures of the pandemic must be acknowledged and congratulated.

Deirdre Hine1

Swine ‘flu caused quite a panic in 2009. The associated virus – H1N1 – had been responsible for the great 1918 pandemic that killed an estimated 50 to 100 million people in the wake of the First World War.2 After almost a century without such a devastating pandemic, there were legitimate concerns from health authorities that the disease could cause similar levels of destruction.

Of course, we know that this didn’t happen. Around 18,000 people died directly of the disease, with an unsubstantiated number of others who died of complications related to it. Because Armageddon never came, the World Health Organisation and other authorities were accused of fear-mongering. Even its own advisers accused WHO of wasting ‘huge amounts of money by investing in pandemic scenarios whose evidence base is weak’.3

The H1N1 virus under an electron microscope. (Source)

The H1N1 virus under an electron microscope.

And yet, as quoted at the beginning of this piece, it turned out that caution was probably advisable. Influenza is a killer; and certain strains of the virus are more virulent than others. We are, if history is any indicator, “overdue” a mass pandemic. Understandably, given the monitoring systems in place, WHO gets a bit jittery when such virulent strains appear to be making a comeback.

There is a common-sense belief in “the boy who cried wolf” – if authorities continue to predict doom and nothing dramatic ever happens, then the public will not listen to warnings when disaster actually arrives. Yet a 2007 study into tornado alarms seems to contradict this assertion. People do respond to alarms “just in case”; and when one factors in “near misses” and other data, the rates of “false alarms” are probably not as high as originally supposed.4

(We see similar attitudes towards opinion polling, especially given their supposed “inaccuracy” in the 2015 General Election and 2016 European Union Referendum. In both cases, the margins for error in the polling data were around 1.5 to 2 per cent5 – and when these are factored in, the polls were, in fact, broadly accurate.)

H1N1, then, represented a number of issues in a modern, globalised Britain. A distrust – or at least scepticism – about the ability of experts to communicate risk to the public. Underlying concerns about the vulnerability of an interconnected world to pandemic infectious disease. And governments toeing the line between unduly panicking their citizens whilst protecting them from infection.

In June 2009, WHO declared H1N1 to be a Phase 6 pandemic – that is, it was present and infecting people in most regions of the world.6 However, it did declare the outbreak to be “moderate” – that is:

1. Most people recover from infection without the need for hospitalization or medical care.
2. Overall, national levels of severe illness … appear similar to levels seen during local seasonal influenza periods … .
3. … Hospitals and health care systems in most countries have been able to cope … .7

The UK put in place a number of pandemic measures during 2009 and 2010 in order to cope. It had done so since 2002, preparing for the possibility of an influenza pandemic. The British health authorities had in mind a more virulent strain – possibly avian flu, which had caused significant damage in Asia during the mid 2000s.8 This included the stockpiling of anti-flu drug “Tamiflu”, which has since received immense criticism for being ineffectual against most of the more virulent forms of the virus (including H1N1).9

Dame Deirdre Hine’s 2010 report, however, shows that without the benefit of hindsight the Labour government probably did the right thing. As she noted, this was the first UK-wide crisis that required co-ordination across the four devolved governments. There were also two ways to attack a potential crisis – prepare for the worst (i.e. over-prepare “just in case”) or prepare for the most likely outcome. The government went for the ‘reasonable worst-case scenario’, which meant they were probably “over prepared”, but had enough slack in the system to cope if the situation had been worse. Where they had caused public confusion was by publishing their estimates and suggestions before the final plans were put in place.10

The question remains, however – how will people respond to the next pandemic? Ebola may have been dramatic, but it did not really affect most Westerners. Similarly, the Zika virus is an unknown quantity. The press coverage and government responses have mostly been cautious. When there is another flu pandemic, how bad will it be? Will people take the threat seriously? Will governments under prepare having been burnt in the past (and operating under externally and self-imposed resource constraints)? Only time will tell.

With special thanks to Sue Taylor at the Centre for History in Public Health for pointing me in the direction of useful material on the crisis.
This post was written on 17 August 2016.
Banner image from Wikicommons.
  1. Dierdre Hine, The 2009 Influenza Pandemic: An Independent Review of the UK Response to the 2009 Influenza Pandemic (London : Cabinet Office, 2010) < https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/61252/the2009influenzapandemic-review.pdf > (accessed 17 August 2015).
  2. See: ‘1918 flu pandemic’, Wikipedia < https://en.wikipedia.org/wiki/1918_flu_pandemic > (accessed 17 August 2016); ‘2009 flu pandemic’, Wikipedia < https://en.wikipedia.org/wiki/2009_flu_pandemic > (accessed 17 August 2016).
  3. Smitha Mundasad and Paul Rodgers, ‘Billions wasted on swine flu pandemic that never came’, Independent on Sunday, 16 May 2010, 36.
  4. Lindsey R. Barnes, et al., ‘False alarms and close call: a conceptual model of warning accuracy‘, Weather and Forecasting 22 (2007), 1140-1147.
  5. Anthony Wells, ‘Understanding margin of error’, YouGov (21 November 2011, 11.55) < https://yougov.co.uk/news/2011/11/21/understanding-margin-error/ > (accessed 17 August 2016).
  6. ‘2009 flu pandemic’.
  7. ‘What is phase 6?’, World Health Organisation (updated 11 June 2009) < http://www.who.int/csr/disease/swineflu/frequently_asked_questions/levels_pandemic_alert/en/ > (accessed 17 August 2016).
  8. Hine, The 2009 Influenza Pandemic; ‘Global spread of H5N1’, < https://en.wikipedia.org/wiki/Global_spread_of_H5N1 > (accessed 17 August 2016).
  9. James Gallagher, ‘Tamiflu: Millions wasted on flu drug, claims major report’, BBC News (10 April 2014) < http://www.bbc.co.uk/news/health-26954482 > (accessed 17 August 2016).
  10. Hine, The 2009 Influenza Pandemic, esp. 4-8.
Print Friendly

2008 – The Great Recession


29 September 2008 – New York

The economic climate of the early twenty-first century will be explained with reference to the stock market crash in 2008. Because exactly the same circumstances that allowed the economy to grow so much over the 2000s led to the recession that continues to affect the Western world.

Boom and bust cycles are nothing new. It is one of those inevitable laws of capitalism – whether you’re a Marxist or not. Speculation creates bubbles, and those bubbles can burst. It’s happened with tulips, South American trade monopolies, and good old fashioned stocks and shares. This time, it was through ‘sub prime mortgages’.

This is a technical term for lending money to people to purchase their own homes when you know:

a) their houses are worth fuck all and;
b) they don’t have the money to pay you back.

The idea was that these debts could be sold on, creating enough money that if someone defaulted on their payments, the owner of the debt could repossess the house and start the cycle again.

Who could have possibly predicted that if the economy slowed, people with no money and houses worth nothing would default?

As Andy Zaltzman put it (to paraphrase):

It’s like slamming your testicles in a car door. Until you actually do it, you don’t know that it will hurt.

Similar schemes for speculation, using weird financial “products” had been used for years to prop up the financial sector. Which wouldn’t necessarily have been a problem – except many Western nations had become increasingly reliant upon finance to keep their economies expanding. Successive governments, especially in the UK and US, had relaxed financial rules, allowing even greater speculation.

Moreover, it had become deliberate policy to move what had been public services into the private sector as a means of reducing the national budget and making the economy more “competitive”. This meant that many countries required private investment in “public” projects – and that meant trying desperately to keep the financial markets happy by lowering taxes, reducing burdensome paperwork, and handing contracts to the lowest bidders.

Caroline Lucas, Jeremy Corbyn and Nigel Farage. Each, in different ways, enjoying electoral success as an "alternative" to the pre-2008 status quo. (Sources: left | centre | right)

Caroline Lucas, Jeremy Corbyn and Nigel Farage. Each, in different ways, enjoying electoral success as an “alternative” to the pre-2008 status quo.
(Sources: left | centre | right)

This has had a somewhat interesting effect on politics in many of these nations. Some, like Greece, were almost completely wiped out, lacking enough of a financial base to cope with the storm. Others were able to bail out their banks and continue investment to drag themselves out of the hole, like Germany and the United States. Still others jailed those who had caused the crash, devalued their currency and made a strong recovery, like Iceland. Britain… kinda just kept doing what it had been doing.

After years without a true recovery – at least one that those in the middle of the country could notice in terms of rising wages and higher standards of living – the traditional party political allegiances have started to fall apart. This, if anything, shows us that Fukuyama was wrong. History is still alive and kicking.

Let’s look at Britain.

On the left, parties promoting greater public involvement in the economy and on matters of social justice have received a resurgence. Rejecting the “Blairite” model of gradual reform within a technocratic, liberal economy, the Green Party has made major strides, allying more-traditional leftism with environmental and identity politics concerns. Labour has just elected a left-leaning leader too, much to the chagrin of the old guard.

On the right, a general sense that liberalism has eroded British values and – perhaps more pertinently – the living standards of the middle classes has led to the rise of anti-establishment voices. While their leader may be a middle-class financier, UKIP’s core support has come from those who feel that the European Union and “Thatcherite” mainstream politics has left them vulnerable to outside threats.

For the next decade, the political landscape in the West will be very interesting. Clearly, the post-1980 consensus is breaking down. But quite how it will resolve itself is anyone’s guess. In any event, the 2008 crash is one of the pivotal moments in that history.

This post was originally published on 14 September 2015. The publication date has been changed to maintain the post order.
Print Friendly

2007 – The iPhone


29 June 2007

I find it quite hard to believe that the smart phone is less than a decade old. I remember not having a mobile phone, even when many of my friends had them. I didn’t really see the point. Now, I’m like most people in their twenties and glued to my screen.

Of course, I probably don’t use my device any where near to its fullest potential. I’m not like other people at the London School of Hygiene and Tropical Medicine who use them for detecting eye diseases. But I can tweet. And check football scores. Which is basically the same thing.

If it's not an iPhone, it's... oh. Wait. It is. (Source)

If it’s not an iPhone, it’s… oh. Wait. It is. (Source)

While camera phones and “feature” phones had been around for a good number of years before, it wasn’t until the iPhone that we got the first real “smart phone”. A handheld computer that could connect to the internet using the mobile phone network, and could install third-party applications to perform a multitude of tasks while on the move.

So while “Personal Digital Assistants” (PDAs), camera phones, mobile e-mail devices, messengers and satellite navigation systems were already commonplace, the iPhone was the first to successfully bring these products together into one device.1

Twat taking a selfie.

Twat taking a selfie.

Historians may come to see the rise of the smart phone as a blessing and a curse.

On the one hand, we have countless images of daily life in the Western World; and certainly more photographic evidence of life elsewhere than at any other point in history. As people post their lives onto the internet, collating and annotating as they go, we have a rich source of information that will keep cultural historians occupied for decades.

On the other, there is simply too much information. Moreover, we are seeing a world refracted through the lens of the mobile phone camera. To be sure, we have always had to deal with the distorted nature of historical evidence. But it’s becoming unclear where the borders between the “reality” of life and the smart-phone-inflected social media world can be drawn. Perhaps – indeed – that is the point of our times.

Still, they were a key part of what became known as the Arab Spring (c. 2010-13). They have helped document brutality by repressive regimes across the world. And it is not trivial to compare the coverage of beatings in Iran with the shooting of civilians by American police forces in recent years. Amateur still and video footage have become key parts of the rolling news cycle, and not simply because it provides easy and cheap content for broadcasters and newspapers. Rather, they act as fragments of evidence upon which a more rounded journalistic story can be told.2 It may turn out that this is how the history of our time is reconstructed, too.

Smart phones may reflect our place in history, but they’re also helping to create it. When information can be spread so quickly, it gives news and current events the chance to reach people like never before. This may in some way amplify scandals beyond where they may have in the past. “Wrongdoing”, if it captures the mood of the world at a particular moment, can explode into something inescapable. But then, so can acts of kindness and hope. The problem for historians is that these stories used to play out over days, months or even years. Now they can flare, disappear and resurface in a matter of hours.

In fairness, however, this may be perception. It could turn out that each of these microscandals plays into a much longer and more mature narrative than we can appreciate at this moment. Because we are caught in exactly the same hyperbolic cycle. Will these records of the past – which the iPhone has made possible – give us new perspectives? And is what we are doing really new? Or is it simply the massive scale upon which this pub gossip spreads around the planet that makes things seem so much more important than they “really” are?

The author currently owns a Samsung S5. If any manufacturer wants to offer him a new phone for free, he won’t say no.

Cover image by Arnob017. (Source)
  1. Certainly, it was the first to popularise it. I really don’t want to get into this argument with people. Feel free to have a slanging match in the comments, though. See ‘History of the iPhone’, Wikipedia < https://en.wikipedia.org/wiki/History_of_the_iPhone > (accessed 25 August 2015).
  2. Which is not to say that the media don’t lean heavily on this footage and use it as an opportunity to slack off on occasion…
Print Friendly

2006 – Pluto


24 August 2006 – Prague

Two and two is four. Apples fall when you drop them. The Battle of Hastings was in 1066. And there are nine planets.

Facts all. On 23 August 2006. Then Neil deGrasse Tyson and his cronies ruined it.

The reaction to Pluto being downgraded from planet to mere dwarf planet got a lot of people angry. Really angry. But, why does it even matter? Whether Pluto is a planet or not doesn’t have a material impact upon our daily lives. It’s still there, orbiting the Sun like it always did. We just don’t think of it as one of the eight planets.

There’s something important about a “fact”. We’re living in a world where there are fewer and fewer of these “facts” that we can hang onto. For an Englishman a couple of hundred years ago it was a fact that there was a God. Now, even the “fact” that men and women should have their own bathrooms is becoming less solid.

Pluto as captured recently by New Horizons. Photo from NASA. (Source)

Pluto as captured recently by New Horizons. Photo from NASA.

This isn’t going to descend into the overly simplistic speech from Men In Black. But there is something quite interesting about how facts are socially constructed.

We teach undergraduates pretty early on that there’s no such thing as a fact in history. Sure, we can roughly agree that something called the Second World War ran from 1939 to 1945. But it depends on your definitions. The Italian invasion of Abyssinia could mark the start. Or the Japanese invasion of Manchuria. Or, even, the declaration of war by the United States. And when did it end? When Germany surrendered? When Japan surrendered? Or did the Soviet occupation of Eastern Europe up until the early 1990s count as part of the war?

We’re worthless humanities scholars though. We revel in telling scientists that they’re making it all up. Sort of. The kid at the back of the class thinking they’re cool by scoffing at the teacher.

Yet, the sciences are supposed to deal with facts. Little nuggets of information that are universally true. We spend hours in chemistry classes heating copper sulphate to turn it white – then putting water on it to turn it blue again. This means water is blue, or something.1

We find pretty quickly as we go on, however, that even the scientific world is a little more complicated than it appears in high school classes. Aeroplanes fly by making the air go faster over the top of the wing, reducing the air pressure and sucking it up. (Except that’s not quite true.) The Earth’s gravity pulls things down at 10m/s2. (It doesn’t.)2

The public does hold on to certain social “facts” that keep us going. A kind of “common sense” (or senso commune as Gramsci called it, but he was Italian). Pluto being a planet was one of those things. And so, there was a lot of public anger when this fact was taken away. There didn’t seem much appetite, for example, to upgrade Ceres from asteroid to planet; or to include Makemake and Eris as new planets. Pluto appeared to warrant a place among the planets simply because it was discovered before we realised things were messier in the Solar System than we now believe them to be.

The enduring popularity of shows like QI demonstrates, however, that we sometimes like our facts to be challenged. Partly this gives us an illusion of some sort of inside knowledge that “most” people can’t access. Partly it allows us to explore beyond what we (think we) know. But the big ones don’t tend to go without a fight.

I’ll end with wild speculation. We find truth in the stars. We have done for as long as we have recorded history. The stars were our calendar. Our compass. We on Earth were the centre of that cosmos. Over the years, bloody battles have been fought over the predictions of various forms of astrology; over whether we truly are the centre of the solar system; and later what our place is within a vast, vast universe. Pluto, then, was more than a rock spinning around the Sun. It was the latest in a long line of truths in the heavens that was being taken away.

Either that, or some people REALLY need to get out more. Dude. It’s a rock. Suck it up.

  1. Have I succeeded an making a scientist have a heart attack yet? #TrollMode
  2. Both of these things were taught when I was a school.
Print Friendly

2005 – Hurricane Katrina


29 August 2005 – New Orleans

It’s been ten years since Katrina tore through the South East of the United States, causing thousands to lose their lives and billions of dollars worth of damage. The impact on people’s lives is being felt to this day.

Photo by Infrogmation. (Source)

Photo by Infrogmation. (Source)

Natural disasters such as this are, for obvious reasons, highly publicised in the media. There is something awesome (in the literal sense of the word) about an unstoppable force, causing so much devastation and producing such dramatic images for television and print. That is not to say the media should be blamed for this.1 These events capture the public imagination, and appetite to hear more about them is clearly there.

Now. This raises a number of questions for historians about how people react to disasters. Living in a country that is so rarely affected by earthquakes, tropical storms, volcanoes or tsunami, my focus is often on the observers. How do those in remote locations deal with the news of disasters?

This matters, because sometimes those people in the remote locations are the ones with the power to act. Indirectly through charitable donations, logistical support and international co-operation; and directly as the heads of governments with direct jurisdiction. What made Katrina so iconic in the popular consciousness was not just the devastation it wrought – it was that the richest country on the planet was completely unable to rebuild one of its most important cities, or provide it with the support that it clearly required.

So many disasters occur in parts of the world that already have myriad issues with their political, economic and transport infrastructure. When just a few months previously the Boxing Day Tsunami hit South East Asian coast, there was a massive reaction from people across the world. British people alone donated over £390 million of private money through organisations such as the Disasters Emergency Committee (DEC), and the government pledged a further £75 million.2

The aftermath of the 2004 tsunami on Aceh, Indonesia. Photo by AusAID. (Source)

The aftermath of the 2004 tsunami on Aceh, Indonesia. Photo by AusAID. (Source)

At the same time, we often do very little (in terms of a percentage of the public finances) to build infrastructure so that these disasters have less of a long-term impact. The foreign aid budget remains a controversial topic, with a not-insignificant proportion of the population who subscribe to the mantra “charity begins at home”. Even when we do give, it is often in a patriarchal relationship, based on a very Western idea of “humanitarianism” to those “other” parts of the world.3

This is not a condemnation – dramatic events often provoke more of a reaction than the general, mundane grind of international poverty. But as a historian, these things matter. They uncover one of those paradoxes of charity, especially in England. We will (as a public) donate our time and/or money to a soup kitchen or food bank – but we won’t commit to the sorts of economic reforms that would provide the levels of social security, housing, employment and health care that would render those charitable acts moot. As one commentator on the welfare state put it in the 1960s, the welfare state is ‘the ambulance waiting at the bottom of the cliff’.4

The VAHS will tell you all about these sorts of nuances, which I don’t have time for here. Suffice to say, Katrina broke a number of the stereotypes. Because this happened in a country that was rich enough and had the infrastructure to clean up New Orleans. And yet for so many political reasons it didn’t.

The criticisms of President Bush, the Federal Emergency Management Agency, the State of Louisiana and the City of New Orleans are widely known. Poor management and planning at all levels of the federal system in the United States led to what can only be accurately described as a clusterfuck.

What is intriguing for historians, however, is the way it exposed on a local level what we often see on the international stage. New Orleans was a poor(er) city, with economic and social problems that extended way beyond the damage inflicted by the hurricane. When Kanye West declared “Bush doesn’t care about black people”, it struck a nerve because it represented decades of neglect of poorer (often black) areas of the country. While the nation united in philanthropic donations and condemnation of the governments’ responses, many of the structural economic problems remain in the Southern United States.

And on that cheery note – don’t take this as an excuse not to donate to DEC appeals. The work they do is vital. But we need to be more critical of the systems which continue to allow natural disasters to do so much damage and last so long when we have the technological know how to fix many of these problems.

  1. We’ll have plenty of opportunity to do that in future articles, I’m sure…
  2. Saleh Saeed, ‘DEC 50th Anniversary: 2004 Asian Tsunami’, DEC (16 December 2013) < http://www.dec.org.uk/articles/dec-50th-anniversary-2004-asian-tsunami > (accessed 25 August 2015); ‘Humanitarian response to the 2004 Indian Ocean earthquake’, Wikipedia < https://en.wikipedia.org/wiki/Humanitarian_response_to_the_2004_Indian_Ocean_earthquake > (accessed 25 August 2015).
  3. Kevin O’Sullivan, ‘Humanitarian encounters: Biafra, NGOs and imaginings of the Third World in Britain and Ireland, 1967–70’, Journal of Genocide Research 16(2/3) (2004), 299-315.
  4. Megan du Boisson, founder and head of the Disablement Income Group, speaking in an interview: The Times, 1 February 1969.
Print Friendly

2004 – The Facebook


4 February 2004 – Cambridge, MA

Social media is… no… social media are everywhere. But one true platform rules them all. At least in the West.

Facebook’s reach is rather remarkable compared to other platforms. At the end of 2014, it had 1.4 billion users. By comparison, Twitter – the darling of academics and journalists – has only half a billion.1 That allows a great number of people to communicate easily across the entire world. This can cover everything from organised resistance against oppressive governments to cat pictures. In my comfy little corner of the ivory tower, it’s usually the latter.

Gratuitous cat.

Gratuitous cat.

These new networks have certainly changed the way I communicate with colleagues and friends. Twitter has allowed me to maintain contact with other historians that, once the hangover of the conference has worn off, would have been much more difficult to maintain. I know for a fact that I would have lost complete contact with many of my school friends. Luckily for us, Facebook launched in the United Kingdom very soon after we left for our respective universities.

We had tools to do this when we were teenagers. Internet forums were a way to meet new people, as were the various chat rooms available through mainstream sites and over the Internet Relay Chat protocol. Incidentally, if parents are worried today about what their wee bairns are up to on Snapchat, then imagine what your kids would have been up to on a totally insecure and anonymous chat protocol that your parents weren’t internet savvy enough to understand. Sorry, mum and dad. Don’t worry. My virginity wasn’t taken by a 43-year-old lorry driver.2



But this isn’t about my dalliance with Frank in the glorious summer of ’01. This is about history. And social media provide some tricky problems for historians. They are usually hidden behind security measures. Facebook, for instance, has myriad privacy settings, and most people will only be able to read (or are likely to find) content posted and linked to by their friends.

At the RESAW conference at Aarhus this year, this was explored in detail. Historians of the internet are now starting to use the archived material of the web. But social media aren’t necessarily the web. Apps are very often used to access the data held on the sevices’ servers. While tweets, for example, may be public, you need to read them in a specific context. People use feeds of individuals’ “microblogs”. The Library of Congress can hold all this information, but how on earth are we going to make sense of it?

So much has been lost. Of course, history has also lost the verbal conversations of working class people down the pub; or the discussions held late into the night of the eighteenth-century coffee house. What is more frustrating is that we KNOW people wrote and sent these messages to each other. All we can ever read of them are the occasional snippets that happen to survive in other forms of blog, journal or personal data files.

Bulletin Board Systems in the 1980s have been mostly lost – though we do have histories that can be told.3. Geocities has been shut down – though we do have an archive we can begin to analyse.4 But the meat of the content is gone, and won’t be coming back. How much of the stuff we have now will go the same way?

We are trying to record this stuff. But as a historian of post-war Britain, I am more interested in a larger question – how has (or will) social media change the way Britons behave. What has changed in our personal relationships; the way we meet; the way we part; the ways we vote, organise, and understand the universe? Having lived through it, I can’t tell if I’ve changed the way I behave because I’m getting older, because of the technology and social fabric of Britain, or – more likely – because of the relationship between the two.

This may be a question we can only answer with some historical distance. But it’s worth asking now. Perhaps my 30-for-60 in 2045 will be able to give a more useful conclusion…

The eagle-eyed amongst you will note this piece was written and published on 9 August 2015. The publication date on this WordPress entry has been changed so that the weekly update pattern is maintained in the database, and the post appears in the right order.
  1. ‘Facebook’, Wikipedia < https://en.wikipedia.org/wiki/Facebook > (accessed 9 August 2015); ‘Twitter’, Ibid. < https://en.wikipedia.org/wiki/Facebook > (accessed 9 August 2015).
  2. Despite his best efforts.
  3. See the work of Kevin Driscoll at his personal site
  4. Ditto Ian Milligan.
Print Friendly

2003 – The Iraq War Protests


15 February 2015 – Various

Despite the numbers, the war went ahead anyway. The images over the following years became almost as iconic as those of the millions marching through London and other cities. Bush in front of the “Mission Accomplished” sign; the toppling of the Saddam statue; the subsequent trial and execution. The question is, then – what was the fucking point?

The protest failed to achieve its main goal, but it is beginning to be historicised into a wider narrative of mass protest and voluntary action. It was in many ways one of the first “internet” demonstrations, with millions of protesters brought together through digital technologies such as e-mail and websites. (This is before Facebook and Twitter. But more on these in upcoming weeks). Movements such as Occupy may have had similar headline “failures”, but they have acted as a focal point for protest against the dominant neo-liberal political framework in the Western world.

Indeed, the breakdown of the party systems in Britain and America especially has made this sort of extra-Parliamentary form of protest increasingly potent and necessary. For while the Labour Party and Conservative Party differ on a number of ideological points, many of the key decisions about how to run foreign and domestic affairs have converged. Neither supports nationalisation of key public services; both believe in a strong military, including a nuclear arsenal; both play the realpolitik game of getting close to dictatorships in various parts of the world in return for good arms contracts and a steady supply of oil. Crucially, both supported the Iraq War, even if there were dissenting members from the parties at the time and subsequently.

This has been going on for a while, however. Voluntary organisations and charities have always been politically driven – you cannot set out to solve a social problem without doing so. While many of the larger institutions have, in the past, steered well clear of party politics, there has often been a direct or indirect moral cajoling of those in local and national government to enact policies that will help charities get on with their vital work.

In the 1960s, however, we began to see more assertive groups coming forward. Charities that did not necessarily provide services themselves, but deliberately spent their money on researching the social problems of the day and lobbying the government to fix it. The Child Poverty Action Group, Disability Income Group, Shelter and many others appeared during this time. They were willing and able to use the growing mass media to present their cases in ever-increasingly sophisticated ways. And, to varying degrees, they have had success with governments of both parties right across the late-twentieth and into the twenty-first century.

The growing professionalism of those groups in this new political climate, however, meant that they became specialised. Social campaigners may have had many concerns, but the charities themselves were often very narrowly-focused. The big questions – traditionally the preserve of the political parties – were beginning to be diffused through a range of institutions and organisations, few of whom would ever hold direct power in Westminster or City Hall.

The Iraq protest, then, represented more than just the war. For many, it was the first time in a generation that people had been able to broadly agree on a particular action and – crucially – had the tools to mobilise quickly and effectively across the world. Occupy, and the struggles of the 99% have been similarly branded. They represent growing disquiet on, predominantly, the political left with the party system and the post-war levers and apparatus that are supposed to impose democratic will on the law of the land. That they have been unsuccessful may say more about the increasing distance between the machinery of government and the people than it does about the protesters themselves.

Print Friendly
Older Posts