historical stuff by Gareth Millward

Search

corbyn

Anti-vaccination and Corbyn: a discussion

22/07/2016

Anti-vaccinationism is fascinating. It takes elements of truth, facts taken entirely out of context and a smattering of outright falsehoods. But what’s really interesting about it is how its supporters take the whole thing as a cohesive whole. It doesn’t matter which bits are true, false or somewhere in the middle. What matters is the central message. And all information can be refracted through that same lens: vaccination must be stopped

It contains within it an elegant and quite brilliant defence strategy – any and all criticism must be based on ignorance, malice or slander. All evidence that is not “anti-vax” approved is dismissed as biased. The medical industry manufactures vaccines – therefore any evidence with the slightest connection to pharmaceuticals is immediately dodgy. Government funding? Well, they have vested interests too. The very fact they have so much evidence shows just how much they fear what anti-vaccinationists are saying – look how hard they’re trying to keep them silent! People only criticise the anti-vaccination community because they fear that if the masses knew the truth the establishment would lose their ability to make money from vaccines.

The same level of critical cynicism, however, is not levelled at evidence which agrees with the anti-vax position. If a single suspected case of vaccine damage can be found, this is taken as bona fide fact. If a single study suggests increased risks of disease within a population, this is seized upon. A vaccinated kid gets the disease they were immunised against? You’d better believe that’s going in the file.

What I find intriguing about all of this is that it purports to use the same tools of rational debate that the humanities, social sciences and hard sciences have used since they were professionalised towards the end of the eighteenth century. Find evidence to support your hypothesis, lay out these facts in a convincing manner, and spread the word amongst your community so that they can build upon your findings. Of course, the academic disciplines have rules and peer review to continually critique those findings. Anti-vaccinationists, however, have their own echo chamber, reinterpreting and reinforcing the same set of information and ideas and shielding it from any criticism. Remember – anyone who criticises them is either working for big pharma or has been taken in by their propaganda.

That is irrational. And it flies in the face of how experts traditionally talk to one another.

Meme posted by vactruth.com on Facebook. A typical anti-vaccination theme is to cast suspicion on the profit motives of vaccine manufacturers and suppliers.

Meme posted by vactruth.com on Facebook. A typical anti-vaccination theme is to cast suspicion on the profit motives of vaccine manufacturers and suppliers.

Now. What on Earth could this have to do with Jeremy Corbyn?

Francis-Underwood-Looks-at-Camera-House-of-Cards

Corbyn’s supporters also have their network of truths and half-truths, some of which I’ll delve into in a minute. But what seems abundantly clear is there is a very well-constructed defence edifice that has been built around this “social movement”. Jeremy fights against corporations, Conservatives, the right-wing of the Labour Party and those who benefit most from the status quo. Anyone who disagrees with him must therefore either have swallowed these vested interests’ lies or be actively working for them. Any evidence they provide is tainted by association, and must be rejected as suspicious. They only do it because they fear that the world will see the truth and sweep away the old establishment. By the same token, any evidence put forward by Corbyn’s supporters, regardless of context, is yet more proof of how important the cause must be.

Meme posted by on Twitter by @LabourEoin. It emphasises Corbyn's fight against vested interests, claiming that his critics only do so because they fear he will end their dominance.

Meme posted by on Twitter by @LabourEoin. It emphasises Corbyn’s fight against vested interests, claiming that his critics only do so because they fear he will end their dominance.

Caveat box!. I am not saying all supporters of Corbyn behave in this manner. There are rational reasons to support him, and plenty of evidence that his enemies are dangerous to our interests.

I found a cache of letters written to the Minister of Health in 1942 at The National Archives. They show that many of the techniques used by anti-vaccination campaigners haven’t changed much over the decades. There’s a shift in nuance here, a change of target there. But broadly it’s the same. Vaccination is unnatural, unsafe and unnecessary. The only reason to support it is because too many people make money from it. And once people hear the truth, they will reject it all, sweeping away those who exploited them.

Let’s just take a few areas as examples. The use of statistics. In the letters, anti-vaccinationists rightly point out that ‘2,380 children who had received a course of immunisation developed diphtheria’.1 What they neglected to mention was that your chances of getting diphtheria were significantly reduced if one was vaccinated. Unvaccinated children were four times more likely to get the disease, and twenty times more likely to die.2

There’s no lie here. The anti-vaccination campaigners were entirely correct. But put in the proper epidemiological context, the claim was bizarre. At face value, relatively convincing; with proper analysis it actually strengthened the argument of the medical authorities.

The same can be applied to election results. Every seat held by Labour is put down to Corbyn’s broad and unquestionable appeal. Yet the local election results of 2016 were largely seen as underwhelming for the party according to all but the most Corbyn-optimistic pundits. Corbyn held most of the seats he was supposed to (which is better than most predicted), but he failed to make gains where his “new politics” was meant to make a difference. He ‘performed woefully in Scotland’, for example.

Again – his supporters aren’t lying when they say Corbyn did well. He didn’t lose many seats. But 6 years into Cameron’s Prime Ministership, this wasn’t the performance of a man with momentum to win a Parliamentary majority in 2020. Far from it.

Meme posted at DailyAlternative.co.uk.

Meme posted at DailyAlternative.co.uk.

‘In spite of all the misrepresentation of facts and figures by medical and other interested apologists, the case against [vaccination] becomes more impressive and condemning’ wrote one correspondent whose signature I can’t decipher.3 This too reflects a key component of anti-vaccinationism’s in-built confidence. The people are “waking up” to the truth, despite the propaganda. And eventually the people will win.

Once more, this isn’t necessarily a lie. It’s not really statistically provable, but its repetition in itself strengthens the cause. Because doctors and the government were engaging in propaganda around vaccines. A war was on – but even for years after 1945, much of the poster and film work around promoting health policy was called “propaganda”.

They government make claims about vaccination being perfectly safe, painless and protecting your child from disease.4 Yet we all know that complications do happen (albeit very, very rarely), it isn’t “painless”, and (as we’ve already seen) vaccinated children can catch the disease. Why is the government exaggerating? What is it hiding? And if I’m noticing this, surely others must be too?

So, back to Corbyn. It’s true, the press has never given him a fair shake.5 Nor did many in his Parliamentary party. Why do they try so hard to discredit him? What are they worried about us knowing or finding out? Since all my friends are seeing this, it can’t be long before the whole nation embraces him.

But… there is always a but. Vaccination is safe. Complications are so rare as to be statistically negligible (though clearly devastating for any family affected by them). The needle does prick and cause discomfort, especially in small children, but it is nowhere near as painful as actually getting diphtheria. And we already know your chances of getting the disease are slim if you get the jab.

Similarly, maybe the press is out to get Corbyn. But he has done and said some pretty questionable things. Taking money from Iranian state TV, for example, while claiming to be pro-gay rights and civil liberties seems to be contradictory. It at least requires addressing if you’re going to attack people for previous employment choices. He clearly does have a problem delegating and managing a large team, regardless of whose interests it is for that story to come out. Just because someone is exaggerating and has an interest doesn’t mean – necessarily – that they’re wrong.

Image from David Icke, a well-known conspiracy theorist. Corbyn is shown here on RT, Russia's state TV channel - also not the most "freedom friendly".

Image from David Icke, a well-known conspiracy theorist. Corbyn is shown here on RT, Russia’s state TV channel – also not the most “freedom friendly”.

I could go on.

The point here is not to say that Corbyn’s supporters are liars. I don’t think they make stuff up. Nor is it to say that they have no truth or power to their cause. Rather that the selective use of evidence and self-proving mythology about why and how they are criticised act in similar ways to the anti-vaccination movement.

Both groups have evidence on their side – and again, for clarity, Corbyn has far more truth on his side than anti-vaccinationists. But this doesn’t really matter. Context is largely irrelevant, and criticism is blunted when the rules of rational debate are subverted. Vested corporate interests are destroying our communities. New Labour did stifle debate and the voice of the working classes. We should have leaders who are willing to challenge established institutions so that they work more fairly for the majority and not just for the privileged few.

And most importantly of all, the Parliamentary Labour Party is a fucking mess that couldn’t organise a piss up in a brewery.

But cherry-picking evidence within that framework to prove Corbyn is the man to change all that doesn’t serve those purposes. It leaves us without a rational defence against the Tories. It leaves us without an internal rational debate within the left as to how best to build coalitions and partnerships with those sympathetic – but not yet converted – to our cause. In short, it leaves us with irrational politics. Voids in the extremes to be filled by fascists and populist racists to the right and dogmatic anti-establishment cults of personality on the left.

And here I am – the academic in his ivory tower – stuck in the middle with you.

Title image from Wikipedia. Photograph by Garry Knight, 4 August 2014.

  1. The National Archives (hereafter TNA): MH 55/1754, Ada Henderson to Ministry of Health, 13 October 1942.
  2. Summary Report of the Ministry of Health for the year ended 31st March, 1943 (Cmd. 6468).
  3. TNA: MH 55/1754, ??? to Ministry of Health, 19 October 1942.
  4. TNA: BN 10/229, Ministry of Health & Ministry Information, “Immunisation against Diphtheria Campaign, 1942”.
  5. Bart Cammaerts, ‘Our report found that 75% of press coverage misrepresents Jeremy Corbyn – we can’t ignore media bias anymore’, Independent (19 July 2016) < http://www.independent.co.uk/voices/jeremy-corbyn-media-bias-labour-mainstream-press-lse-study-misrepresentation-we-cant-ignore-bias-a7144381.html > (accessed 22 July 2016).
Print Friendly
H1N1_influenza_virusBANNER

2009 – H1N1

31/08/2015

Overall … I consider [the British government’s] response to have been proportionate and effective. There is much good practice on which to build. I heard nothing but praise and admiration from those interviewed for the health service and health protection staff right across the UK who led the response to the pandemic. Their dedication and professionalism … despite the additional pressures of the pandemic must be acknowledged and congratulated.

Deirdre Hine1

Swine ‘flu caused quite a panic in 2009. The associated virus – H1N1 – had been responsible for the great 1918 pandemic that killed an estimated 50 to 100 million people in the wake of the First World War.2 After almost a century without such a devastating pandemic, there were legitimate concerns from health authorities that the disease could cause similar levels of destruction.

Of course, we know that this didn’t happen. Around 18,000 people died directly of the disease, with an unsubstantiated number of others who died of complications related to it. Because Armageddon never came, the World Health Organisation and other authorities were accused of fear-mongering. Even its own advisers accused WHO of wasting ‘huge amounts of money by investing in pandemic scenarios whose evidence base is weak’.3

The H1N1 virus under an electron microscope. (Source)

The H1N1 virus under an electron microscope.
(Source)

And yet, as quoted at the beginning of this piece, it turned out that caution was probably advisable. Influenza is a killer; and certain strains of the virus are more virulent than others. We are, if history is any indicator, “overdue” a mass pandemic. Understandably, given the monitoring systems in place, WHO gets a bit jittery when such virulent strains appear to be making a comeback.

There is a common-sense belief in “the boy who cried wolf” – if authorities continue to predict doom and nothing dramatic ever happens, then the public will not listen to warnings when disaster actually arrives. Yet a 2007 study into tornado alarms seems to contradict this assertion. People do respond to alarms “just in case”; and when one factors in “near misses” and other data, the rates of “false alarms” are probably not as high as originally supposed.4

(We see similar attitudes towards opinion polling, especially given their supposed “inaccuracy” in the 2015 General Election and 2016 European Union Referendum. In both cases, the margins for error in the polling data were around 1.5 to 2 per cent5 – and when these are factored in, the polls were, in fact, broadly accurate.)

H1N1, then, represented a number of issues in a modern, globalised Britain. A distrust – or at least scepticism – about the ability of experts to communicate risk to the public. Underlying concerns about the vulnerability of an interconnected world to pandemic infectious disease. And governments toeing the line between unduly panicking their citizens whilst protecting them from infection.

In June 2009, WHO declared H1N1 to be a Phase 6 pandemic – that is, it was present and infecting people in most regions of the world.6 However, it did declare the outbreak to be “moderate” – that is:

1. Most people recover from infection without the need for hospitalization or medical care.
2. Overall, national levels of severe illness … appear similar to levels seen during local seasonal influenza periods … .
3. … Hospitals and health care systems in most countries have been able to cope … .7

The UK put in place a number of pandemic measures during 2009 and 2010 in order to cope. It had done so since 2002, preparing for the possibility of an influenza pandemic. The British health authorities had in mind a more virulent strain – possibly avian flu, which had caused significant damage in Asia during the mid 2000s.8 This included the stockpiling of anti-flu drug “Tamiflu”, which has since received immense criticism for being ineffectual against most of the more virulent forms of the virus (including H1N1).9

Dame Deirdre Hine’s 2010 report, however, shows that without the benefit of hindsight the Labour government probably did the right thing. As she noted, this was the first UK-wide crisis that required co-ordination across the four devolved governments. There were also two ways to attack a potential crisis – prepare for the worst (i.e. over-prepare “just in case”) or prepare for the most likely outcome. The government went for the ‘reasonable worst-case scenario’, which meant they were probably “over prepared”, but had enough slack in the system to cope if the situation had been worse. Where they had caused public confusion was by publishing their estimates and suggestions before the final plans were put in place.10

The question remains, however – how will people respond to the next pandemic? Ebola may have been dramatic, but it did not really affect most Westerners. Similarly, the Zika virus is an unknown quantity. The press coverage and government responses have mostly been cautious. When there is another flu pandemic, how bad will it be? Will people take the threat seriously? Will governments under prepare having been burnt in the past (and operating under externally and self-imposed resource constraints)? Only time will tell.

With special thanks to Sue Taylor at the Centre for History in Public Health for pointing me in the direction of useful material on the crisis.
This post was written on 17 August 2016.
Banner image from Wikicommons.
  1. Dierdre Hine, The 2009 Influenza Pandemic: An Independent Review of the UK Response to the 2009 Influenza Pandemic (London : Cabinet Office, 2010) < https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/61252/the2009influenzapandemic-review.pdf > (accessed 17 August 2015).
  2. See: ‘1918 flu pandemic’, Wikipedia < https://en.wikipedia.org/wiki/1918_flu_pandemic > (accessed 17 August 2016); ‘2009 flu pandemic’, Wikipedia < https://en.wikipedia.org/wiki/2009_flu_pandemic > (accessed 17 August 2016).
  3. Smitha Mundasad and Paul Rodgers, ‘Billions wasted on swine flu pandemic that never came’, Independent on Sunday, 16 May 2010, 36.
  4. Lindsey R. Barnes, et al., ‘False alarms and close call: a conceptual model of warning accuracy‘, Weather and Forecasting 22 (2007), 1140-1147.
  5. Anthony Wells, ‘Understanding margin of error’, YouGov (21 November 2011, 11.55) < https://yougov.co.uk/news/2011/11/21/understanding-margin-error/ > (accessed 17 August 2016).
  6. ‘2009 flu pandemic’.
  7. ‘What is phase 6?’, World Health Organisation (updated 11 June 2009) < http://www.who.int/csr/disease/swineflu/frequently_asked_questions/levels_pandemic_alert/en/ > (accessed 17 August 2016).
  8. Hine, The 2009 Influenza Pandemic; ‘Global spread of H5N1’, < https://en.wikipedia.org/wiki/Global_spread_of_H5N1 > (accessed 17 August 2016).
  9. James Gallagher, ‘Tamiflu: Millions wasted on flu drug, claims major report’, BBC News (10 April 2014) < http://www.bbc.co.uk/news/health-26954482 > (accessed 17 August 2016).
  10. Hine, The 2009 Influenza Pandemic, esp. 4-8.
Print Friendly

2008 – The Great Recession

24/08/2015

29 September 2008 – New York

The economic climate of the early twenty-first century will be explained with reference to the stock market crash in 2008. Because exactly the same circumstances that allowed the economy to grow so much over the 2000s led to the recession that continues to affect the Western world.

Boom and bust cycles are nothing new. It is one of those inevitable laws of capitalism – whether you’re a Marxist or not. Speculation creates bubbles, and those bubbles can burst. It’s happened with tulips, South American trade monopolies, and good old fashioned stocks and shares. This time, it was through ‘sub prime mortgages’.

This is a technical term for lending money to people to purchase their own homes when you know:

a) their houses are worth fuck all and;
b) they don’t have the money to pay you back.

The idea was that these debts could be sold on, creating enough money that if someone defaulted on their payments, the owner of the debt could repossess the house and start the cycle again.

Who could have possibly predicted that if the economy slowed, people with no money and houses worth nothing would default?

As Andy Zaltzman put it (to paraphrase):

It’s like slamming your testicles in a car door. Until you actually do it, you don’t know that it will hurt.

Similar schemes for speculation, using weird financial “products” had been used for years to prop up the financial sector. Which wouldn’t necessarily have been a problem – except many Western nations had become increasingly reliant upon finance to keep their economies expanding. Successive governments, especially in the UK and US, had relaxed financial rules, allowing even greater speculation.

Moreover, it had become deliberate policy to move what had been public services into the private sector as a means of reducing the national budget and making the economy more “competitive”. This meant that many countries required private investment in “public” projects – and that meant trying desperately to keep the financial markets happy by lowering taxes, reducing burdensome paperwork, and handing contracts to the lowest bidders.

Caroline Lucas, Jeremy Corbyn and Nigel Farage. Each, in different ways, enjoying electoral success as an "alternative" to the pre-2008 status quo. (Sources: left | centre | right)

Caroline Lucas, Jeremy Corbyn and Nigel Farage. Each, in different ways, enjoying electoral success as an “alternative” to the pre-2008 status quo.
(Sources: left | centre | right)

This has had a somewhat interesting effect on politics in many of these nations. Some, like Greece, were almost completely wiped out, lacking enough of a financial base to cope with the storm. Others were able to bail out their banks and continue investment to drag themselves out of the hole, like Germany and the United States. Still others jailed those who had caused the crash, devalued their currency and made a strong recovery, like Iceland. Britain… kinda just kept doing what it had been doing.

After years without a true recovery – at least one that those in the middle of the country could notice in terms of rising wages and higher standards of living – the traditional party political allegiances have started to fall apart. This, if anything, shows us that Fukuyama was wrong. History is still alive and kicking.

Let’s look at Britain.

On the left, parties promoting greater public involvement in the economy and on matters of social justice have received a resurgence. Rejecting the “Blairite” model of gradual reform within a technocratic, liberal economy, the Green Party has made major strides, allying more-traditional leftism with environmental and identity politics concerns. Labour has just elected a left-leaning leader too, much to the chagrin of the old guard.

On the right, a general sense that liberalism has eroded British values and – perhaps more pertinently – the living standards of the middle classes has led to the rise of anti-establishment voices. While their leader may be a middle-class financier, UKIP’s core support has come from those who feel that the European Union and “Thatcherite” mainstream politics has left them vulnerable to outside threats.

For the next decade, the political landscape in the West will be very interesting. Clearly, the post-1980 consensus is breaking down. But quite how it will resolve itself is anyone’s guess. In any event, the 2008 crash is one of the pivotal moments in that history.

This post was originally published on 14 September 2015. The publication date has been changed to maintain the post order.
Print Friendly
selfie

2007 – The iPhone

17/08/2015

29 June 2007

I find it quite hard to believe that the smart phone is less than a decade old. I remember not having a mobile phone, even when many of my friends had them. I didn’t really see the point. Now, I’m like most people in their twenties and glued to my screen.

Of course, I probably don’t use my device any where near to its fullest potential. I’m not like other people at the London School of Hygiene and Tropical Medicine who use them for detecting eye diseases. But I can tweet. And check football scores. Which is basically the same thing.

If it's not an iPhone, it's... oh. Wait. It is. (Source)

If it’s not an iPhone, it’s… oh. Wait. It is. (Source)

While camera phones and “feature” phones had been around for a good number of years before, it wasn’t until the iPhone that we got the first real “smart phone”. A handheld computer that could connect to the internet using the mobile phone network, and could install third-party applications to perform a multitude of tasks while on the move.

So while “Personal Digital Assistants” (PDAs), camera phones, mobile e-mail devices, messengers and satellite navigation systems were already commonplace, the iPhone was the first to successfully bring these products together into one device.1

Twat taking a selfie.

Twat taking a selfie.

Historians may come to see the rise of the smart phone as a blessing and a curse.

On the one hand, we have countless images of daily life in the Western World; and certainly more photographic evidence of life elsewhere than at any other point in history. As people post their lives onto the internet, collating and annotating as they go, we have a rich source of information that will keep cultural historians occupied for decades.

On the other, there is simply too much information. Moreover, we are seeing a world refracted through the lens of the mobile phone camera. To be sure, we have always had to deal with the distorted nature of historical evidence. But it’s becoming unclear where the borders between the “reality” of life and the smart-phone-inflected social media world can be drawn. Perhaps – indeed – that is the point of our times.

Still, they were a key part of what became known as the Arab Spring (c. 2010-13). They have helped document brutality by repressive regimes across the world. And it is not trivial to compare the coverage of beatings in Iran with the shooting of civilians by American police forces in recent years. Amateur still and video footage have become key parts of the rolling news cycle, and not simply because it provides easy and cheap content for broadcasters and newspapers. Rather, they act as fragments of evidence upon which a more rounded journalistic story can be told.2 It may turn out that this is how the history of our time is reconstructed, too.

Smart phones may reflect our place in history, but they’re also helping to create it. When information can be spread so quickly, it gives news and current events the chance to reach people like never before. This may in some way amplify scandals beyond where they may have in the past. “Wrongdoing”, if it captures the mood of the world at a particular moment, can explode into something inescapable. But then, so can acts of kindness and hope. The problem for historians is that these stories used to play out over days, months or even years. Now they can flare, disappear and resurface in a matter of hours.

In fairness, however, this may be perception. It could turn out that each of these microscandals plays into a much longer and more mature narrative than we can appreciate at this moment. Because we are caught in exactly the same hyperbolic cycle. Will these records of the past – which the iPhone has made possible – give us new perspectives? And is what we are doing really new? Or is it simply the massive scale upon which this pub gossip spreads around the planet that makes things seem so much more important than they “really” are?

The author currently owns a Samsung S5. If any manufacturer wants to offer him a new phone for free, he won’t say no.

Cover image by Arnob017. (Source)
  1. Certainly, it was the first to popularise it. I really don’t want to get into this argument with people. Feel free to have a slanging match in the comments, though. See ‘History of the iPhone’, Wikipedia < https://en.wikipedia.org/wiki/History_of_the_iPhone > (accessed 25 August 2015).
  2. Which is not to say that the media don’t lean heavily on this footage and use it as an opportunity to slack off on occasion…
Print Friendly
global-mosaic-of-pluto-in-true-color-banner

2006 – Pluto

10/08/2015

24 August 2006 – Prague

Two and two is four. Apples fall when you drop them. The Battle of Hastings was in 1066. And there are nine planets.

Facts all. On 23 August 2006. Then Neil deGrasse Tyson and his cronies ruined it.

The reaction to Pluto being downgraded from planet to mere dwarf planet got a lot of people angry. Really angry. But, why does it even matter? Whether Pluto is a planet or not doesn’t have a material impact upon our daily lives. It’s still there, orbiting the Sun like it always did. We just don’t think of it as one of the eight planets.

There’s something important about a “fact”. We’re living in a world where there are fewer and fewer of these “facts” that we can hang onto. For an Englishman a couple of hundred years ago it was a fact that there was a God. Now, even the “fact” that men and women should have their own bathrooms is becoming less solid.

Pluto as captured recently by New Horizons. Photo from NASA. (Source)

Pluto as captured recently by New Horizons. Photo from NASA.
(Source)

This isn’t going to descend into the overly simplistic speech from Men In Black. But there is something quite interesting about how facts are socially constructed.

We teach undergraduates pretty early on that there’s no such thing as a fact in history. Sure, we can roughly agree that something called the Second World War ran from 1939 to 1945. But it depends on your definitions. The Italian invasion of Abyssinia could mark the start. Or the Japanese invasion of Manchuria. Or, even, the declaration of war by the United States. And when did it end? When Germany surrendered? When Japan surrendered? Or did the Soviet occupation of Eastern Europe up until the early 1990s count as part of the war?

We’re worthless humanities scholars though. We revel in telling scientists that they’re making it all up. Sort of. The kid at the back of the class thinking they’re cool by scoffing at the teacher.

Yet, the sciences are supposed to deal with facts. Little nuggets of information that are universally true. We spend hours in chemistry classes heating copper sulphate to turn it white – then putting water on it to turn it blue again. This means water is blue, or something.1

We find pretty quickly as we go on, however, that even the scientific world is a little more complicated than it appears in high school classes. Aeroplanes fly by making the air go faster over the top of the wing, reducing the air pressure and sucking it up. (Except that’s not quite true.) The Earth’s gravity pulls things down at 10m/s2. (It doesn’t.)2

The public does hold on to certain social “facts” that keep us going. A kind of “common sense” (or senso commune as Gramsci called it, but he was Italian). Pluto being a planet was one of those things. And so, there was a lot of public anger when this fact was taken away. There didn’t seem much appetite, for example, to upgrade Ceres from asteroid to planet; or to include Makemake and Eris as new planets. Pluto appeared to warrant a place among the planets simply because it was discovered before we realised things were messier in the Solar System than we now believe them to be.

The enduring popularity of shows like QI demonstrates, however, that we sometimes like our facts to be challenged. Partly this gives us an illusion of some sort of inside knowledge that “most” people can’t access. Partly it allows us to explore beyond what we (think we) know. But the big ones don’t tend to go without a fight.

I’ll end with wild speculation. We find truth in the stars. We have done for as long as we have recorded history. The stars were our calendar. Our compass. We on Earth were the centre of that cosmos. Over the years, bloody battles have been fought over the predictions of various forms of astrology; over whether we truly are the centre of the solar system; and later what our place is within a vast, vast universe. Pluto, then, was more than a rock spinning around the Sun. It was the latest in a long line of truths in the heavens that was being taken away.

Either that, or some people REALLY need to get out more. Dude. It’s a rock. Suck it up.

  1. Have I succeeded an making a scientist have a heart attack yet? #TrollMode
  2. Both of these things were taught when I was a school.
Print Friendly

2005 – Hurricane Katrina

03/08/2015

29 August 2005 – New Orleans

It’s been ten years since Katrina tore through the South East of the United States, causing thousands to lose their lives and billions of dollars worth of damage. The impact on people’s lives is being felt to this day.

Photo by Infrogmation. (Source)

Photo by Infrogmation. (Source)

Natural disasters such as this are, for obvious reasons, highly publicised in the media. There is something awesome (in the literal sense of the word) about an unstoppable force, causing so much devastation and producing such dramatic images for television and print. That is not to say the media should be blamed for this.1 These events capture the public imagination, and appetite to hear more about them is clearly there.

Now. This raises a number of questions for historians about how people react to disasters. Living in a country that is so rarely affected by earthquakes, tropical storms, volcanoes or tsunami, my focus is often on the observers. How do those in remote locations deal with the news of disasters?

This matters, because sometimes those people in the remote locations are the ones with the power to act. Indirectly through charitable donations, logistical support and international co-operation; and directly as the heads of governments with direct jurisdiction. What made Katrina so iconic in the popular consciousness was not just the devastation it wrought – it was that the richest country on the planet was completely unable to rebuild one of its most important cities, or provide it with the support that it clearly required.

So many disasters occur in parts of the world that already have myriad issues with their political, economic and transport infrastructure. When just a few months previously the Boxing Day Tsunami hit South East Asian coast, there was a massive reaction from people across the world. British people alone donated over £390 million of private money through organisations such as the Disasters Emergency Committee (DEC), and the government pledged a further £75 million.2

The aftermath of the 2004 tsunami on Aceh, Indonesia. Photo by AusAID. (Source)

The aftermath of the 2004 tsunami on Aceh, Indonesia. Photo by AusAID. (Source)

At the same time, we often do very little (in terms of a percentage of the public finances) to build infrastructure so that these disasters have less of a long-term impact. The foreign aid budget remains a controversial topic, with a not-insignificant proportion of the population who subscribe to the mantra “charity begins at home”. Even when we do give, it is often in a patriarchal relationship, based on a very Western idea of “humanitarianism” to those “other” parts of the world.3

This is not a condemnation – dramatic events often provoke more of a reaction than the general, mundane grind of international poverty. But as a historian, these things matter. They uncover one of those paradoxes of charity, especially in England. We will (as a public) donate our time and/or money to a soup kitchen or food bank – but we won’t commit to the sorts of economic reforms that would provide the levels of social security, housing, employment and health care that would render those charitable acts moot. As one commentator on the welfare state put it in the 1960s, the welfare state is ‘the ambulance waiting at the bottom of the cliff’.4

The VAHS will tell you all about these sorts of nuances, which I don’t have time for here. Suffice to say, Katrina broke a number of the stereotypes. Because this happened in a country that was rich enough and had the infrastructure to clean up New Orleans. And yet for so many political reasons it didn’t.

The criticisms of President Bush, the Federal Emergency Management Agency, the State of Louisiana and the City of New Orleans are widely known. Poor management and planning at all levels of the federal system in the United States led to what can only be accurately described as a clusterfuck.

What is intriguing for historians, however, is the way it exposed on a local level what we often see on the international stage. New Orleans was a poor(er) city, with economic and social problems that extended way beyond the damage inflicted by the hurricane. When Kanye West declared “Bush doesn’t care about black people”, it struck a nerve because it represented decades of neglect of poorer (often black) areas of the country. While the nation united in philanthropic donations and condemnation of the governments’ responses, many of the structural economic problems remain in the Southern United States.

And on that cheery note – don’t take this as an excuse not to donate to DEC appeals. The work they do is vital. But we need to be more critical of the systems which continue to allow natural disasters to do so much damage and last so long when we have the technological know how to fix many of these problems.

  1. We’ll have plenty of opportunity to do that in future articles, I’m sure…
  2. Saleh Saeed, ‘DEC 50th Anniversary: 2004 Asian Tsunami’, DEC (16 December 2013) < http://www.dec.org.uk/articles/dec-50th-anniversary-2004-asian-tsunami > (accessed 25 August 2015); ‘Humanitarian response to the 2004 Indian Ocean earthquake’, Wikipedia < https://en.wikipedia.org/wiki/Humanitarian_response_to_the_2004_Indian_Ocean_earthquake > (accessed 25 August 2015).
  3. Kevin O’Sullivan, ‘Humanitarian encounters: Biafra, NGOs and imaginings of the Third World in Britain and Ireland, 1967–70’, Journal of Genocide Research 16(2/3) (2004), 299-315.
  4. Megan du Boisson, founder and head of the Disablement Income Group, speaking in an interview: The Times, 1 February 1969.
Print Friendly

2004 – The Facebook

27/07/2015

4 February 2004 – Cambridge, MA

Social media is… no… social media are everywhere. But one true platform rules them all. At least in the West.

Facebook’s reach is rather remarkable compared to other platforms. At the end of 2014, it had 1.4 billion users. By comparison, Twitter – the darling of academics and journalists – has only half a billion.1 That allows a great number of people to communicate easily across the entire world. This can cover everything from organised resistance against oppressive governments to cat pictures. In my comfy little corner of the ivory tower, it’s usually the latter.

Gratuitous cat.

Gratuitous cat.

These new networks have certainly changed the way I communicate with colleagues and friends. Twitter has allowed me to maintain contact with other historians that, once the hangover of the conference has worn off, would have been much more difficult to maintain. I know for a fact that I would have lost complete contact with many of my school friends. Luckily for us, Facebook launched in the United Kingdom very soon after we left for our respective universities.

We had tools to do this when we were teenagers. Internet forums were a way to meet new people, as were the various chat rooms available through mainstream sites and over the Internet Relay Chat protocol. Incidentally, if parents are worried today about what their wee bairns are up to on Snapchat, then imagine what your kids would have been up to on a totally insecure and anonymous chat protocol that your parents weren’t internet savvy enough to understand. Sorry, mum and dad. Don’t worry. My virginity wasn’t taken by a 43-year-old lorry driver.2

a/s/l?

a/s/l?

But this isn’t about my dalliance with Frank in the glorious summer of ’01. This is about history. And social media provide some tricky problems for historians. They are usually hidden behind security measures. Facebook, for instance, has myriad privacy settings, and most people will only be able to read (or are likely to find) content posted and linked to by their friends.

At the RESAW conference at Aarhus this year, this was explored in detail. Historians of the internet are now starting to use the archived material of the web. But social media aren’t necessarily the web. Apps are very often used to access the data held on the sevices’ servers. While tweets, for example, may be public, you need to read them in a specific context. People use feeds of individuals’ “microblogs”. The Library of Congress can hold all this information, but how on earth are we going to make sense of it?

So much has been lost. Of course, history has also lost the verbal conversations of working class people down the pub; or the discussions held late into the night of the eighteenth-century coffee house. What is more frustrating is that we KNOW people wrote and sent these messages to each other. All we can ever read of them are the occasional snippets that happen to survive in other forms of blog, journal or personal data files.

Bulletin Board Systems in the 1980s have been mostly lost – though we do have histories that can be told.3. Geocities has been shut down – though we do have an archive we can begin to analyse.4 But the meat of the content is gone, and won’t be coming back. How much of the stuff we have now will go the same way?

We are trying to record this stuff. But as a historian of post-war Britain, I am more interested in a larger question – how has (or will) social media change the way Britons behave. What has changed in our personal relationships; the way we meet; the way we part; the ways we vote, organise, and understand the universe? Having lived through it, I can’t tell if I’ve changed the way I behave because I’m getting older, because of the technology and social fabric of Britain, or – more likely – because of the relationship between the two.

This may be a question we can only answer with some historical distance. But it’s worth asking now. Perhaps my 30-for-60 in 2045 will be able to give a more useful conclusion…

The eagle-eyed amongst you will note this piece was written and published on 9 August 2015. The publication date on this WordPress entry has been changed so that the weekly update pattern is maintained in the database, and the post appears in the right order.
  1. ‘Facebook’, Wikipedia < https://en.wikipedia.org/wiki/Facebook > (accessed 9 August 2015); ‘Twitter’, Ibid. < https://en.wikipedia.org/wiki/Facebook > (accessed 9 August 2015).
  2. Despite his best efforts.
  3. See the work of Kevin Driscoll at his personal site
  4. Ditto Ian Milligan.
Print Friendly

2003 – The Iraq War Protests

20/07/2015

15 February 2015 – Various

Despite the numbers, the war went ahead anyway. The images over the following years became almost as iconic as those of the millions marching through London and other cities. Bush in front of the “Mission Accomplished” sign; the toppling of the Saddam statue; the subsequent trial and execution. The question is, then – what was the fucking point?

The protest failed to achieve its main goal, but it is beginning to be historicised into a wider narrative of mass protest and voluntary action. It was in many ways one of the first “internet” demonstrations, with millions of protesters brought together through digital technologies such as e-mail and websites. (This is before Facebook and Twitter. But more on these in upcoming weeks). Movements such as Occupy may have had similar headline “failures”, but they have acted as a focal point for protest against the dominant neo-liberal political framework in the Western world.

Indeed, the breakdown of the party systems in Britain and America especially has made this sort of extra-Parliamentary form of protest increasingly potent and necessary. For while the Labour Party and Conservative Party differ on a number of ideological points, many of the key decisions about how to run foreign and domestic affairs have converged. Neither supports nationalisation of key public services; both believe in a strong military, including a nuclear arsenal; both play the realpolitik game of getting close to dictatorships in various parts of the world in return for good arms contracts and a steady supply of oil. Crucially, both supported the Iraq War, even if there were dissenting members from the parties at the time and subsequently.

This has been going on for a while, however. Voluntary organisations and charities have always been politically driven – you cannot set out to solve a social problem without doing so. While many of the larger institutions have, in the past, steered well clear of party politics, there has often been a direct or indirect moral cajoling of those in local and national government to enact policies that will help charities get on with their vital work.

In the 1960s, however, we began to see more assertive groups coming forward. Charities that did not necessarily provide services themselves, but deliberately spent their money on researching the social problems of the day and lobbying the government to fix it. The Child Poverty Action Group, Disability Income Group, Shelter and many others appeared during this time. They were willing and able to use the growing mass media to present their cases in ever-increasingly sophisticated ways. And, to varying degrees, they have had success with governments of both parties right across the late-twentieth and into the twenty-first century.

The growing professionalism of those groups in this new political climate, however, meant that they became specialised. Social campaigners may have had many concerns, but the charities themselves were often very narrowly-focused. The big questions – traditionally the preserve of the political parties – were beginning to be diffused through a range of institutions and organisations, few of whom would ever hold direct power in Westminster or City Hall.

The Iraq protest, then, represented more than just the war. For many, it was the first time in a generation that people had been able to broadly agree on a particular action and – crucially – had the tools to mobilise quickly and effectively across the world. Occupy, and the struggles of the 99% have been similarly branded. They represent growing disquiet on, predominantly, the political left with the party system and the post-war levers and apparatus that are supposed to impose democratic will on the law of the land. That they have been unsuccessful may say more about the increasing distance between the machinery of government and the people than it does about the protesters themselves.

Print Friendly

2002 – The Salt Lake City Winter Olympics

13/07/2015

My mother once told me of a place,
With waterfalls an unicorns flying.
Where there was no suffering, no pain.
Where there was laughter instead of dying.
I always thought she’d made it up,
To comfort me in times of pain.
But now I know that place is real,
Now I know its name.

~ The Book of Mormon

Why wouldn’t you want to hold a Winter Olympics in Salt Lake City, Utah? Where the warlords are friendly and the goat meat is plentiful? Well, we know why you would hold an Olympics there – flagrant corruption.

The Salt Lake City Olympics bidding process opened up the lid on the systemic nepotism and bribery within the International Olympic Committee (IOC), and the systems for awarding Games to host cities. It resulted in a number of reforms to clean up the system and the IOC’s reputation. Thankfully, nothing like this will ever happen again…

In a completely and utterly unrelated story:

It can be difficult sometimes to justify to people who don’t like sport just why I spend so much of my time watching it. Even if you can explain the attraction of watching people compete and the narratives that flow from that, how exactly do you explain away the rampant commercialism, corruption, sexism, racism, homophobia and various other unsavoury aspects of so many sports and their governing organisations?

Corruption in sport is – shockingly – not new. The “old boys’ networks” from the old British Public School system helped establish the rules for a number of sports in the late nineteenth century, from various versions of football to tennis and beyond. This was part of a Victorian desire to rationalise and standardise sport across the globe, so that everyone was playing by the same rule books. Sport was part of a masculine1 and Christian ideal, supposedly requiring and encouraging self discipline and athletic prowess. By the end of that century and the beginning of the twentieth, international sporting organisations popped up to express these ideals through nationalistic competition.

That nationalism was a key tool for regimes across the twentieth century, some authoritarian, some democratic. Italy “borrowed” a number of soccer players from Argentina to win the 1934 and 1938 FIFA World Cups (and may have slipped a coin or two in the pockets of the referees for good measure). The Nazis made a big play to host the 1936 Olympics. After 1945, the USSR and USA used a number of sports, mostly Olympic, to play out the Cold War in the sports stadium. Soccer teams have been integral to the post-1991 independent states in Eastern Europe and their national identities.

This explains why, say, Salt Lake City was keen to have a Winter Olympics. The eyes of the world would be on them, giving the place legitimacy. It’s why Qatar wanted a FIFA World Cup. It’s why the English were so angry when they didn’t get it.

There is another side to the corruption coin, however, which makes uncomfortable reading for countries who were part of that initial elite 100 years or so ago. Because the rest of the world has had to fight hard to be taken seriously alongside the traditional nations of Western Europe and North America. It took until 2002 before a FIFA World Cup went to Asia; 2010 before one went to Africa. We’re yet to have an African Olympics, Summer or Winter. And we’re a year away from the first ever South American one.

In the case of FIFA, the English/European oligarchy was swept aside in the 1970s under the leadership of João Havelange from Brazil. He appealed to the traditionally neglected nations, and built a large power base outside Europe. In some ways, this was a way to finally wrest control from the closed-shop in Europe. But it was built on giving bonuses to officials with… questionable ethics. No doubt, football investment has improved dramatically in Africa and Asia. But how much would have it have improved if officials weren’t trousering so much of the money?

Now, of course, Sal Tlay Ka Siti was in the United States – not exactly a sporting backwater. But it had been repeatedly overlooked. They believed the reason for this was that they weren’t wining and dining officials as well as their rivals. They may have been right. Though their solution wasn’t. They decided to bribe their way to the Olympics.

Perhaps it was right. They got the games, after all. But it nearly brought down the IOC and resulted in widespread reform.

There’s a question that international sport really needs to tackle, then. It doesn’t want corruption. At the same time, the West isn’t willing to give up its power. The arguments that other nations are not mature enough to be involved, economically or in terms of their business practices, can only go so far. How can they get better if they are never invited to the table?

Similarly, we cannot condone corruption simply because it allows non-traditional nations a shot at the big time. Qatar shouldn’t have a World Cup for the same reason it shouldn’t have a Winter Olympics. The climate isn’t right, the human rights record should see it kicked out of the international community, and it will help none of their citizens; it’s a vanity project for an absolute monarchy trying to buy credibility and prestige with the West.

People in the non-traditional countries deserve more from FIFA, the IOC and their ilk. More than that, though, they deserve more from the people who supposedly represent them.

Title image from Wikicommons.
  1. Although women were similarly encouraged into sport for the good of the race, especially with eugenic theories running wild through the period.
Print Friendly

2001 – September 11

06/07/2015

xkcd (Source)

xkcd (Source)

The terrorist attacks on “9/11” were horrific. The sheer scale of the damage, the cultural significance of the targets, and the fact that this exposed the vulnerability of “the most powerful nation on Earth” made most of the Western world uneasy. Whatever the geopolitical position in the Middle East, whatever the West’s role in the historical and continued instability in the region, the hijackings were barbaric acts.

I have already discussed terrorism and racialised attitudes towards it in this series. And while I could probably go on at length on the subject, there is another aspect of the attacks that piques my interest. The endurance of the conspiracy theory.

Of course, 9/11 wasn’t the first event to generate an industry of writers and enthusiasts spreading their own particular hypotheses to explain major events. JFK and the Apollo XI Moon Landing come to mind immediately. Then there are various “interesting” positions on vaccination or aircraft vapour trails. And we still find people who believe the Jews run the world, or the Catholics run the world, or the Illuminati run the world, or the Stonemasons run the world, or (that’s enough, Ed.)

My own work recently has had to deal with the vaccination issue. And this has been fascinating, partially because it involves so many different positions on what is largely the same base of evidence. It includes everyone from the hardcore “anti-vaxxers” to the hardcore “pro-vaxxers” – and somewhere in between individuals and parents who actively or passively do not get their children vaccinated without really expressing an opinion that survives in the historical record.

So this isn’t about 9/11. It’s a rant.

One of the reasons that the vaccination conspiracies have attracted so much opinion recently is because they have very tangible results. We can see the vaccination rate go up or down; we can see the disease notification rates fluctuate. And it is one of those group behaviour which, we believe, might affect us. Whether another person (or group of people) choose to vaccinate can lead to a greater risk of disease for another. Or so the “pro-vaxxers” would have you believe. (At this point the author dons his tin-foil hat.)

Ben Goldacre, a science writer, has written about “vaccine scares” in his popular books on Bad Science.1 He notes that these theories about, for example, hepatitis vaccines causing multiple sclerosis, worries over mercury or the realtionship between MMR and autism have tended to respect national boundaries.2 And while for the most part he is correct, these ideas did spread (albeit more slowly) in the pre-internet age. The scare over pertussis (whooping cough) vaccination, for example, had pretty much burnt out in England before it flared in the United States; although there was a contemporary (yet completely separate) issue with the vaccine in Japan.3 It took a number of years for Australia and the US to catch onto Wakefield and MMR (despite how his work had been discredited), and the UK has never really got interested in thimerosal (mercury compounds). In the internet age, however, ideas are spreading quicker as people are more quickly able to find individuals with similar views, and in turn are able to share information which confirms these beliefs.

Let’s be clear, however – pro-vaccine people do similar things. The vast majority of those commenting on vaccination (like the vast majority of the world’s population) are not medically trained in the areas of epidemiology and immunisation. This doesn’t make their opinions invalid, but it does make claims about scientific “truth” very difficult to trust. Vaccines are complicated. The science underpinning them is complicated. I would have to contextualise the opinion of, say, a brain surgeon when opining on MMR. Much like – even though I have a PhD in history – my views and pronouncements on pre-industrial Oviedo should probably be taken with a pinch of salt. The difference is that overwhelming medical opinion supports vaccination as safe and effective. The real questions is – how “safe” is “safe”; and how “effective” is “effective”?

No vaccine is 100% effective. It significantly reduces your chances of getting a disease; which in turn significantly reduces your chances of passing it on. Especially if everyone around you is also vaccinated. After a few generations of the disease, theoretically, a population can rid itself of the disease as it has fewer hosts and fewer people to infect. This concept of “herd immunity” is a well established one, even if it is only in recent (historically speaking) times that we have been able to develop statistical and epidemiological models to predict the impact of vaccines on a population.

And, no vaccine is 100% safe. Any procedure – indeed, anything carries risk. This goes from opening a can, to driving a car. As a proportion of the billions of vaccines administered, a tiny fraction have been injured. Health authorities know many of the contra-indicators which might cause this, and attempt to avoid the complications. But mistakes happen. That is no comfort to the families affected, but it has meant that over the course of the twentieth century the death and injury toll of TB, polio, smallpox, diphtheria, tetanus, measles, whooping cough, mumps, rubella and others has been significantly reduced.

This gives the conspiracy theorists their “in”. Because there are thousands of cases of vaccine damage to point to. Each individual one is a devastating tragedy to that family. There are millions who have been vaccinated, yet caught the disease anyway. Each one makes an individual wonder whether the pain was worth it. And, of course, medical and public health science had become more effective at preventing and curing illnesses from the late nineteenth century. Who is to say that vaccines have caused the final drop in cases? Couldn’t it just be coincidence? Aren’t we just exposing children to unnecessary risk?

The answer is, of course, peer-reviewed data and analysis. It’s the mass trials conducted by many countries over the course of the past hundred years. It’s the data we have of disease rates in areas where vaccination rates drop. It’s the control experiments and analyses that seek to eliminate other causal factors. Nobody serious would claim vaccination was the only reason for the elimination of smallpox. Nobody serious would claim that it wasn’t a significant factor.

There are two aces up the sleeve of the conspiracy theorists, however, which keep the debate alive. The first is to lie, or seriously misrepresent data. To make claims like “nobody has ever tested vaccine safety or efficacy”. They have – just look at the Medical Research Council’s trials of pertussis,4 polio and TB5 vaccines as a starting point. While none is without its problems, it is a flat out lie to suggest they never happened.

The second is to deny the relevance of these findings on the basis that “Big Pharma”™ makes money off the back of vaccines, or that the government wants to exert control. This seems to suggest that if people have ulterior motives, what they say cannot be true, regardless of their evidence. This would be enough to discredit those selling alternative therapies to protect people from disease, who have a vested interest in making you doubt biomedicine. But that’s a debate for another time.

This seems to fall under its own logic. For a start, vaccines are not a big money maker for pharmaceutical companies compared to their overall turnover. While they are becoming a bigger part of pharmaceutical company’s strategy – due to emerging markets in the developing world and increased difficulties bringing new drugs to the market – in 2013 the vaccine industry was worth an estimated $24 billion.6 Yet the industry is valued at over $980 billion.7 Besides – why would a drug company want to cure you of a disease that can cause complications for which it could sell you drugs to treat?

These argument build on those made by historians of medicine over the past few decades about the need to question scientific truth claims by authorities. There are countless examples of the power of the medical profession and the increasingly medicalised state interfering in the lives of its citizens. But there is a fundamental flaw in using this work – meant as a critique of the social, political and economic structures of power – and applying it simply to back up another faulty truth claim on the material reality of the universe. Just because science, scientists and the scientific method are bound up in power structures and the interests of the capitalist state doesn’t make all (or even most) of their conclusions invalid. As humanities scholars we can debate their relevance, but we don’t have the tools to deny their scientific merits. Especially if you are going to appeal to rational science as your basis for anti-vaccinationism.

Credit: Wellcome Library, London. Wellcome Images. (Source)

Credit: Wellcome Library, London. Wellcome Images. (Source)

But if you’re going to lean on my discipline, let’s try this. Let’s assume vaccination is part of a Foucauldian panopticon. It monitors citizens, bringing all children under the surveillance of the state, where their behaviour is controlled through the universal administration of drugs designed to increase the productivity of the whole. It’s purpose is at once to exert state power and to discipline the individual into believing that she has an obligation to herself and others to maintain her health for the good of the nation state. Let’s just say we buy that (and I probably do, on an academic level).

Why would the state continue to support a system that (supposedly) injures its citizens, rendering them and their parents’ nuclear family economic unit less productive? The state has a vested interest in supporting a system it helped create in order to save face. But it has a bigger vested interest in finding the most efficient and safest vaccines so that its citizens grow up to be net producers, not drains on the system.

There are legitimate questions to be raised here on the moral and political level. Is one death from a vaccine one death too many? It it right that the state should compel people through law or societal pressure to undergo a medical procedure? Fine. We can sit and have that debate. But you don’t get to make up scientific data or ignore the mountain of evidence which contextualises or repudiates your claims.

  1. In the interests of transparency, Goldacre is, like me, a research fellow at the London School of Hygiene and Tropical Medicine. Unlike me, he has medical qualifications and is far more successful. I have never met the guy. I’m sure he’s lovely/a corporate schill (delete as applicable according to personal taste). http://evaluation.lshtm.ac.uk/people/members/ (accessed 5 July 2015).
  2. Ben Goldacre, ‘How vaccine scares respect local cultural boundaries’, Bad Science(24 April 2013) http://www.badscience.net/2013/04/how-vaccine-scares-respect-local-cultural-boundaries/ (accessed 5 July 2015).
  3. Jeffrey P. Baker, ‘The pertussis vaccine controversy in Great Britain, 1974–1986‘, Vaccine 21(25-26), 4003-10.
  4. ‘Vaccination against whooping-cough’, BMJ 4990 (25 August 1956), 454-62.
  5. B.C.G. and vole bacillus vaccines in the prevention of tuberculosis in adolescents‘, BMJ 4964 (25 February 1956), 413-27.
  6. Miloud Kaddar, ‘Global vaccine market features and trends’ World Health Organization http://who.int/influenza_vaccines_plan/resources/session_10_kaddar.pdf (accessed 5 July 2015).
  7. Statista, ‘Revenue of the worldwide pharmaceutical market from 2001 to 2013 (in billion U.S. dollars)’ http://www.statista.com/statistics/263102/pharmaceutical-market-worldwide-revenue-since-2001/ (accessed 5 July 2015).
Print Friendly
Older Posts