historical stuff by Gareth Millward

Search

selfie

2007 – The iPhone

17/08/2015

29 June 2007

I find it quite hard to believe that the smart phone is less than a decade old. I remember not having a mobile phone, even when many of my friends had them. I didn’t really see the point. Now, I’m like most people in their twenties and glued to my screen.

Of course, I probably don’t use my device any where near to its fullest potential. I’m not like other people at the London School of Hygiene and Tropical Medicine who use them for detecting eye diseases. But I can tweet. And check football scores. Which is basically the same thing.

If it's not an iPhone, it's... oh. Wait. It is. (Source)

If it’s not an iPhone, it’s… oh. Wait. It is. (Source)

While camera phones and “feature” phones had been around for a good number of years before, it wasn’t until the iPhone that we got the first real “smart phone”. A handheld computer that could connect to the internet using the mobile phone network, and could install third-party applications to perform a multitude of tasks while on the move.

So while “Personal Digital Assistants” (PDAs), camera phones, mobile e-mail devices, messengers and satellite navigation systems were already commonplace, the iPhone was the first to successfully bring these products together into one device.1

Twat taking a selfie.

Twat taking a selfie.

Historians may come to see the rise of the smart phone as a blessing and a curse.

On the one hand, we have countless images of daily life in the Western World; and certainly more photographic evidence of life elsewhere than at any other point in history. As people post their lives onto the internet, collating and annotating as they go, we have a rich source of information that will keep cultural historians occupied for decades.

On the other, there is simply too much information. Moreover, we are seeing a world refracted through the lens of the mobile phone camera. To be sure, we have always had to deal with the distorted nature of historical evidence. But it’s becoming unclear where the borders between the “reality” of life and the smart-phone-inflected social media world can be drawn. Perhaps – indeed – that is the point of our times.

Still, they were a key part of what became known as the Arab Spring (c. 2010-13). They have helped document brutality by repressive regimes across the world. And it is not trivial to compare the coverage of beatings in Iran with the shooting of civilians by American police forces in recent years. Amateur still and video footage have become key parts of the rolling news cycle, and not simply because it provides easy and cheap content for broadcasters and newspapers. Rather, they act as fragments of evidence upon which a more rounded journalistic story can be told.2 It may turn out that this is how the history of our time is reconstructed, too.

Smart phones may reflect our place in history, but they’re also helping to create it. When information can be spread so quickly, it gives news and current events the chance to reach people like never before. This may in some way amplify scandals beyond where they may have in the past. “Wrongdoing”, if it captures the mood of the world at a particular moment, can explode into something inescapable. But then, so can acts of kindness and hope. The problem for historians is that these stories used to play out over days, months or even years. Now they can flare, disappear and resurface in a matter of hours.

In fairness, however, this may be perception. It could turn out that each of these microscandals plays into a much longer and more mature narrative than we can appreciate at this moment. Because we are caught in exactly the same hyperbolic cycle. Will these records of the past – which the iPhone has made possible – give us new perspectives? And is what we are doing really new? Or is it simply the massive scale upon which this pub gossip spreads around the planet that makes things seem so much more important than they “really” are?

The author currently owns a Samsung S5. If any manufacturer wants to offer him a new phone for free, he won’t say no.

Cover image by Arnob017. (Source)
  1. Certainly, it was the first to popularise it. I really don’t want to get into this argument with people. Feel free to have a slanging match in the comments, though. See ‘History of the iPhone’, Wikipedia < https://en.wikipedia.org/wiki/History_of_the_iPhone > (accessed 25 August 2015).
  2. Which is not to say that the media don’t lean heavily on this footage and use it as an opportunity to slack off on occasion…
Print Friendly
global-mosaic-of-pluto-in-true-color-banner

2006 – Pluto

10/08/2015

24 August 2006 – Prague

Two and two is four. Apples fall when you drop them. The Battle of Hastings was in 1066. And there are nine planets.

Facts all. On 23 August 2006. Then Neil deGrasse Tyson and his cronies ruined it.

The reaction to Pluto being downgraded from planet to mere dwarf planet got a lot of people angry. Really angry. But, why does it even matter? Whether Pluto is a planet or not doesn’t have a material impact upon our daily lives. It’s still there, orbiting the Sun like it always did. We just don’t think of it as one of the eight planets.

There’s something important about a “fact”. We’re living in a world where there are fewer and fewer of these “facts” that we can hang onto. For an Englishman a couple of hundred years ago it was a fact that there was a God. Now, even the “fact” that men and women should have their own bathrooms is becoming less solid.

Pluto as captured recently by New Horizons. Photo from NASA. (Source)

Pluto as captured recently by New Horizons. Photo from NASA.
(Source)

This isn’t going to descend into the overly simplistic speech from Men In Black. But there is something quite interesting about how facts are socially constructed.

We teach undergraduates pretty early on that there’s no such thing as a fact in history. Sure, we can roughly agree that something called the Second World War ran from 1939 to 1945. But it depends on your definitions. The Italian invasion of Abyssinia could mark the start. Or the Japanese invasion of Manchuria. Or, even, the declaration of war by the United States. And when did it end? When Germany surrendered? When Japan surrendered? Or did the Soviet occupation of Eastern Europe up until the early 1990s count as part of the war?

We’re worthless humanities scholars though. We revel in telling scientists that they’re making it all up. Sort of. The kid at the back of the class thinking they’re cool by scoffing at the teacher.

Yet, the sciences are supposed to deal with facts. Little nuggets of information that are universally true. We spend hours in chemistry classes heating copper sulphate to turn it white – then putting water on it to turn it blue again. This means water is blue, or something.1

We find pretty quickly as we go on, however, that even the scientific world is a little more complicated than it appears in high school classes. Aeroplanes fly by making the air go faster over the top of the wing, reducing the air pressure and sucking it up. (Except that’s not quite true.) The Earth’s gravity pulls things down at 10m/s2. (It doesn’t.)2

The public does hold on to certain social “facts” that keep us going. A kind of “common sense” (or senso commune as Gramsci called it, but he was Italian). Pluto being a planet was one of those things. And so, there was a lot of public anger when this fact was taken away. There didn’t seem much appetite, for example, to upgrade Ceres from asteroid to planet; or to include Makemake and Eris as new planets. Pluto appeared to warrant a place among the planets simply because it was discovered before we realised things were messier in the Solar System than we now believe them to be.

The enduring popularity of shows like QI demonstrates, however, that we sometimes like our facts to be challenged. Partly this gives us an illusion of some sort of inside knowledge that “most” people can’t access. Partly it allows us to explore beyond what we (think we) know. But the big ones don’t tend to go without a fight.

I’ll end with wild speculation. We find truth in the stars. We have done for as long as we have recorded history. The stars were our calendar. Our compass. We on Earth were the centre of that cosmos. Over the years, bloody battles have been fought over the predictions of various forms of astrology; over whether we truly are the centre of the solar system; and later what our place is within a vast, vast universe. Pluto, then, was more than a rock spinning around the Sun. It was the latest in a long line of truths in the heavens that was being taken away.

Either that, or some people REALLY need to get out more. Dude. It’s a rock. Suck it up.

  1. Have I succeeded an making a scientist have a heart attack yet? #TrollMode
  2. Both of these things were taught when I was a school.
Print Friendly

2005 – Hurricane Katrina

03/08/2015

29 August 2005 – New Orleans

It’s been ten years since Katrina tore through the South East of the United States, causing thousands to lose their lives and billions of dollars worth of damage. The impact on people’s lives is being felt to this day.

Photo by Infrogmation. (Source)

Photo by Infrogmation. (Source)

Natural disasters such as this are, for obvious reasons, highly publicised in the media. There is something awesome (in the literal sense of the word) about an unstoppable force, causing so much devastation and producing such dramatic images for television and print. That is not to say the media should be blamed for this.1 These events capture the public imagination, and appetite to hear more about them is clearly there.

Now. This raises a number of questions for historians about how people react to disasters. Living in a country that is so rarely affected by earthquakes, tropical storms, volcanoes or tsunami, my focus is often on the observers. How do those in remote locations deal with the news of disasters?

This matters, because sometimes those people in the remote locations are the ones with the power to act. Indirectly through charitable donations, logistical support and international co-operation; and directly as the heads of governments with direct jurisdiction. What made Katrina so iconic in the popular consciousness was not just the devastation it wrought – it was that the richest country on the planet was completely unable to rebuild one of its most important cities, or provide it with the support that it clearly required.

So many disasters occur in parts of the world that already have myriad issues with their political, economic and transport infrastructure. When just a few months previously the Boxing Day Tsunami hit South East Asian coast, there was a massive reaction from people across the world. British people alone donated over £390 million of private money through organisations such as the Disasters Emergency Committee (DEC), and the government pledged a further £75 million.2

The aftermath of the 2004 tsunami on Aceh, Indonesia. Photo by AusAID. (Source)

The aftermath of the 2004 tsunami on Aceh, Indonesia. Photo by AusAID. (Source)

At the same time, we often do very little (in terms of a percentage of the public finances) to build infrastructure so that these disasters have less of a long-term impact. The foreign aid budget remains a controversial topic, with a not-insignificant proportion of the population who subscribe to the mantra “charity begins at home”. Even when we do give, it is often in a patriarchal relationship, based on a very Western idea of “humanitarianism” to those “other” parts of the world.3

This is not a condemnation – dramatic events often provoke more of a reaction than the general, mundane grind of international poverty. But as a historian, these things matter. They uncover one of those paradoxes of charity, especially in England. We will (as a public) donate our time and/or money to a soup kitchen or food bank – but we won’t commit to the sorts of economic reforms that would provide the levels of social security, housing, employment and health care that would render those charitable acts moot. As one commentator on the welfare state put it in the 1960s, the welfare state is ‘the ambulance waiting at the bottom of the cliff’.4

The VAHS will tell you all about these sorts of nuances, which I don’t have time for here. Suffice to say, Katrina broke a number of the stereotypes. Because this happened in a country that was rich enough and had the infrastructure to clean up New Orleans. And yet for so many political reasons it didn’t.

The criticisms of President Bush, the Federal Emergency Management Agency, the State of Louisiana and the City of New Orleans are widely known. Poor management and planning at all levels of the federal system in the United States led to what can only be accurately described as a clusterfuck.

What is intriguing for historians, however, is the way it exposed on a local level what we often see on the international stage. New Orleans was a poor(er) city, with economic and social problems that extended way beyond the damage inflicted by the hurricane. When Kanye West declared “Bush doesn’t care about black people”, it struck a nerve because it represented decades of neglect of poorer (often black) areas of the country. While the nation united in philanthropic donations and condemnation of the governments’ responses, many of the structural economic problems remain in the Southern United States.

And on that cheery note – don’t take this as an excuse not to donate to DEC appeals. The work they do is vital. But we need to be more critical of the systems which continue to allow natural disasters to do so much damage and last so long when we have the technological know how to fix many of these problems.

  1. We’ll have plenty of opportunity to do that in future articles, I’m sure…
  2. Saleh Saeed, ‘DEC 50th Anniversary: 2004 Asian Tsunami’, DEC (16 December 2013) < http://www.dec.org.uk/articles/dec-50th-anniversary-2004-asian-tsunami > (accessed 25 August 2015); ‘Humanitarian response to the 2004 Indian Ocean earthquake’, Wikipedia < https://en.wikipedia.org/wiki/Humanitarian_response_to_the_2004_Indian_Ocean_earthquake > (accessed 25 August 2015).
  3. Kevin O’Sullivan, ‘Humanitarian encounters: Biafra, NGOs and imaginings of the Third World in Britain and Ireland, 1967–70’, Journal of Genocide Research 16(2/3) (2004), 299-315.
  4. Megan du Boisson, founder and head of the Disablement Income Group, speaking in an interview: The Times, 1 February 1969.
Print Friendly

2004 – The Facebook

27/07/2015

4 February 2004 – Cambridge, MA

Social media is… no… social media are everywhere. But one true platform rules them all. At least in the West.

Facebook’s reach is rather remarkable compared to other platforms. At the end of 2014, it had 1.4 billion users. By comparison, Twitter – the darling of academics and journalists – has only half a billion.1 That allows a great number of people to communicate easily across the entire world. This can cover everything from organised resistance against oppressive governments to cat pictures. In my comfy little corner of the ivory tower, it’s usually the latter.

Gratuitous cat.

Gratuitous cat.

These new networks have certainly changed the way I communicate with colleagues and friends. Twitter has allowed me to maintain contact with other historians that, once the hangover of the conference has worn off, would have been much more difficult to maintain. I know for a fact that I would have lost complete contact with many of my school friends. Luckily for us, Facebook launched in the United Kingdom very soon after we left for our respective universities.

We had tools to do this when we were teenagers. Internet forums were a way to meet new people, as were the various chat rooms available through mainstream sites and over the Internet Relay Chat protocol. Incidentally, if parents are worried today about what their wee bairns are up to on Snapchat, then imagine what your kids would have been up to on a totally insecure and anonymous chat protocol that your parents weren’t internet savvy enough to understand. Sorry, mum and dad. Don’t worry. My virginity wasn’t taken by a 43-year-old lorry driver.2

a/s/l?

a/s/l?

But this isn’t about my dalliance with Frank in the glorious summer of ’01. This is about history. And social media provide some tricky problems for historians. They are usually hidden behind security measures. Facebook, for instance, has myriad privacy settings, and most people will only be able to read (or are likely to find) content posted and linked to by their friends.

At the RESAW conference at Aarhus this year, this was explored in detail. Historians of the internet are now starting to use the archived material of the web. But social media aren’t necessarily the web. Apps are very often used to access the data held on the sevices’ servers. While tweets, for example, may be public, you need to read them in a specific context. People use feeds of individuals’ “microblogs”. The Library of Congress can hold all this information, but how on earth are we going to make sense of it?

So much has been lost. Of course, history has also lost the verbal conversations of working class people down the pub; or the discussions held late into the night of the eighteenth-century coffee house. What is more frustrating is that we KNOW people wrote and sent these messages to each other. All we can ever read of them are the occasional snippets that happen to survive in other forms of blog, journal or personal data files.

Bulletin Board Systems in the 1980s have been mostly lost – though we do have histories that can be told.3. Geocities has been shut down – though we do have an archive we can begin to analyse.4 But the meat of the content is gone, and won’t be coming back. How much of the stuff we have now will go the same way?

We are trying to record this stuff. But as a historian of post-war Britain, I am more interested in a larger question – how has (or will) social media change the way Britons behave. What has changed in our personal relationships; the way we meet; the way we part; the ways we vote, organise, and understand the universe? Having lived through it, I can’t tell if I’ve changed the way I behave because I’m getting older, because of the technology and social fabric of Britain, or – more likely – because of the relationship between the two.

This may be a question we can only answer with some historical distance. But it’s worth asking now. Perhaps my 30-for-60 in 2045 will be able to give a more useful conclusion…

The eagle-eyed amongst you will note this piece was written and published on 9 August 2015. The publication date on this WordPress entry has been changed so that the weekly update pattern is maintained in the database, and the post appears in the right order.
  1. ‘Facebook’, Wikipedia < https://en.wikipedia.org/wiki/Facebook > (accessed 9 August 2015); ‘Twitter’, Ibid. < https://en.wikipedia.org/wiki/Facebook > (accessed 9 August 2015).
  2. Despite his best efforts.
  3. See the work of Kevin Driscoll at his personal site
  4. Ditto Ian Milligan.
Print Friendly

2003 – The Iraq War Protests

20/07/2015

15 February 2015 – Various

Despite the numbers, the war went ahead anyway. The images over the following years became almost as iconic as those of the millions marching through London and other cities. Bush in front of the “Mission Accomplished” sign; the toppling of the Saddam statue; the subsequent trial and execution. The question is, then – what was the fucking point?

The protest failed to achieve its main goal, but it is beginning to be historicised into a wider narrative of mass protest and voluntary action. It was in many ways one of the first “internet” demonstrations, with millions of protesters brought together through digital technologies such as e-mail and websites. (This is before Facebook and Twitter. But more on these in upcoming weeks). Movements such as Occupy may have had similar headline “failures”, but they have acted as a focal point for protest against the dominant neo-liberal political framework in the Western world.

Indeed, the breakdown of the party systems in Britain and America especially has made this sort of extra-Parliamentary form of protest increasingly potent and necessary. For while the Labour Party and Conservative Party differ on a number of ideological points, many of the key decisions about how to run foreign and domestic affairs have converged. Neither supports nationalisation of key public services; both believe in a strong military, including a nuclear arsenal; both play the realpolitik game of getting close to dictatorships in various parts of the world in return for good arms contracts and a steady supply of oil. Crucially, both supported the Iraq War, even if there were dissenting members from the parties at the time and subsequently.

This has been going on for a while, however. Voluntary organisations and charities have always been politically driven – you cannot set out to solve a social problem without doing so. While many of the larger institutions have, in the past, steered well clear of party politics, there has often been a direct or indirect moral cajoling of those in local and national government to enact policies that will help charities get on with their vital work.

In the 1960s, however, we began to see more assertive groups coming forward. Charities that did not necessarily provide services themselves, but deliberately spent their money on researching the social problems of the day and lobbying the government to fix it. The Child Poverty Action Group, Disability Income Group, Shelter and many others appeared during this time. They were willing and able to use the growing mass media to present their cases in ever-increasingly sophisticated ways. And, to varying degrees, they have had success with governments of both parties right across the late-twentieth and into the twenty-first century.

The growing professionalism of those groups in this new political climate, however, meant that they became specialised. Social campaigners may have had many concerns, but the charities themselves were often very narrowly-focused. The big questions – traditionally the preserve of the political parties – were beginning to be diffused through a range of institutions and organisations, few of whom would ever hold direct power in Westminster or City Hall.

The Iraq protest, then, represented more than just the war. For many, it was the first time in a generation that people had been able to broadly agree on a particular action and – crucially – had the tools to mobilise quickly and effectively across the world. Occupy, and the struggles of the 99% have been similarly branded. They represent growing disquiet on, predominantly, the political left with the party system and the post-war levers and apparatus that are supposed to impose democratic will on the law of the land. That they have been unsuccessful may say more about the increasing distance between the machinery of government and the people than it does about the protesters themselves.

Print Friendly

2002 – The Salt Lake City Winter Olympics

13/07/2015

My mother once told me of a place,
With waterfalls an unicorns flying.
Where there was no suffering, no pain.
Where there was laughter instead of dying.
I always thought she’d made it up,
To comfort me in times of pain.
But now I know that place is real,
Now I know its name.

~ The Book of Mormon

Why wouldn’t you want to hold a Winter Olympics in Salt Lake City, Utah? Where the warlords are friendly and the goat meat is plentiful? Well, we know why you would hold an Olympics there – flagrant corruption.

The Salt Lake City Olympics bidding process opened up the lid on the systemic nepotism and bribery within the International Olympic Committee (IOC), and the systems for awarding Games to host cities. It resulted in a number of reforms to clean up the system and the IOC’s reputation. Thankfully, nothing like this will ever happen again…

In a completely and utterly unrelated story:

It can be difficult sometimes to justify to people who don’t like sport just why I spend so much of my time watching it. Even if you can explain the attraction of watching people compete and the narratives that flow from that, how exactly do you explain away the rampant commercialism, corruption, sexism, racism, homophobia and various other unsavoury aspects of so many sports and their governing organisations?

Corruption in sport is – shockingly – not new. The “old boys’ networks” from the old British Public School system helped establish the rules for a number of sports in the late nineteenth century, from various versions of football to tennis and beyond. This was part of a Victorian desire to rationalise and standardise sport across the globe, so that everyone was playing by the same rule books. Sport was part of a masculine1 and Christian ideal, supposedly requiring and encouraging self discipline and athletic prowess. By the end of that century and the beginning of the twentieth, international sporting organisations popped up to express these ideals through nationalistic competition.

That nationalism was a key tool for regimes across the twentieth century, some authoritarian, some democratic. Italy “borrowed” a number of soccer players from Argentina to win the 1934 and 1938 FIFA World Cups (and may have slipped a coin or two in the pockets of the referees for good measure). The Nazis made a big play to host the 1936 Olympics. After 1945, the USSR and USA used a number of sports, mostly Olympic, to play out the Cold War in the sports stadium. Soccer teams have been integral to the post-1991 independent states in Eastern Europe and their national identities.

This explains why, say, Salt Lake City was keen to have a Winter Olympics. The eyes of the world would be on them, giving the place legitimacy. It’s why Qatar wanted a FIFA World Cup. It’s why the English were so angry when they didn’t get it.

There is another side to the corruption coin, however, which makes uncomfortable reading for countries who were part of that initial elite 100 years or so ago. Because the rest of the world has had to fight hard to be taken seriously alongside the traditional nations of Western Europe and North America. It took until 2002 before a FIFA World Cup went to Asia; 2010 before one went to Africa. We’re yet to have an African Olympics, Summer or Winter. And we’re a year away from the first ever South American one.

In the case of FIFA, the English/European oligarchy was swept aside in the 1970s under the leadership of João Havelange from Brazil. He appealed to the traditionally neglected nations, and built a large power base outside Europe. In some ways, this was a way to finally wrest control from the closed-shop in Europe. But it was built on giving bonuses to officials with… questionable ethics. No doubt, football investment has improved dramatically in Africa and Asia. But how much would have it have improved if officials weren’t trousering so much of the money?

Now, of course, Sal Tlay Ka Siti was in the United States – not exactly a sporting backwater. But it had been repeatedly overlooked. They believed the reason for this was that they weren’t wining and dining officials as well as their rivals. They may have been right. Though their solution wasn’t. They decided to bribe their way to the Olympics.

Perhaps it was right. They got the games, after all. But it nearly brought down the IOC and resulted in widespread reform.

There’s a question that international sport really needs to tackle, then. It doesn’t want corruption. At the same time, the West isn’t willing to give up its power. The arguments that other nations are not mature enough to be involved, economically or in terms of their business practices, can only go so far. How can they get better if they are never invited to the table?

Similarly, we cannot condone corruption simply because it allows non-traditional nations a shot at the big time. Qatar shouldn’t have a World Cup for the same reason it shouldn’t have a Winter Olympics. The climate isn’t right, the human rights record should see it kicked out of the international community, and it will help none of their citizens; it’s a vanity project for an absolute monarchy trying to buy credibility and prestige with the West.

People in the non-traditional countries deserve more from FIFA, the IOC and their ilk. More than that, though, they deserve more from the people who supposedly represent them.

Title image from Wikicommons.
  1. Although women were similarly encouraged into sport for the good of the race, especially with eugenic theories running wild through the period.
Print Friendly

2001 – September 11

06/07/2015

xkcd (Source)

xkcd (Source)

The terrorist attacks on “9/11” were horrific. The sheer scale of the damage, the cultural significance of the targets, and the fact that this exposed the vulnerability of “the most powerful nation on Earth” made most of the Western world uneasy. Whatever the geopolitical position in the Middle East, whatever the West’s role in the historical and continued instability in the region, the hijackings were barbaric acts.

I have already discussed terrorism and racialised attitudes towards it in this series. And while I could probably go on at length on the subject, there is another aspect of the attacks that piques my interest. The endurance of the conspiracy theory.

Of course, 9/11 wasn’t the first event to generate an industry of writers and enthusiasts spreading their own particular hypotheses to explain major events. JFK and the Apollo XI Moon Landing come to mind immediately. Then there are various “interesting” positions on vaccination or aircraft vapour trails. And we still find people who believe the Jews run the world, or the Catholics run the world, or the Illuminati run the world, or the Stonemasons run the world, or (that’s enough, Ed.)

My own work recently has had to deal with the vaccination issue. And this has been fascinating, partially because it involves so many different positions on what is largely the same base of evidence. It includes everyone from the hardcore “anti-vaxxers” to the hardcore “pro-vaxxers” – and somewhere in between individuals and parents who actively or passively do not get their children vaccinated without really expressing an opinion that survives in the historical record.

So this isn’t about 9/11. It’s a rant.

One of the reasons that the vaccination conspiracies have attracted so much opinion recently is because they have very tangible results. We can see the vaccination rate go up or down; we can see the disease notification rates fluctuate. And it is one of those group behaviour which, we believe, might affect us. Whether another person (or group of people) choose to vaccinate can lead to a greater risk of disease for another. Or so the “pro-vaxxers” would have you believe. (At this point the author dons his tin-foil hat.)

Ben Goldacre, a science writer, has written about “vaccine scares” in his popular books on Bad Science.1 He notes that these theories about, for example, hepatitis vaccines causing multiple sclerosis, worries over mercury or the realtionship between MMR and autism have tended to respect national boundaries.2 And while for the most part he is correct, these ideas did spread (albeit more slowly) in the pre-internet age. The scare over pertussis (whooping cough) vaccination, for example, had pretty much burnt out in England before it flared in the United States; although there was a contemporary (yet completely separate) issue with the vaccine in Japan.3 It took a number of years for Australia and the US to catch onto Wakefield and MMR (despite how his work had been discredited), and the UK has never really got interested in thimerosal (mercury compounds). In the internet age, however, ideas are spreading quicker as people are more quickly able to find individuals with similar views, and in turn are able to share information which confirms these beliefs.

Let’s be clear, however – pro-vaccine people do similar things. The vast majority of those commenting on vaccination (like the vast majority of the world’s population) are not medically trained in the areas of epidemiology and immunisation. This doesn’t make their opinions invalid, but it does make claims about scientific “truth” very difficult to trust. Vaccines are complicated. The science underpinning them is complicated. I would have to contextualise the opinion of, say, a brain surgeon when opining on MMR. Much like – even though I have a PhD in history – my views and pronouncements on pre-industrial Oviedo should probably be taken with a pinch of salt. The difference is that overwhelming medical opinion supports vaccination as safe and effective. The real questions is – how “safe” is “safe”; and how “effective” is “effective”?

No vaccine is 100% effective. It significantly reduces your chances of getting a disease; which in turn significantly reduces your chances of passing it on. Especially if everyone around you is also vaccinated. After a few generations of the disease, theoretically, a population can rid itself of the disease as it has fewer hosts and fewer people to infect. This concept of “herd immunity” is a well established one, even if it is only in recent (historically speaking) times that we have been able to develop statistical and epidemiological models to predict the impact of vaccines on a population.

And, no vaccine is 100% safe. Any procedure – indeed, anything carries risk. This goes from opening a can, to driving a car. As a proportion of the billions of vaccines administered, a tiny fraction have been injured. Health authorities know many of the contra-indicators which might cause this, and attempt to avoid the complications. But mistakes happen. That is no comfort to the families affected, but it has meant that over the course of the twentieth century the death and injury toll of TB, polio, smallpox, diphtheria, tetanus, measles, whooping cough, mumps, rubella and others has been significantly reduced.

This gives the conspiracy theorists their “in”. Because there are thousands of cases of vaccine damage to point to. Each individual one is a devastating tragedy to that family. There are millions who have been vaccinated, yet caught the disease anyway. Each one makes an individual wonder whether the pain was worth it. And, of course, medical and public health science had become more effective at preventing and curing illnesses from the late nineteenth century. Who is to say that vaccines have caused the final drop in cases? Couldn’t it just be coincidence? Aren’t we just exposing children to unnecessary risk?

The answer is, of course, peer-reviewed data and analysis. It’s the mass trials conducted by many countries over the course of the past hundred years. It’s the data we have of disease rates in areas where vaccination rates drop. It’s the control experiments and analyses that seek to eliminate other causal factors. Nobody serious would claim vaccination was the only reason for the elimination of smallpox. Nobody serious would claim that it wasn’t a significant factor.

There are two aces up the sleeve of the conspiracy theorists, however, which keep the debate alive. The first is to lie, or seriously misrepresent data. To make claims like “nobody has ever tested vaccine safety or efficacy”. They have – just look at the Medical Research Council’s trials of pertussis,4 polio and TB5 vaccines as a starting point. While none is without its problems, it is a flat out lie to suggest they never happened.

The second is to deny the relevance of these findings on the basis that “Big Pharma”™ makes money off the back of vaccines, or that the government wants to exert control. This seems to suggest that if people have ulterior motives, what they say cannot be true, regardless of their evidence. This would be enough to discredit those selling alternative therapies to protect people from disease, who have a vested interest in making you doubt biomedicine. But that’s a debate for another time.

This seems to fall under its own logic. For a start, vaccines are not a big money maker for pharmaceutical companies compared to their overall turnover. While they are becoming a bigger part of pharmaceutical company’s strategy – due to emerging markets in the developing world and increased difficulties bringing new drugs to the market – in 2013 the vaccine industry was worth an estimated $24 billion.6 Yet the industry is valued at over $980 billion.7 Besides – why would a drug company want to cure you of a disease that can cause complications for which it could sell you drugs to treat?

These argument build on those made by historians of medicine over the past few decades about the need to question scientific truth claims by authorities. There are countless examples of the power of the medical profession and the increasingly medicalised state interfering in the lives of its citizens. But there is a fundamental flaw in using this work – meant as a critique of the social, political and economic structures of power – and applying it simply to back up another faulty truth claim on the material reality of the universe. Just because science, scientists and the scientific method are bound up in power structures and the interests of the capitalist state doesn’t make all (or even most) of their conclusions invalid. As humanities scholars we can debate their relevance, but we don’t have the tools to deny their scientific merits. Especially if you are going to appeal to rational science as your basis for anti-vaccinationism.

Credit: Wellcome Library, London. Wellcome Images. (Source)

Credit: Wellcome Library, London. Wellcome Images. (Source)

But if you’re going to lean on my discipline, let’s try this. Let’s assume vaccination is part of a Foucauldian panopticon. It monitors citizens, bringing all children under the surveillance of the state, where their behaviour is controlled through the universal administration of drugs designed to increase the productivity of the whole. It’s purpose is at once to exert state power and to discipline the individual into believing that she has an obligation to herself and others to maintain her health for the good of the nation state. Let’s just say we buy that (and I probably do, on an academic level).

Why would the state continue to support a system that (supposedly) injures its citizens, rendering them and their parents’ nuclear family economic unit less productive? The state has a vested interest in supporting a system it helped create in order to save face. But it has a bigger vested interest in finding the most efficient and safest vaccines so that its citizens grow up to be net producers, not drains on the system.

There are legitimate questions to be raised here on the moral and political level. Is one death from a vaccine one death too many? It it right that the state should compel people through law or societal pressure to undergo a medical procedure? Fine. We can sit and have that debate. But you don’t get to make up scientific data or ignore the mountain of evidence which contextualises or repudiates your claims.

  1. In the interests of transparency, Goldacre is, like me, a research fellow at the London School of Hygiene and Tropical Medicine. Unlike me, he has medical qualifications and is far more successful. I have never met the guy. I’m sure he’s lovely/a corporate schill (delete as applicable according to personal taste). http://evaluation.lshtm.ac.uk/people/members/ (accessed 5 July 2015).
  2. Ben Goldacre, ‘How vaccine scares respect local cultural boundaries’, Bad Science(24 April 2013) http://www.badscience.net/2013/04/how-vaccine-scares-respect-local-cultural-boundaries/ (accessed 5 July 2015).
  3. Jeffrey P. Baker, ‘The pertussis vaccine controversy in Great Britain, 1974–1986‘, Vaccine 21(25-26), 4003-10.
  4. ‘Vaccination against whooping-cough’, BMJ 4990 (25 August 1956), 454-62.
  5. B.C.G. and vole bacillus vaccines in the prevention of tuberculosis in adolescents‘, BMJ 4964 (25 February 1956), 413-27.
  6. Miloud Kaddar, ‘Global vaccine market features and trends’ World Health Organization http://who.int/influenza_vaccines_plan/resources/session_10_kaddar.pdf (accessed 5 July 2015).
  7. Statista, ‘Revenue of the worldwide pharmaceutical market from 2001 to 2013 (in billion U.S. dollars)’ http://www.statista.com/statistics/263102/pharmaceutical-market-worldwide-revenue-since-2001/ (accessed 5 July 2015).
Print Friendly

2000 – The Millennium Bug

29/06/2015

1 January 1900 – Earth

Technology can become obselete. That in itself doesn’t seem to be too contentious a statement. Just ask coopers, smiths and thatchers about how business has been going lately. But the furore over the “millennium bug” or “Y2K” showed just how dangerous this can be when every major administrative system in the world relies on an outmoded date format.

In early computing, both memory and processing power were at a premium. It is estimated that one bit cost one dollar in the 1960s.1 There are 8 bits to a byte. Storing a four digit number requires two bytes; a two digit number only needs one. That’s an $8 per date. Since most calculations would be for dates within the 20th century, adding the “19” to the beginning of the date seemed redundant. It became convention in a lot of software to simply write my birthday as 10/10/85. Which is fine.

However, software used by air traffic control, international banking and national security took these dates to calculate a number of things. Something simple such as the day of the week become complicated once you move over the century boundary. 12 June 1914, for example, was a Friday. 12 June 2014 was a Thursday. And in 1814, it fell on a Sunday.

There are other things that could go wrong, too. Say I wanted to book an appointment with someone seven days after Friday 25 February 2000. No problem – I’ll see them on Friday 4 March. But what if the computer sees that I want to book an appointment seven days after 25/02/00, and thinks it’s 1900? Well, that’s a problem. Because it will want me to be there on Monday 5th March. 1900, unlike 2000, wasn’t a leap year.

Retrofitting old software and data to include the correct date formats and calculations was not a trivial exercise. We spent an estimated $300 billion to fix the problem. In the end, it may not have even been that big a deal.2 But it shows how decisions made out of convenience or financial necessity could come to create problems for future generations who get locked into a particular format.

The most famous example of this theory has come from Paul David, who described how the QWERTY keyboard came to dominate the Anglophonic world. First, it was used for mass produced typewriters. As typists became trained to work quickly on these machines, any potential benefits from a more ergonomic layout were negated by the massive short-term costs of retraining everybody to use the new system.3

If you’ve ever tried typing an e-mail back to your parents using your French exchange family’s AZERTY monstrosity, you’ll know just how this feels.

Human rights abuse. (Source)

Human rights abuse. (Source)

Historiographically, the idea of path dependence is an interesting one. Certainly, you could apply it to bureaucratic and operational decisions made by businesses, government or the collective action of societies.4 The recent furore over Universal Credit, for example, shows that while there may have been the political will to produce a more rational and consistent benefits system, the existing web of payments and tax credits is unfathomably costly to untangle.5

The current social security system is an accident of history. After being overhauled in the 1940s, new schemes have been added as society has changed and holes in the safety net have been identified. No doubt, a single benefit, administered using a overarching computer system and bureaucratic machinery, would be more efficient that what we have now. But if the cost of change – to both the government in terms of infrastructure and claimants in terms of lost income – is higher than the potential gain, it can cause a political crisis. One might argue there is a reason why no other government has been so… “brave”… in their overhaul of the Department of Work and Pensions.

Despite being a massive part of the late 1990s news cycle, Y2K never really caused that many problems. Like much else with the dawning of the new millennium, the really exciting part was that Western society was approaching a nice round number. It’s like watching your odometer tick over to 100,000, or getting 100 lines in Tetris. Objectively, it means little. But there’s something nice about hitting a milestone.

Still. It’s a helpful reminder. However you design any system, eventually it will start to creak and groan under its own contradictions. But fixing it properly may end up being more costly than patching it up with string and sticky back plastic.

  1. ‘Year 2000 problem’, Wikipedia < https://en.wikipedia.org/wiki/Year_2000_problem > (accessed 28 June 2015).
  2. ‘Y2K: Overhyped and oversold?’, BBC News 6 January 2000 < http://news.bbc.co.uk/1/hi/talking_point/586938.stm > (accessed 28 June 2015).
  3. Paul A. David, ‘Clio and the economics of QWERTY’, The American Economic Review 75(2) (1985), 332-7. (Copy available – http://www.econ.ucsb.edu/~tedb/Courses/Ec100C/DavidQwerty.pdf – (as of 28 June 2015).
  4. Paul Pierson, ‘Increasing Returns, Path Dependence, and the Study of Politics’, The American Political Science Review 94 (2000), 251-67.
  5. Asa Bennett, ‘Iain Duncan Smith’s Universal Credit Could Cost More Than Planned, Warns Think-Tank’, Huffington Post (9 September 2014) < http://www.huffingtonpost.co.uk/2014/09/09/ids-universal-credit-welfare_n_5789200.html > (accessed 28 June 2015).
Print Friendly
Hope_Columbine_Memorial_Library_thin

1999 – The Columbine High School Massacre

22/06/2015

20 April 1999 – Columbine

Last week, it was fortunate coincidence that I had planned to write about Google the weekend after getting back from a conference on the history of the Web. This week, it’s utterly depressing in the wake of the Charleston shootings that I should be writing about Columbine.

School shootings are – thankfully – a rare event in Europe. The Dunblane shooting in 1996 was a reminder that these things do happen, though not with the regularity that they have plagued the United States in recent years. When Dunblane happened, I was 10 years old; with Columbine I was 13. Of course, the chance of being caught up in one of these tragedies was infinitesimally small. But that didn’t stop kids of my age (and, probably, more our parents) worrying about this sort of thing.

The memorial library at Columbine High School, built after the massacre. (Source)

The memorial library at Columbine High School, built after the massacre. (Source)

Columbine was now over 15 years ago, and yet mass shootings continue across the United States. Colorado had another incident only as recently as 2012 when a gunman attacked a screening of the final Chirstopher Nolan Batman movie. Sadly, as Jon Stewart put it:

I honestly have nothing other than just sadness once again that we have to peer into the abyss of the depraved violence that we do to each other and the nexus of a just gaping racial wound that will not heal, yet we pretend doesn’t exist. And I’m confident, though, that by acknowledging it, by staring into that and seeing it for what it is, we still won’t do jack shit.1

I could take this opportunity to look at what it is historically about American culture that allows this to keep happening. That particular topic has been wrung to death over recent days. But we can revisit the greatest hits! Easy access to guns gives people opportunity. The glorification of fire arms creates a fantasy culture that those with dangerous personalities get lost in. Not enough support networks exist for those with mental health issues. Racism is still rife. A large section of the community celebrates the names and flags of white supremacists from the nineteenth century. Others hide behind a constitutional amendment designed to allow the country to defend itself. All good stuff, to be repeated ad nauseam next time this happens.

Because, let’s not kid ourselves. There will be many next times.

The historical attachment to gun ownership in America makes sense within a narrow historical interpretation. The country was founded as a modern experiment in liberalism. Individual property was to be protected from the intrusion of the state, and individuals were free to follow their own religious and political lives providing these did not impinge on the rights of others to do the same.

One of the important pillars of this concept was the Constitution – a set of rules governing the government. The state would not be allowed to interfere in the private lives of others apart from within strict laws of conduct. In order for that to happen, the appropriate levers needed to be created to allow the people to kick out those who would abuse the system.

In the ideal world, of course, this would happen with regular elections to the Senate, House of Representatives and the Presidency. But if those failed, the people needed to be able to defend themselves against injustice: hence, the Second Amendment. The right for people to form militias to protect against tyranny, domestic and foreign. Especially foreign. Those dirty Brits might come back one day…

The idea that firearms will protect the individual from tyranny continues in US politics. See this cartoon, reproduced on NaturalNews.com. (Source)

The idea that firearms will protect the individual from tyranny continues in US politics. See this cartoon, reproduced on NaturalNews.com. (Source)

Within the context of the late eighteenth century, this made perfect sense. This was a new country, just starting to form the sort of economic and political infrastructure required to defend itself and to provide a mature, accountable democracy. As time has gone on, however, history has left this world behind. On the one hand, the United States as a wide network of military and police bases, with a highly developed justice system and supreme court. While these institutions do not work anywhere near as well as Americans would like, there are myriad constitutional forms of protection and of law enforcement. If these fail, there are also myriad avenues for challenging this power.

Second, fire arms are far more deadly and far more prevalent than anyone in the eighteenth century could have assumed. Individuals can, realistically, pull together the fire power to form a militia that would rival that of many middle-income nations.

To many in the States, however, that paranoia – a vital defence mechanism 200 years ago – remains. And it has blended with a fetishisation of guns and a deep mistrust of the federal government. Many believe a gun makes them safer, even though you are more likely to be killed by a gun if you own one.2 In the Southern states, you can combine all this with a nagging suspicion that the economic and political dominance of Northern states and California means that “liberals” are trying to impose a political way of life upon the states that once tried to secede from the Union.

On this historical reading, then, guns are justified because governments can never be trusted to operate within the law. At any moment, the Federal government could seize your property (including your guns).

To Europeans, this sounds like utter nonsense. And, increasingly, it is being ridiculed by middle America too. But just because the Second Amendment is an anachronism doesn’t make it any less potent to many. In fact, when one of the major forces in American politics is named after an act of sabotage in Boston harbour, its roots in the eighteenth century make it even more relevant.

Columbine will happen again. And it will keep happening until those most wedded to gun culture understand that they are being manipulated far more by the arms industry and vested capital interests than they are by the Federal government. For it is the legal framework and protection offered by a government – constrained by the rule of law – that will ultimately make America a safer place.

That’s going to take a long time. Because such a collectivist attitude, relatively common in Europe, is an anathema to the individual rights approach at the heart of American politics and history. And we should be honest – collective trust in government hasn’t always worked out so well this side of the pond.

Until then, America will continue to tell the rest of the world – mass shootings are the price we pay for freedom.

Addendum

Last night, a friend posted this to Facebook. If you want a more sweary and entertaining version of the above, see below:

  1. Jon Stewart on The Daily Show, broadcast on Comedy Central. Transcript from ‘Read Jon Stewart’s blistering monologue about race, terrorism and gun violence after Charleston church massacre’, Washington Post, 19 June 2015 < http://www.washingtonpost.com/blogs/style-blog/wp/2015/06/19/read-jon-stewarts-blistering-monologue-about-race-terrorism-and-gun-violence-after-charleston-church-massacre/ > (accessed 21 June 2015).
  2. Linda L Dahlberg, Robin M Ikeda and Marcie-jo Kresnow, ‘Guns in the home and risk of violent death in the home: Findings from a national study’, American Journal of Epidemiology 160(10) (2004), 929-36.
Print Friendly

1998 – Google, Inc.

15/06/2015

4 September 1998 – Menlo Park

Hoover, Xerox and Coke have come to mean vacuum cleaner, photocopier and cola in colloquial English. Such was the success of those brands, either as inventors or popularisers of day-to-day products that we use their trademarks more than the more generic terms; even when Vax, Epson and Pepsi are available.

Google is another of those brands. It is the world’s most used search engine, accounting for 88% of the planet’s searches.1 Yet that isn’t primarily what Google does any more. It offers a range of services and collects mind-blowing amounts of data, leading many to criticise its dominant position on the internet and World Wide Web.

Google, November 1998

The Google search page as it looked around the time Google was incorporated.
(http://google.stanford.edu/ [The Internet Archive, site captured 11 November 1998])

The Web is full of information, but it’s relatively useless if you can’t find anything. There are a number of ways you can find stuff, but it generally boils down to access to one of three things:

  • someone recommends a site to go to;
  • you follow a link on an existing site; or
  • you search for a particular topic in a search engine.

In the late nineties, search wasn’t particularly sophisticated. The main providers would maintain large databases of websites, and then would provide the user with results based on how often a search term appeared. (To very crudely summarise the technology.) Students at Stanford, however, wondered whether an algorithm could deduce relevance, but monitoring how often authoritative websites linked to each other on a specific topic. Using these and other metrics, they developed a search engine that eventually became Google. It launched in 1997, and the company was incorporated in September 1998.2

My family got the internet in 2000. Google has been practically a constant in my experience of the web since then, as it has been for many others. But there was a Web before Google. And there was an internet before the Web. So the question is – how did we ever find anything?

Yahoo had a search option, but also gave prominence to its directory. Much like a telephone directory or book index, it sorted sites by category and was maintained by the site itself. (ww2.yahoo.com [The Internet Archive, captured 17 October 1996].

Yahoo had a search option, but also gave prominence to its directory. Much like a telephone directory or book index, it sorted sites by category and was maintained by the site itself.
(www2.yahoo.com [The Internet Archive, captured 17 October 1996].

One form of “discovery” came through directories. The example above was on the Yahoo front page in late 1996. Google also maintained a directory in its early years, before neglecting it in favour of other services. While these were helpful, they were also at the mercy of those maintaining them. Humans simply could not read and categorise every single page on the Web.

Another way of maintaining links for like-minded people, then, was to gather in one place. These sorts of internet communities have existed for many years, even before the invention of the Web. At the recent RESAW conference in Aarhus, Kevin Driscoll spoke about the history of Bulletin Board Systems. Much like the modern “internet forum”, people could exchange messages on the internet equivalent of the community bulletin board in the local parish church or community centre. Access came through dialling up BBS providers using a modem and transferring data over the phone line. This is essentially how modern Internet Service Providers work, but in the days before the Web as we know it. Indeed, a number of BBS providers went on to become ISPs.3

These boards provided information not just on the “real” world goings on in the cities in which they were based. They also gave people recommendations for other places on the internet to visit.

Other messaging systems such as Usenet provided similar information amongst the core service of facilitating conversations on a particular topic. This was brought out in William McCarthy’s paper on the history of troll communities.4

Some users took the geographical analogy and ran with it, contributing to the rise of GeoCities in the late 1990s. Ian Milligan‘s research showed that the owners of GeoCities pages tended to keep themselves in the city which most reflected their interests. In doing so, they were able to join communities of like-minded people, share information, and then “go” to other cities to learn about other topics. This was a web constructed by amateurs rather than professional content generating sites, but it allowed people to discover and – crucially – be discovered by their fellow Web consumers.5

xkcd nails it again... (Source)

xkcd nails it again… (Source)

Google has become a valuable tool for web “discovery”. Alongside the rise in social media, we have been able to share our own content and that of others in ways that would have been difficult or impossible in the 1980s and 1990s. Finding people, places and things has never been easier.

Aside from the political concerns and debates over the “right to forget“, it has also made things tricky for documentary researchers. The intricate details of that are probably worth explaining elsewhere (such as in this paper I gave with Richard Deswarte and Peter Webster last year). Suffice to say, the sheer volume of data available through Google is overwhelming. So too is the gnawing suspicion that it is almost too easy, leading us to do research on the sorts of topics that serve only to confirm our own prejudices of the world and ignoring wider and important outside context.

In any case, Google looks like it’s here to stay. Given the way it has revolutionised the Web, which has in turn revolutionised the twenty-first century, the company had to go into my 30-for-30.

  1. ‘Worldwide market share of leading search engines from January 2010 to April 2015’, Statista < http://www.statista.com/statistics/216573/worldwide-market-share-of-search-engines/ > (accessed 14 June 2015).
  2. ‘Google’, Wikipedia < https://en.wikipedia.org/wiki/Google > (accessed 14 June 2015).
  3. Kevin Driscol, ‘Shaping the social web: Recovering the contributions of bulletin board system operators’ (unpublished conference paper at Web Archives as Scholarly Sources: Issues, Practices and Perspectives, 9 June 2015, Aarhus).
  4. William McCarthy, ‘The Advent of the Online Troll Community: A Digital Ethnography of alt.flame’ (unpublished conference paper at Web Archives as Scholarly Sources: Issues, Practices and Perspectives, 9 June 2015, Aarhus).
  5. Ian Milligan, ‘Welcome to the GeoHood: Using the GeoCities Web Archive to Explore Virtual Communities’ (unpublished conference paper at Web Archives as Scholarly Sources: Issues, Practices and Perspectives, 9 June 2015, Aarhus).
Print Friendly
Older Posts