historical stuff by Gareth Millward

Search

rwanda-banner

1994 – The Rwandan Genocide

18/05/2015

6 April 1994 – Kigali

When a aeroplane carrying the President was shot down over Rwanda’s capital, Kigali, the government responded in brutal fashion. The ruling classes, dominated by an ethnic group called the Hutu, began systematically raping and murdering Tutsis.

The massacre, which lasted around three months, was part of a bloody civil war. Plenty has been written about the genocide, and (as with other topics in this series) I doubt I will be able to do the subject justice. Thankfully, as a historian I can twist the question “what can you tell me about Rwanda ’94?” and talk about something completely different.

Two historically interesting issues surround the narrative of Rwanda. The first is “what constitutes a genocide”? Is it simply a question of numbers? Does there have to be an overt racial element? What about religion or nationality? And do the perpetrators have to be racially homogeneous?

The second is “should ‘we’ have done more to stop it”? Is military intervention in a foreign conflict ever acceptable? Is there such thing as “neutrality”? Indeed, is non-intervention a concious and dangerous act in itself? At what point does a stronger power have to put itself at risk to protect the most vulnerable?

These are questions worth asking, because in the twenty years since both have been prevalent in British politics – and have therefore had a big impact upon the world in which I live.

The term “genocide” or “ethnic cleansing” has been attached to a number of atrocities in the twentieth century. Because it is so emotive, there are some very sensitive issues to confront before using it in any serious assessment of history. For the alleged perpetrators, it carries a stigma that very few would be willing to admit – certainly not if that group still holds power to this day. For the alleged victims, there is great political capital to be made – either from seeking political redress after the fact, or in trying to secure military support from an outside power (on which more later). This is not, of course, to suggest that Kurds, Tutsis, Armenians or Jews have “played the victim” – rather, it shows the various factors that lead to people on both sides becoming so defensive of their relative positions.

Some cases of denialism are less convincing than others. There is a great industry of holocaust deniers, most notably in the work David Irving. A German “historian”, Irving used a number of German documents from the Nazi Party to claim that Hitler did not order the final solution, and questions whether any systematic attempt to wipe out Jewish people ever existed.1 He was proved an incompetent liar in court.2 But even a cursory glance at the seedier sides of the internet shows that this sort of attitude persists.

Attacking National-Socialist Germany is, of course, easier because of their utter defeat in the Second World War. Even if people will defend them, there is no need for any major company or nation state to negotiate the truth. Turkey, on the other hand, is a different matter. The Turkish Government completely denies that the murder of Armenians at the end of World War One constitutes a genocide. Its allies, including the United States, often try to find euphemistic terms for the killing.3

What’s interesting here is that there is no denial that killing took place. Just as there is little doubt that Saddam Hussein gassed Kurdish people in Northern Iraq; or that there were thousands of people murdered in ethnic wars in Yugoslavia. Rather their significance is “downgraded” from genocide to less emotive terms. Hussein and Milošević were (no longer) useful allies for Western governments – and were eventually put on trial (with dubious success and/or impartiality).

Turkey, however, as first a safety zone against the encroachment of Communism in Eastern Europe, and then as a secular buffer against unstable dictatorships in the Middle East, is not a country to piss off. Despite the opinions of many historians with more knowledge than I – and, indeed, Barack Obama himself4 – the Office of the POTUS will not say that what happened was a genocide.

While not universally considered a genocide, the potato famine in Ireland is seen by some as such. (<a href="http://en.wikipedia.org/wiki/Great_Famine_(Ireland)#/media/File:An_gorta_Mor.jpg">Source</a>)More subtle are mass killings that, for a variety of reasons, are not considered genocides. In some cases this is because the ethnic grouping of the victims is unclear. In others, deaths are caused by indirect means, such as the inadvertent or deliberate exacerbation of famine and abject poverty. For instance, the famine that struck Ireland in the nineteenth century was fuelled by British agricultural and taxation policy, but is not generally considered a genocide. Forced collectivisation of farms in Ukraine under Stalin, similarly, border on the definition, but are not usually treated as such.

Then there are the practices of European Empires which killed millions, but are usually couched in other terms (if they are confronted at all). The famous stories of smallpox-riddled blankets being “donated” to Native Americans, for example;5 or the systematic destruction of traditional trade networks leading to famines in the Indian subcontinent.6

The United Nations Security Council. (Source)

The United Nations Security Council. (Source)

OK. So even if there are questions about whether a conflict is necessarily a “genocide”, what responsibility do others have to intervene? One of the main criticisms about Rwanda has been that Western Nations could have prevented a catastrophe if they had come to the aid of the Tutsi rebels.7 Since then, military interventions have taken place (or been proposed) in Bosnia, Libya, Iraq, Syria, Sierra Leone and others.

Much like with the question of Europe, the split in opinion in places like Britain has not been across the traditional left/right dividing line. There is a strong anti-war contingent on the left, often painted as anti-imperialism. Opposition to Vietnam and nuclear weapons also extends to an overtly pacifist approach to international relations. On the other hand, there is also a group that professes “solidarity” with common people in foreign countries unable to defend themselves from the forces of dictatorial regimes. Thus, there was left-wing support for Iraq and Libya in recent years; and more famously the Spanish Civil War in the 1930s.

Intervention is not always well-received by the electorate of the intervening nation. (Source)

Intervention is not always well-received by the electorate of the intervening nation. (Source)

On the right, there are the “hawks” who see military intervention as a way of ensuring stability in regions of the world that might hurt Western interests. Either through direct conflict and terrorism, or through disrupting potential markets and supplies of raw materials. More cynically, some see it as a boom for the arms trade. Then there are isolationists who believe that problems in the rest of the world are not our concern. Money spent on wars should be kept at home, with poorer countries left to sort out their own messes. Less Machiavellian is the belief that intervention has unknown consequences, and may lead to power vacuums that, in turn, create bigger and longer-lasting problems down the line. This is a concern shared by the anti-war left, too.

It is certainly the case that intervention tends to happen when there is a benefit to the intervening state.8 But it would be wrong to describe this as solely to do with “oil” (as in the case of Iraq) – there were also genuine humanitarian concerns from the British people about the safety of civilians in a number of conflicts.

Rwanda, then, brings into focus many foreign policy question which have yet to be resolved. After the Cold War, this sort of global politics seems to have intensified, and been a key part of my adolescent and adult life. No doubt it will continue to reverberate 30 years hence.

  1. For more on this pleasant individual, see ‘David Irving’, Wikipedia http://en.wikipedia.org/wiki/David_Irving > (accessed 7 May 2015).
  2. Richard Evans, Telling Lies about Hitler: The Holocaust, History and the David Irving Trial (Verso, 2002).
  3. Chris Bohjalian, ‘Why does Turkey continue to deny Armenian genocide’, The Boston Globe (9 March 2015) < http://www.bostonglobe.com/opinion/2015/03/09/why-does-turkey-continue-deny-armenian-genocide/rV7dcOXxrDb7wz01AoXgZM/story.html > (accessed 7 May 2015).
  4. AP, ‘Barack Obama will not label 1915 massacre of Armenians a genocide’, The Guardian (22 April 2015, 4:23 BST) < http://www.theguardian.com/us-news/2015/apr/22/barack-obama-will-not-label-1915-massacre-of-armenians-a-genocide > (accessed 7 May 2015).
  5. K B Patterson and T Runge, ‘Smallpox and the Native American’, American Journal of Medical Science 323(4) (2002), 216-22.
  6. David Arnold, ‘The “discovery” of malnutrition and diet in colonial India’, Indian Economic and Social History Review 31(1) (1994), 1-26.
  7. See, for example, Emily Sciarillo, ‘Genocide in Rwanda: The United Nations’ ability to act from a neoliberal perspective’, Towson University Journal of International Affairs 38(2) (2002), 17-29.
  8. There are dangers in applying this interpretation in all cases, however. See: Philip Spence, ‘Imperialism, anti-imperialism and the problem of genocide, past and present’, History 98(332), 606-22.

1993 – Maastricht and NAFTA

11/05/2015

1 November 1993 – Europe; 8 December 1993 – Washington D.C.

1993 saw the ratification of the Maastricht Treaty in Europe, and the formation of the European Union (EU). Over the pond, the North American Free Trade Agreement (NAFTA) signalled greater economic cooperation between the United States, Canada and Mexico. These agreements, naturally, did not appear overnight – but they signalled the shifting dynamic of economic power. One in which supranational organisation would have an impact upon domestic lawmaking and financial autonomy.

It is an unfortunate quirk of timing that I am writing this before (and publishing it after) the 2015 UK General Election. The rise of the right-wing United Kingdom Independence Party (UKIP) has been built largely upon anti-EU sentiment. After the Cold War, economic integration seemed to offer greater financial security: but at a cost.

Salinas de Gortari, Bush and Mulroney initialing a draft of NAFTA in 1992. (Source)

Salinas de Gortari, Bush and Mulroney initialing a draft of NAFTA in 1992. (Source)

Between them, the EU and NAFTA account for around 40% of the world’s wealth. Included in that are eight of the fifteen largest national economies.1 The point being, that the buying power of these trade blocs is huge, and has a significant effect on the world economy.

In Europe, however, the move from the European Community (EC) to the European Union (EU) was greeted with scepticism. Loyalties were split, although not along the traditional “left-right” axis. In Britain during the 1970s, many in the Labour Party were opposed to the EC because they feared the impact it would have upon trade with the Commonwealth and the ability for a socialist government to control the economy. Conservative support, led by Margaret Thatcher, was needed to offset the Labour rebels. By the 1990s, Labour was the pro-European party against large parts of the Conservative Party who feared a loss of national sovereignty and the impact of European “bureaucracy” on businesses. Today, the leadership of the major parties in Britain support the EU, but a sizeable number of MPs (particularly Conservative) have major reservations and would support another referendum.

At the time of Maastricht, similar splits were evident. For the international left (usually of a middle-class bent), greater political integration was seen as a progressive move towards a post-national society. For some, this was a reaction against the nationalism of the earlier decades which had seen untold destruction across the continent. For those living in large sub-national regions, the EU offered an opportunity for greater autonomy within the national structure under a trans-national regulatory authority. Places such as Brittany, for example, with a local language and culture saw great potential in the European project. But for those with a more nationalist bent, free trade and freedom of movement was the threat to local job security. Large companies would find it easier to move to places where labour was cheaper, and migrants would be able to undercut local workers’ wages.

On the neoliberal right, the EU offered opportunities for greater cross-national trade and greater profits. It would allow businesses to be more competitive by providing a tariff-free trade zone, larger labour pool and the ability to be based in multiple sites across the continent. The traditionalist right, however, were worried about the loss of national autonomy. Especially in those countries with a long history and strong national identity, the idea of ceding control to “Brussels” was an anathema. France only narrowly passed a referendum vote on the Maastricht Treaty; the Danes rejected it.2

The Lisbon Treaty also split European opinion in 2005. (Source) &copy Stephane Peray, 2005.

The Lisbon Treaty also split European opinion in 2005. (Source) © Stephane Peray, 2005.

Across the pond, NAFTA did not pursue such close political integration. There is no Pan-American currency similar to the Euro. But the destruction of tariffs and promotion of free trade has had similar consequences for smaller businesses. Take agriculture, for instance. One of the major bones of contention within the EU has been the Common Agricultural Policy. This provides subsidies to European farmers as a way of combating the threat of cheap food from other areas and ensuring supplies. Some countries have claimed that it benefits large, “industrial” farms at the expense of traditional growers – and that nation states have lost their ability to provide subsidies to their local producers. It has also encouraged over-production.3

The opening up of Mexico’s markets to American agricultural businesses appears to have had a similar effect. Local farms – without the size or technology of the massive food companies “north of the border” – have suffered. Cheap corn, and in particular cheap corn syrup, have flooded the Mexican market, hitting the agricultural sector and having a negative impact upon the country’s obesity levels and tooth decay.4

Obligatory picture of a Mexican farmer. (Source)

Obligatory picture of a Mexican farmer. (Source)

Despite all this, I am a fan of the European Union. But to support it and truly get the best from it, we need to be aware of its strengths as well as its faults.

Both the EU and NAFTA have made a significant contribution to the growth of Western economies over the past 20 years. After a period of recession in the early 1990s, there was near-continuous economic growth until the crash of 2008. In that time, Europeans have enjoyed freedom of movement within the Union, and the somewhat mixed blessings of a single currency (apart from in Britain and Scandinavia). Half of Britain’s trade is conducted within the EU, and the elections held for the European Parliament cover over half a billion people. The European courts over recent years have protected citizens from state overreach, and reversed judgements in breach of human rights legislation. And while this may not mean much to some, freedom to move around the continent and the availability of EU grants are an essential pillar of higher education and research across the continent.

The Union is far from perfect, but it offers the possibility of genuine transnational co-operation in the interests of citizens rather than national governments or large corporations. But it also contains the machinery to outlaw economic controls that would curb the worst excesses of free markets and protect smaller businesses and workers. I firmly believe that these sorts of organisations will become increasingly important in the modern world, a vital check against vested interests which are more powerful than many single nations states. At the same time, they may also create exactly the conditions that would allow billion-dollar companies to force democratically elected governments at the local, national and supranational level to kowtow to their demands.

In any case – Britain will not find protection in some fanciful New Commonwealth or as the 51st State of America. Countries such as Australia and India are already moving towards their own free trade alliances – while the US is tied into NAFTA. Supranational organisation will become inescapable. The question is, do you want to use its powers to democratically provide a better deal for ordinary people? Or accept that power is inevitably and solely in the hands of the businesses with the biggest wallets?

"Nige". (die Quelle)

“Nige”. (die Quelle)

  1. According to the International Monetary Fund in 2014 – USA (largest), Germany (4th), UK (5th), France (6th), Italy (8th), Canada (11th), Spain (14th) and Mexico (15th). ‘List of countries by GDP (nominal)’, Wikipedia < http://en.wikipedia.org/wiki/List_of_countries_by_GDP_%28nominal%29 > (accessed 4 May 2015).
  2. Michael S. Lewis-Beck and Daniel S. Morey, ‘The French “Petit Oui”: The Maastricht Treaty and the French voting agenda’, Journal of Interdisciplinary History 38(1) (2007), 65-87.
  3. There is a long-running debate about CAP which is impossible to summarise here. However, a decent overview is available on the Wikipedia article on the subject: ‘Common Agricultural Policy’, Wikipedia, Criticism < http://en.wikipedia.org/wiki/Common_Agricultural_Policy#Criticism >.
  4. Anjali Browning, ‘Corn, tomatoes, and a dead dog: Mexican agricultural restructuring after NAFTA and rural responses to declining maize production in Oaxaca, Mexico’, Mexican Studies / Estudios Mexicanos 29(1) (2013), 85-119; Sarah E. Clark, Corinna Hawkes, Sophia M. E. Murphy, Karen A. Hansen-Kuhn and David Wallinger, ‘Exporting obesity: US farm and trade policy and the transformation of the Mexican consumer food environment’, International Journal of Occupational and Environmental Health 18(1) (2012), 53-64.

1992 – The Premier League

04/05/2015

15 August 1992 – London, Ipswich, Liverpool, Southampton, Coventry, Leeds and Sheffield

The sport of Association Football was invented in 1992 when the Premier League first kicked off at 3pm, 15 August 1992 in nine grounds around England. It developed out of primitive proto-football games played in local parks, the British Colonies and, for 104 years, as a professional sideshow in England and Wales.

The video below shows some grainy footage of one of these games being played in front of a small audience of circa 107,000.

While the “Sky Sports Generation” is roundly mocked for its insistence on statistics that start with the phrase “in the Premier League era”, it is worth taking my tongue out of my cheek for a moment to note that it has been almost 23 years since the first Premier League game was played. As a result, we are as about as far removed from Brian Deane’s goal against Manchester United as Brian Deane’s goal against Manchester United was from the Apollo XI moon landing.

Brian Deane scored the first Premier League goal. Against Manchester United. For Sheffield United.

Now, as with posts about Wrestlemania or Sol Campbell, why should anyone who doesn’t care about sport give a damn about the Premier League? Well, as with ‘Mania this is a story about globalisation. It is about the breaking down of national boundaries and the great benefits and inherent dangers of the commercialisation of working-class culture.

To set my stall out from the beginning – I am a fan of a non-Premier League club. Walsall. A club whose only real claim to fame is beating Arsenal in the days before the mass production of penicillin and the ground zero of Paul Merson‘s managerial career. If you don’t know who Paul Merson is, don’t bother Googling.

I am, however, very much part of the “Premier League Generation”. I have a Sky Sports subscription. My wife is an Arsenal fan. Rightly or wrongly, I view the Champions League Final as the biggest game of the season. And I still believe David Beckham is a flawless human being.

Flawless. (Source)

Flawless. (Source)

When professional football started in this country back in the 1880s, it was viewed with suspicion by the public-school educated men who governed the sport. The Corinthian – amateur – spirit was supposed to pervade. Football was played for the love of the game, to make men manly and to reflect the masculine Christian ideal of the defender of the British Empire. Professionalism sullied that ideal. So what if the factory workers of the North needed to be compensated for the time they took off work to play the game? So what if the money from paying customers went straight into the hands of business men, not the athletes competing?

If that sounds like a stupid argument, bear in mind the Rugby Football Union only approved of formal professionalisation in 1995. The NCAA in the United States still forbids it.

As time went on, the business side of football became more entrenched. The British Football Association briefly championed the rights of professional clubs, before the Football Association caved and supported the paying of players in 1885. Clubs in the North of England organised a league in 1888 so that teams could play regular fixtures and therefore guarantee a certain standard of matches for paying customers. 12 clubs signed up. Over the years, more clubs were added and split into hierarchical divisions. In 1950 the last expansion occurred, creating a league of 92 professional teams in four divisions. They played over 40 games a year, plus various local, national and eventually international cup competitions.

The first league champions, Preston North End. (Source)

The first league champions, Preston North End. (Source)

League football, then, has always been a business. It exists because owners wanted more reliable income streams. The laws of the game may have changed significantly since 1888, but this has been the one constant. Until 1961, it was against the rules to pay a player more than £20 a week. (£412 or $615 in today’s money.) Owners have always found ways to overcrowd delapidated stadiums to get as much in ticket revenue as they could – sometimes with disastrous consequences. And so when a new cash cow came along, it was no surprise that the bigger owners tried to exploit it to their own ends.

Between 1990 and 1992 a series of negotiations were held with the biggest clubs in the country and broadcasters to form a breakaway league. Under Football League rules, revenue from televised football was split between all 92 clubs. Those in Division 1 believed this was unfair; since they provided the best quality football, they argued that they should keep the vast majority of the proceeds.

The 22 teams in that division voted to form their own league – The FA Premier League – and negotiated a separate deal with BSkyB, the biggest pay-TV supplier in the country. The structure of English football was retained, with three sides relegated to the second tier ever season, and three new arrivals taking their place. Other than one season in which the number of sides was reduced to 20, this has remained the case. The total number of clubs in the top four divisions in English football is still 92. So, why the fuss?

As satellite television became increasingly popular in the UK, BSkyB were able to bid even more for Premier League football. Off the back of a reasonable performance by England in the 1990 World Cup, the return of European competition following the Heysel ban, and the safety improvements off the back of the Taylor Report, football became a more popular product to attend and view on screen. Rupert Murdoch’s TV empire was built on its ability to provide exclusive access to Bart Simpson, Rachel Green and Eric Cantona. Football fuelled Sky as much as Sky fuelled football.

The size of the television contracts meant that promotion to the Premier League became increasingly lucrative. Those in the division had a ridiculous competitive edge over those outside. Those in the old Division Two – now the top tier of the Football League – were forced to either face a season of losing regularly in the Premier League (and inevitable relegation), or to overspend in the pursuit of Sky money. Others found themselves unable to cope with the loss of income that followed relegation. The knock on effects were obvious; even lower league clubs were forced to pay inflated prices for average players just to try and compete in the divisions they had historically been associated with. Football fans are well aware of the difficulties of Crystal Palace, Southampton, Portsmouth, Leicester City, Cardiff City, Luton Town, Wrexham, Wimbledon, York City, Port Vale, Boston United, Leeds United, Chester City, Hereford United, Halifax Town…

Liverpool fans oppose the American owners of the club. "Built by Shanks [Bill Shankley, legendary manager of the club who built the success of the 1970s and 1980s], broke by Yanks [Americans].  (Source)

Liverpool fans oppose the American owners of the club. “Built by Shanks [Bill Shankley, legendary manager of the club who built the success of the 1970s and 1980s], broke by Yanks [Americans].
(Source)

There have been other problems. As football has increased in popularity with the middle classes and foreign tourists, ticket prices have increased well above inflation, so that the demographics of working class men and women who would have attended games 20 years ago can no longer afford to do so. The influx of money has also led to the mass importation of the best talent from around the world, cutting off the traditional opportunities for young British players to play at the highest level of English football. For those who do get the opportunity, wages are so over-inflated it can be difficult to keep their egos grounded. Fans of clubs outside the top six or seven have absolutely no hope going into a new season of winning the league – unless, by some miracle they get a multi-billionaire show interest in their team. But that causes its own unique problems.

Sky's television deals have distorted the competition between the 92 "league" clubs. (Source)

Sky’s television deals have distorted the competition between the 92 “league” clubs.
(Source)

On the other hand, the quality of football on the field is undoubtedly better than it was before the Premier League. World television deals have improved the facilities and talent pools of all the top clubs across Western Europe (soccer’s traditional powerhouse). Players are stronger, fitter and spend more time honing their technique than ever. Tactically, the teams are better drilled. Sometimes that results in turgid chess battles that are less interesting than watching actual chess. But it also gives us the brilliance of Thierry Henry, Cristiano Ronaldo and Luis Suarez. Stadiums across the country are now safe and comfortable. It is easier than ever to follow your team if you cannot for some reason get to the ground. And for people like me and my wife, who grew up in rural areas without access to a decent professional team, it means that you can consume and follow the sport in a way that would have been difficult in the past.

Of course, as with all change narratives, some will praise the march of progress; other will mourn for a culture lost. But this assumes that footballing culture in Britain was always static. Football had already evolved from a local activity to a national and increasingly ‘Europeanized’ culture by 1990.1 Wages had increased since the maximum wage was scrapped in 1961. While not a multi-millionaire in the same way the average Premier League player is now, it would be a stretch to suggest that someone like Kevin Keegan (active 1968-85) was some sort of romantic local working class boy who remained part of the local community. Maybe he was more “one of the people” than Wayne Rooney. But it is a crass over-simplification to assume that the game in 1991 was somehow pure and Bohemian.

The rabid opposition expressed in some to what is considered the “over commercialisation” of football is also reflective of a wider concern about the encroaching power of globalisation and the destruction of local identities. The “39th Game” proposal, an anathema to many, was offered as an example of how modern club owners place profit above the integrity of the sport.2 This would have broken the traditional structure of the league, in which each club plays every other team twice (once at their own stadium and once away). Thus, every team has the same schedule, and can therefore prove over the course of a year which is the strongest. But the opposition also reflected a grave concern that by playing English football in the middle east that local clubs, traditionally rooted in a local community, were simply businesses who happened to be based in England. Those fans from the surrounding area were far less important than anyone who was willing and able to buy a ticket. A culture was being lost.

Even when English clubs play each other, the prize at stake is often international in nature. Whether it be here, in a UEFA Champions League match; or whether it be as part of the process of finishing in the "top 4" so that the club can qualify for the mega-riches of European competition. (Source)

Even when English clubs play each other, the prize at stake is often international in nature. Whether it be here, in a UEFA Champions League match; or whether it be as part of the process of finishing in the “top 4″ so that the club can qualify for the mega-riches of European competition.
(Source)

The trends seen in the Premier League are not unique to England. Spain’s league has also become less competitive in the wake of foreign stars, despite possessing arguably the best two teams in world football over the past 5 years. The increased popularity of the Champions League – an annual competition for the top teams from each of Europe’s leagues – has further distorted the financial gap between the top and bottom of each country. “National” football makes increasingly less sense in a global market place, let alone the idea that a club should reflect and be integrated with the city in which it happens to play.

European football as a whole, then, reflects a world in which local and national assets are owned by increasingly powerful and distant entities, and bear little relation to their historical origins. Much like a car factory can simply leave an area for one with cheaper labour, causing local unemployment in its wake, football clubs can price out their traditional fans in the knowledge that there will be enough TV subscribers and tourists willing and able to fork out the price of viewing. Those praising and demonising the Premier League do so within this historical context. One in which quality undoubtedly improves, but the local character and culture become discarded in favoured of uniformity and the interests of distant powers.

  1. Mark Turner, ‘From local heroism to global celebrity stardom: a critical reflection of the social cultural and political changes in British football culture from the 1950s to the formation of the premier league’, Soccer and Society, 15(5) (2014), 751-60.
  2. Joel Rookwood and Nathan Chan, ‘The 39th game: fan responses to the Premier League’s proposal to globalize the English game’, Soccer and Society, 12(6) (2011), 897-913.

1991 – The Croatian War of Independence

27/04/2015

1 March 1991 – Pakrac

An old professor of mine during my third year as an undergraduate ran a course on Eastern European history. I enjoyed writing my dissertation on Czechoslovakia and Poland during the Second World War, but his expertise ran much wider than that. We got into a conversation about how the fall of Communism was a good thing because it meant that UEFA got a bunch of great new football teams such as Croatia, Serbia and Ukraine. He grinned, and said that one of his most embarrassing “predictions” in the late 1980s was that Yugoslavia would be one of the only communist states to remain intact. “I said, ‘Oh, Yugoslavia will have to remain together. The bloodshed would be horrific if any of the ethnic groups tried to secede’.” He paused. “I suppose I was half right.”

A disclaimer, which I don’t normally put in these articles: I do not claim to be an expert on Yugoslavia or Balkan politics. My interest as a historian is in the way in which stories about the war have been interpreted and filtered into the popular imagination. And, in particular, how they have been received in the United Kingdom. I don’t care who was “right” and who was “wrong”. And while I sympathise with those affected, it is not my intention here to tell “the real history” of Croatia. I hope the footnotes to this piece offer at least some starting point for those wanting to do their own research.

As tensions increased following the fall of the Berlin Wall, a football riot broke out between Dinamo Zagrb and Red Star Belgarde (respectively Croatia's and Serbia's top teams). For some, it marks the informal star of the war of independence. footage has been uploaded to Youtube.

As tensions increased following the fall of the Berlin Wall, a football riot broke out between Dinamo Zagrb and Red Star Belgarde (respectively Croatia’s and Serbia’s top teams). For some, it marks the informal star of the war of independence.
footage has been uploaded to Youtube.

The various civil wars in Yugoslavia in the 1990s were horrific. It made “ethnic cleansing” a widely recognised term, a euphemism to describe the mass murder of various ethnic groups. Even today, the independence of Kosovo is disputed, while the tensions between Serbs and other nations have often been expressed in violence at international sporting events.1 As a child just becoming aware of international events and the news, the violence in the former Yugoslavia was the first war I remember watching on television. And its effects are still being felt.

Like my professor, ethnic tension is often cited as the root cause of the violence. And in many ways it was. But the source of that tension was historical – not in the sense that the histories of these groups led to inevitable conflict; but that ethnic histories could be manipulated by political leaders to mobilise national groups to war.

Croatia is the land of Marco Polo and the neck tie. (The country is locally known as Hrvatska – which was corrupted in French to “cravat”.)2 In the Middle Ages it was an independent kingdom, but over the Early Modern period it spent time under the rule of Venice, the Hapsburg Empire and the Ottoman Empire. When the Austro-Hungarian state was dismantled following defeat in the First World War (prompted by the assassination of Franz Ferdinand in neighbouring Bosnia), Croatia, along with a number of other Balkan states, formed a new Kingdom of Yugoslavia. Then, after Nazi occupation during the Second World War, Yugoslavia emerged as an independent Communist state.

Unlike many of the other countries in the region, Yugoslavia was not a puppet of Moscow. It was fairly liberal, allowing much more foreign travel and educational opportunities than its neighbours. Indeed, on the eve of the fall of the Berlin wall, it was expected that the state would be become the first Eastern member of the European Economic Community.3 Instead, growing tensions within the Republic following the death of Tito boiled over into bloody violence. The question ever since for political scientists and historians has been – why?

At the time, and in the popular narrative, ethnic tensions between the various groups have been blamed. For the ex-Yugoslavia was by no means ethnically uniform. Slovenians, Croats, Bosniaks, Montenegrins, Albanians, Macedonians and Serbs all lived within the union. The capital, Belgrade, was within Serbia, but there were various constituent republics based around key towns such as Zagreb (Crotaia) and Sarejevo (Bosnia). This reflected the country’s origins as an alliance between various ethnic groups within the Balkans. Aside from these national/ethnic differences, there was also a mix of Muslim, Orthodox and Catholic religion.4 Certainly the mass murders in Kosovo and Bosnia were done in the name of “ethnic cleansing”.5 But regardless of how these events were spun, or the way ethnic identity was harnessed to provoke political mass movements, the historical picture is not quite so clear.

The former Yugoslavia and successor states as of 2008. (Source)

The former Yugoslavia and successor states as of 2008. (Source)

Indeed, Dejan Djokic has noted how one of the more interesting arguments during the Slobodan Milošovic trial centred over the Croatian Party of Right and the historical figure Vuk Stefanović Karadžić – a nineteenth century Serb linguistic reformer. In a debate between the ex-leaders of Serbia (Milošovic) and Croatia (Stjepan Mesic), the battle was over the true interpretation of historical events. For the Serbs, the Croatians took their political heritage from a proto-fascists; for the Croats, they were part of a line of freedom fighters, first against the Austrians and now against the Serbs.6 Others have published studies which show how political discourse at crucial points in Yugoslavian history – 1984-85, 1989-90, 1996 and 2003 – bear the marks of manipulation by political elites. What mattered most, argue Sekulić, Massey and Hodson, was that ethnic identities could be mapped onto political and economic movements. This allowed political and historical narratives to become distorted.7 These histories became necessarily adversarial, as Raymond and Bajic-Raymond discuss with regard to Franjo Tudjman (Croat) and Milošovic.8

In the end, national identities are built on history. Key events and public figures are interpreted and used as symbols of good or evil, and held up as role models for particular interests. They proclaim common interests, a shared heritage which is worth defending and building upon. And it is often said that history cannot exist without the nation.

No history is value free, nor can it ever hope to be. But when used as a call to war, it can be tremendously powerful. There is a danger that we lionise the good – “Churchillian spirit” or “the spirit of Dunkirk” destroying the Nazis – and forget that it is all part of the same process that allowed Hitler’s “Third Reich” to be placed in a continuum of great German states. This is why we should be always be sceptical of those who try to ‘rehabilitate’ the British Empire9 (as well as those who use it for their own nefarious ends).10

All history is distorted. This is not to say that all history is bad, but simply to make us keenly aware that what we think of as “the truth” is often far from objective. We shouldn’t stop building and retelling national myths necessarily. But we should always be questioning and presenting alternatives. An important lesson to learn as we move forward with the 30-for-30.

  1. See: Ellen Connolly, ‘Balkan fans riot at Australian Open tennis’, The Guardian, 24 January 2009 < http://www.theguardian.com/world/2009/jan/24/australian-open-riot > (accessed 25 March 2015); ‘Serbia and Albania game abandoned after drone invasion sparks brawl’, CNN, 15 October 2014 7:27pm EDT < http://edition.cnn.com/2014/10/14/sport/football/serbia-albania-game-abandoned/ > (accessed 25 March 2015).
  2. ‘Cravat’, Wikipedia < http://en.wikipedia.org/wiki/Cravat > (accessed 25 March 2015).
  3. V. P. Gagnon Jr., ‘Yugoslavia in 1989 and after’, Nationalities Papers 38(1) (2010), 23-39.
  4. See: Wendy Bracewell, ‘The end of Yugoslavia and new national histories’, European History Quarterly 29(1), 149-56.
  5. See particularly: ‘Srebrenica Massacre’, Wikipedia < http://en.wikipedia.org/wiki/Srebrenica_massacre > (accessed 25 March 2015).
  6. Dejan Djokic, ‘Coming to terms with the past: Former Yugoslavia’, History Today 54(6) (2004).
  7. Duško Sekulić, Garth Massey, Randy Hodson, ‘Ethnic intolerance and ethnic conflict in the dissolution of Yugoslavia’, Ethnic & Racial Studies 29(5) (2006), 797-827.
  8. G.G. Raymond, S. Bajic-Raymond, ‘Memory and history: The discourse of nation-building in the former Yugoslavia’, Patterns of Prejudice 31(1) (1997), 21-30.
  9. Seumas Milne, ‘This attempt to rehabilitate empire is a recipe for conflict’, The Guardian, 10 June 2010 8:01 BST < http://www.theguardian.com/commentisfree/2010/jun/10/british-empire-michael-gove-history-teaching > (accessed 25 March 2015).
  10. Mark Tran, ‘Mugabe denounces Britain as “thieving colonialists”‘, The Guardian, 18 April 2008 16:09 BST < http://www.theguardian.com/world/2008/apr/18/zimbabwe.independence > (accessed 25 March 2015).
mandela

1990 – The Release of Nelson Mandela

20/04/2015

11 February 1990 – Paarl

Writing about Mandela is incredibly difficult. Partly, it’s because so much has been said already. He is almost universally held up as a force for good, a freedom fighter who managed to win a democratic election and defeat Apartheid. I am nowhere near an expert on South African history, and so cannot give the arguments about “terrorism” the nuance and context they deserve. Writing about how the white establishment in Africa and Europe treated him abominably seems like shooting fish in a barrel. Yet that “Long Walk to Freedom” is iconic. Coming so soon after the Berlin Wall, it seemed to herald the end of the repressive political regimes of the twentieth century. I could not choose anything else for 1990.

A recurring theme in this series has been about explaining the importance of “myth”. The Berlin Wall was more than a wall. Chernobyl was more than a health and safety snafu. And it stands to reason that Mandela was more than a politician. He represents the death of Apartheid and the birth of modern1 South Africa. But hagiogrpahy – the creation of saints – is important. It’s not simply a case of propaganda (though this is part of it). Nor is it entirely cynical or manipulative. It is part of a crucial process of owning our own histories, giving us ideal types against which we can measure our present and guide our futures. In the end, it doesn’t matter if Isaac Newton was a paragon of rationality; if Margaret Thatcher was a great leader; if Ghandi was tolerant of all; if Hitler was pure evil; if Pele was the greatest footballer of all time. What matters is that these historical legends exist. At least in a political sense.

Where history can help is by contextualising both the “real” lives of the Florence Nightingales and Wolfgang Mozarts and the subsequent use of their legends by their apostles and enemies. For instance – did the 1994 election allow Mandela to become the symbol of ‘democratic transition.. and reconciliation’?2 Is that more important than Mandela’s “real” life? Or are the two inextricable?

Which is the more important depends on the questions one asks and for what purpose. If we want to understand how a movement could shift from violence to democratic tactics, then the role of Mandela and the political context of 1960s versus 1980s South Africa become important. If, however, we want to understand the political discourse of the 1990s and twenty-first century, then the image of Mandela may offer better explanations.

Indeed, Raymond Suttner has argued that the focus on Mandela as an icon has made it very difficult to write a narrative history about his life. Many spend their energies explaining the meaning of Mandela ‘rather than purely narrating’. Because, clearly, Mandela was never a uniform entity. He and South African politics changed a lot over his lifetime.3 For Western audiences, ‘popular narratives of race and redemption’ are perhaps more palatable than a full consideration of the reasons behind racial governance in sub-Saharan Africa.4 And whatever you do – don’t mention that he was a communist!5

Francois Pienaar receiving the Rugby World Cup from Nelson Mandela. ((Source)

Francois Pienaar receiving the Rugby World Cup from Nelson Mandela. ((Source)

Being geographically and politically distant also meant that Mandela and South Africa were refracted through the prism of television. His release from prison was in many ways the first of a series of “episodes”, which included meetings with world leaders and – most notably – the Rugby and FIFA World Cups of 1995 and 2010.6 A ‘safe’ image of Mandela as ‘international statesman’ became vital to the reintegration of South Africa into the post-cold-war international community. Philippe de Brabanter has argued that Time made a conscious effort to avoid references to communism and violence when it chose excerpts of Mandela’s autobiography to publish.7 At the same time, it was clear following his death in 2013 that many right-wingers who had opposed him in the past were unable to express anything but admiration for a man who had come to symbolise liberty, democracy and a post-racial world.8

In the end, this article hasn’t really been about Mandela. What it shows is that myths and legends are an integral part of history. They should not be dismissed simply as distortions. They are driving forces in their own right. Historians of different types ask different questions at different times. Some will be interested in “real” lives and events; others will be more interested in how those events have been refracted through the prism of history. What makes a good historian is the ability to be able to separate these issues and contextualise them. This will become ever more noticeable as this series progresses. Mandela was released 25 years ago. We do at least have the benefit of some historical distance when we come to assess him. This will not be the case with some of the later articles. So, with that disclaimer/excuse for lazy writing – let us proceed with the 30-for-30!

  1. Meaning “present-day” rather than “modern”.
  2. Xolela Mangcu, ‘Nelson Mandela in the history of African modernity – Towards a reappraisal of existing approaches’, Bulletin of the National Library of South Africa, 68(2) (2014), 187-97.
  3. Raymond Suttner, ‘(Mis)Understanding Nelson Mandela’, African Historical Review, 39(2) (2007), 107-30.
  4. Maryann Martin, ‘Throwing off the yoke to carry the man: Deconstructing the myth of Nelson Mandela’, Topia, 12 (2004), 41-62.
  5. Philippe de Brabanter, ‘”Long Walk to Freedom” or how “Time Magazine” manipulates Nelson Mandela into unwittingly forging his own image’, Revue Belge de Philologie et d’Histoire, 73(3) (1995), 725-39.
  6. Martha Evans, ‘Mandela and the televised birth of the rainbow nation’, National Identities, 12(3) (2010), 309-26.
  7. Brabanter, ‘”Long Walk to Freedom”‘.
  8. Julian Borger, ‘The Conservative party’s uncomfortable relationship with Nelson Mandela’, The Guardian, 6 December 2013, 15:03 GMT < http://www.theguardian.com/politics/2013/dec/06/conservative-party-uncomfortable-nelson-mandela > (accessed 16 March 2015); ‘Twitter fact-check: David Cameron didn’t want to “Hang Nelson Mandela” in the 80s’, New Statesman, 6 December 2013, 11:25 GMT < http://www.newstatesman.com/media-mole/2013/12/twitter-fact-check-david-cameron-didnt-want-hang-nelson-mandela-80s > (accessed 16 March 2015).
ISS035-E-17210-strip

1989 – The Fall of the Berlin Wall

13/04/2015

9 November 1989 – Berlin

Mauerfall1 is one of two iconic events in my lifetime that might be said to mark the “end” of the twentieth century world.2 I would say it was the most important.

The world before 1989 was a different place. That sounds trite, but it’s true. I’m too young to remember the Cold War, but I grew up alongside people who did. My parents were alive when the wall came up, and they saw it fall again. My grandparents fought in the War that helped establish the capitalist and communist “zones” within Europe. Those of my colleagues teaching undergraduates at the moment will be explaining a geopolitics very different to today. None of this should be underestimated – though I will explain later why the Berlin Wall is perhaps more representative of a number of cultural shifts since the twentieth century than the cataclysmic event that changed everything.

Much like with Chernobyl, the iconicism of the Wall – what its construction, presence and destruction symbolised – was as important as the edifice itself.

I’ve already said that I felt that 9/11/89 was the most important in my lifetime. But that has to be seen as part of the political narrative of the past 25 years. Democracy beat communism, Europe became united, and we now live in a globalised world. But – obviously – the Eastern Bloc did not disintegrate overnight. It would not be until 1991 that the USSR declared its disillusion; the same year Yugoslavia broke into a bloody civil war. And if, indeed, this is a story about how the capitalist West “won”, it wasn’t until 1999 that the Czech Republic joined NATO, and 2004 when they joined the European Union. The Wall was therefore a symbol of change, and reflected political and social shifts on both sides of the Iron Curtain. It came to represent the literal dismantling of a system of government that had lasted since the Second World War. Indeed, for a country like Poland, one could argue that this was the end of the Second World War.

Why was the wall built? Well, as with all things in history, the answer is long and rambling.3

In the run up to VE Day,4 the American-led armies to the West and the Soviet-led armies to the East were in a race to Berlin – the capital of the German and Prussian Empires since the eighteenth century. It was the heart of modern Germany. It represented not just the capital of the Nazi Reich, but the very concept of the German nation state. Sure, there was a lot of industrial technology to be looted (such as rocketry5 and mass production techniques), but there was also a sense that the mistakes of the First World War needed to be righted. In short, Germany had to be thoroughly and unequivocally beaten. And marching through the centre of Berlin was the most explicit way of achieving this.

Lloyd-George, Clemenceau and Wilson, the leaders of the UK, France and USA who negotiated the Armistice. (Source)

Lloyd-George, Clemenceau and Wilson, the leaders of the UK, France and USA who negotiated the Armistice. (Source)

Part of Hitler’s political appeal was based on the idea that the German Empire had been sold out by socialists and cowards in 1918. By signing the Armistice rather than fighting to the bitter end, the German state had been subjected to foreign interference, crippling reparation payments and had its territories in Africa and central Europe seized. The Austro-Hungarian Empire suffered similar dismantling. Many in Vienna and Berlin believed that Bohemia, Poland and Hungary were legitimately “German” lands, taken away by the vengeful French and their accomplices.6 With the Great Depression and hyperinflation in the inter-war Weimar Republic, the Nazi party gained widespread support on the back of promising to restore the Empire and right the injustices of the 1920s. In order to crush fascism in the German-speaking world, the Allies believed that total victory was paramount.

This left Germany cleft in half. The East was occupied by the Soviet Union, and the West by a mix of French, British and American troops. Following the Potsdam Conference, the Germany was split into 4 zones of control. The three western zones eventually became capitalist West Germany; while the eastern zone effectively became a puppet of Moscow. Berlin, wholly in the Soviet zone, was also split into four parts – effectively leaving a capitalist island within East Germany.

Map outlining the zones of control in Germany and Austria following the War. Austria eventually became an independent, democratic country in 1955 after the Soviet Union withdrew. Germany would remain split, however, until 1990. (Source)

Map outlining the zones of control in Germany and Austria following the War. Austria eventually became an independent, democratic country in 1955 after the Soviet Union withdrew. Germany would remain split, however, until 1990. (Berlin is in grid reference C3.) (Source)

Being an exclave during the early Cold War was not particularly helpful. Since Moscow could effectively block all land vehicles entering West Berlin by patrolling the border between East and West Germany, food and supplies were difficult to obtain. A blockade began in 1948 in response to the Western powers’ introduction of a new Deutschmark. For over a year, the only way to keep Western Berlin running was to airlift supplies into West Berlin. It proved successful, and eventually the Soviets backed down. But it showed the tension between two sides would be a constant threat.7

President Kennedy visiting the Berlin Wall in 1963. Ich bin ein Berliner... (Source)

President Kennedy visiting the Berlin Wall in 1963. Ich bin ein Berliner… (Source)

By 1961, East Germany was concerned at the number of defections to West Berlin. Since it was relatively easy to get across the border and never return, the Democratic Republic8 was losing skilled men and women. Around 3.5 million had managed to leave the country before the wall was erected. Sold as an “anti-fascist” protection against insurgency from the West, the wall effectively ended East-to-West migration, and sealed West Berlin off from the rest of the Russian-occupied zone. Or, perhaps more accurately, it sealed East Germany off from the Western world.

The Communist Bloc was in crisis in the late 1980s. Things came to a head when Hungary dismantled its armed border with Austria in August 1989, allowing “tourists” in the East a quick means of escape to the West. East Germany attempted to impose a travel ban, hoping to stop the flow of migrants. Resentment and protest grew and, following the resignation of Erich Honecker in October, events reached boiling point. On the night of 9 November, protesters began to hack away at the wall, effectively causing a revolution.

Photograph taken by astronaut Chris Hadfield from the International Space Station. The angle is skewed, but you can clearly see the bright white street lamps of Western Berlin towards the top, and the dimmer, yellow lights in Eastern Berlin towards the bottom. This is caused by the lower-quality lighting used in the East - in the West, street lamps were updated, indicating a higher level of investment in infrastructure. (Source)

Photograph taken by astronaut Chris Hadfield from the International Space Station. The angle is skewed, but you can clearly see the bright white street lamps of Western Berlin towards the top, and the dimmer, yellow lights in Eastern Berlin towards the bottom. This is caused by the lower-quality lighting used in the East – in the West, street lamps were updated, indicating a higher level of investment in infrastructure. (Source)

The Berlin Wall has always been a symbol as much as a physical barrier. While many died trying to get across it, it served an important political purpose for both sides of the political divide. This was as true while it stood and with its fall.9 Today, the wall has come to represent a memorial to the past – not to celebrate the division, but as a lesson that ought not to be forgotten. The site of the wall has become an important place for tourists, not just from abroad but from within the German Republic. As Hope Harrison argues, the wall has in some ways been ‘resurrected’ as an important artefact of German history.10 This is a common theme in European history – much like Auschwitz was not razed to the ground, these buildings serve as a stark and important reminder of what people can do when their power remains unchecked.

For those outside Germany, it represented the Rubicon for the anti-communist revolution. But it also reflected a number of other changes that were stirring at the time. In the West, almost-universal ownership of televisions, telephones and automobiles had created a much smaller world than the one of the 1940s. Affordable air travel, the growing reach of multinational corporations and the increasing importance of the European Economic Community were creating a globalised world. 1989 marked the point at which the Communist East became part of this world too, no longer in self-imposed exile on the other side of the Iron Curtain. It was a victory for capitalism, democracy and freedom. Whether or not this “really” happened, the resulting invasion of the East by McDonalds, Pepsi, Nike and David Hasselhoff created a very different geopolitical landscape.

As the articles following this will show, the next 25 years were very different to the previous 25 in so many ways. 1989 marked the beginning of the end of the Cold War; or perhaps the end of the Second World War; or even the end of the twentieth century itself, stretching back to Versailles. Everything else in this series is informed by these events. This is why it had to be included in the 30-for-30.

  1. The German word for the Fall of the Berlin Wall. Replicated here to bolster my hipster credentials. “Europa geht durch mich“, etc.
  2. The “short” twentieth century being roughly World War I (c. 1914) to the fall of the Eastern Bloc (c. 1989-1991). Longer versions (usually Americo-centric) position the “end” with the terrorist attacks on the World Trade Centre in New York in September 2001.
  3. (c) Grampa Simpson.
  4. 9 May 1945. “Victory in Europe Day”, known in some countries as Liberation Day, or similar. See: ‘Victory in Europe Day’, Wikipedia < http://en.wikipedia.org/wiki/Victory_in_Europe_Day > (accessed 9 March 2015).
  5. The popular narrative is that this kick started the space exploration programmes in the US and USSR. See: ‘Soviet space program’, Wikipedia < http://en.wikipedia.org/wiki/Soviet_space_program > (accessed 9 March 2015).
  6. Clemenceau is traditionally seen as the more vengeful of the three leaders who negotiated the Treaty of Versaille. Having been stung by the 1875 Franco-Prussian War, he wanted revenge on Germany without too much care for the social or political consequences. Lloyd-George was sympathetic, but willing to go along with it, while Wilson was horrified. Thus, another lesson of the First World War was not just that Germany had to be conquered – the country needed to be rebuilt, with an emphasis on democracy and co-operation. German Imperialism had be be thoroughly destroyed. See: ‘Treaty of Versailles’, Wikipedia < http://en.wikipedia.org/wiki/Treaty_of_Versailles (accessed 9 March 2015).
  7. ‘Berlin Blockade’, Wikipedia < http://en.wikipedia.org/wiki/Berlin_Blockade > (accessed 9 March 2015).
  8. East Germany. I’m just getting bored of typing the word “East”…
  9. Pertti Ahonen, ‘The Berlin Wall and the battle for legitimacy in divided Germany’, German Politics and Society, 29(2) (2011), 40-56.
  10. Hope M. Harrison, ‘The Berlin Wall and its resurrection as a site of memory’, German Politics and Society, 29(2) (2011), 78-106.

1988 – The Global Polio Eradication Initiative

06/04/2015

13 May 1988 – Geneva

Poliomyelitis is, sadly, still with us. But since 1988, global action on the disease has reduced the number of cases from an estimated 350,000 to just 445 in 2013.1 This is pretty remarkable for a disease that only reached epidemic proportions in the twentieth century, and with a vaccine that was developed as recently as the 1950s. Given my own research interests in the history of polio and vaccination, the Global Polio Eradication Initiative (GPEI) is the fourth entry in the 30-for-30 series.

The Forty-first World Health Assembly… declares the commitment of WHO [the World Health Organization] to the global eradication of poliomyelitis by the year 2000.

The GPEI may have missed this ambitious goal, but it was not for want of trying. The number of confirmed cases of poliomyelitis dropped significantly over the 1990s.2 The last bastions of the disease have, however, proved difficult to break down due to a combination of economic, political and medical factors.

Ancient Egyptian depiction of someone affected by polio - although, as historians of medicine, we need to be careful about diagnosing people in the past! (Source)

Ancient Egyption depiction of someone affected by polio – although, as historians of medicine, we need to be careful about diagnosing people in the past! (Source)

Polio is, in many ways, a disease whose importance as risen alongside modern biomedicine. Though diseases which may be today thought of as “polio” were recorded in Ancient Egypt, proper classification of its causes, effects and treatment are definitely modern in origin. It was “discovered” through the works of Karl Medin and Jakob Heine over the mid-to-late nineteenth century. The virus responsible was only isolated in 1909.3

The first outbreaks occurred in Western Europe and the United States at the dawn of the twentieth century. Unlike infectious diseases such as tuberculosis or cholera, which were largely attributed to poverty and poor sanitation, polio seemed to affect everyone equally. There was also no cure, and so wealthier patients were not protected from their usual isolation from the vectors of disease or ability to pay for the best treatment. Indeed, there was strong evidence that the greater the level of sanitation in a region, the more likely it was to succumb to an epidemic. The first outbreak of epidemic proportions hit Britain in 1947, and by the early 1950s the Western World had mobilised its efforts to find some sort of medical protection against the poliovirus.

This may seem a very cynical interpretation of the history of polio, but there is no doubt that the battle against the disease was given a massive boost by powerful patrons with a personal interest. Franklin D. Roosevelt, President of the United States and polio survivor, founded the National Foundation for Infantile Paralysis. “The March of Dimes”, a fundraising campaign, attracted support from across the United States, fueled by “celebrity endorsements” from the likes of Lucile Ball and Louis Armstrong. Much of this money focused on medical research into the causes and prevention of polio.

That came to fruition in 1955 when Jonas Salk announced successful trials of a new intravenous poliomyelitis vaccination (IPV).4 It was quickly rolled out across the United States, and similar versions were used around the world. His rival, Albert Sabin, produced an oral polio vaccine (OPV) which was eventually adopted as a safer and more effective method. For children of my age, the foul-tasting drops on a jelly baby formed our memories of going to see the doctor.

A German doctor administering the oral polio vaccine. (Source)

A German doctor administering the oral polio vaccine. (Source)

The story of Salk vs Sabin is fascinating in its own right, and something I may cover at a later date.5 For now, however, all we need to know is that despite the overwhelming success of various vaccination programmes (polio has been nigh-on eradicated in the West since the late 1970s), scientific consensus is never enough to convince people to accept vaccination into their lives.

The Cutter Incident of 1955, for example, almost killed IPV before it had even begun. The Cutter Laboratories in California produced a batch of Salk’s vaccine with live poliovirus, not the inactivated version required to avoid infecting the patient. Hundreds of children caught the disease directly from the vaccine. Salk vehemently protested that the problem was with the manufacturer, not the design, and he was proven right. But the planned vaccination programme in Britain was delayed while the Medical Research Council and Ministry of Health debated on how to proceed. The British decided to make their own, despite being completely incapable of producing enough of the vaccine to inoculate the number of children who had been signed up for the programme. When they did cave to demand and began to import Salk’s vaccine, they gave parents the option to opt out. The MRC advised:

This country has an unblemished record and it is strongly felt in some medical quarters that it would be deplorable to run any risk of an accident such as might jeopardise public confidence, not only in the particular vaccine, but in preventive inoculation and vaccination in general.6

Cutter had shown that vaccination was not without its risks – and until the MRC was certain about its safety, the known risk of wild poliovirus was preferred to the unknown risk of Salk’s new invention.

Of course, by the 1980s manufacturing techniques had improved dramatically. New strains of the vaccine had been produced, and the new OPV was not only safer but easier to administer. With WHO backing, many countries adopted polio vaccination, and rates fell dramatically. Why, then, did progress stall in the early 2000s?

There were practical concerns, to be sure. Since there is an unhelpful relationship between sanitation and polio, vaccination is one of the very few public health measures that can have a lasting impact on the disease. (For instance, hygiene and quarantine were used in conjunction with smallpox vaccines to eradicate that particular disease.) Very remote rural regions were hard to access. As were war zones. Refrigeration is also a problem in places without electricity in sub-tropical climates. But there was a growing political opposition to vaccination too. Some of this was due to a post-colonial backlash against white doctors “experimenting” with black bodies. Other legitimate concerns from locals were fanned by groups with a political interest in driving out foreign observation. Attacks on aid workers in Pakistan and Afghanistan, for example, mean that this area is one of the very few where polio remains endemic.

It didn’t help that the Central Intelligence Agency was found to be using vaccination programmes to spy on remote populations.7 In the same way that Cutter and incidents like it have always been used by anti-vaccination campaigners who argue – against all epidemiological evidence – that vaccination is unsafe compared to a perceived “natural immunity”.8 Despite these setbacks, however, India was declared polio-free in 2014. The number of recorded cases has fallen from around 3,000 in the year 2000 to just 445 in 2013. The year for global eradication keeps being pushed back, but I hope that by the time I’m 40 the job will be done.

  1. See various GPEI reports, and the data collated on Wikipedia, ‘Poliomyelitis eradication’ < http://en.wikipedia.org/wiki/Poliomyelitis_eradication > (accessed 15 January 2015); World Health Organization, ‘Polio Case Count’ < https://extranet.who.int/polis/public/CaseCount.aspx > (accessed 15 January 2015)
  2. Ibid.
  3. See Gareth Williams, Paralysed with Fear: The Story of Polio (London: Palgrave Macmillan, 2013).
  4. Thomas Francis, Evaluation of the 1954 Field Trial of Poliomyelitis Vaccine: Final Report (Ann Arbor : University of Michigan, 1957).
  5. Williams, Paralysed with Fear.
  6. The National Archives: FD 23/1058. Sir H Himsworth to Lord Alec Home, ‘Vaccination Against Poliomyelitis. Considerations relating to the possible use of American Salk vaccine in this country’, 25 July 1957.
  7. Saeed Shah, ‘CIA tactics to trap Bin Laden linked with polio crisis, say aid groups’ in The Guardian, 2 March 2012 16:57 GMT < http://www.theguardian.com/world/2012/mar/02/aid-groups-cia-osama-bin-laden-polio-crisis > (accessed 15 January 2015).
  8. Macrobiotic Guide, ‘Family health’ < http://www.macrobiotics.co.uk/familyhealth/childsimmunity.htm (accessed 15 January 2015).

1987 – Star Trek: The Next Generation

30/03/2015

29 May 1987 – Los Angeles

I grew up with Star Trek. Between 1987 and 2005, a new series1 of the franchise aired on US television, and so I watched quite a bit of it during my primary and secondary school education. Despite its status as a science fiction classic (and the general reputation of science fiction fans), it seeped into the popular consciousness in the way that few series ever have. And while it’s a massive stretch to argue that Star Trek “invented” much of the technology which would go on to be commonplace in the twenty-first century, it probably isn’t that far fetched to say that many of the technicians and inventors at Google, Apple, Microsoft et al were also glued to their TV screens every week watching Picard, Sisko, Janeway and Archer fight the good fight against the evils of the galaxy.

The Next Generation was announced on 10 October 1986.2 Filming began on a pilot entitled Encounter at Farpoint in May 1987, and the series finally aired in September. 3 But, of course, this was not the first crew to boldly go where no crew had gone before.4

The original Star Trek series was created by Gene Roddenberry, and was first broadcast in 1966. It made stars of William Shatner and Leonard Nimoy, while the crew became household names. ‘Beam me up, Scotty’ , ‘He’s dead Jim’ and ‘Live long and prosper’ are phrases as well known as ‘My kingdom for a horse’. And, of course, there was that interracial kiss that broke television taboos during the height of the civil rights movement.

Roddenberry’s vision was of a united Earth, in which petty squabbles over money, religion and political ideology had long-since been consigned to history. Following the Third World War, humans invented a faster-than-light engine that allowed them to explore the stars. First contact with the emotionless Vulcans brought homo sapiens sapiens into a galactic community. Yes, there were dangers from the militaristic Klingons or the duplicitous Romulans, but humanity would face them together. With a multi-ethnic crew, women in senior roles and story lines that tended to explore deep philosophical issues, Star Trek was in many ways the quintessential expression of 1960s optimism. Some day, science and reason would lead humanity to total enlightenment. And we could explore the beauty of space together.

This is what I loved about the Star Trek universe too. Ignoring the deep tricky questions of “so, what do we do without money?” and “if Earth is united, why is there a disproportionate number of white male Americans in senior positions?”,5 the show did explore some pretty deep problems. Deanna Troi, the Enterprise’s counselor in TNG, was confronted with physical and psychic rape. Jean-Luc Picard, the captain, was assimilated into the Borg collective, and had to deal with severe post traumatic stress. William Riker got his end away. A lot. And Wesley Crusher had to deal with the pressures and social awkwardness of being a child protegé.

Since I grew up alongside the “reboot” of Star Trek, it’s been an important part of my cultural heritage. And while the show has obviously reflected the sensibilities of American audiences over the past fifty years, it has always challenged cultural norms. It’s hard to imagine any other shows which have put a disabled black man in a position of authority and made them central to the story arc.

Clockwise from top left: T'Pol; Seven of Nine (being held hostage); Dax; Troi. (Source)

Clockwise from top left: T’Pol; Seven of Nine (being held hostage); Dax; Troi. (Source)

In what has to be one of the most fun literature searches I’ve ever done, there has been an awful lot of work done by academics on the Star Trek universe. In particular, queer studies has found elements of the show which both reflect and challenge Western concepts of gender, race, sexuality and the body. Take, for example, Seven of Nine (Voyager) and T’Pol (Enterprise). Hey, perhaps even throw Dax (Deep Space Nine) and Troi (TNG) in there too. All four women were cast (in part) due to their sexual attractiveness. They were often seen in tight-fitting or revealing garb, and clearly served a… how to put this… aesthetic function.

In each case, however, the women were a vital part of the crew. Seven of Nine possessed knowledge of the region of space in which Voyager was stranded; T’Pol was first officer, send from the Vulcan high command to aid the inexperienced human crew; Dax had lived several past lives, making her an experienced science officer and confidant of the captain; and Troi was the ship’s counselor, and as an empath was invaluable during diplomatic missions. As Ulrich Scheck has argued, T’Pol and Seven use sarcasm and wit to go beyond their ‘stereotypical body image’.6 And the friendships between Crusher and Troi, Dax and Kira and Janeway and Torres would easily satisfy the Bechdel test.7

One criticism has been the lack of openly gay characters in Star Trek, although sexuality was often played with during the show’s run.8 Beverly Crusher, the doctor in TNG, falls in love with a Trill – a species whose memories are held within a symbiotic being that lives inside the humanoid. When the human (male) part dies, the symbiont is implanted into a woman. The two remain emotionally in love, but Crusher finds it impossible to reconcile this with this new female body.9 A similar narrative is told with Jadzia Dax, when an ex-lover of Kerzon (her previous host) is dragged into a court case. Here the love clearly remains, but is not acted upon.10 One can add to this a number of races whose reproductive cycle does not involve traditional gender roles. Riker, for example, falls for a member of an androgynous species in whose society sexual behaviour is considered a mental illness.11 In another episode, a group of isolated colonists who have evolved a form of reproduction involving cloning have to stomach the unpalatable idea that sexual intercourse with another colony will help revitalise their damaged and shallow gene pool.12

While I could write about Star Trek indefinitely, the point of these articles is to show historically important events during my life time. Culturally – for me – this is one of the biggest. What I also find so interesting is that I can re-watch the episodes in light of events that have happened since. The terrorism narratives in Deep Space Nine can be quite harrowing, especially with the knowledge that 9/11 was only a few years after the show ended. Having gone through various stages of the education system, the historical allegories gained greater nuance. And, as seen above, re-watching some of the episodes from certain political stances can give a variety of new interpretations.

But more than anything else – I fucking love Star Trek. It’s going in the 30-for-30.

  1. I’m English. What Americans call “seasons”, we call “series”. What Americans call “series”, we just tend to refer to as “shows” or “programmes”. Apologies for the confusion.
  2. My first birthday.
  3. ‘Encounter at Farpoint (episode)’, Memory Alpha < http://en.memory-alpha.org/wiki/Encounter_at_Farpoint_%28episode%29 > (accessed 3 March 2015).
  4. The kind of people who get bothered by split infinitives are the worst kind of bores. But if it makes you feel any better, the quote is attributed to Zephron Cochrane, the inventor of the warp drive. He said the engine would allow man ‘to go boldly’. I am aware of how much I need to get a life. See: ‘Broken Bow (episode)’, Memory Alpha < http://en.memory-alpha.org/wiki/Broken_Bow_%28episode%29 > (accessed 3 March 2015).
  5. Allen Kwan, ‘Seeking new civilizations: Race normativity in the Star Trek franchise’, Bulletin of Science, Technology and Society 27(1) (2007), 59-70.
  6. Ulrich Scheck, ‘Where no woman has gone before: Humour and gender crossing Star Trek’s Voyager and Enterprise’, Amsterdamer Beiträge zur Neueren Germanistik 69(1) (2009), 103-118.
  7. I wrote that last sentence with trepidation, but thankfully someone’s done the research for me. Jarrah Hodge, ‘How does your favorite Star Trek series fare on the Bechdel test?’, The Mary Sue (1 September 2014, 10.55am) < < a href="http://www.themarysue.com/star-trek-bechdel-test/">http://www.themarysue.com/star-trek-bechdel-test/ > (accessed 9 March 2015).
  8. Stephen Kerry, ‘”There’s Genderqueers on the Starboard Bow”: The Pregnant Male in Star Trek’, Journal of Popular Culture 42(4), 699-714.
  9. ‘The Host (episode)’, Memory Alpha < http://en.memory-alpha.org/wiki/The_Host_(episode) > (accessed 9 March 2015).
  10. ‘Dax (episode)’, Memory Alpha < http://en.memory-alpha.org/wiki/Dax_%28episode%29 > (accessed 9 March 2015).
  11. ‘The Outcast (episode)’, Memory Alpha < http://en.memory-alpha.org/wiki/The_Outcast_(episode) > (accessed 9 March 2015).
  12. ‘Up the Long Ladder (episode)’, Memory Alpha < http://en.memory-alpha.org/wiki/Up_The_Long_Ladder_(episode) > (accessed 9 March 2015). See also: Victor Grech, ‘Infertility in Star Trek’, Word Future Review 4(4) (2012), 19-27.

1986 – Chernobyl

23/03/2015

26 April 1986 – Pripyat

The Chernobyl Disaster is one of those iconic events that has permeated into many aspects of our society. While it certainly wasn’t the first nuclear disaster (or, indeed, the last), it occurred at a time in which its political, environmental and cultural effects were amplified. Chernobyl, now a byword for catastrophe, has had a lasting impact upon the last three decades. And so, here it is at number 2 in the 30-for-30.

Homer, your bravery and quick thinking have turned a potential Chernobyl into a mere Three-Mile Island. Bravo!

Montgomery C. Burns, The Simpsons, 5 November 1995

The Simpsons plays on the caricature of nuclear power. It is simultaneously the economic centre of Springfield and - on more than one occasion - a potential cause of Armageddon...

The Simpsons plays on the caricature of nuclear power. It is simultaneously the economic centre of Springfield and – on more than one occasion – a potential cause of Armageddon…

As with most historical events, the more fascinating aspects of Chernobyl are not the scientific facts, but the way it came to be represented and reconstructed by various people. However, the reality of how the plant came to its demise is intriguing. One might be forgiven for thinking that Pripyat – the abandoned Ukrainian city in which Chernobyl was built – can never be visited. That a massive mushroom cloud billowed above, leaving a massive crater below. That the local fish have three eyes, and that anyone not disintegrated by the blast died soon after from horrific radiation burns. Much like the monsters of the early-modern period, though, the myth of Chernobyl is built on elements of truth that have been exaggerated and reinforced in the popular imagination.

Panaroma of Pripyat, the city built to house the workers at the Chernobyl Nuclear Power Plant. (Source)

Panaroma of Pripyat, the city built to house the workers at the Chernobyl Nuclear Power Plant. (Source)

First, the plant did not explode like a Hiroshima-style A-bomb. A power surge, combined with poor safety procedures, produced a fire within one of the reactors.1 This then caused a chemical explosion (it’s not a good idea to expose graphite to fire), which created a cloud of radioactive ash. As a direct result of the explosion, two workers died. A further 28 died within three months as a result of the rescue and containment operation. The prevailing winds meant that much of the fallout landed in the nearby republics of Belarus and Russia rather than in the Ukraine itself. While this has significantly raised the risk of cancer in these areas,2 the wider region is still inhabited and, while far from ideal, it is safe enough for people to live there.3. Indeed, while Pripyat and the immediate environs are restricted and abandoned, it is still possible to visit the city. You could even drive through it, if you were so inclined…

This is not, of course, to downplay the scale of the disaster. It was incredibly expensive. Pripyat will remain uninhabited for 20,000 years. And there are not only significant levels of cancer in Ukraine and Belarus; lives were irreparably disrupted by the relocation of 50,000 citizens from the city. But Chernobyl was never the comic-book Apocalypse that it appears to be portrayed as. So. Why does this myth keep repeating?

First, the most boring explanation. It’s a good story. The idea of a far-flung place, nuclear explosions, mutant trees and radioactivity. These are the stuff of science fiction and play into fears of engineers playing God. They’ve been around for a long time. The Hulk and Spider-Man both debuted in 1962. That same year, the Cuban Missile Crisis (allegedly) brought the world to the brink of nuclear annihilation. The word “nuclear” is iconic. Mix it in with “explosion” and “Russia” (because all of the USSR was “Russia”), and you’ve got yourself a blockbuster.

Which brings us onto perhaps the most important aspect. The Cold War. Not only did the late-Cold War setting allow the West to use Chernobyl as a sign of Soviet incompetence4 – an almost literal metaphor of how the country was falling apart – it also led to major problems dealing with the aftermath. When the USSR disintegrated in the 1990s, responsibility was spread between the Russian, Belarusian and Ukrainian governments. Consequently, there remain serious social and political problems along the Dneiper river.5 This has served as a constant reminder of the long-term effects of nuclear power if it goes wrong. This food for the anti-nuclear lobby and, in turn, keeps Chernobyl in the public consciousness. For others, Chernobyl must be held up as the exception, caused by incompetence. Nuclear power is such an important part of many Western countries’ energy infrastructure that all fear must be projected onto Chernobyl and focused away from the potential disasters closer to home.6 Following the fall of the Berlin Wall, Western experts sought to improve safety standards in the East as a way of enforcing their own professional power and to show to the world that nuclear was safe when “done properly”.7

When the Fukushima plant went into meltdown following the 2011 earthquake in Japan, comparisons were immediately drawn.8 But this hasn’t captured the imagination in the same way. At the time, there was a great deal of speculation, fuelled again by the “disaster movie” narrative being spun by the rolling news media. Yet the limited fallout and the relatively swift response appear to have nipped it in the bud. It probably helps that Japan is “one of us” – a technologically advanced capitalist nation. Thus, despite being the only other “level 7″ nuclear accident, Fukushima is not talked about in the same tones as Chernobyl.

The disaster is one of the most iconic events of the last thirty years. It simultaneously seems to be blown completely out of proportion as a cartoonish Apocalypse; and underplayed as a long-term catastrophe outside of the city of Pripyat itself. With the political situations in Belarus, Russia and Ukraine currently unstable, it is clear that Chernobyl is not over – and the management of the aftermath continues to be a concern. For these reasons, Chernobyl is the entry for 1986.

  1. Marples argues that the disaster was as a direct result of complacency on behalf of the nuclear industry in the USSR in the 1970s and 1980s. See David R. Marples, ‘The Chernobyl Disaster’ in Current History 86(522) (1987), 325-43.
  2. Adriana Petryna, ‘Biological citizenship: The science and politics of Chernobyl-exposed populations’ in Osiris 19 (2004), 250-65.
  3. International Atomic Energy Agency, Chernobyl +15: Frequently Asked Chernobyl Questions (undated, but presumably c. April 2001) < http://web.archive.org/web/20031220213501/http://www.iaea.org/NewsCenter/Features/Chernobyl-15/cherno-faq.shtml > (captured by The Internet Archive, 2 December 2003) (accessed 3 February 2015)
  4. Nicky Falkoff, ‘Heroes with a Half Life: Teenage Mutant Ninja Turtles and American repression of radiophobia after Chernobyl’ in The Journal of Popular Culture 46(5) (2013), 931-49.
  5. BBC News, ‘Belarus cursed by Chernobyl’ (26 April 2005) < http://news.bbc.co.uk/1/hi/world/europe/4485003.stm > (accessed 3 February 2015); Petryna, ‘Biological citizens’.
  6. Falkoff, ‘Heroes with a Half Life’.
  7. Thomas R. Wellock, ‘The children of Chernobyl: Engineers and the campaign for safety in Soviet-designed reactors in Central and Eastern Europe’ in History & Technology 29(1) (2013), 3-32.
  8. BBC News, ‘How does Fukushima differ from Chernobyl?’ (16 December 2011) < http://www.bbc.co.uk/news/world-asia-pacific-13050228 > (accessed 3 February 2015).

1985 – WrestleMania

16/03/2015

31 March 1985 – New York City

This was the moment that the modern version of professional wrestling – cartoon characters, big venues, loud music, pyrotechnics and Spandex went global. Or, at least, Vincent McMahon Jr’s version of it. But despite a relative decline in popularity over recent years, the idea of “predetermined” or “choreographed” fighting, closely associated with the Greco-Roman wrestling seen in the Olympics, has a deep cultural history across Britain and America. It is with this flimsy excuse I open this series with Wrestlemania.

I’m often met with incredulity from work colleagues when I tell them about what I spend my free time doing. Playing computer games. Catching up on TV. Going to the cinema. And watching professional wrestling.

“You do know it’s fake, right?”

Big Daddy, World of Sport

Big Daddy (in white) and Giant Haystacks, two of the biggest stars of British wrestling in the 1970s and 1980s. (Source)

Of course, “fake” is a relative term.1. While the outcome is predetermined and the story lines are acted, they are played as if real (“kayfabe” in wrestling lingo). Much like any dramatic art form. But while the strikes, flips, spins and throws are often performed in such a way to minimise the damage done to the performers (whilst making it look like they are beating the proverbial out of each other) the risks being taken are very real.2

I could write an entire book on why professional wrestling is the best thing ever (despite the casual racism, misogyny, homophobia, drug use, questionable morality, politics, occasional contempt for its audience, lack of safety and security for its performers…). What I want to argue is that wrestling, like sport in general, has been an important part of working class culture around the world. Indeed, the way it plays on tropes within society, and the fact that it is a “fake” sport, is entirely the point.

Cribb v Molineux from 1811. (Source)

Cribb v Molineux from 1811. (Source)

Professional wrestling developed alongside professional sport during the industrial revolution. Forms of martial arts such as boxing, Greco-Roman style wrestling, and so forth had become popular attractions and carnivals and fairs. As permanent structures were built to house the “music hall” variety acts (“Vaudeville” in the United States), various forms of football, pedestrianism (forerunner of track and field), and so on, a need grew for star attractions on a regular basis. The nature of fighting, however, is that competitors can only perform once every month or so. Injuries and fatigue build up. Moreover, the most accomplished boxer is not necessarily the most charismatic. As far as the business is concerned, the only good fighter is the one who can “draw” – bring people into the arena to buy tickets.3

The narrative power of sport was popular and profitable. While “legitimate” competition continued to grow, it was clear that not every soccer game was great to watch. Not every boxing bout went the distance (some were over in a few seconds); others were long turgid draws. One way to ensure entertaining events was, therefore, to add the drama of sport without the audience becoming incredulous. Wrestling, more prone than other sports to technically impressive but largely dull affairs, could be extended if the charismatic star was able to deliberately “go easy” on his opponent and extend the contest. Or a foreign star could burst into the auditorium and demand a fight at the end of the night. This would encourage people to come back next week, and allowed the audience to cheer on their local hero against the evil outsider.

Professional wrestling gradually incorporated more and more of these elements. Wrestlers took on characters – or “gimmicks” – to make themselves more attractive. They began performing more spectacular moves, like jumping off ropes and performing flips. These had little impact on their ability to legitimately win a contest, but entertained crowds. To allow them to move from town to town, fighting every night, the winners began to be predetermined, with the “workers” being more gentle with each other to avoid injury and fatigue. Feuds were manufactured to give a reason for people to fight and for an audience to continue to buy tickets. By the 1950s this had become a well-recognised form of entertainment in Britain and the United States, fuelled in the latter case by local TV stations looking for cheap content.4

Blue Demon and El Santo, two of the "big three" luchadores (along with Mil Mascaras), who popularised the lucha libre style of wrestling in Mexico and Latin America. (Source)

Blue Demon and El Santo, two of the “big three” luchadores (along with Mil Mascaras), who popularised the lucha libre style of wrestling in Mexico and Latin America.
(Source)

While professional wrestling spread across the world, each country adapted the concept to their own local attitudes towards sport. In the United States, the World Wrestling Federation (WWF) became the most popular “promotion” based on larger-than-life characters, and a Hollywood-esque soap opera approach to storytelling. In Britain, fights were based around more-technical holds, and television presented the contests in the same way it would broadcast “legit” sports. In Mexico, the culture of masks and bright costumes was married to high-flying, fast-paced gymnastic moves. The Japanese developed a style which looked and felt more realistic, in some cases putting wrestlers in real – “shoot” – fights similar to modern-day mixed martial arts. This reflected the origins of wrestling in the country – imported by the United States after the Second World War as a replacement for competitive sport, which was banned.5 Australia, Germany and South Africa (among others) put their own spin on it.

"Macho Man" Randy Savage, one of the most recognisable wrestlers of the 1980s.  (Source)

“Macho Man” Randy Savage, one of the most recognisable wrestlers of the 1980s. (Source)

Because of this set up, most of the biggest stars the “sport” has produced have tapped into the cultural Zeitgeist. Sgt. Slaughter, for example, became WWF champion in the early 1990s after he abandoned his country (kayfabe, darlings) to support Saddam Hussein at the height of the Gulf War.6 Randy Savage was a charismatic “Macho Man” with over-the-top colourful outfits that sum up the 1980s to a tee. Hulk Hogan did it even better, going on to star in multiple (awful) movies. The Rock and Stone Cold Steve Austin were rowdy anti-heroes during the “edgy” 1990s. At the same time, plenty of wrestlers of colour have found themselves on the losing side more often than not; women were often given very stereotypical gimmicks to work with.7 The Hispanic “Los Guerreros” would ‘lie cheat and steal’. And the less said about the tradition of “midget wrestling” the better.

But this isn’t about the issues with pro wrestling. Like any art form, it reflects the climate of the time. If wrestling is sexist and racist, it’s because it exists as a warped, cartoonish version of reality. If people find its depiction of competition and success distasteful, that is because it reflects our society; one only has to see the sport analogies used by politicians to see that sport, entertainment and politics all borrow from each other on a regular basis.

Anyway. Why does Wrestlemania matter? Well, it began the slow breakdown of the structures which had held up pro wrestling across the world. In many cases, it shows the power of globalisation.

With so many different styles, why is it the WWF’s product that people generally associate with ‘rasslin? Primarily it’s because of the success of Wrestlemania and Vince McMahon Junior’s attempts to make his company an international enterprise. In the 1970s, the wrestling world was split into “territories”. The National Wrestling Alliance (NWA) maintained an international system in which promotions would not actively compete in each other’s geographical area. Even those companies that were not part of the NWA (such as McMahon’s WWF in New York, or Verne Gagne’s American Wrestling Association in Minneapolis) understood that this cartel was good for business. Wrestling was massive on international television, but very much localised. The UK would cheer on Big Daddy on World of Sport; New York would idolise the WWF’s Bruno Sammartino; Memphis loved Jerry Lawler. If a wrestler – usually a “villain” – became stale, he could go and work in another area as an unknown (or, perhaps, with a mythical reputation). Within the United States, however, different areas had different emphases. Some were more hard-hitting; some focused more on storytelling; others went more for athleticism. Regardless, the WWF was not wrestling sin qua non.

McMahon Junior bought the WWF from his father in the early 1980s, and planned to take the promotion onto national television. The rise of cable, coupled with new formats for “pay-per-view” events opened up the possibility of marketing the WWF well beyond New York and New England in a cost-effective way. Against McMahon’s Senior’s wishes, Junior got his television show on cable across the country, and began signing the biggest stars from other companies (in flagrant violation of the NWA “gentlemen’s agreement”).

Wrestlemania was a massive gamble. McMahon spent big on luring Hulk Hogan away from the AWA in 1983 in preparation, building the company around his star power. Then he invested in hiring venues to show ‘Mania through “closed-circuit television”, and brought in Cyndi Lauper and Mr T as celebrity guests. It was a runaway success, leading eventually to international expansion.

The poster for Wrestlemania. Vince McMahon's financial gamble paid off and eventually led to global expansion. (Source)

The poster for Wrestlemania. Vince McMahon’s financial gamble paid off and eventually led to global expansion. (Source)

Unable to compete with the production values and celebrity of the expanded WWF, many of the local promotions went into terminal decline. The loss of their best draws to New York didn’t help. World of Sport in the UK went off the air, and replacement shows quickly lost ground to the glitzier and more bombastic McMahon product. By the mid-90s, only World Championship Wrestling (WCW) could seriously compete in the USA, with Extreme Championship Wrestling (ECW) offering a more low-fi alternative. Japan and Mexico maintained their traditions, but in the latter case stars such as Rey Misterio Jr. and Juventud Guerrera moved north of the border for greater exposure and a bigger pay cheque. Shortly after the millennium, WCW and ECW went bankrupt having overstretched their resources competing with WWF.

McMahon’s vision of wrestling had won. It is still by far the most popular version of wrestling across the world. And while local variations continue to exist, the globalisation of the WWF product reflects many changes in the global economy. Everyone has their own version of the hamburger, but the Big Mac is still the most recognisable. The WWF was very 80s. And it’s kept me entertained ever since. That’s why I had to include it as the first entry in 30 for 30.

  1. Isn’t everything to historians…
  2. Chuck Austin, for example, landed on his neck after a move went wrong, paralysing him. He successfully sued the World Wrestling Federation for damages. For this and others, see ‘Worst botched moves in history’, Adam’s Wrestling Blog (19 June 2012) < http://adamswrestling.blogspot.co.uk/2012/06/worst-botched-moves-in-history.html > (accessed 2 March 2015).
  3. For an overview of this from an academic perspective, see the work of Benjamin Litherland at Huddersfield. < http://www.hud.ac.uk/ourstaff/profile/index.php?staffuid=smusbl > (accessed 2 March 2015).
  4. See the story of the first big TV star: ‘Gorgeous George’, Wikipedia < http://en.wikipedia.org/wiki/Gorgeous_George > (accessed 2 March 2015).
  5. The Allies wanted to destroy the culture of Japanese imperialism, and competitive sport was considered part of this. “Puroresu” helped fill the void and kept sport stadiums full during the 1940s. See ‘Puroresu’, Wikipedia < http://en.wikipedia.org/wiki/Puroresu > (accessed 2 March 2015).
  6. It might be a stretch to call Slaughter one of ‘the biggest stars’…
  7. Dion Beary, ‘Pro wrestling is fake, but its race problem isn’t’, The Atlantic (10 July 2014, 8.00am EST) < http://www.theatlantic.com/entertainment/archive/2014/07/the-not-so-fictional-bias-in-the-wwe-world-championship/374042/ > (accessed 2 March 2015).
Older Posts