historical stuff by Gareth Millward

Search

mandela

1990 – The Release of Nelson Mandela

20/04/2015

11 February 1990 – Paarl

Writing about Mandela is incredibly difficult. Partly, it’s because so much has been said already. He is almost universally held up as a force for good, a freedom fighter who managed to win a democratic election and defeat Apartheid. I am nowhere near an expert on South African history, and so cannot give the arguments about “terrorism” the nuance and context they deserve. Writing about how the white establishment in Africa and Europe treated him abominably seems like shooting fish in a barrel. Yet that “Long Walk to Freedom” is iconic. Coming so soon after the Berlin Wall, it seemed to herald the end of the repressive political regimes of the twentieth century. I could not choose anything else for 1990.

A recurring theme in this series has been about explaining the importance of “myth”. The Berlin Wall was more than a wall. Chernobyl was more than a health and safety snafu. And it stands to reason that Mandela was more than a politician. He represents the death of Apartheid and the birth of modern1 South Africa. But hagiogrpahy – the creation of saints – is important. It’s not simply a case of propaganda (though this is part of it). Nor is it entirely cynical or manipulative. It is part of a crucial process of owning our own histories, giving us ideal types against which we can measure our present and guide our futures. In the end, it doesn’t matter if Isaac Newton was a paragon of rationality; if Margaret Thatcher was a great leader; if Ghandi was tolerant of all; if Hitler was pure evil; if Pele was the greatest footballer of all time. What matters is that these historical legends exist. At least in a political sense.

Where history can help is by contextualising both the “real” lives of the Florence Nightingales and Wolfgang Mozarts and the subsequent use of their legends by their apostles and enemies. For instance – did the 1994 election allow Mandela to become the symbol of ‘democratic transition.. and reconciliation’?2 Is that more important than Mandela’s “real” life? Or are the two inextricable?

Which is the more important depends on the questions one asks and for what purpose. If we want to understand how a movement could shift from violence to democratic tactics, then the role of Mandela and the political context of 1960s versus 1980s South Africa become important. If, however, we want to understand the political discourse of the 1990s and twenty-first century, then the image of Mandela may offer better explanations.

Indeed, Raymond Suttner has argued that the focus on Mandela as an icon has made it very difficult to write a narrative history about his life. Many spend their energies explaining the meaning of Mandela ‘rather than purely narrating’. Because, clearly, Mandela was never a uniform entity. He and South African politics changed a lot over his lifetime.3 For Western audiences, ‘popular narratives of race and redemption’ are perhaps more palatable than a full consideration of the reasons behind racial governance in sub-Saharan Africa.4 And whatever you do – don’t mention that he was a communist!5

Francois Pienaar receiving the Rugby World Cup from Nelson Mandela. ((Source)

Francois Pienaar receiving the Rugby World Cup from Nelson Mandela. ((Source)

Being geographically and politically distant also meant that Mandela and South Africa were refracted through the prism of television. His release from prison was in many ways the first of a series of “episodes”, which included meetings with world leaders and – most notably – the Rugby and FIFA World Cups of 1995 and 2010.6 A ‘safe’ image of Mandela as ‘international statesman’ became vital to the reintegration of South Africa into the post-cold-war international community. Philippe de Brabanter has argued that Time made a conscious effort to avoid references to communism and violence when it chose excerpts of Mandela’s autobiography to publish.7 At the same time, it was clear following his death in 2013 that many right-wingers who had opposed him in the past were unable to express anything but admiration for a man who had come to symbolise liberty, democracy and a post-racial world.8

In the end, this article hasn’t really been about Mandela. What it shows is that myths and legends are an integral part of history. They should not be dismissed simply as distortions. They are driving forces in their own right. Historians of different types ask different questions at different times. Some will be interested in “real” lives and events; others will be more interested in how those events have been refracted through the prism of history. What makes a good historian is the ability to be able to separate these issues and contextualise them. This will become ever more noticeable as this series progresses. Mandela was released 25 years ago. We do at least have the benefit of some historical distance when we come to assess him. This will not be the case with some of the later articles. So, with that disclaimer/excuse for lazy writing – let us proceed with the 30-for-30!

  1. Meaning “present-day” rather than “modern”.
  2. Xolela Mangcu, ‘Nelson Mandela in the history of African modernity – Towards a reappraisal of existing approaches’, Bulletin of the National Library of South Africa, 68(2) (2014), 187-97.
  3. Raymond Suttner, ‘(Mis)Understanding Nelson Mandela’, African Historical Review, 39(2) (2007), 107-30.
  4. Maryann Martin, ‘Throwing off the yoke to carry the man: Deconstructing the myth of Nelson Mandela’, Topia, 12 (2004), 41-62.
  5. Philippe de Brabanter, ‘”Long Walk to Freedom” or how “Time Magazine” manipulates Nelson Mandela into unwittingly forging his own image’, Revue Belge de Philologie et d’Histoire, 73(3) (1995), 725-39.
  6. Martha Evans, ‘Mandela and the televised birth of the rainbow nation’, National Identities, 12(3) (2010), 309-26.
  7. Brabanter, ‘”Long Walk to Freedom”‘.
  8. Julian Borger, ‘The Conservative party’s uncomfortable relationship with Nelson Mandela’, The Guardian, 6 December 2013, 15:03 GMT < http://www.theguardian.com/politics/2013/dec/06/conservative-party-uncomfortable-nelson-mandela > (accessed 16 March 2015); ‘Twitter fact-check: David Cameron didn’t want to “Hang Nelson Mandela” in the 80s’, New Statesman, 6 December 2013, 11:25 GMT < http://www.newstatesman.com/media-mole/2013/12/twitter-fact-check-david-cameron-didnt-want-hang-nelson-mandela-80s > (accessed 16 March 2015).
Print Friendly
ISS035-E-17210-strip

1989 – The Fall of the Berlin Wall

13/04/2015

9 November 1989 – Berlin

Mauerfall1 is one of two iconic events in my lifetime that might be said to mark the “end” of the twentieth century world.2 I would say it was the most important.

The world before 1989 was a different place. That sounds trite, but it’s true. I’m too young to remember the Cold War, but I grew up alongside people who did. My parents were alive when the wall came up, and they saw it fall again. My grandparents fought in the War that helped establish the capitalist and communist “zones” within Europe. Those of my colleagues teaching undergraduates at the moment will be explaining a geopolitics very different to today. None of this should be underestimated – though I will explain later why the Berlin Wall is perhaps more representative of a number of cultural shifts since the twentieth century than the cataclysmic event that changed everything.

Much like with Chernobyl, the iconicism of the Wall – what its construction, presence and destruction symbolised – was as important as the edifice itself.

I’ve already said that I felt that 9/11/89 was the most important in my lifetime. But that has to be seen as part of the political narrative of the past 25 years. Democracy beat communism, Europe became united, and we now live in a globalised world. But – obviously – the Eastern Bloc did not disintegrate overnight. It would not be until 1991 that the USSR declared its disillusion; the same year Yugoslavia broke into a bloody civil war. And if, indeed, this is a story about how the capitalist West “won”, it wasn’t until 1999 that the Czech Republic joined NATO, and 2004 when they joined the European Union. The Wall was therefore a symbol of change, and reflected political and social shifts on both sides of the Iron Curtain. It came to represent the literal dismantling of a system of government that had lasted since the Second World War. Indeed, for a country like Poland, one could argue that this was the end of the Second World War.

Why was the wall built? Well, as with all things in history, the answer is long and rambling.3

In the run up to VE Day,4 the American-led armies to the West and the Soviet-led armies to the East were in a race to Berlin – the capital of the German and Prussian Empires since the eighteenth century. It was the heart of modern Germany. It represented not just the capital of the Nazi Reich, but the very concept of the German nation state. Sure, there was a lot of industrial technology to be looted (such as rocketry5 and mass production techniques), but there was also a sense that the mistakes of the First World War needed to be righted. In short, Germany had to be thoroughly and unequivocally beaten. And marching through the centre of Berlin was the most explicit way of achieving this.

Lloyd-George, Clemenceau and Wilson, the leaders of the UK, France and USA who negotiated the Armistice. (Source)

Lloyd-George, Clemenceau and Wilson, the leaders of the UK, France and USA who negotiated the Armistice. (Source)

Part of Hitler’s political appeal was based on the idea that the German Empire had been sold out by socialists and cowards in 1918. By signing the Armistice rather than fighting to the bitter end, the German state had been subjected to foreign interference, crippling reparation payments and had its territories in Africa and central Europe seized. The Austro-Hungarian Empire suffered similar dismantling. Many in Vienna and Berlin believed that Bohemia, Poland and Hungary were legitimately “German” lands, taken away by the vengeful French and their accomplices.6 With the Great Depression and hyperinflation in the inter-war Weimar Republic, the Nazi party gained widespread support on the back of promising to restore the Empire and right the injustices of the 1920s. In order to crush fascism in the German-speaking world, the Allies believed that total victory was paramount.

This left Germany cleft in half. The East was occupied by the Soviet Union, and the West by a mix of French, British and American troops. Following the Potsdam Conference, the Germany was split into 4 zones of control. The three western zones eventually became capitalist West Germany; while the eastern zone effectively became a puppet of Moscow. Berlin, wholly in the Soviet zone, was also split into four parts – effectively leaving a capitalist island within East Germany.

Map outlining the zones of control in Germany and Austria following the War. Austria eventually became an independent, democratic country in 1955 after the Soviet Union withdrew. Germany would remain split, however, until 1990. (Source)

Map outlining the zones of control in Germany and Austria following the War. Austria eventually became an independent, democratic country in 1955 after the Soviet Union withdrew. Germany would remain split, however, until 1990. (Berlin is in grid reference C3.) (Source)

Being an exclave during the early Cold War was not particularly helpful. Since Moscow could effectively block all land vehicles entering West Berlin by patrolling the border between East and West Germany, food and supplies were difficult to obtain. A blockade began in 1948 in response to the Western powers’ introduction of a new Deutschmark. For over a year, the only way to keep Western Berlin running was to airlift supplies into West Berlin. It proved successful, and eventually the Soviets backed down. But it showed the tension between two sides would be a constant threat.7

President Kennedy visiting the Berlin Wall in 1963. Ich bin ein Berliner... (Source)

President Kennedy visiting the Berlin Wall in 1963. Ich bin ein Berliner… (Source)

By 1961, East Germany was concerned at the number of defections to West Berlin. Since it was relatively easy to get across the border and never return, the Democratic Republic8 was losing skilled men and women. Around 3.5 million had managed to leave the country before the wall was erected. Sold as an “anti-fascist” protection against insurgency from the West, the wall effectively ended East-to-West migration, and sealed West Berlin off from the rest of the Russian-occupied zone. Or, perhaps more accurately, it sealed East Germany off from the Western world.

The Communist Bloc was in crisis in the late 1980s. Things came to a head when Hungary dismantled its armed border with Austria in August 1989, allowing “tourists” in the East a quick means of escape to the West. East Germany attempted to impose a travel ban, hoping to stop the flow of migrants. Resentment and protest grew and, following the resignation of Erich Honecker in October, events reached boiling point. On the night of 9 November, protesters began to hack away at the wall, effectively causing a revolution.

Photograph taken by astronaut Chris Hadfield from the International Space Station. The angle is skewed, but you can clearly see the bright white street lamps of Western Berlin towards the top, and the dimmer, yellow lights in Eastern Berlin towards the bottom. This is caused by the lower-quality lighting used in the East - in the West, street lamps were updated, indicating a higher level of investment in infrastructure. (Source)

Photograph taken by astronaut Chris Hadfield from the International Space Station. The angle is skewed, but you can clearly see the bright white street lamps of Western Berlin towards the top, and the dimmer, yellow lights in Eastern Berlin towards the bottom. This is caused by the lower-quality lighting used in the East – in the West, street lamps were updated, indicating a higher level of investment in infrastructure. (Source)

The Berlin Wall has always been a symbol as much as a physical barrier. While many died trying to get across it, it served an important political purpose for both sides of the political divide. This was as true while it stood and with its fall.9 Today, the wall has come to represent a memorial to the past – not to celebrate the division, but as a lesson that ought not to be forgotten. The site of the wall has become an important place for tourists, not just from abroad but from within the German Republic. As Hope Harrison argues, the wall has in some ways been ‘resurrected’ as an important artefact of German history.10 This is a common theme in European history – much like Auschwitz was not razed to the ground, these buildings serve as a stark and important reminder of what people can do when their power remains unchecked.

For those outside Germany, it represented the Rubicon for the anti-communist revolution. But it also reflected a number of other changes that were stirring at the time. In the West, almost-universal ownership of televisions, telephones and automobiles had created a much smaller world than the one of the 1940s. Affordable air travel, the growing reach of multinational corporations and the increasing importance of the European Economic Community were creating a globalised world. 1989 marked the point at which the Communist East became part of this world too, no longer in self-imposed exile on the other side of the Iron Curtain. It was a victory for capitalism, democracy and freedom. Whether or not this “really” happened, the resulting invasion of the East by McDonalds, Pepsi, Nike and David Hasselhoff created a very different geopolitical landscape.

As the articles following this will show, the next 25 years were very different to the previous 25 in so many ways. 1989 marked the beginning of the end of the Cold War; or perhaps the end of the Second World War; or even the end of the twentieth century itself, stretching back to Versailles. Everything else in this series is informed by these events. This is why it had to be included in the 30-for-30.

  1. The German word for the Fall of the Berlin Wall. Replicated here to bolster my hipster credentials. “Europa geht durch mich“, etc.
  2. The “short” twentieth century being roughly World War I (c. 1914) to the fall of the Eastern Bloc (c. 1989-1991). Longer versions (usually Americo-centric) position the “end” with the terrorist attacks on the World Trade Centre in New York in September 2001.
  3. (c) Grampa Simpson.
  4. 9 May 1945. “Victory in Europe Day”, known in some countries as Liberation Day, or similar. See: ‘Victory in Europe Day’, Wikipedia < http://en.wikipedia.org/wiki/Victory_in_Europe_Day > (accessed 9 March 2015).
  5. The popular narrative is that this kick started the space exploration programmes in the US and USSR. See: ‘Soviet space program’, Wikipedia < http://en.wikipedia.org/wiki/Soviet_space_program > (accessed 9 March 2015).
  6. Clemenceau is traditionally seen as the more vengeful of the three leaders who negotiated the Treaty of Versaille. Having been stung by the 1875 Franco-Prussian War, he wanted revenge on Germany without too much care for the social or political consequences. Lloyd-George was sympathetic, but willing to go along with it, while Wilson was horrified. Thus, another lesson of the First World War was not just that Germany had to be conquered – the country needed to be rebuilt, with an emphasis on democracy and co-operation. German Imperialism had be be thoroughly destroyed. See: ‘Treaty of Versailles’, Wikipedia < http://en.wikipedia.org/wiki/Treaty_of_Versailles (accessed 9 March 2015).
  7. ‘Berlin Blockade’, Wikipedia < http://en.wikipedia.org/wiki/Berlin_Blockade > (accessed 9 March 2015).
  8. East Germany. I’m just getting bored of typing the word “East”…
  9. Pertti Ahonen, ‘The Berlin Wall and the battle for legitimacy in divided Germany’, German Politics and Society, 29(2) (2011), 40-56.
  10. Hope M. Harrison, ‘The Berlin Wall and its resurrection as a site of memory’, German Politics and Society, 29(2) (2011), 78-106.
Print Friendly

1988 – The Global Polio Eradication Initiative

06/04/2015

13 May 1988 – Geneva

Poliomyelitis is, sadly, still with us. But since 1988, global action on the disease has reduced the number of cases from an estimated 350,000 to just 445 in 2013.1 This is pretty remarkable for a disease that only reached epidemic proportions in the twentieth century, and with a vaccine that was developed as recently as the 1950s. Given my own research interests in the history of polio and vaccination, the Global Polio Eradication Initiative (GPEI) is the fourth entry in the 30-for-30 series.

The Forty-first World Health Assembly… declares the commitment of WHO [the World Health Organization] to the global eradication of poliomyelitis by the year 2000.

The GPEI may have missed this ambitious goal, but it was not for want of trying. The number of confirmed cases of poliomyelitis dropped significantly over the 1990s.2 The last bastions of the disease have, however, proved difficult to break down due to a combination of economic, political and medical factors.

Ancient Egyptian depiction of someone affected by polio - although, as historians of medicine, we need to be careful about diagnosing people in the past! (Source)

Ancient Egyption depiction of someone affected by polio – although, as historians of medicine, we need to be careful about diagnosing people in the past! (Source)

Polio is, in many ways, a disease whose importance as risen alongside modern biomedicine. Though diseases which may be today thought of as “polio” were recorded in Ancient Egypt, proper classification of its causes, effects and treatment are definitely modern in origin. It was “discovered” through the works of Karl Medin and Jakob Heine over the mid-to-late nineteenth century. The virus responsible was only isolated in 1909.3

The first outbreaks occurred in Western Europe and the United States at the dawn of the twentieth century. Unlike infectious diseases such as tuberculosis or cholera, which were largely attributed to poverty and poor sanitation, polio seemed to affect everyone equally. There was also no cure, and so wealthier patients were not protected from their usual isolation from the vectors of disease or ability to pay for the best treatment. Indeed, there was strong evidence that the greater the level of sanitation in a region, the more likely it was to succumb to an epidemic. The first outbreak of epidemic proportions hit Britain in 1947, and by the early 1950s the Western World had mobilised its efforts to find some sort of medical protection against the poliovirus.

This may seem a very cynical interpretation of the history of polio, but there is no doubt that the battle against the disease was given a massive boost by powerful patrons with a personal interest. Franklin D. Roosevelt, President of the United States and polio survivor, founded the National Foundation for Infantile Paralysis. “The March of Dimes”, a fundraising campaign, attracted support from across the United States, fueled by “celebrity endorsements” from the likes of Lucile Ball and Louis Armstrong. Much of this money focused on medical research into the causes and prevention of polio.

That came to fruition in 1955 when Jonas Salk announced successful trials of a new intravenous poliomyelitis vaccination (IPV).4 It was quickly rolled out across the United States, and similar versions were used around the world. His rival, Albert Sabin, produced an oral polio vaccine (OPV) which was eventually adopted as a safer and more effective method. For children of my age, the foul-tasting drops on a jelly baby formed our memories of going to see the doctor.

A German doctor administering the oral polio vaccine. (Source)

A German doctor administering the oral polio vaccine. (Source)

The story of Salk vs Sabin is fascinating in its own right, and something I may cover at a later date.5 For now, however, all we need to know is that despite the overwhelming success of various vaccination programmes (polio has been nigh-on eradicated in the West since the late 1970s), scientific consensus is never enough to convince people to accept vaccination into their lives.

The Cutter Incident of 1955, for example, almost killed IPV before it had even begun. The Cutter Laboratories in California produced a batch of Salk’s vaccine with live poliovirus, not the inactivated version required to avoid infecting the patient. Hundreds of children caught the disease directly from the vaccine. Salk vehemently protested that the problem was with the manufacturer, not the design, and he was proven right. But the planned vaccination programme in Britain was delayed while the Medical Research Council and Ministry of Health debated on how to proceed. The British decided to make their own, despite being completely incapable of producing enough of the vaccine to inoculate the number of children who had been signed up for the programme. When they did cave to demand and began to import Salk’s vaccine, they gave parents the option to opt out. The MRC advised:

This country has an unblemished record and it is strongly felt in some medical quarters that it would be deplorable to run any risk of an accident such as might jeopardise public confidence, not only in the particular vaccine, but in preventive inoculation and vaccination in general.6

Cutter had shown that vaccination was not without its risks – and until the MRC was certain about its safety, the known risk of wild poliovirus was preferred to the unknown risk of Salk’s new invention.

Of course, by the 1980s manufacturing techniques had improved dramatically. New strains of the vaccine had been produced, and the new OPV was not only safer but easier to administer. With WHO backing, many countries adopted polio vaccination, and rates fell dramatically. Why, then, did progress stall in the early 2000s?

There were practical concerns, to be sure. Since there is an unhelpful relationship between sanitation and polio, vaccination is one of the very few public health measures that can have a lasting impact on the disease. (For instance, hygiene and quarantine were used in conjunction with smallpox vaccines to eradicate that particular disease.) Very remote rural regions were hard to access. As were war zones. Refrigeration is also a problem in places without electricity in sub-tropical climates. But there was a growing political opposition to vaccination too. Some of this was due to a post-colonial backlash against white doctors “experimenting” with black bodies. Other legitimate concerns from locals were fanned by groups with a political interest in driving out foreign observation. Attacks on aid workers in Pakistan and Afghanistan, for example, mean that this area is one of the very few where polio remains endemic.

It didn’t help that the Central Intelligence Agency was found to be using vaccination programmes to spy on remote populations.7 In the same way that Cutter and incidents like it have always been used by anti-vaccination campaigners who argue – against all epidemiological evidence – that vaccination is unsafe compared to a perceived “natural immunity”.8 Despite these setbacks, however, India was declared polio-free in 2014. The number of recorded cases has fallen from around 3,000 in the year 2000 to just 445 in 2013. The year for global eradication keeps being pushed back, but I hope that by the time I’m 40 the job will be done.

  1. See various GPEI reports, and the data collated on Wikipedia, ‘Poliomyelitis eradication’ < http://en.wikipedia.org/wiki/Poliomyelitis_eradication > (accessed 15 January 2015); World Health Organization, ‘Polio Case Count’ < https://extranet.who.int/polis/public/CaseCount.aspx > (accessed 15 January 2015)
  2. Ibid.
  3. See Gareth Williams, Paralysed with Fear: The Story of Polio (London: Palgrave Macmillan, 2013).
  4. Thomas Francis, Evaluation of the 1954 Field Trial of Poliomyelitis Vaccine: Final Report (Ann Arbor : University of Michigan, 1957).
  5. Williams, Paralysed with Fear.
  6. The National Archives: FD 23/1058. Sir H Himsworth to Lord Alec Home, ‘Vaccination Against Poliomyelitis. Considerations relating to the possible use of American Salk vaccine in this country’, 25 July 1957.
  7. Saeed Shah, ‘CIA tactics to trap Bin Laden linked with polio crisis, say aid groups’ in The Guardian, 2 March 2012 16:57 GMT < http://www.theguardian.com/world/2012/mar/02/aid-groups-cia-osama-bin-laden-polio-crisis > (accessed 15 January 2015).
  8. Macrobiotic Guide, ‘Family health’ < http://www.macrobiotics.co.uk/familyhealth/childsimmunity.htm (accessed 15 January 2015).
Print Friendly

1987 – Star Trek: The Next Generation

30/03/2015

29 May 1987 – Los Angeles

I grew up with Star Trek. Between 1987 and 2005, a new series1 of the franchise aired on US television, and so I watched quite a bit of it during my primary and secondary school education. Despite its status as a science fiction classic (and the general reputation of science fiction fans), it seeped into the popular consciousness in the way that few series ever have. And while it’s a massive stretch to argue that Star Trek “invented” much of the technology which would go on to be commonplace in the twenty-first century, it probably isn’t that far fetched to say that many of the technicians and inventors at Google, Apple, Microsoft et al were also glued to their TV screens every week watching Picard, Sisko, Janeway and Archer fight the good fight against the evils of the galaxy.

The Next Generation was announced on 10 October 1986.2 Filming began on a pilot entitled Encounter at Farpoint in May 1987, and the series finally aired in September. 3 But, of course, this was not the first crew to boldly go where no crew had gone before.4

The original Star Trek series was created by Gene Roddenberry, and was first broadcast in 1966. It made stars of William Shatner and Leonard Nimoy, while the crew became household names. ‘Beam me up, Scotty’ , ‘He’s dead Jim’ and ‘Live long and prosper’ are phrases as well known as ‘My kingdom for a horse’. And, of course, there was that interracial kiss that broke television taboos during the height of the civil rights movement.

Roddenberry’s vision was of a united Earth, in which petty squabbles over money, religion and political ideology had long-since been consigned to history. Following the Third World War, humans invented a faster-than-light engine that allowed them to explore the stars. First contact with the emotionless Vulcans brought homo sapiens sapiens into a galactic community. Yes, there were dangers from the militaristic Klingons or the duplicitous Romulans, but humanity would face them together. With a multi-ethnic crew, women in senior roles and story lines that tended to explore deep philosophical issues, Star Trek was in many ways the quintessential expression of 1960s optimism. Some day, science and reason would lead humanity to total enlightenment. And we could explore the beauty of space together.

This is what I loved about the Star Trek universe too. Ignoring the deep tricky questions of “so, what do we do without money?” and “if Earth is united, why is there a disproportionate number of white male Americans in senior positions?”,5 the show did explore some pretty deep problems. Deanna Troi, the Enterprise’s counselor in TNG, was confronted with physical and psychic rape. Jean-Luc Picard, the captain, was assimilated into the Borg collective, and had to deal with severe post traumatic stress. William Riker got his end away. A lot. And Wesley Crusher had to deal with the pressures and social awkwardness of being a child protegé.

Since I grew up alongside the “reboot” of Star Trek, it’s been an important part of my cultural heritage. And while the show has obviously reflected the sensibilities of American audiences over the past fifty years, it has always challenged cultural norms. It’s hard to imagine any other shows which have put a disabled black man in a position of authority and made them central to the story arc.

Clockwise from top left: T'Pol; Seven of Nine (being held hostage); Dax; Troi. (Source)

Clockwise from top left: T’Pol; Seven of Nine (being held hostage); Dax; Troi. (Source)

In what has to be one of the most fun literature searches I’ve ever done, there has been an awful lot of work done by academics on the Star Trek universe. In particular, queer studies has found elements of the show which both reflect and challenge Western concepts of gender, race, sexuality and the body. Take, for example, Seven of Nine (Voyager) and T’Pol (Enterprise). Hey, perhaps even throw Dax (Deep Space Nine) and Troi (TNG) in there too. All four women were cast (in part) due to their sexual attractiveness. They were often seen in tight-fitting or revealing garb, and clearly served a… how to put this… aesthetic function.

In each case, however, the women were a vital part of the crew. Seven of Nine possessed knowledge of the region of space in which Voyager was stranded; T’Pol was first officer, send from the Vulcan high command to aid the inexperienced human crew; Dax had lived several past lives, making her an experienced science officer and confidant of the captain; and Troi was the ship’s counselor, and as an empath was invaluable during diplomatic missions. As Ulrich Scheck has argued, T’Pol and Seven use sarcasm and wit to go beyond their ‘stereotypical body image’.6 And the friendships between Crusher and Troi, Dax and Kira and Janeway and Torres would easily satisfy the Bechdel test.7

One criticism has been the lack of openly gay characters in Star Trek, although sexuality was often played with during the show’s run.8 Beverly Crusher, the doctor in TNG, falls in love with a Trill – a species whose memories are held within a symbiotic being that lives inside the humanoid. When the human (male) part dies, the symbiont is implanted into a woman. The two remain emotionally in love, but Crusher finds it impossible to reconcile this with this new female body.9 A similar narrative is told with Jadzia Dax, when an ex-lover of Kerzon (her previous host) is dragged into a court case. Here the love clearly remains, but is not acted upon.10 One can add to this a number of races whose reproductive cycle does not involve traditional gender roles. Riker, for example, falls for a member of an androgynous species in whose society sexual behaviour is considered a mental illness.11 In another episode, a group of isolated colonists who have evolved a form of reproduction involving cloning have to stomach the unpalatable idea that sexual intercourse with another colony will help revitalise their damaged and shallow gene pool.12

While I could write about Star Trek indefinitely, the point of these articles is to show historically important events during my life time. Culturally – for me – this is one of the biggest. What I also find so interesting is that I can re-watch the episodes in light of events that have happened since. The terrorism narratives in Deep Space Nine can be quite harrowing, especially with the knowledge that 9/11 was only a few years after the show ended. Having gone through various stages of the education system, the historical allegories gained greater nuance. And, as seen above, re-watching some of the episodes from certain political stances can give a variety of new interpretations.

But more than anything else – I fucking love Star Trek. It’s going in the 30-for-30.

  1. I’m English. What Americans call “seasons”, we call “series”. What Americans call “series”, we just tend to refer to as “shows” or “programmes”. Apologies for the confusion.
  2. My first birthday.
  3. ‘Encounter at Farpoint (episode)’, Memory Alpha < http://en.memory-alpha.org/wiki/Encounter_at_Farpoint_%28episode%29 > (accessed 3 March 2015).
  4. The kind of people who get bothered by split infinitives are the worst kind of bores. But if it makes you feel any better, the quote is attributed to Zephron Cochrane, the inventor of the warp drive. He said the engine would allow man ‘to go boldly’. I am aware of how much I need to get a life. See: ‘Broken Bow (episode)’, Memory Alpha < http://en.memory-alpha.org/wiki/Broken_Bow_%28episode%29 > (accessed 3 March 2015).
  5. Allen Kwan, ‘Seeking new civilizations: Race normativity in the Star Trek franchise’, Bulletin of Science, Technology and Society 27(1) (2007), 59-70.
  6. Ulrich Scheck, ‘Where no woman has gone before: Humour and gender crossing Star Trek’s Voyager and Enterprise’, Amsterdamer Beiträge zur Neueren Germanistik 69(1) (2009), 103-118.
  7. I wrote that last sentence with trepidation, but thankfully someone’s done the research for me. Jarrah Hodge, ‘How does your favorite Star Trek series fare on the Bechdel test?’, The Mary Sue (1 September 2014, 10.55am) < < a href="http://www.themarysue.com/star-trek-bechdel-test/">http://www.themarysue.com/star-trek-bechdel-test/ > (accessed 9 March 2015).
  8. Stephen Kerry, ‘”There’s Genderqueers on the Starboard Bow”: The Pregnant Male in Star Trek’, Journal of Popular Culture 42(4), 699-714.
  9. ‘The Host (episode)’, Memory Alpha < http://en.memory-alpha.org/wiki/The_Host_(episode) > (accessed 9 March 2015).
  10. ‘Dax (episode)’, Memory Alpha < http://en.memory-alpha.org/wiki/Dax_%28episode%29 > (accessed 9 March 2015).
  11. ‘The Outcast (episode)’, Memory Alpha < http://en.memory-alpha.org/wiki/The_Outcast_(episode) > (accessed 9 March 2015).
  12. ‘Up the Long Ladder (episode)’, Memory Alpha < http://en.memory-alpha.org/wiki/Up_The_Long_Ladder_(episode) > (accessed 9 March 2015). See also: Victor Grech, ‘Infertility in Star Trek’, Word Future Review 4(4) (2012), 19-27.
Print Friendly

1986 – Chernobyl

23/03/2015

26 April 1986 – Pripyat

The Chernobyl Disaster is one of those iconic events that has permeated into many aspects of our society. While it certainly wasn’t the first nuclear disaster (or, indeed, the last), it occurred at a time in which its political, environmental and cultural effects were amplified. Chernobyl, now a byword for catastrophe, has had a lasting impact upon the last three decades. And so, here it is at number 2 in the 30-for-30.

Homer, your bravery and quick thinking have turned a potential Chernobyl into a mere Three-Mile Island. Bravo!

Montgomery C. Burns, The Simpsons, 5 November 1995

The Simpsons plays on the caricature of nuclear power. It is simultaneously the economic centre of Springfield and - on more than one occasion - a potential cause of Armageddon...

The Simpsons plays on the caricature of nuclear power. It is simultaneously the economic centre of Springfield and – on more than one occasion – a potential cause of Armageddon…

As with most historical events, the more fascinating aspects of Chernobyl are not the scientific facts, but the way it came to be represented and reconstructed by various people. However, the reality of how the plant came to its demise is intriguing. One might be forgiven for thinking that Pripyat – the abandoned Ukrainian city in which Chernobyl was built – can never be visited. That a massive mushroom cloud billowed above, leaving a massive crater below. That the local fish have three eyes, and that anyone not disintegrated by the blast died soon after from horrific radiation burns. Much like the monsters of the early-modern period, though, the myth of Chernobyl is built on elements of truth that have been exaggerated and reinforced in the popular imagination.

Panaroma of Pripyat, the city built to house the workers at the Chernobyl Nuclear Power Plant. (Source)

Panaroma of Pripyat, the city built to house the workers at the Chernobyl Nuclear Power Plant. (Source)

First, the plant did not explode like a Hiroshima-style A-bomb. A power surge, combined with poor safety procedures, produced a fire within one of the reactors.1 This then caused a chemical explosion (it’s not a good idea to expose graphite to fire), which created a cloud of radioactive ash. As a direct result of the explosion, two workers died. A further 28 died within three months as a result of the rescue and containment operation. The prevailing winds meant that much of the fallout landed in the nearby republics of Belarus and Russia rather than in the Ukraine itself. While this has significantly raised the risk of cancer in these areas,2 the wider region is still inhabited and, while far from ideal, it is safe enough for people to live there.3. Indeed, while Pripyat and the immediate environs are restricted and abandoned, it is still possible to visit the city. You could even drive through it, if you were so inclined…

This is not, of course, to downplay the scale of the disaster. It was incredibly expensive. Pripyat will remain uninhabited for 20,000 years. And there are not only significant levels of cancer in Ukraine and Belarus; lives were irreparably disrupted by the relocation of 50,000 citizens from the city. But Chernobyl was never the comic-book Apocalypse that it appears to be portrayed as. So. Why does this myth keep repeating?

First, the most boring explanation. It’s a good story. The idea of a far-flung place, nuclear explosions, mutant trees and radioactivity. These are the stuff of science fiction and play into fears of engineers playing God. They’ve been around for a long time. The Hulk and Spider-Man both debuted in 1962. That same year, the Cuban Missile Crisis (allegedly) brought the world to the brink of nuclear annihilation. The word “nuclear” is iconic. Mix it in with “explosion” and “Russia” (because all of the USSR was “Russia”), and you’ve got yourself a blockbuster.

Which brings us onto perhaps the most important aspect. The Cold War. Not only did the late-Cold War setting allow the West to use Chernobyl as a sign of Soviet incompetence4 – an almost literal metaphor of how the country was falling apart – it also led to major problems dealing with the aftermath. When the USSR disintegrated in the 1990s, responsibility was spread between the Russian, Belarusian and Ukrainian governments. Consequently, there remain serious social and political problems along the Dneiper river.5 This has served as a constant reminder of the long-term effects of nuclear power if it goes wrong. This food for the anti-nuclear lobby and, in turn, keeps Chernobyl in the public consciousness. For others, Chernobyl must be held up as the exception, caused by incompetence. Nuclear power is such an important part of many Western countries’ energy infrastructure that all fear must be projected onto Chernobyl and focused away from the potential disasters closer to home.6 Following the fall of the Berlin Wall, Western experts sought to improve safety standards in the East as a way of enforcing their own professional power and to show to the world that nuclear was safe when “done properly”.7

When the Fukushima plant went into meltdown following the 2011 earthquake in Japan, comparisons were immediately drawn.8 But this hasn’t captured the imagination in the same way. At the time, there was a great deal of speculation, fuelled again by the “disaster movie” narrative being spun by the rolling news media. Yet the limited fallout and the relatively swift response appear to have nipped it in the bud. It probably helps that Japan is “one of us” – a technologically advanced capitalist nation. Thus, despite being the only other “level 7″ nuclear accident, Fukushima is not talked about in the same tones as Chernobyl.

The disaster is one of the most iconic events of the last thirty years. It simultaneously seems to be blown completely out of proportion as a cartoonish Apocalypse; and underplayed as a long-term catastrophe outside of the city of Pripyat itself. With the political situations in Belarus, Russia and Ukraine currently unstable, it is clear that Chernobyl is not over – and the management of the aftermath continues to be a concern. For these reasons, Chernobyl is the entry for 1986.

  1. Marples argues that the disaster was as a direct result of complacency on behalf of the nuclear industry in the USSR in the 1970s and 1980s. See David R. Marples, ‘The Chernobyl Disaster’ in Current History 86(522) (1987), 325-43.
  2. Adriana Petryna, ‘Biological citizenship: The science and politics of Chernobyl-exposed populations’ in Osiris 19 (2004), 250-65.
  3. International Atomic Energy Agency, Chernobyl +15: Frequently Asked Chernobyl Questions (undated, but presumably c. April 2001) < http://web.archive.org/web/20031220213501/http://www.iaea.org/NewsCenter/Features/Chernobyl-15/cherno-faq.shtml > (captured by The Internet Archive, 2 December 2003) (accessed 3 February 2015)
  4. Nicky Falkoff, ‘Heroes with a Half Life: Teenage Mutant Ninja Turtles and American repression of radiophobia after Chernobyl’ in The Journal of Popular Culture 46(5) (2013), 931-49.
  5. BBC News, ‘Belarus cursed by Chernobyl’ (26 April 2005) < http://news.bbc.co.uk/1/hi/world/europe/4485003.stm > (accessed 3 February 2015); Petryna, ‘Biological citizens’.
  6. Falkoff, ‘Heroes with a Half Life’.
  7. Thomas R. Wellock, ‘The children of Chernobyl: Engineers and the campaign for safety in Soviet-designed reactors in Central and Eastern Europe’ in History & Technology 29(1) (2013), 3-32.
  8. BBC News, ‘How does Fukushima differ from Chernobyl?’ (16 December 2011) < http://www.bbc.co.uk/news/world-asia-pacific-13050228 > (accessed 3 February 2015).
Print Friendly

1985 – WrestleMania

16/03/2015

31 March 1985 – New York City

This was the moment that the modern version of professional wrestling – cartoon characters, big venues, loud music, pyrotechnics and Spandex went global. Or, at least, Vincent McMahon Jr’s version of it. But despite a relative decline in popularity over recent years, the idea of “predetermined” or “choreographed” fighting, closely associated with the Greco-Roman wrestling seen in the Olympics, has a deep cultural history across Britain and America. It is with this flimsy excuse I open this series with Wrestlemania.

I’m often met with incredulity from work colleagues when I tell them about what I spend my free time doing. Playing computer games. Catching up on TV. Going to the cinema. And watching professional wrestling.

“You do know it’s fake, right?”

Big Daddy, World of Sport

Big Daddy (in white) and Giant Haystacks, two of the biggest stars of British wrestling in the 1970s and 1980s. (Source)

Of course, “fake” is a relative term.1. While the outcome is predetermined and the story lines are acted, they are played as if real (“kayfabe” in wrestling lingo). Much like any dramatic art form. But while the strikes, flips, spins and throws are often performed in such a way to minimise the damage done to the performers (whilst making it look like they are beating the proverbial out of each other) the risks being taken are very real.2

I could write an entire book on why professional wrestling is the best thing ever (despite the casual racism, misogyny, homophobia, drug use, questionable morality, politics, occasional contempt for its audience, lack of safety and security for its performers…). What I want to argue is that wrestling, like sport in general, has been an important part of working class culture around the world. Indeed, the way it plays on tropes within society, and the fact that it is a “fake” sport, is entirely the point.

Cribb v Molineux from 1811. (Source)

Cribb v Molineux from 1811. (Source)

Professional wrestling developed alongside professional sport during the industrial revolution. Forms of martial arts such as boxing, Greco-Roman style wrestling, and so forth had become popular attractions and carnivals and fairs. As permanent structures were built to house the “music hall” variety acts (“Vaudeville” in the United States), various forms of football, pedestrianism (forerunner of track and field), and so on, a need grew for star attractions on a regular basis. The nature of fighting, however, is that competitors can only perform once every month or so. Injuries and fatigue build up. Moreover, the most accomplished boxer is not necessarily the most charismatic. As far as the business is concerned, the only good fighter is the one who can “draw” – bring people into the arena to buy tickets.3

The narrative power of sport was popular and profitable. While “legitimate” competition continued to grow, it was clear that not every soccer game was great to watch. Not every boxing bout went the distance (some were over in a few seconds); others were long turgid draws. One way to ensure entertaining events was, therefore, to add the drama of sport without the audience becoming incredulous. Wrestling, more prone than other sports to technically impressive but largely dull affairs, could be extended if the charismatic star was able to deliberately “go easy” on his opponent and extend the contest. Or a foreign star could burst into the auditorium and demand a fight at the end of the night. This would encourage people to come back next week, and allowed the audience to cheer on their local hero against the evil outsider.

Professional wrestling gradually incorporated more and more of these elements. Wrestlers took on characters – or “gimmicks” – to make themselves more attractive. They began performing more spectacular moves, like jumping off ropes and performing flips. These had little impact on their ability to legitimately win a contest, but entertained crowds. To allow them to move from town to town, fighting every night, the winners began to be predetermined, with the “workers” being more gentle with each other to avoid injury and fatigue. Feuds were manufactured to give a reason for people to fight and for an audience to continue to buy tickets. By the 1950s this had become a well-recognised form of entertainment in Britain and the United States, fuelled in the latter case by local TV stations looking for cheap content.4

Blue Demon and El Santo, two of the "big three" luchadores (along with Mil Mascaras), who popularised the lucha libre style of wrestling in Mexico and Latin America. (Source)

Blue Demon and El Santo, two of the “big three” luchadores (along with Mil Mascaras), who popularised the lucha libre style of wrestling in Mexico and Latin America.
(Source)

While professional wrestling spread across the world, each country adapted the concept to their own local attitudes towards sport. In the United States, the World Wrestling Federation (WWF) became the most popular “promotion” based on larger-than-life characters, and a Hollywood-esque soap opera approach to storytelling. In Britain, fights were based around more-technical holds, and television presented the contests in the same way it would broadcast “legit” sports. In Mexico, the culture of masks and bright costumes was married to high-flying, fast-paced gymnastic moves. The Japanese developed a style which looked and felt more realistic, in some cases putting wrestlers in real – “shoot” – fights similar to modern-day mixed martial arts. This reflected the origins of wrestling in the country – imported by the United States after the Second World War as a replacement for competitive sport, which was banned.5 Australia, Germany and South Africa (among others) put their own spin on it.

"Macho Man" Randy Savage, one of the most recognisable wrestlers of the 1980s.  (Source)

“Macho Man” Randy Savage, one of the most recognisable wrestlers of the 1980s. (Source)

Because of this set up, most of the biggest stars the “sport” has produced have tapped into the cultural Zeitgeist. Sgt. Slaughter, for example, became WWF champion in the early 1990s after he abandoned his country (kayfabe, darlings) to support Saddam Hussein at the height of the Gulf War.6 Randy Savage was a charismatic “Macho Man” with over-the-top colourful outfits that sum up the 1980s to a tee. Hulk Hogan did it even better, going on to star in multiple (awful) movies. The Rock and Stone Cold Steve Austin were rowdy anti-heroes during the “edgy” 1990s. At the same time, plenty of wrestlers of colour have found themselves on the losing side more often than not; women were often given very stereotypical gimmicks to work with.7 The Hispanic “Los Guerreros” would ‘lie cheat and steal’. And the less said about the tradition of “midget wrestling” the better.

But this isn’t about the issues with pro wrestling. Like any art form, it reflects the climate of the time. If wrestling is sexist and racist, it’s because it exists as a warped, cartoonish version of reality. If people find its depiction of competition and success distasteful, that is because it reflects our society; one only has to see the sport analogies used by politicians to see that sport, entertainment and politics all borrow from each other on a regular basis.

Anyway. Why does Wrestlemania matter? Well, it began the slow breakdown of the structures which had held up pro wrestling across the world. In many cases, it shows the power of globalisation.

With so many different styles, why is it the WWF’s product that people generally associate with ‘rasslin? Primarily it’s because of the success of Wrestlemania and Vince McMahon Junior’s attempts to make his company an international enterprise. In the 1970s, the wrestling world was split into “territories”. The National Wrestling Alliance (NWA) maintained an international system in which promotions would not actively compete in each other’s geographical area. Even those companies that were not part of the NWA (such as McMahon’s WWF in New York, or Verne Gagne’s American Wrestling Association in Minneapolis) understood that this cartel was good for business. Wrestling was massive on international television, but very much localised. The UK would cheer on Big Daddy on World of Sport; New York would idolise the WWF’s Bruno Sammartino; Memphis loved Jerry Lawler. If a wrestler – usually a “villain” – became stale, he could go and work in another area as an unknown (or, perhaps, with a mythical reputation). Within the United States, however, different areas had different emphases. Some were more hard-hitting; some focused more on storytelling; others went more for athleticism. Regardless, the WWF was not wrestling sin qua non.

McMahon Junior bought the WWF from his father in the early 1980s, and planned to take the promotion onto national television. The rise of cable, coupled with new formats for “pay-per-view” events opened up the possibility of marketing the WWF well beyond New York and New England in a cost-effective way. Against McMahon’s Senior’s wishes, Junior got his television show on cable across the country, and began signing the biggest stars from other companies (in flagrant violation of the NWA “gentlemen’s agreement”).

Wrestlemania was a massive gamble. McMahon spent big on luring Hulk Hogan away from the AWA in 1983 in preparation, building the company around his star power. Then he invested in hiring venues to show ‘Mania through “closed-circuit television”, and brought in Cyndi Lauper and Mr T as celebrity guests. It was a runaway success, leading eventually to international expansion.

The poster for Wrestlemania. Vince McMahon's financial gamble paid off and eventually led to global expansion. (Source)

The poster for Wrestlemania. Vince McMahon’s financial gamble paid off and eventually led to global expansion. (Source)

Unable to compete with the production values and celebrity of the expanded WWF, many of the local promotions went into terminal decline. The loss of their best draws to New York didn’t help. World of Sport in the UK went off the air, and replacement shows quickly lost ground to the glitzier and more bombastic McMahon product. By the mid-90s, only World Championship Wrestling (WCW) could seriously compete in the USA, with Extreme Championship Wrestling (ECW) offering a more low-fi alternative. Japan and Mexico maintained their traditions, but in the latter case stars such as Rey Misterio Jr. and Juventud Guerrera moved north of the border for greater exposure and a bigger pay cheque. Shortly after the millennium, WCW and ECW went bankrupt having overstretched their resources competing with WWF.

McMahon’s vision of wrestling had won. It is still by far the most popular version of wrestling across the world. And while local variations continue to exist, the globalisation of the WWF product reflects many changes in the global economy. Everyone has their own version of the hamburger, but the Big Mac is still the most recognisable. The WWF was very 80s. And it’s kept me entertained ever since. That’s why I had to include it as the first entry in 30 for 30.

  1. Isn’t everything to historians…
  2. Chuck Austin, for example, landed on his neck after a move went wrong, paralysing him. He successfully sued the World Wrestling Federation for damages. For this and others, see ‘Worst botched moves in history’, Adam’s Wrestling Blog (19 June 2012) < http://adamswrestling.blogspot.co.uk/2012/06/worst-botched-moves-in-history.html > (accessed 2 March 2015).
  3. For an overview of this from an academic perspective, see the work of Benjamin Litherland at Huddersfield. < http://www.hud.ac.uk/ourstaff/profile/index.php?staffuid=smusbl > (accessed 2 March 2015).
  4. See the story of the first big TV star: ‘Gorgeous George’, Wikipedia < http://en.wikipedia.org/wiki/Gorgeous_George > (accessed 2 March 2015).
  5. The Allies wanted to destroy the culture of Japanese imperialism, and competitive sport was considered part of this. “Puroresu” helped fill the void and kept sport stadiums full during the 1940s. See ‘Puroresu’, Wikipedia < http://en.wikipedia.org/wiki/Puroresu > (accessed 2 March 2015).
  6. It might be a stretch to call Slaughter one of ‘the biggest stars’…
  7. Dion Beary, ‘Pro wrestling is fake, but its race problem isn’t’, The Atlantic (10 July 2014, 8.00am EST) < http://www.theatlantic.com/entertainment/archive/2014/07/the-not-so-fictional-bias-in-the-wwe-world-championship/374042/ > (accessed 2 March 2015).
Print Friendly
Clipboard01

30-for-30: Introduction

09/03/2015

I turn 30 this year. Which is a round number. So it means something.

Since I needed some motivation to write some history, I thought I would steal an old idea from ESPN: produce thirty pieces of history for the thirty years I have existed. Starting with 1985 and ending in 2014, I’ll be writing one post a week on something that happened in those particular years.

It’s going to be a mix of topics, using academic history alongside personal interpretations and other media. For each year, the subject reflects something that has changed with world I live in. It won’t necessarily be the most important event of that year, but its impact will be something still felt today. As the posts go on, of course, history will blur with journalism. But that just gives people more to argue about.

“1985” will be released next week. Topics will include sport, television, terrorism, communism, globalisation, nuclear explosions, science, public health, evolution, vaccination, Catholicism and civil rights.

In doing the research for this, it’s clear that historians will write about pretty much anything, and get it accepted into academic journals. I’ve learnt quite a lot about how events have shaped the popular history of the 1980s and 1990s. If, for any subject, you find related work that I’ve skipped over, do forward it on. Once this series is over, I plan to do a follow up piece summing up the past three decades.

Enjoy!

Print Friendly

What is history?

22/10/2014

E.H. Carr covered this a while back. So there’s no need for an extended essay. However, The Internet linked me to an interesting piece in New York about the perils of predicting how the present will sit in the history of the future.

The fact is that we can’t write history while we’re in it — not even that first draft of history that journalists aspire to write. While 2014 may have a shot at eternal infamy, our myopia and narcissism encourage us to discount the possibility that this year could be merely an inconsequential speed bump on the way to some greater catastrophe or unexpected nirvana. This was brought home to me when, in a quest for both a distraction from and a perspective on our current run of dreadful news, I revisited 1964, the vintage American year that has been serving as an unofficial foil, if not antidote, to 2014.

Frank Rich, ‘Nothing you think matters today will matter the same way tomorrow‘, New York, 21 October 2014, 8.00am EST (accessed 22 October 2014, 10.40am BST).

Immediately, a couple of things jump out. The first being that “we can’t write history while we’re in it”. The author, Frank Rich, might have a point here. However, there’s a question over when history ends and the present begins. There’s a strong case to be made that what happened this morning is already in the past, and therefore the realm of history. My own work falls into the domain of “Contemporary History”, which can often include analyses that take into account events yet to play out.

Second, the idea of an “inconsequential speed bump”. Is history “one fucking thing after another”? A series of events, marking the ebb and flow of human evolution? While steering away from the word that makes historians go into anaphylactic shock – progress1- there is a tacit idea that history is somehow the story of how we got from A to B; how some things got better and other things got worse; but ultimately it’s the story of how things that exist today came to be. That isn’t, necessarily, how historians approach their subject. Change over time is often an important concept in giving meaning and context to our work, but we often write about things and people that don’t exist today. Or if some remnant of them does, we make it very explicit that the “feminism” of the 1890s was a very different beast to the one of the 1990s – and that to attempt to trace a hard lineage from one to the other would be to impose presentist values of what “feminism” “really” “is”.2

Ultimately, this is the point we definitely agree on. It is certainly difficult to “write history while we’re in it”. History relies on context. It is the contextualisation of the past which allows us to even begin to understand events and the lives of people who lived there. That context might include what happened before and what would happen after – but this is not always so. Sometimes it will require greater understanding of the cultural, social and political situation and how it may have impacted upon our subject matter. Sometimes it will require rejecting presentist labels and attempting to redefine certain concepts using the values of those who would have understood them at the time.3 These things are incredibly hard to do when one has an incomplete set of sources (the events haven’t finished yet), or one is far to close to the subject matter at hand to be able to take a step back and reinterpret this history with a different conceptual framework.

As the piece shows, reading back to the 1960s to try and explain the present’s exceptionalism (in this case, how exceptionally bad today is), is just terrible history. It has to ignore so much of the context of the time that it risks painting an unrealistic picture of our society. Unfortunately, it is a common reaction to troubling times. The rise of UKIP in Britain has relied upon nostalgic visions of a past society in which England benignly ruled the Commonwealth, marriages lasted forever, and people knew their roles in life. This was never the case. Any historian of the nineteenth or twentieth centuries – and in that I include “anyone who has read a book” – can tell you what twaddle this vision is. But it resonates as an easy explanation for the supposed problems of the present, and a template for a better future. That we lack the tools as a society to question it says a lot about the state of history education in this country. We might teach kids facts, but we’re certainly not teaching them how to weight and evaluate evidence. I believe scientists are having a similar whinge right now.

At the same time, we shouldn’t be afraid about beginning to write histories of the present. Our problems usually stem from trying to “predict the future” or placing today in the grand chronology of stuff what happened. By using historical context to place some of the trends, events and people of the recent past within a larger explanatory framework of human activity, we’re just doing our jobs. And showing how today is just as historically constructed as the past.

  1. The glossary entry on “Progress” explains some of my misgivings about the term.
  2. Yes. The scare quotes are necessary… ahem.
  3. Ethnography.
Print Friendly
Older Posts