Anarchy in the UK…Archaeological Sector? A Brief Introduction into an Alternative Approach to Archaeology

One of my goals for 2019 is to try and make my work even more accessible – including conference and journal papers! I know that those can be hard to read due to jargon and the general sleep-inducing nature of the academic writing style, so I’ll be writing accompanying blog posts that are more accessible (and hopefully more fun!) to read with just about the same information. And if you’re a nerd, I’ll also add a link to the original paper too. Today’s blog post comes from a paper I presented at the 2018 Theoretical Archaeology Group Conference – you can find the full text here.

If you think about the word “anarchist”, you probably have a very specific image that comes to mind – some sort of “punk” masked up and dressed all in black, probably breaking windows or setting fires. And while that may be accurate praxis for some who wave the black flag (and also completely valid!), I’d argue that is doesn’t necessarily do the actual concept of “anarchism” justice…although, to be honest, I do love to wear black clothes

So then…what is anarchism? And how can it relate to archaeology?

tag2018(3)
A slide from my original TAG 2018 presentation on Anarchism and Archaeology showing various images of what most people consider to be “anarchist”.

To use Alex Comfort’s definition (1996), anarchism is “the political philosophy which advocates the maximum individual responsibility and reduction of concentrated power” – anarchy rejects centralised power and hierarchies, and instead opts for returning agency to the people without needing an authority, such as a government body. Anarchy places the emphasis on communal efforts, such as group consensus (Barclay 1996).

So, how does this work with archaeology? Why would you mix anarchy and archaeology together? For starters – this isn’t a new concept! There have been many instances of “anarchist archaeology” discussions, from special journal issues (Bork and Sanger 2017) to dedicated conference sessions (see the Society for American Archaeology 2015 conference). There have also been a few instances of anarchist praxis put into archaeological practice: for example, there is the Ludlow Collective (2001) that worked as a non-hierarchical excavation team, as well as the formation of a specifically anarchist collective known as the Black Trowel Collective (2016).

To me, an Anarchist Archaeology is all about removing the power structures (and whatever helps to create and maintain these structures) from archaeology as a discipline, both in theory and practice. We often find that the voices and perspectives of white/western, cis-heteronormative male archaeologists are overrepresented. Adapting an anarchist praxis allows us to push back against the active marginalisation and disenfranchisement of others within our discipline. This opens up the discipline to others, whose perspectives were often considered “non-archaeology” and therefore non-acceptable for consideration by the “experts” (i.e. – archaeologists) In Gazin-Schwartz and Holtorf’s edited volume on archaeology and folklore, this sentiment is echoed by a few authors, including Collis (1999, pp. 126-132) and Symonds (1999, pp. 103-125).

And hey, maybe logistically we’ll never truly reach this level of “equitable archaeology” – after all, this is a long, hard work that requires tearing down some of the so-called “fundamental structures” of the discipline that have always prioritised the privileged voice over the marginalised. But adapting an anarchist praxis isn’t about achieving a state of so-called “perfection”; rather, it’s a process of constantly critiquing our theories and assumptions, always looking for ways to make our field more inclusive and to make ourselves less reliant on the problematic frameworks that were once seen as fundamental.

It’s a destructive process for progress…but hey, isn’t that just the very nature of archaeology itself?

screenshot_2019-01-08 (pdf) black flags and black trowels embracing anarchy in interpretation and practice
Enjoy this poorly Photoshopped emblem of Anarchist Archaeology!

References

Barclay, H. (1996) People Without Government: An Anthropology of Anarchy. Kahn and Averill Publishers.

Black Trowel Collective (2016) Foundations of an Anarchist Archaeology: a Community Manifesto. Savage Minds. Retrieved from https://savageminds.org/2016/10/31/foundations-of-an-anarchist-archaeology-a-community-manifesto/.

Bork, L. and Sanger, M.C. (2017) Anarchy and Archaeology. The SAA Archaeological Record. 17(1).

Collis, J. (1999) Of ‘The Green Man’ and ‘Little Green Men’. In Gazin-Schawrtz, A. and Holtorf, C.J. (editors) Archaeology and Folklore. Routledge. pp. 126-132.

Comfort, A. (1996) Preface. In Barclay, H. People Without Government: An Anthropology of Anarchy. Kahn and Averill Publishers.

Ludlow Collective (2001) Archaeology of the Colorado Coal Field War, 1913-1914. Archaeologies of the Contemporary Past. Routledge. pp. 94-107.

Symonds, J. (1999) Songs Remembered in Exile? Integrating Unsung Archives of Highland Life. In Gazin-Schawrtz, A. and Holtorf, C.J. (editors) Archaeology and Folklore. Routledge. pp. 103-125.

Advertisements

“Start at the Beginning, and When You Get to the End, Stop” – The Archaeology of Time

At the time of writing this blog post, we are only three days into 2019. I’ll be honest – I’ve experienced 25 years on this planet and I still make New Year’s resolutions. The usual ones, of course: exercise more, consume less sugar, etc. And, of course, these resolutions usually make it until mid-February before I completely ditch them and continue to eat chocolate bars every day without touching my running shoes. I know New Year’s resolutions are silly gimmicks, marketed by gyms and health apps to make lots of money come January 1st. But I have always liked to utilise the New Year as a time for restarting my daily routines, renewing goals – I mean, I have an entire year ahead of me with so many possibilities, right?

So in honour of the New Year, let’s look at how we measure time in archaeology.

layers
An diagram of “typical” archaeological stratigraphy (Image Credit: Crossrail Ltd.)

There are many ways that archaeologists create chronologies, and we often combine several methods to get a better idea of what a site’s timeline was like. Possibly the easiest way to “see” time across a site’s archaeological record is to look at the cross-section of a trench during excavation. The stratigraphy of an archaeological site can usually be seen as a series of “layers”, almost like a cake…if the cake was made out of various soils, organic material, and artefacts. These layers provide us with a general ideal of the order in which materials were deposited – this includes both natural and anthropogenic materials. It may be easier to think of archaeological stratigraphy as a sort of “visual starting point” for further developing a chronology for the site (Harris 1989). In an ideal world, we could simply look at the layer on the bottom to determine the “beginning” of the site’s history…but of course, things are never that simple.

During post-excavation, there are numerous methods available to an archaeologist for further dating. Having a typology (read more on typologies here) of a certain artefact, such as pottery, can help an archaeologist get a general idea of what time period they are currently dealing with. Within archaeological science, there are a variety of lab-based methods for dating: radiocarbon, potassium-argon, uranium, etc.

Of course, these methodologies aren’t perfect, nor are they definite. In fact, archaeologists differentiate between absolute and relative chronologies. Absolute chronologies provide us with approximate dates, often from lab-based methods such as radiocarbon dating. On the other hand, relative chronologies (for example, using a typology to determine an approximate period of creation and use) can be used to determine general time periods using the relationship between a previously occupied site (and its material remains) and an overall culture (Fagan and Durrani 2016).

Additionally, there are many external factors that can affect the recovered context of a site, thereby complicating the timeline – for example, burrowing creatures may cause some artefacts to fall into the contexts of others. There have also been many cases of re-using older artefacts and spaces, which can complicate the timeline further (you can read more on recycling and re-using the past here).

Overall, however, archaeology has been a useful tool for conceptualising the beginnings of things – while we cannot establish with certainty the absolute start of agriculture or domestication, for example, we have been able to develop an approximation of how early humans were practising such concepts.

And let’s be real – time itself is a fascinating concept. While we have this sort of “standardised” method of calculating and measuring time today, we cannot truly account for past perspectives on time. Of course, we can find material evidence that may illustrate the physical act of “keeping time” in the past, but how did people in the past really experience time? Think about how quickly an hour can go by today, just by watching random videos on YouTube or Facebook on your smartphone. Remember how much longer an hour felt when we didn’t always have access to the Internet at all times, prior to smartphones and other such devices? What about someone in the past who has a completely different mindset to us – how did they experience an hour?

…honestly, I could probably prattle on for hours and hours about this (and how would you experience that??).

Anyway, hope you all had an easy transfer from 2018 to 2019 this past New Year. Here’s to another year of writing incoherent, rambling posts that you hopefully find entertaining at the very least. And thank you all for supporting and reading my work last year, too – hope to see you all back again at the end of 2019!

References

Fagan, B.M. and Durrani, N. (2016) In the Beginning: an Introduction to Archaeology. Routledge.

Harris, E.C. (1989) Principles of Archaeological Stratigraphy. Academic Press.

Studies in Skyrim: Vampirism Yesterday and Today

Skyrim_20180709225100
When your Skyrim character becomes a vampire, their skin becomes deathly pale and their eyes turn an otherworldly orange.

In the Elder Scrolls video game series, there are many fantastical creatures and monsters inhabiting the world, both friendly and hostile to the player character. One of these monsters (or perhaps that’s a bit too judgemental?) is the vampire, whose curse (or blessing?) is passed to others through a disease called “sanguinare vampiris”, also known as “porphyric hemophilia” in earlier games. The player character can become infected with this disease and become a “creature of the night”, obtaining all the advantages and disadvantages of vampirism.

Vampire lore was elaborated on extensively in Skyrim, specifically through the downloadable content Dawnguard, which places the player character in the middle of a conflict between vampires and vampire hunters. In this DLC, it is explained that there are many individual clans of vampires across the world, with the most powerful vampires known as “pure-bloods”. A pure-blooded vampire will have been granted their powers from the Daedric Prince (basically one of the Elder Scrolls deities) Molag Bal directly. The DLC also introduced the “Vampire Lord” form – this is considered the ultimate form of vampirism and is usually a power that only pure-blooded vampires have.

Skyrim_20180709170801
The Dawnguard DLC reveals that one can become a “Vampire Lord”, which completely transforms the body into a powerful creature of the night.

The idea of the “vampire” is a relatively old one, of course. In Europe, it seems that vampirism became a topic of interest during the 18th century, with the word “vampire” officially entered into the Oxford English Dictionary in 1734. Many early stories of vampires appear to have originated from German and Slavic folklore, although there are many instances of vampire-like creatures in stories around the world (Barber 1988).

Literature and film eventually created what we may consider today to be the “archetypal vampire” – Polidori’s The Vampyre, Sheridan Le Fanu’s Carmilla, and Bram Stoker’s Dracula provided the textual background for the modern day vampire, while F.W. Murnau’s Nosferatu and Tod Browning’s Dracula ultimately solidified the visual characteristics associated with the monster that are still used to this day. However, we still occasionally get new “twists” on the old formula in popular culture – from “sexy, brooding vampires” (see Anne Rice’s The Vampire Chronicles series or Stephanie Meyer’s Twilight series) to more hilarious takes on vampire culture (see Jermaine Clement and Taika Watiti’s mockumentary What We Do in the Shadows).

vampire_brick
A digital recreation of how some “vampire burials” would place a stone or brick into the mouth of the dead (Image Credit: Matteo Borrini)

But what about vampires in archaeology? We already looked at lycanthropes in the archaeological record – can we find vampires as well?

Well…kinda.

When it comes to “deviant” burials, or burials that differ from normative burial practices, it’s easy to draw negative assumptions of the deceased, particularly when combined with “flights of fancy” of local folklore. Among these deviant burials, many have been interpreted as possible “anti-vampire burials”; this was a term first used in 1971 off-handedly by Helena Zol-Adamikowa and eventually popularised throughout Slavic archaeological literature to refer to most burials that defied funerary norms (Hodgson 2013).

Some of the various evidence used to support these “anti-vampire burials” include protective burial goods (like sickles), stones left atop of bodies, stakes or knives stabbed through the chest, decapitations, and, perhaps one of the more prolific examples, stones or bricks placed within the mouth of the deceased (Barrowclough 2014).

While there are undoubtedly examples of how pervasive the idea of vampires or, more generally, the undead were throughout folklore in “deviant” burials, there should also be a bit of caution in generalising all non-normative burials in this way, of course! There has been plenty of debate even regarding the evidence mentioned above. But perhaps the most solid thing to come out of all of this archaeological research is how such abstract concepts can ultimately be reflected in the material culture that remains.

Oh, and that apparently if you run into a vampire, you should definitely stuff a brick in their mouth.

References

Barber, P. (1988) Vampires, Burial, and Death: Folklore and Reality. Yale University Press.

Barrowclough, D. (2014) Time to Slay Vampire Burials? The Archaeological and Historical Evidence for Vampires

Bethesda Game Studios. (2011) The Elder Scrolls V: Skyrim.

Hodgson, J.E. (2013) ‘Deviant’ Burials in Archaeology. Anthropology Publications. 58. pp. 1-24.

“I Love Dying and Being Dead” – Late Capitalism and Modern Perceptions of Death

Lately, archaeologists have been a bit concerned about memes. No, not because they’re trying to perfect their comedic skills – rather, there’s been a relatively recent rash of popular memes that were derived from several big archaeological finds. For example, a nearly complete human skeleton was recovered in Pompeii, originally interpreted to have been crushed to death while fleeing the eruption of Mt. Vesuvius in 79 CE. The image used to publicise this excavation – a skeleton whose head has been obfuscated by a stone slab – ended up being used by many as a meme on social media like Twitter and Facebook. This led to a further discussion by archaeologists across the Internet on respecting human remains and whether or not it was ethical to make memes out of recovered bodies, regardless of the age and unknown identity (Finn 2018).

662
A Tweet from Patrick Gill (@Pizza_Suplex) commenting on the skeleton recovery that says, “Me to a panicked group of archaeologists moments after I drop a big ass rock on a perfectly preserved Pompeii skeleton: Chill. Let me talk to the press. I’ve got this.”

Although the main concern with this “meme-ification” of the dead is the ethics at play (for more on the ethics of human remains on display, see my blog post on selfies with human remains in the recent Tomb Raider game), I’m more interested in why memes utilising the dead – or associated with death and dying – are so popular these days.

Let’s talk about late capitalism and how it shapes the average young person’s everyday life, shall we?

An image of a tombstone that says “This Space for Rent” – the caption added above it says, “Capitalism even ruins the sweet release of death smh”

Millennials have had the utmost misfortune to reach young adulthood (the “pivotal years”, as many call this time period) during late capitalism. This means that, as a generational group, they are significantly poorer than previous generations (O’Connor 2018), with a growing number unable to even save money (Elkins 2018) from a severe lack of fair wages. This is the generational group that is leaving higher education with high amounts of debt, only to find a feeble job market that demands long hours for little pay. It’s a pretty bleak future that young people seem to have inherited, so it’s honestly hard to blame them for developing such a morbid sense of humour that utilises iconography and imagery associated with death to express such futility in a way that’s become palatable for everyone else.

DpPQkulUUAAgi4v.jpg large
A meme from Da Share Zone (@dasharezOne), a popular social media presence that makes images using stock photos of skeletons. This one depicts a (fake) human skeleton wearing a fur coat with a flower crown . The text around it says, “Looking Good, Feeling Bad”. Relatable!

What interests me the most as an archaeologist is how this affects our perception of death and dying in modern times. Morbid memes may be contributing to a sort of desensitisation of dying, to the point where it has become no longer taboo or fearful to speak of the dead – in fact, people actively make fun of the dead and the concept of dying. I would argue that this could be seen as the opposite effect that the Positive Death Movement is having, which strives to cultivate a more positive and respectful attitude towards death. I think, as archaeologists, we definitely need to push back against the meme-ification of the dead as violation of ethics – but I also think we should consider why this has become a trend, how the socio-political characteristics of the world at large can cause these things to become popular, and how we can take this approach and apply it to our interpretations of the past.

References

Elkins, K. (2018) A Growing Percentage of Millennials Have Absolutely Nothing Saved. CBNBC. Retrieved from https://www.cnbc.com/2018/02/09/a-growing-percentage-of-millennials-have-absolutely-nothing-saved.html

Finn, E. (2018) Pompeii Should Teach Us to Celebrate People’s Lives, Not Mock Their Death. The Conversation. Retrieved from http://theconversation.com/pompeii-should-teach-us-to-celebrate-peoples-lives-not-mock-their-death-97632

O’Connor, S. (2018) Millennials Poorer than Previous Generations, Data Show. Financial Times. Retrieved from https://www.ft.com/content/81343d9e-187b-11e8-9e9c-25c814761640

 

OM NOM NOM or Did I Really Use an Old, Bad Joke to Introduce a Post on Gnawing?

Hi, welcome back to the early to mid 2000’s where we still use jokes like “om nom nom” unironically!

Just kidding, I won’t subject you to bad jokes like that for this entire post. Anyway, it’s come to my attention that for a blog called “Animal Archaeology”, I don’t really write that much about the archaeology of animals, huh? Well, today will change that! Here is a brief introduction to how we identify gnaw marks on certain bones – because humans aren’t the only species to eat other animals, don’t ya know?

Rodents

IMG_E3762
Rat skull and mandible

Rodent gnawing is probably the easiest one to recognise. Due to those huge incisors of theirs, rodents leave behind a very distinct pattern of close striations on the bone. Be warned, however! It can be easy to mix this up with cut marks, or vice versa.

CEgf_E-UgAE_mGG
Rodent gnaw marks on a Bison bone (Photo Credit: Alton Dooley)

Felines

IMG_3761
Domestic Cat skull and mandible

Cats do indeed gnaw on bones! And they have a pretty peculiar way of doing so – when they hold onto a bone, they’ll use their canine teeth, which will often leave a puncture mark! Given their smaller size, these marks will often be a bit small and usually won’t go entirely through the bone (although if you’re dealing with a bigger feline, like a lion, you may find yourself with bigger and deeper puncture marks!). Cats will also do a bit of a “nibble”, leaving behind a very pitted and rough looking texture.

Parkinson_Fig13
Examples of feline gnawing (with tooth punctures) from experiments (Image Credit: Jennifer A. Parkson, Thomas Plummer, and Adam Hartstone-Rose)

Canines

IMG_E3764
Wolf skull and mandible

This is possibly something you can check right now if you have dogs as pets – take another look the next time they chew up a bone. Canine species like dogs and wolves will produce gnaw marks similar to felines in that they will often cause a puncture hole in the bone with their teeth. However, canine species will usually produce much larger holes in comparison. Another key characteristic is that canine species will slobber – when they gnaw on bones, they often produce what can only be described as “an upsetting amount of saliva” – however, this is great for zooarchaeologists, as it can leave behind a very polished look to the bone, which is very distinct. So, next time see you a beautifully polished archaeological bone…it was probably covered in ancient dog spit.

1-s2.0-S1040618215006862-gr7
Some examples of canine gnawing on discarded worked bones (Image Credit: Reuven Yeshurun, Daniel Kaufman, and Mina Weinstein-Evron)

Humans

product-605-main-main-big-1478731547
Replica human skull and mandible (Photo Credit: Bone Clones)

Yes, occasionally we do find human gnaw marks, although now we’re a little bit out of my jurisdiction! So, our teeth look weird – well, at least compared to non-human teeth. So the kind of gnaw marks we leave are a bit…wonkier? Is that the right word? Just bite into an apple and see what you leave behind, it’ll depend on how your incisors look, as we often lead with them to bite down onto something. Personally, I have pretty large buckteeth, so I’d hate to be the zooarchaeologist looking at my left behind teeth marks trying to figure out what the heck happened!

Volunteers-chew-bones-to-help-identify-marks-of-earliest-human-chefs
Human gnaw marks left behind on various sheep bones (Image Credit: Antonio J. Romero)

References

Hays, B. (2016) Volunteers Chew Bones to Help Identify Marks of Earliest Human Chefs. United Press International. Retrieved from https://www.upi.com/Science_News/2016/08/02/Volunteers-chew-bones-to-help-identify-marks-of-earliest-human-chefs/2831470153494/

Parkinson, J.A., Plummer, T., and Hartstone-Rose, A. (2015) Characterizing Felid Tooth Marking and Gross Bone Damage Patterns Using GIS Image Analysis: An Experimental Feeding Study with Large Felids. Journal of Human Evolution. 80. pp. 114-134.

Yeshurun, R., Kaufman, D., and Weinstein-Evron, M. (2016) Contextual Taphonomy of Worked Bones in the Natufian Sequence of the el-Wad Terrace (Israel). Quarternary International. 403. pp. 3-15.

The Perfect Pokemon: A Brief Look at Selective Animal Breeding

Is there a “Perfect Pokemon”? Well, I guess technically there is the genetically engineered Mewtwo…but what about “naturally occurring” Pokemon? Can Trainers “breed” them for battle?

46513781_10156633626078503_196326571462623232_n
Pokeballs wait to be healed up.

A form of “Pokemon breeding” has been a vital part of the competitive scene for years. Players took advantage of hidden stats known as “Individual Values”, or “IV’s”, which would influence a Pokemon’s proficiency in battle. These stats could be changed based on training and utilising certain items in-game. In order to have the most control over a Pokemon’s IV’s, it is best if a Player breeds a Pokemon from the start by hatching them from an Egg, allowing for modification of stats  from the very start. This is in contrast to usiong caught Pokemon, which are often above Level 1, so some of their important stats have already been changed “naturally” (Tapsell 2017).

46815743_10156633627698503_1961150361628573696_n
By looking at the stats of your Pokemon, you can figure out how best to perfect it through training.

But what about real life animal breeding? More specifically, “selective breeding” – this refers to human-influenced or artificial breeding to maximise certain traits, such as better production of certain materials (for example, milk or wool) or better physicality for domestication (stronger builds for beasts of burden, etc.). This is in contrast to natural breeding or selection, in which the best traits towards survival and adaptation are passed through breeding, although these traits may not be best suited for human use of the animal. Selective breeding is most likely as old as domestication itself, but its only been recently (at least, in the past few centuries) that humans have more drastically modified animal genetics (Oldenbroek and van der Waaij 2015).

dog-skull-variation
Various dog breeds represented by their skulls – an example of how breeding can dramatically modify the anatomy of an animal (Image Credit: A. Drake, Skidmore Department of Biology)

But can we see selective breeding archaeologically? For the most part, this sort of investigation requires a large amount of data – zooarchaeologists can see dramatic modifications to bred animals by examining large assemblages of animal remains over time. Arguably one of the best examples of this can be seen in looking at dog domestication and how breeding techniques have drastically changed aspects of canine anatomy (Morley 1994).

Zooarchaeological data can be supplemental by other sources of evidence, such as text and material remains. Perhaps the most powerful innovation in archaeological science, however, is DNA analysis – using techniques such as ancient DNA (aDNA), we can see specific genetic markers to further investigate exact points of change (MacKinnon 2001, 2010).

The most recent additions to the Pokemon video game franchise, Pokemon: Let’s Go Pikachu and Let’s Go Eevee have not only streamlined gameplay, but have also made the previously “invisible stats” more visible and trackable to the chagrin of some seasoned Pokemon players. However, for new players this is undoubtably a welcome change…now if only we could make it just as easy to see in real life zooarchaeology!

References

MacKinnon, M. (2001) High on the Hog: Linking Zooarchaeological, Literary, and Artistic Data for Pig Breeds in Roman Italy. American Journal of Archaeology. 105(4). pp. 649-673.

MacKinnon, M. (2010) Cattle ‘Breed’ Variation and Improvement in Roman Italy: Connecting the Zooarchaeological and Ancient Textual Evidence. World Archaeology. 42(1). pp. 55-73.

Morey, D.F. (1994) The Early Evolution of the Domestic Dog. American Scientist. 82(4). pp. 336-347.

Oldenbroek, K. and van der Waaij, L. (2015) Textbook Animal Breeding and Genetics for BSc Students. Centre for Genetic Resources The Netherlands and Animal Breeding and Genomics Centre. Retrieved from https://wiki.groenkennisnet.nl/display/TAB/Textbook+Animal+Breeding+and+Genetics

Tapsell, C. (2017) Pokemon Sun and Moon Competitive Training Guide. Eurogamer. Retrieved from https://www.eurogamer.net/articles/2017-12-15-pokemon-sun-and-moon-competitive-training-guide-how-to-raise-the-best-strongest-pokemon-for-competitive-play-4925

Studies in Skyrim: Big Game Collector

With the addition of Hearthfire as downloadable content, Skyrim allowed players to build and live in their own customisable homes. One of the options for buildable rooms included a “trophy room”, where players can erect trophy versions of some of the creatures that can be killed in-game. This ranges from real world game like bears to the more mythical beasts like dragons. Yes, even in  a game where you can kill and mount living tree creatures called “Spriggans”, the very human fascination with animal remains still exists!

Skyrim_20181113005717
A dragon skull mounted in my character’s trophy room.

Hunting trophies appear to be somewhat culturally ubiquitous, and can be found throughout the archaeological record. Although most discussion on trophies in the Prehistoric tend to focus on headhunting and human remains (see Armit 2012), we do have plausible evidence that some recovered animal remains from sites were most likely kept as hunting trophies.

Of course, animal remains were used quite often in Prehistoric life in ways that went beyond decor and trophies – modified bones reveal that it was common to create tools (needles, pins, combs, etc.) out of hunted animals. Another common interpretation for animal bones and other associated remains found in more “domestic” contexts is that they may have had some sort of ritual use – for example, there are many instances of animal bones deposited in pits and building foundations (Wilson 1999). Arguably some of the most famous examples of ritual use of animal bones are the Star Carr deer frontlets – these cranial fragments with the antlers still attached were possibly worn as headdress or masks during rituals, perhaps as a way of evoking a form of transformation by the wearer (Conneller 2004).

d41586-018-03894-y_15559578
A deer “frontlet” that may have been used as a mask or headdress from the Mesolithic site of Star Carr (Photo Credit: Neil Gevaux, University of York)

Hunting trophies as we understand them today were popular as far back as the medieval period, where hunting for sport not only resulted in trophies of animal remains – there were also “living trophies”, in which big game and exotic animals were captured and kept in menageries. The popularisation of natural history exhibits and taxidermy in the 19th and 20th centuries also brought with it a new wave of displaying hunted animals, both for education and for the sake of, well, showing off your hunting skills. However, this wasn’t the only way to display one’s hunted game – it was also quite popular to commission paintings of hunting trophies, which would eventually evolve into the popularisation of photographing one’s kills (Kalof 2007).

skyrim_20181113011625.jpg
After encountering a tough snow bear in the wild, I immediately had to have in displayed in my character’s new home.

Ultimately, if we look at the concept of “trophy animals” as a whole, what can we learn about human-animal interactions throughout history? The concept of a “trophy”, regardless of the method in which it is displayed, is centred around the objectification of the dead animal. It is also often a sign of power and a visual reminder of the sort of hierarchies in place in society – after all, trophy rooms and hunting for sport are often associated with masculinity and elite status. Unsurprisingly, there are also associations with hunting trophies and colonialism, with many photographs showcasing white men in pith helmets next to their “exotic” game in colonised regions of the world (Kalof and Fitzgerald 2003).

But here, in our fantasy video game, our trophies stand – perhaps problematic by nature of their associations in real life – but also as reminders of the system in which Skyrim runs, where I fondly remember how that one snow bear managed to kill me at Level 3 at least a dozen times. And now that snow bear is stuffed in my house. How the tables turn.

Skyrim_20181113000123
Hunting trophies can be found all over Skyrim!

References

Armit, I. (2012) Headhunting and the Body in Iron Age Europe. Cambridge University Press.

Conneller, C. (2004) Becoming Deer: Corporeal Transformations at Star Carr. Archaeological Dialogues 11(1). pp. 37-55.

Kalof, L. and Fitzgerald, A. (2003) Reading the Trophy: Exploring the Display of Dead Animals in Hunting Magazines. Visual Studies 18(2). pp. 112-122.

Kalof, L. (2007) Looking at Animals in Human History. Reaktion Books Ltd.

Wilson, B. (1999) Displayed or Concealed? Cross Cultural Evidence for Symbolic and Ritual Activity Depositing Iron Age Animal Bones. Oxford Journal of Archaeology 18(3). pp. 297-305.

A Lesson in Taphonomy with Red Dead Redemption 2

Note: I struggled about whether or not to write about this game due to the issues surrounding its development and the poor treatment of workers (for more information, please read this article from Jason Schreier). However, I think it marks an interesting development in the ever-growing world of virtual archaeologies, so I proceeded to write about it. That being said, please show support for the unionisation of game workers by visiting Game Workers Unite.

Red Dead Redemption 2 (Rockstar Studios 2018) has only been out for a short while, but many players have been praising the level of detail that has gone into the game. One of the most striking features, at least to me as an archaeologist, is the fact that bodies actually decay over time. That’s right, video game archaeologists – we now have some form of taphonomy in our virtual worlds!

But wait, what istaphonomy“? Well, you may actually get a few slightly differing answers from archaeologists – we all mostly agree that taphonomy refers to the various processes that affect the physical properties of organic remains. However, it’s where the process begins and ends that has archaeologists in a bit of a debate. For the purposes of this blog post, I’m gonna to use a definition from Lyman (1994), which defines taphonomy as “the science of the laws of embedding or burial” – or, to put it another way, a series of processes that create the characteristics of  an assemblage as recovered by archaeologists. This will include not only pre-mortem and post-mortem processes, but processes that occur post-excavation, as identified by Clark and Kietzke (1967).

rdr2-brand-new-1200x450
Promotional Image Credit: Rockstar Games (2018)

Let’s start with the pre-mortem processes, which are often ignored in discussions of overall taphonomy – firstly, we have biotic processes, which sets up the actual conditions of who or what will be deposited in our final resulting assemblage – this can include seasonal characteristics of a particular region, which will draw certain species to inhabit the area (O’Connor 2000), as well as cultural factors, such as exploitation and, unfortunately, colonisation/imperialism (Hesse and Wapnish 1985).

Now, let’s use some poor ol’ cowboys from Red Dead Redemption 2 as examples of post-mortem processes – Content Warning: Images of (digital) human remains in various stages of decay are about to follow, so caution before you read on!

RD1
Image Credit: YouTube user WackyW3irdo (2018)

With our biotic processes providing us with these cowboys who have moved West for a variety of reasons, we now need to determine our cause of death to continue with taphonomy. This falls under thanatic processes, which causes death and primary deposition of the remains (O’Connor 2000). In our example above, we would probably be able to find osteological evidence of trauma due to the cowboy getting shot to death.

RD2
Image Credit: YouTube user WackyW3irdo (2018)

In time, we soon see the work of taphic processes, or the chemical and physical processes that affect the remains – this is also sometimes referred to as “diagenesis” (O’Connor 2000). Much of what we consider to be “decay” when we think of decomposition will fall under this category of processes. Sometimes this will also affect the remaining structure and character of bone that will eventually be recovered.

RD3
Image Credit: YouTube user WackyW3irdo (2018)

Now, imagine we take this body and, as seen in the YouTube video from which these images come from, toss it down a hill. Okay, this is a bit of an over-the-top example, but it showcases another category of processes known as perthotaxic processes. These processes causes movement and physical damage to the remains, either through cultural (butchery, etc.) or natural (weathering, gnawing, trampling, etc.) methods. Similar to these processes are anataxic processes, which cause secondary deposition and further exposure of the remains to other natural factors that will further alter them (Hesse and Wapnish 1985).

RD4
Image Credit: YouTube user WackyW3irdo (2018)

The above image shows the remains of the cowboy finally reaching his secondary place of deposition after being tossed from the top of the hill and now drawing the attention of scavenger birds – this showcases an example of an anataxic process, as the body is being scavenged due to exposure from secondary deposition.

RD5
Image Credit: YouTube user WackyW3irdo (2018)

At this point, we begin to see how all of the aforementioned processes have affected our current archaeological assemblage-in-progress: we clearly have physical and chemical signs of decay, with physical alteration due to post-mortem trauma (tossing off of a hill) and exposure (including gnawing from other animals). This results in some elements going missing, some being modified, and others being made weaker and more likely to be absent by the time the body is recovered archaeologically.

Now, we also have two processes that occur during and after archaeological excavation that, again, often get overlooked: sullegic processes, which refer to the decisions made by archaeologists for selecting samples for further analysis (O’Connor 2000) and trephic processes, which refer to the factors that affect the recovered remains during post-excavation: curation, storage, recording, etc. These are often ignored as they don’t necessarily tell us much about the context surrounding the remains, but they are vital to consider if you are working with samples that you did not recover yourself or have been archived for a long time prior to your work.

RD6
Image Credit: YouTube user CallOfTreyArch (2018)

Environmental differences will also affect the sort of variety within the overall taphonomic process – for example, wet environments (say, like the body of water seen in the image above) will cause the body to become water-logged, which may speed up certain taphic processes and create poorer preservation. More arid environments, like a desert, may lead to slightly more preservation in some cases due to the lack of water that may damage the bones.

RD7
Image Credit: YouTube user CallOfTreyArch (2018)

Although the game certainly speeds up these processes and streamlines them in a way that removes some of the other variables that you would see in real life, I’d argue that Red Dead Redemption 2 might currently be the most accurate depiction of taphonomy that exists within a virtual world and may present new opportunities for developing models that could aid in furthering our understanding of how remains may decay under certain circumstances.

At the very least, it could make it easier and less smellier to do taphonomic experiments!

References

CallOfTreyArch. (2018) Red Dead Redemption 2 – In-Game Corpse Decay Timelapse. YouTube Video. Retrieved from https://www.youtube.com/watch?v=5izZ2gv17M8

Clark, J. and Kietzke, K.K. (1967) Paleoecology of the Lower Nodule Zone, Brule Formation, in the Big Badlands of South Dakota. Fieldiana: Geology Memoir. pp. 111-129.

Hesse, B. and Wapnish, P. (1985) Animal Bone Archaeology: From Objectives to Analysis. Taracuxum Inc.

Lyman, R.L. (1994) Vertebrate Taphonomy. Cambridge University Press.

O’Connor, T. (2000) The Archaeology of Animal Bones. Sutton Publishing Ltd.

Rockstar Games. (2018) Red Dead Redemption 2.

WackyW3irdo. (2018) Red Dead Redemption 2 – Decaying NPC Body Timelapse. YouTube Video. Retrieved from https://www.youtube.com/watch?v=D2AoQyynYFM

“Death Positivity” for Pets: Are We Changing Our Attitudes Towards the Death Of Animals?

Content Warning – Today’s blog post will talk at length about animal death and will have some photos of taxidermy animals. Please proceed with caution and feel free to skip the blog post entirely if this is too upsetting.

Screenshot_2018-10-30 Death Salon on Instagram “Our director_s personal haul from #deathsalonboston including moulagedecire[...]
Caitlyn Doughty, founder of the Order of the Good Death, gives a talk at a Death Salon event in Seattle (Photo Credit: @DeathSalon on Instagram)
The “Death Positivity” movement has truly become part of the mainstream discourse recently, ranging from a general increase in appreciation for all things aesthetically macabre, to more organized events that educate others on death and the culture surrounding it. Arguably at the forefront of this movement in the United States is Caitlyn Doughty, a mortician who started the Order of the Good Death as a means of engaging with death and dying in a more positive manner and combatting the anxieties that surround death in modern society (Troop 2013). Doughty eventually began working with other organizers to create “death salons” – based on 18th century intellectual salons, these events gather academics, professionals, and creatives (such as musicians, artists, performers, and even chefs!) together to discuss aspects of death and the culture around death (Rosenbloom 2013).

But while our attitudes towards human death may be changing, what about our attitudes towards animal death? This may be a more complicated question than I originally thought – after all, given our utilisation of animals as subsistence, product manufacturers, and sometimes companions, humans will find themselves constantly confronting animal death. However, there are two specific examples of recent trends that I’ve noticed as someone who consistently works with animal remains in their everyday life…

Screenshot_2018-10-30 Alex Fitzpatrick ( afitzpatrickarchaeology) • Instagram photos and videos
A typical array of “vulture culture” collections, processed and used in artwork by artist and seller Ossaflores (Photo Credit: @Ossaflores on Instagram)

Perhaps one example of changes towards animal death is the popularisation of “vulture culture” online – this term often refers to enthusiasts for collecting animal remains, either as skeletal elements or as taxidermies. Not everyone in the community processes their own remains, but everyone expresses a passion for collecting specimens via online sellers or by finding naturally defleshed remains in the wild. Some enthusiasts are also artists that incorporate animal remains into their artwork somehow.  It is usually emphasised that “vulture culture” collections are derived from naturally deceased animals as part of their ethics (Miller 2017).

screenshot_2018-10-30-f09d9493f09d94aff09d94a2f09d94a0f09d94a6f09d94acf09d94b2f09d94b0-e284adf09d94aff09d94a2f09d949ef09d94b1f09d94b2f09d94af.jpg
An example of “pet aftercare” in the form of full taxidermy, done by Precious Creatuer Taxidermy (Photo Credit: @PreciousCreature on Instagram)

Another example of “animal death positivity” could also be seen in the rise of pet mortuary businesses that specialise in “alternative aftercare”. This can either be as a full taxidermy piece, as a partial piece (for example, preserved tails or paws), or in skeletal form. Precious Creature Taxidermy, an alternative aftercare and taxidermy business run by Lauren Lysak in California, offers various aftercare services in lieu of what we may consider “traditional human funerary services” that includes the previously mentioned processes as well as cremation (Lysak 2018). Although it may seem a bit macabre to taxidermy one’s pet, you could also consider this as a deeper acceptance of death and its constant presence around all of us…in taxidermy form.

Screenshot_2018-10-30 Alex Fitzpatrick ( alexleefitz) • Instagram photos and videos

So, are we entering a new phase of “death positivity” with regards to animals? Do we even have a right to feeling “death positive” towards non-human species – after all, of course, many animal deaths are directly caused by human activities. I think that, ultimately, this is a very complicated topic that has many layers to it regarding concepts of posthumanism, of ethics, of agency, and so on – perhaps this requires another, more lengthy blog post! However, at least with regards to how humans experience the death of animals, specifically pets, I think we are making strides to better understanding the processes of death and utilising some aspects of “death positivity” as we apply it to humans in our overall understanding of the concept as a whole.

References

Lysak, L. (2018) About Precious Creature Taxidermy. Precious Creature Taxidermy. Retrieved from http://www.preciouscreaturetaxidermy.com/new-page.

Miller, L. (2017) What is Vulture Culture? Vulture Gear Blog. Retrieved from https://vulturegear.com/blogs/vulture-gear-blog/what-is-vulture-culture

Rosenbloom, M. (2013) Death Salon LA…and Beyond! Death Salon. Retrieved from https://deathsalon.org/2013/11/04/death-salon-la-and-beyond/.

Troop, S. (2013) Death Salon Interviews Caitlyn Doughty. Death Salon. Retrieved from https://deathsalon.org/2013/10/02/death-salon-interviews-caitlin-doughty/.

Terror and Tradition Over Time: A Look at the Material Culture Of Halloween

Some of my current Halloween decorations – surprisingly low-key compared to previous years!

Oddly enough, I didn’t really expect to run into that many significant cultural differences when I first moved from the United States to the United Kingdom. So I was actually a bit surprised when Halloween first came around. I expected there to be streets covered in decorations, but was surprised to see only a few pumpkins and paper bats placed here and there. Turns out that Halloween isn’t necessarily as big of a deal as it is in the US; back where I grew up, I was used to seeing houses on my block completely transform into haunted places complete with loud, scary noises and bloody, horrifying animatronic monsters! I never considered the differences in the material culture and presentation of the holiday across cultures.

With that in mind, I figured a brief look into the history of Halloween material culture  may be an interesting blog post to celebrate the holiday this year! Fair Warning: Some of these decorations may be pretty spooky.

672a180d2be1f5bc5876ba768f363ebeb71c81c5
A traditional turnip jack-o-lantern from the Museum of Country Life in Ireland (Photo Credit: Rannpháirtí Anaithnid)

 

Most academics seem to agree that our modern celebration of Halloween stems from a pagan tradition, although there tends to be some debate over which one. Many point to Samhain, a Celtic festival that celebrated the end of the harvest season, the preparation for the upcoming winter, and the warding off of spirits by using large bonfires. Others, however, point to Pomona, which was allegedly a festival celebrated in the name of the Roman goddess of fruit and seeds, also named Pomona (Rogers 2002). Unfortunately, we have little textual or archaeological evidence to support either of these theories besides the similarities in timing with modern day Halloween – in fact, we have no evidence of the Pomona Festival ever occurring and no evidence to suggest how widespread Samhain may have been (Moss 2013).

Regardless of the actual origin point of the holiday, we can see that the introduction of All Saints Day to the 1st of November (possibly as a means of “Christianizing” Samhain) in the 8th Century eventually led to the standardization of many traditions that are still associated with Halloween. This includes perhaps the earliest form of “trick-or-treating”, where the poor would go from house to house and given soul cakes (pastries or breads made to honour the Dead) in exchange for praying for the dead of the household. Dressing up in costumes, or masquerading, also appears to have become a custom associated with All Saints Day, although it was for honouring the Christian Saints rather than terrifying the local neighbours (Bannatyne 1998). And, of course, there is that terrifying tradition that appears to have been originated in Ireland of “jack-o-lanterns” – these were faces carved into root vegetables like turnips…thankfully, the tradition turned to pumpkins once it was brought over to North America, which is good because have you seen how horrifying those turnip lanterns are?

1906
A tin Halloween parade stick from the early 1900’s (Image Credit: Mark B. Ledenbach, Halloween Collector)

Halloween and its traditions were introduced to the United States via the influx of immigrants from Ireland and Scotland during the mid 1800’s. However, up until the early 1900’s Halloween was mostly an adult-oriented holiday, celebrated by dinner parties. This led to the popularity of home decorations, which were often promoted by booklets and catalogues such as Dennison’s Bogie Book (Mitchell 2017).

na4
A Halloween Die-Cut Sign from the 1930’s (Image Credit: Mark B. Ledenbach, Halloween Collector)

By the 1920’s, Halloween was becoming more standardized in practice and in design into the holiday that we recognize today. Most decorations on offer for purchase were in the form of “die-cuts” – basically paper decorations – as these were easily disposable. You probably still see die-cuts used to this day – think of the sort of cute, paper Halloween decorations that were hung up around school. In the 1930’s, trick-or-treating was practised more widely around the United States, prompting the popularity of decorations that were more cute than creepy. From then onward, Halloween was more of a children’s holiday (Eddy 2016).

448
A small selection from the 2017 TransWorld Halloween and Attractions Show (Photo Credit: Chelsea T., Haunts.com)

Today, Halloween has become entwined with modern consumerist culture – in fact, Americans spent approximately 9.1 billion dollars on Halloween decorations and costunes (Mitchell 2017). And that’s not surprising given today’s emphasis on consumerism, which has tied itself to concepts of nostalgia and pop culture that now seem to propel many modern day traditions for Halloween – from dressing up as your favourite 90’s television character to hosting a marathon of “classic” horror films. Trends in consumption and aesthetics have also added to the holiday’s general popularity – by 2010, Halloween has become the most popular non-Christian holiday in the United States (Moss 2013).

With these changes in popularity and material trends, there has also been a significant shift in the main demographic for Halloween – although still enjoyed by children and young people, there has been a rise in popularity for adult Halloween costumes and adult-oriented celebrations, like Halloween parties organized at clubs, bars, and pubs (Belk 1990).

This trend can also be seen in the movement towards associating Halloween with the truly terrifying and gory. Due to advances in technology, computer animation, and prosthetics, modern day horror media has never been more elaborate and realistic in their grim and grisly details. This has also been carried over to amateur Halloween decorations, with homemade haunted houses and terrifying attractions taking the place of trick-or-treat spots (for some of the most spectacular looking Halloween decorations and costumes, check out the TransWorld Halloween Showcase).

So, what can we see from this brief history of Halloween trends and patterns in material culture? Well, its hard to say – especially as the origins of the holiday are still widely debated. However, we could argue that Halloween has consistently been a holiday of invoking what is otherwise taboo – whether that’s communicating with spirits and saints, demanding treats and sweets from strangers and neighbours alike, playing pranks, or even just dressing a bit differently than what’s considered “normal”! Like most other popular holidays, Halloween has become entwined with consumerism and rooted to pop culture by a variety of tropes and customs. And yet, we could also say that it remains a holiday truly rooted in tradition – from the carving of Jack-o-Lanterns to trick-or-treating, these traditions have been carried over from one continent to another and have lasted hundreds of years…I think its safe to say that they don’t seem like they’ll be going away any time soon.

Have a safe and happy Halloween, everyone!

References

Bannatyne, L. (1998) Halloween: An American Holiday, an American History. Pelican Publishing Company.

Belk, R.W. (1990) Halloween: An Evolving American Consumption Ritual. Advances in Consumer Research. pp. 508-517.

Eddy, C. (2016) The History of Modern Halloween, as Seen Through its Decorations. Gizmodo. https://io9.gizmodo.com/the-history-of-modern-halloween-as-seen-through-its-de-1788207372

Ledenbach, M.B. (2018) Halloween Collector. www.halloweencollector.com

Mitchell, N. (2017) Halloween Decorating Hasn’t Been Around as Long as You Think. Apartment Therapy. https://www.apartmenttherapy.com/the-rather-modern-history-of-halloween-decorations-249863

Moss, C. (2013) Halloween: Witches, Old Rites, and Modern Fun. BBC: Religion & Ethics. http://www.bbc.co.uk/religion/0/24623370

Rogers, N. (2002) Halloween: From Pagan Ritual to Party Night. Oxford University Press.