Crystal Dynamics’ new game Tomb Raider (2013) adopts the strategy of the film Star Trek (2009) in using a prequel-cum-reboot to revitalize a franchise that had run out of steam: You play the game as a version of Lara Croft far younger and less experienced than in her previous 10 iterations, whose experiences in the game are meant to be her “formative” experiences—but it’s not at all clear that she’ll turn out to be exactly the same Croft. Unlike J.J. Abrams’ Star Trek film, the new Tomb Raider game doesn’t bend over backwards to explain the discontinuity between it and the previous iterations of the series; such obsessive attention to narrative continuity, it would seem, is not particularly expected within video games. This is perhaps because of the different devices of universe creation endemic to games.
A film and television series like Star Trek creates rules for a fictional world through a succession of texts that are each meant to envelope the spectator as invisible observer. The rules of that world take on something of an intractable solidity both by being relayed via verbal and visual exposition, and by remaining relatively consistent between individual texts. The experiential world of a video game, on the other hand, is produced to a large degree by the nature of the user’s specific interactions with that game. Rather than being narrated to a spectator, then, the rules that govern the universe of a video game are enacted by a user.
This aspect of gaming is most pointedly illustrated by games like Spore (2008) which explicitly thematize the user’s collaboration with the software to create a new world. But it’s present to some degree in every video game, inasmuch as the object of every video game is for the user to affect a virtual world. Tomb Raider was fascinating from its early moments because, like so many recent video games, a significant gameplay element is Croft’s interaction with a natural environment. As in Red Dead Redemption (2010), Assassin’s Creed III (2012) or Far Cry 3 (2012), to name just a few, the user is encouraged to spend time stalking animals in the forest with a myriad of hunting appendages and skills, or scrounging for rare plants that provide boosts to Croft’s character.
At work here may be a certain nostalgia for a way of life the “developed world” thinks it has lost contact with, for a world that might in fact be waning due to pollution and population growth. While this interpretation certainly has some weight, it doesn’t take into account the particularities of video gaming: The environment of these games is not simply a romanticized, nostalgic image of nature, it is also an interactive ecology. In playing these games, one becomes part of a virtual system of interactions, exchanges and exploitations. What they are concerned with is not just hunting, but the very question of existing within an ecological system. The authors of these games have realized that it is a question video games seem particularly adapted—so to speak—to deal with.
Video games can allegorize ecology because their virtual worlds depend on a network of actions and actors, only some (or one) of which are controlled by an actual organic being (the user). Playing a video game means learning how to act toward the different elements of the game—both in what your character does on screen and what your thumbs do on the controller—and understanding how one’s actions alter one’s environment. Such games might compel us to think about our relationship to our environment, meaning the totality of the things that surround us, the interactions which make up our experiential world. It’s true, on the one hand, that in rewarding users with so many points or particular bonuses for killing the highest number of animals or harvesting the highest number of plants, video games can be seen as partaking of a dangerous cultural logic. On the other hand, the same games might actually make us think about the way natural objects and beings are so frequently reduced to quantitative and utilitarian identities in our society.
Although it’s introduced in the tutorial as a fundamental part of gameplay and of the progression of Croft’s character, it quickly becomes clear, strangely, that hunting is not an essential part of Tomb Raider. The game doesn’t have the massive, open-world format of Red Dead or Far Cry, and the bonuses you get from hunting aren’t diverse or vital to Croft’s survival: They’re just a set amount of points. One might question why the game even includes this hunting mechanism, but it seems to me that it’s when a gameplay element becomes pointless—a habitual, generic trope—that we can mostly clearly see how significant it is. In Tomb Raider, one can’t help but notice how thoroughly rationalized and unnatural this virtual nature is. The stupidity of incorporating hunting into the game but doing so little with it is the stupidity of the meaningless exploitation of nature, and the game makes you feel how ultimately unpleasurable, how experientially proscribed, this makes human life.
Pat Brown is a graduate student in Film Studies at the University of Iowa. No, that doesn’t mean he makes movies; he just likes them a lot.