Wednesday, September 21, 2011

Win, Lose, or Fail

A bunch of gaming writers have recently cycled back around to one of the most foundational questions of our art.  No matter what perspective each of us prefers, no matter which lens each of us uses, down at the bottom there's a single question even more important than the perennial argument of "Are games art?"

Our core issue is this: what are video games?

Michael Abbott over at The Brainy Gamer launched this most recent salvo with Games Aren't Clocks:

I say it's time to let go of our preoccupation with gameplay as the primary criterion upon which to evaluate a game's merits. It's time to stop fetishizing mechanics as the defining aspect of game design. Designers must be free to arrange their priorities as they wish - and, increasingly, they are. Critics, too, must be nimble and open-minded enough to consider gameplay as one among many other useful criteria on which to judge a game's quality and aspirations.

This caused a nearly instant rejoinder from journalist Dennis Scimeca at his personal blog, Punching Snakes, in which he asserted that actually, Games ARE Clocks:

Video games can afford to suffer some modicum of technical errors and still be playable – we routinely look past the regularly-scheduled bugs in Bethesda titles all the time without letting them ruin our fun – but if their mechanics are so broken so as to preclude play? Without play, there is no game, at which point nothing else matters.

I think the salient aspect of Abbott’s post starts midway through, when he expresses his frustration with the term “video game.” Rather than trying to redefine what the term means, in order to fit everything inside the same, comfortable box, however, I think we need new language entirely.

A few paragraphs later, he continues:

I might argue that The Sims has never been a video game, for the same lack of victory conditions. It is a simulation, a digital sandbox, and winning or losing has nothing to do with it. When competition ceases to be part of the equation, I think an object’s definition as a game should immediately be called into question. We don’t do this because even if we determined that “video game” no longer works as a descriptor, we have no fallback positions or options available.

It's an interesting debate, to me, because I think that in their own ways, both gentlemen are quite right.  Games are more than the sum of their mechanics, to many of us, and the word "game" is also loaded with connotations that may not apply to our modern interactive narratives.

Where I've gotten caught up, though, is in this idea of "winning" and "losing."  I don't think they've been the right terms to discuss game completion for a very long time.  BioShock isn't chess,  Plants vs Zombies isn't basketball, and Tetris isn't poker.  How do you decide if you're "winning" the character arc of Mass Effect, Fallout: New Vegas, or Fable III?

At its most basic, a game is something playable.  Whether it's got a story or not, no matter the genre, system, or type, a game is something that requires player input.  You, the consumer, are in some way integral to this experience.  Whether you push one button or speak a word into a microphone, whether you wave your arms at a motion sensor or deliberately hold still when you could act -- a game requires you to contribute.  That's the sum total of the agreement on our current definition of "gaming," and really that's quite a low bar.  Small wonder, then, that we keep looping through these arguments.
We don't just have a win / lose dichotomy anymore.  We do have completion and backlog; we have sandbox and short story.  But every title I can think of -- every title I've ever played and a thousand more I haven't -- has either a failure state or a success metric, and some have both.  Our metrics aren't necessarily competitive, and they might be imposed by the player rather than intrinsically by the game.  There are little successes and big ones, game-ending failures and completely surmountable ones, but every pixellated problem I've ever pounced on has at least one or the other.

(If at first you don't succeed, you fail.)

Writing about L. A. Noire and death in gaming back to back started me down the path of contemplating the failure state in general.  I hadn't really given it any thought before, but recently I've started to understand just how important it is.  Coupling the failure state with the success state (and no, they are not necessarily binary opposites) creates pretty much our entire dynamic of gaming.

Depending on the sort of player you are, this is either a total failure, or a smashing success.

While I was starting to muse aloud on this idea on Twitter, Mattie and Line challenged me with The Sims.  That challenge leads to a critical point: player-determined goals are still crucial goals.  Your Sims can fail at their own little lives: going hungry, getting fired, burning the house down, or getting dumped by SimSpouse.  But it is common to play the game aiming for maximum drama in SimLives -- so, the argument runs, those aren't failure states at all.  They're successes.  That's all well and good, but the players who want SimHouse to burn down still have failure conditions available: the scenario in which the house, in fact, does not burn down.  The standard failure and success metrics, as envisioned by the designers, might be reversed but there are still measurable goals present, waiting to be accomplished. 

To a certain extent, most success goals can be said to be player-determined.  What's true success in Peggle: beating the story mode, or going back for an Ace and a 100% on every level?  What's good enough in Tetris: getting to level 10?  Beating your own old high score?  Beating someone else's?  What's a successful play-through of Mass Effect: paragon, renegade, or somewhere in between?

Even in Minecraft, the most popular sandbox to come along in gaming since die were first rolled for stat sheets, there are successes and failures.  Both wear many faces, of course.  But success can look like this:

Image source:

And failure can look (comically) like this:

Creation and destruction are player goals, rather than creator goals, but the game itself is still a set of tools that enables the player to achieve those goals (building a nice house, which is the sum of many smaller goals) or fail in them (committing accidental arson while installing the fireplace).

A huge amount of our gaming, though, is deliberately narrative.  Most of the games that I play certainly are.  This year alone has seen me in Fable III, Portal 2, Enslaved: Odyssey to the West, Fallout: New Vegas, Bastion, L.A. Noire, both Mass Effect titles, and another dozen or two that I can't immediately call to mind.  These are all cinematic stories, designed with beginnings, middles, and ends; the mechanics of their telling are a vehicle to carry us from plot point to plot point, mainly via weaponry.

Stories don't have failure conditions, but they do have endings.  Story-based games often have clear fail states, though, and that's the game over screen.  Your character has died, or the setback you face is so adverse there can be no overcoming it.  Game over, mission failed, you suck at shooting bad guys so your planet is destroyed.  Go back to a save point and try again.

Of course, sometimes they're just kidding about "game over."

But a game like Mass Effect doesn't need to rely as heavily on the fail states (though the game over screen most certainly exists), because its relying on the player input to define the character.  We care about keeping Shepard alive in the face of certain doom, but we tend to care more about whether she aims for diplomatic solutions, or shoots a guy in the face.  A failure state in Mass Effect 2 doesn't look like the game over screen given to the player if a mission goes bad; it looks like being unable to keep one of your crew members loyal, or like being unable to keep one in line.  We're playing to achieve the successes, in whichever form we feel they take, rather than to avoid the failures.

Most narrative games don't take the "define this character for yourself" trajectory that BioWare titles are famous for, of course, but they still rely on that delicate combination of success and failure.  If you're playing Phoenix Wright, the game is completely on rails.  But it has fail states: you can press the wrong statement or present the wrong evidence.  You need to have a decent understanding of what's going on in order to make correct accusations and put the evidence together properly.  And you can get it wrong to the point of seeing a "game over" screen.  (Unless you're me, and save compulsively, and reload if you're doing badly.)  Success in meeting goals -- finding evidence, correctly questioning a witness, or surviving a cross-examination -- will advance the story to the next set of goals. 

Purple's the evil one.
My most beloved games of old literally do not have a fail state.  The classic LucasArts SCUMM-engine adventure games -- Monkey Island 1 and 2; Day of the Tentacle, Loom, and more -- were revolutionary in that the player literally could not get permanently stuck or die.  (As compared to the Sierra adventure games of the era, which were death-happy, or to older games like Zork, where you could waste hours playing on past the point where you'd already screwed yourself over.)  Rather than ending with failure, the games rely on continued success.  These stories have natural bottlenecks built in: the narrative will not continue until you figure out what Bernard should do with that hamster or how Guybrush can use the rubber chicken with a pulley in the middle.  There are items that need to be found, contraptions that need to be built, and discussions that need to be had in order for the player to progress.

In a sense, these games -- of which you could easily argue L.A. Noire is the most recent descendant -- are very proactive.  Reliance on cut-scenes is very low and mainly, non-playable sequences are just showing the consequences of whatever action the player just took.  The absence of a game over screen may remove a certain kind of tension from the story, but it also removes a major source of potential frustration for the player.

With all of this said, it's true that not every game has a visible set of goals, or any available success or failure metrics.  There are titles out there that deliberately subvert the very idea of success and failure states; this is where I would say the avant-garde of gaming truly lies.  From one point of view, The Stanley Parable has six failure states.  From another point of view, it has six success states.  What it actually has are six conclusions and ways to reach them, the ultimate meanings of which are left to the player.  None are particularly desirable (at least, of the ones I saw); nor is any one better or worse than the others.  An existential crisis in every box!

The Path is another art game that subverts the idea of success and failure states.  There are six player characters; each girl has a starting point and is told to go to an ending point via the given path.  The game, such as it is, happens in the experiences along the way; the journey is the destination and the destination is incidental.  Grandmother's house is more of a concept than a crucial place to be.

One of six sisters finding maturity, sexuality, and experiential horror between home and Grandmother's.

The avant-garde exists deliberately to undermine the tropes and tools of our media.  That's what it's for, and I have long thought gaming would truly come into its own as an art form when a thriving independent and avant-garde scene could generate new ideas that would, in time, filter into mainstream development.  Film history and the histories of other arts have evolved along this path, and evolving technology and the ubiquity of distribution venues (i.e. the internet) have now made the production and release of art games common.

Aside from deliberately subversive arguable non-game experiences like The Stanley Parable (see Line Hollis for links to and reviews of more obscure art games than you can imagine), I don't think I've ever played any interactive digital experience in the "game" category that didn't have either some kind of failure or some kind of success built in.  Even the visual poem Flower partakes: you can't really fail (I was dreadful at using the motion controls, but as I recall you just keep trying, except perhaps for the stormy level), but as with a classic adventure game, you do need actively to succeed to continue.

If a game had absolutely no success metrics or failure states in any form, whether intentional or untentional, direct or subverted, dictated or player-driven, would it still be a game?  Maybe, in the same way Andy Warhol's Empire is still a film.

So, after all of this, we come back around to Dennis and to Michael.  As much as I think Dennis is wrong to assert that these digital experiences we all enjoy aren't "games," he's also right.  That is: we have to use the existing vocabulary for the time being, even if only to transition away from it as our discussion evolves.  We've only got so many words right now, and we -- players, critics, and designers -- need to be on the same plane to communicate.

But is "video game" really the right term for the transcendent, new immersive-media experience Michael seems to covet?  As long as those experiences have discrete goals, and as long as player input determines the failure or success of those goals, I think we can use the words we have.  We have a while yet to revisit our lexicon; I hope we've decided what to call the experience before we get to the point where the Holodeck actually shoots back.


  1. Agreed on both counts. What I see as the great failure of motion-based controllers is that they are anything but seamless. I haven't used Move yet, but I have a Wii and have played Kinect games, as well as previous-gen attempts at similar things (I still have a U-Force), and I've yet to play a game that had seamless motion control. "Mostly seamless" doesn't work for exactly the reasons you mention. (If I want frustration in sports, I'll play for real. I don't need to swing my arm forward and have the Wiimote decide "sorry, you actually let go of the ball already, you got a 3.")

    Yes; with few exceptions, arcade games have failure states whether or not they have victory conditions, and I think that drove a lot of console game development up through the NES era, when we finally started to get some depth.

    Actually, thinking about it, I don't think that's quite right. '80s and '90s games had failure conditions; I think you may have used these interchangeably, and maybe this is just the programmer in me (or the English minor; grammar Nazi ftw), but I think there are differences, and I think those differences touch more closely on what you expanded.

    Those "classic" games (Asteroids, Pac-Man, and so on) had end-of-game conditions: run out of lives or time or whatever and that was it. There were rarely victory conditions (I never did like Dragon's Lair - I'm not one for memorization, although ironically I do have a good memory), but there were also no intermediate states in the sense that games have them now. Sure, you could beat the cherry stage at Pac-Man and move to strawberry, but that's not quite the same ... I don't think milestones really count as success states. Most games didn't have failure states, either: you couldn't fail and continue. (There were exceptions, like the bonus stages in Galaga, but that was still a purely linear path.)

    Now, many games have success and failure states, but not necessarily win-the-game or lose-the-game conditions. Those states are a large part of gaming, in no small part to Microsoft's achievements/gamerscore idea: give us little things to do within a game, and suddenly there is that much more of a reason to go back and play it again, or to play off the beaten path, so to speak. So The Sims is definitely a video game because it has these states: burn down the house, have three kids, learn to play a musical instrument, what have you. Fallout 3 and Fable II have win-the-game conditions as well as success and failure states. Most sports games have modes that don't even have end-game conditions ... everything is a state.

    I think that, really, is what is driving the current generation of games, the idea that you can play a game for a while, put it down, and come back later to pick up where you left off ... and then do so as many times as you want. There are still hugely popular games with victory or failure conditions (Bejeweled, for example), but even those have things to play for in addition to the end-game conditions. (In a sense, they become meta-games: within the larger game, each individual game is a goal, or contributes to a goal. Play 100 times, match 10,000 games, score 500,000 points ... the idea is to get you to think of Bejeweled as a series of games rather than a single game.)

    And in all those cases, those really are player-determined goals. Which achievements do I want to get? What kinds of achievements do I want to try for: extremely rare? Grinding? Online? Local multiplayer? Can I create my own in-game goals, even if I track them out-of-game? (I'm sure I'm not the only person who built an Excel pivot table with races, classes, and high scores from playing Dungeon Crawl.) The modern game-playing experience is much more user-controlled than it was before (in more than the literal sense), and that's a very good thing.

  2. My reaction?  Games aren't clocks, they are dogs

  3. I think that gaming definitely does require at least some focus on gameplay. If player input is what defines the game world, then mechanics can make or break it, or at the very least define replay value.

    However, I do agree that our lexicon needs to be reevaluated. Still, I'm fairly sure there will never be a "unified theory" of gaming, a virtual catch-all that describes or sets a rubric for a good game. Games are an extremely personal experience in nature, and what is a good game for one person may be terrible for another. I personally am usually drawn in by games with some sort of engrossing storyline, whether it's an incredibly strong story (as in the Uncharted series), or a story that requires almost entirely player input for it's outcome (such as Dragon Age). I enjoy games without stories, like Minecraft, but generally speaking if I'm not being told a story, I get tired of the game after a while and feel as if I am spinning my wheels. This is the reason I don't play MMO's all that often.

    This is not to say that those games are bad - there are fantastic games with absolutely no storyline. However, because games are such a subjective experience, I wonder if game reviewing can ever be objective at all? There are some objective standards upon which a game can be judged, and mechanics are some of the most concrete things we can point to as being "good" or "bad."


  4. I would argue your biggest problem here is the word "game".  Just because I'm entertaining myself with an interactive program doesn't make it a game.  

    If the kids and I pull out the legos and build random stuff, we may be playing, but few people would call it a game.  In the same vein, if I pop open Minecraft and build random stuff, I'm playing, but it's not really a game.

    Goals as well do not automatically make something a game.  I can set a goal to do the laundry, but when I accomplish this goal, even if I find an entertaining way to do it, it's not a game unless I make into a game.  If the wife and I decide to race to see who can fold the most clothing, at that point it becomes a game.
    In my opinion, putting "video" before "game" doesn't change the meaning of the word.  If you wouldn't call it a game in a "real world" format, it wouldn't be a game in a digital format either.