Search the Community

Showing results for tags 'playtests'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • Design
    • Projects
    • Design Discussion
    • Tools & Tutorials
  • Off Topic
    • Games Discussion
    • General Discussion
    • Site Support & Feedback


  • Articles
  • NLD Originals
  • News
  • Projects


  • NLD Dev Blog

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



About Me

Found 6 results

  1. Pascal works as a freelance game designer and creative director since 1995. He was commissioned by major studios and publishers including Activision, SCEE, Ubisoft and DICE. In particular, he was Lead Level Designer on the multiplayer versions of both Splinter Cell - Pandora Tomorrow and Chaos Theory, lead game designer on Alone In The Dark - The New Nightmare and Creative Director on Wanted – Weapons of Fate. Leveraging his console design experience, he is also working on mobile games, including freemium ones. His first game for mobile platforms, The One Hope, was published in 2007 by the publishers Gmedia and received the Best In Gaming award at the 2009 Digital Media Awards of Dublin. Proximity, responsiveness, relevance... these are the watchwords of efficient playtests. In the previous installment of this article, I had explored the reasons for the rising importance of playtests in game development. In an industry where games represent increasingly high financial risks for publishers, playtests have come to function as a strong guarantee for quality gameplay. I will share with you today my experience regarding the methodology employed in preparing and conducting them. Heeding the Clients: The Design Teams Foremost, one must be aware of a fundamental say: the role of playtests is not to redo the design in place of the design teams -- for either game or level design. They are instead conducted to help them. This observation is crucial, because it drives the entire approach to playtests. Firstly, we must respect the hard work of the design teams. Having had my own responsibilities in game and level design, I know how difficult it is to make "a good game". We must respect those who put their whole hearts into building the best game possible; we must not scorn or undervalue their work. Secondly, playtests must adapt to the needs of the design teams. Good tuning for maps or gameplay mechanics is often the result of trial and error. Knowing this, designers should require experimentation; playtests can afford them the opportunity to test out their hypotheses regarding design issues, and must therefore adapt to particular needs as they arise. Lastly, playtest results must be made available to the concerned parties as soon as possible, as time allotted for game development is always short. Preparing a Playtest Campaign A playtest campaign generally requires around one month of preparation. We must first define its objectives, because they will determine what types of playtesters we shall have to recruit, the scale of the sessions (1, 2, 4, 8, 12 players), and their duration (from half a day to a full week). We will also have to attend to the logistics as well as the legal framework (non-disclosure agreement, eventual monetary compensation for playtesters when sessions last over a half-day, etc.) And we must, of course, prepare the design teams to effectively utilize the playtests. One does not grow the best crops in dry land; a playtest's effectiveness is rooted in the playtesters themselves. Half the battle in running an effective playtest campaign lies in wisely choosing playtesters, which requires investment of time, energy, and perhaps a bit of money and patience. Recruiting takes time: we must not only hire as many candidates as possible (in order to have a solid pool of playtesters). We must also evaluate them. The purpose of evaluation is obviously to judge the candidate's gaming competence, but also his ability for analysis and self-expression. Evaluation may take several forms. An initial selection can be done through a more or less thorough questionnaire, to be completed by the candidate. The true evaluation, however, must be done during the sessions themselves, where we can observe the candidates at play. We must establish a protocol for obtaining the most consistent results possible. There is no "all-purpose" evaluation protocol; we must also be able to adapt to specific circumstances as the situation mandates. When I built a playtest structure at the Bucarest Ubisoft office, I encountered an interesting problem: we needed playtests for console games, but all the players we could find locally were exclusively PC gamers. I had to set up a specific protocol to evaluate the ease with which our Romanian candidates could adapt to console gaming. Ubisoft's Splinter Cell: Chaos Theory The protocol consisted of briefly explaining the gameplay controls of a complex game (the multi-player mode in Splinter Cell: Chaos Theory), and then setting them loose in the game in order to gauge the speed at which they adapted to the gameplay. This selection method proved to be quite efficient. Candidate selection must therefore be done according to a given playtest campaign's objectives. We may have need of only extremely skilled players who have already mastered the genre, or we may require novices, if the objective is to playtest the accessibility of the game. Communication regarding playtests also takes time. Before candidates can turn up on your doorstep, they must first be made aware of your need. In my experience, while recruiting through generic classified ads will yield a high number of candidates, many will be too young (careful of those labor laws!), and most will be only casual gamers. A good way to recruit experienced players is to make use of forums, gaming clans or specialized stores. It takes much more time but I always got great playtesters this way. In playtesting, quality matters more than quantity! Organizing the Sessions I shall address three aspects of playtest organization: the composition of the team, the preparation of the playtest protocol, and its logistics. Recruiting must start at least four or five days before the session itself. At this stage, the playtest manager already has access to a database of candidates that have already been evaluated or, at least, identified. He can thereby choose his playtesters according to the session's theme. Invites are sent by e-mail. At this point, we realize the importance of having a great number of candidates, since most are not available at will. We must therefore engage in mass-mailing to ensure sufficient availability of playtesters come session day. It is also best to invite at least one more playtester than necessary, since last minute withdrawals are commonplace. It is also usually a good idea to ask playtesters to confirm their presence via e-mail. Protocol setup is an important part of session preparation. Some playtests are organized near the end of the development cycle, to tune up maps or the game system. The protocol for this type of playtest is often straightforward: we must allow the playtesters to play for a maximum of time, note game statistics, and organize open Q&A sessions. The time when playtests are most useful, however, is during earlier stages of the development cycle, when the game system and maps are still in gestation. Let us not forget that the earlier we detect any issues, the easier and cheaper it will be to correct them. During the development of maps for the multiplayer version of Splinter Cell: Chaos Theory, I had organized playtests to evaluate the structure of the then still-embryonic maps. I specifically remember the Aquarius map: By having it tested by highly experienced playtesters, we -- including the level designer who had built the map -- quickly realized that the map was far too large. Having noticed this problem, he immediately rebuilt his map, which took little time as the map was still just a prototype. It took him a few iterations to downsize his map to the optimal size. In the end, Aquarius became one of the game's most popular maps. Ubisoft's Splinter Cell: Pandora Tomorrow Playtests allow us to shed light on many problems and to validate (or invalidate) hypotheses set by the design team. During the development of the multiplayer version of Splinter Cell: Pandora Tomorrow, specific playtests were undertaken with the purpose of tweaking the characteristics of certain pieces of equipment, such as the smoke grenade. The latter is one of the most-used accessories by the spies, since its cloud slows down the spy's opponents (the mercenaries), and it can even put them to sleep if they stay too long in its area of effect. Tuning the smoke grenade's parameters was not so simple -- if its range was too wide, it would be an unstoppable weapon for the attackers (they would simply need to employ a single grenade in a corridor to block any access by their opponents). On the other hand, if the grenade's effect zone were too small, the weapon would be completely useless (defenders have vision modes allowing them partial visibility through the cloud). Finding the right values took us a lot of time. Lastly, to be relevant, protocols must adapt to problems encountered in previous sessions as well as to the test requests put forth by the design team. This commensurability with the development team's needs is one of the hallmarks of a successful playtest. I shall address this point later on. Let us now talk about logistics. Good playtests require a stable build of the game without too many bugs. When directing playtests in the middle of the development cycle, this may be easier said than done. Regardless, the game must be sufficiently stable, and maps must be rid of the most detrimental bugs (such as the inability to climb a ladder, for example). A game delivery protocol must be set up with the development team. The latter must deliver a playtest-ready version of the game to the internal debug team, which will rapidly review the game to ensure that the version is playtestable. When issues arise, cooperation between the debug and development teams will allow for swift corrections of issues, and subsequently the production of a stable version suitable for playtests. Such organizational finesse requires a lot of discipline from all of the teams involved. Another good practice is to prepare a checklist for the level and graphic designers, so that they can make sure that their own maps are free of blocker bugs. Finally, the playtest session manager himself must make sure that the version is indeed playable. Playtest Sessions Playtests are especially instructive when design team personnel attend the sessions; indeed, a game or level designer will base his work on ideas he will formulate upon observing the behavior of the players. However, players do not always react as expected, and we must take their diversity into account. By seeing with his own eyes how real players use equipment or navigate a map's topology, and by asking them the reasons for their behavior at the end of the session, the designer can rapidly make optimizing adjustments -- a demonstration is always more efficient than a long speech! It is thus highly recommended to encourage the designers to attend the playtests. That's why I strongly recommend that playtests should be conducted on the premises of the development studio itself. Remote playtests are valuable for tweaking map and system settings, but less so for playtests on an embryonic game. Obviously, playtest observers must follow certain rules: they must not voice their comments or ask any questions until they are authorized by the playtest session manager, in order to preclude influencing the game session or the playtesters' judgement. If it is desirable for designers to attend the playtests, it is simply essential that the playtest session manager does so. He must not simply organize the session and ask his questions at the end; he must actually watch the playtesters at play. The reason is as follows: early playtests often have a limited number of playtesters, and the problems found are liable to be numerous. This fact is likely to affect the relevancy of feedback received, rendering it inconsistent at best and flat-out contradictory at worst. The manager must take all of this into account, evaluating the relevance of the feedback himself. Note, however, that the involvement of the playtest manager can be cause for controversy. In some cases, a playtest manager must simply behave as a mere observer; in fact, this is generally the best attitude to have during playtests occurring later in the game development, when it is time to fine-tune game system settings. The objective at this point is to collect a maximum of statistical data from a high number of playtesters. By contrast, during early playtesting meant to evaluate the strengths and weaknesses of embryonic maps or game systems, the comparatively low quantity and greater heterogeneity of the collected data require a more aggressive, reactive, and direct involvement on the part of the manager. At this point, he must necessarily "get his hands dirty", as he'll be working with incomplete data. While there is a risk of error here, my experience has shown me that playtest results are actually more concrete at this stage, and thus more useful. My experience amidst one of the best development studios in France has taught me that the playtest manager must be wholly invested in the final quality of the game, and must not be content with being a mere observer. This conclusion once again indicates the need for a close relationship between the playtest and the development teams. Debriefing We thus arrive at the final result of a playtest session. The general idea is to bring the playtest conclusions as quickly as possible to those who most need it -- generally the designers and project leaders. Debriefing may take several forms. First, design team members who observed the playtests may put their most pressing or immediate questions to the playtesters. They often leave the playtesting room with some strong ideas burning in their mind. Then comes the report, which must make a clear distinction between the facts (statistics etc.), opinions from the playtesters, and the manager's own observations and conclusions. Raw data must be provided so that the designers know on which bases the manager drew his conclusions. Putting all the cards on the table is a good way to establish trust with the ones who will read the report. Let us not forget that the purpose of playtests is to improve the game, and not to settle scores. A full-fledged report takes time to compile and to write so a shorter, intermediary debriefing might be needed if the needs for crucial feedbacks is urgent. As a final note, I'll mention that I had begun to experiment at the Milan Ubisoft studio with a protocol allowing a remote office (in another city or even another country) to obtain a hot report on a map playtest. Named D3 for "Debrief Dynamique à Distance" (Remote Dynamic Debrief), this protocol consists in quickly establishing a list of the main open issues, and organizing an online session where the concerned designers (at the development office) and the playtest session managers (at the playtest office) can log on. They can then explore the maps while the playtest team explains the issues with much precision, and all can work together in developing possible solutions. A playtester may even join them, contributing further to the dialogue. Source: Follow Pascal Website: Twitter: Follow Next Level Design Join the Forum: Follow us on Twitter: Discuss on Discord:
  2. Pascal works as a freelance game designer and creative director since 1995. He was commissioned by major studios and publishers including Activision, SCEE, Ubisoft and DICE. In particular, he was Lead Level Designer on the multiplayer versions of both Splinter Cell - Pandora Tomorrow and Chaos Theory, lead game designer on Alone In The Dark - The New Nightmare and Creative Director on Wanted – Weapons of Fate. Leveraging his console design experience, he is also working on mobile games, including freemium ones. His first game for mobile platforms, The One Hope, was published in 2007 by the publishers Gmedia and received the Best In Gaming award at the 2009 Digital Media Awards of Dublin. There is nothing new about asking testers for their feedback on a game in development. However, the practice of managing playtests by following near-scientific protocols, and of integrating them very early in the development cycle, is a more recent trend. The spread of real playtests in the game development cycle is probably part of this silent revolution; a revolution profoundly affecting the development environment. How? Playtests force game development to center around the players instead of the hopes of the development team. Let's look at the effects of this shifted focus: Playtests allow the identification of gameplay or level design flaws that could elude the grasp of normal testers. After all, testers are always seasoned gamers who are not necessarily representative of the target audience. Who better than a casual gamer to pinpoint issues related to the difficulty curve or the overall understanding of the game? Playtests fulfill a moderator role in situations of disagreement or controversy within the design team. A series of playtests can quickly settle a contested issue by resolving almost any counter-argument or dispute, thereby preventing the disagreement from spiralling into an impasse. Playtesting is also a management tool. The partnership between playtesting and design can be very constructive. For example, it can be quite instructive for game and level designers to observe gameplay during playtesting, allowing them to immediately determine whether or not particular aspects of their design work as planned. Playtests executed on pre-prod mock-ups allow the anticipation of problems very early on, as well as timely corrections of said problems (the faster a problem is corrected in the development cycle, the less expensive it is). Game development can therefore become truly "player-centric". According to the playtest protocol and the selection of playtesters (hardcore, casual, etc.), playtests allow the examination of a specific aspect of the game with heightened acuity: game balance, navigation, understanding of the game objectives, etc. We all have the opportunity to play games that display high production values but nonetheless suffer from obvious flaws: erratic difficulty curve early in the game, navigation issues, overly complex interface, and so on. Such flaws could often have been easily avoided if they had been identified early enough. Major names in the industry understand this quite well, such as Ubisoft, which possesses qualified teams and invest lot of resources in this aspect of game development. What kind of problems might we fix or prevent with playtests? Some examples include: Accessibility and ease of use (interface, navigation within the game, etc.). Identification of sure-fire-wins, i.e. strategies allowing a player to easily overcome any challenge created by the designers and therefore remove any interest in the game or the current mission. This issue is especially sensitive for multiplayer maps. Fine-tuning of the game system: experience has shown me that the intensity of use of game features (weapons, equipment, actions, etc.) tends to vary considerably according to a number of factors. These include player profiles, the time a given player spends on familiarizing himself with the game, and of course the game tuning itself. Only through long-term playtests with relevant samples of players can we ensure that the game tuning maintains its balance and relevance even after long hours of gaming. Analysis of the early reactions of different categories of players during their first session. This will highlight their first impressions and initial frustrations. Some game demos have probably had a negative effect on the marketing of games they were meant to promote because of accessibility and tuning issues that could have easily been spotted during playtesting. For multiplayer games, the robustness of the game system and the potential of maps. I have had several opportunities to delve deeply into playtest management. I built the playtest structure from scratch at the Ubisoft Annecy studio, where the successful multi-player "versus" modes of Splinter Cell: Pandora Tomorrow and Chaos Theory were developed. I set up the recruiting methods, playtest protocols, and the debriefing methods employed in this program. I also set up a playtest cell at the Ubisoft Bucarest office and led playtests there myself. Playtests have changed the way I perceive my job as creative director, so I feel the need to share my experience with everyone. Let us start with a definition. Playtests consist in analyzing the reactions of a representative pool of players toward gameplay in order to improve the final game and to make sure it matches their expectations. Some will argue that game testing is nothing new. True, but real playtests have nothing to do with the debug testing executed at the end of the development cycle. Traditionally, game designers ask testers for their opinions. Testers are often excellent players and are therefore not always representative of the targeted demographic which is often made up of mainstream gamers. Moreover, testers generally get to know a game so deeply that their knowledge of it strengths and weaknesses profoundly influences the way they play. Therefore, they do not play as someone who discovers the game for the first time. Well-executed playtests allow us to evaluate gameplay strengths and weaknesses with great accuracy since they rely on two solid principles: The careful selection of playtesters. The use of ad-hoc protocols. The Selection of Playtesters Just as a peasant needs fertile ground in order to ultimately obtain the best yields, good playtests require a group of carefully-selected playtesters. I could never insist hard enough on the importance of the recruitment and evaluation of the playtest candidates. What are the recruiting criteria? This depends, of course, on what kind of playtests we are planning. We may need hardened gamers, beginners, console-only gamers, multiplayer fans, and so on. The candidate's gaming proficiency and overall game culture represent the first criteria. The second is the candidate's ability for analyzing and drawing conclusions from their gaming experience. Note, however, that it is not mandatory that a playtester should possess a high level of competence on both criteria. Again, the type of playtests will determine the requirements. I have the utmost respect for the playtesters I have worked with. Their good will and enthusiasm are boundless. Many came to Annecy from distant cities like Lyon, Grenoble, or Belfort simply for an unpaid half-day session! This generosity and enthusiasm are characteristics of our industry; let us nurture these characteristics by treating playtesters with the gratitude and respect that they deserve. The Use of Ad-hoc Protocols The protocol is the unifying thread of the playtest session, defining the objectives, allocation of resources, and especially the methods of collecting and parsing information for a given playtest. The playtest protocol needs to adapt to the specifics of the challenge at hand (game system tuning, navigation, map concept, etc.). During the playtest campaigns that I led, I would prepare a different protocol for each session. Indeed, an important part of those playtests involved multiplayer maps under construction or game system tuning. Each session revealed specific problems to be analyzed in the subsequent session. I shall conclude this first part by repeating that a playtest campaign must be directed with a true scientific rigor if it is to be of any use; one does not conduct playtests simply by bringing over one's buddies for a few hours of fun followed by a session of easygoing Q&As. Each aspect of the session must be carefully tailored in order to best realize the objectives at hand. Managing the session itself requires constant attention, not only because one can learn much by watching the playtesters in action, but also because things do not always go as planned! I shall address concrete aspects of playtests in the second part of this article. Source: Follow Pascal Website: Twitter: Follow Next Level Design Join the Forum: Follow us on Twitter: Discuss on Discord:
  3. Recently through the mysterious and tenuous connections of social media, I was asked a few questions about the game design of Halo multiplayer. Yes, the first Halo. Combat Evolved. Yes, I know that game came out when dinosaurs still roamed the Earth, but there are still a few things about the development process that might be interesting to designers. One question in particular caught my attention: “Was quick-camo intentional?” Paraphrased, I read the question this way: "When the player picks up the Active Camo powerup, they turn invisible. If they shoot while they're invisible, they become visible for a while. But some weapons seem to make the player fade in and then back out of view faster than others. Was that intentional?" The answer is related to one of my Universal Truths of Game Design. The Universal Truths are rules that I have figured out throughout my career in game development. I know they're true, because I have followed these rules and succeeded, and I've ignored them (or just been ignorant of them) and failed. In this case, the answer comes from; UNIVERSAL TRUTH #3: You must create a mental model That means that, as a designer you must create a theoretical model that describes how the systems in the game should act with each other. Game data design and balancing is an incredibly complex task. As anyone who has ever opened up a set of modern game tools knows, there are an overwhelming number of places where a designer can change numbers that can affect how the in-game systems behave. Here’s an example picture of an open game toolset that I grabbed off the web: It’s a pretty typical screenshot of a set of development tools. There are windows that allow the designer to place objects in 3D space, and along the right side of the screen there are a bunch of folders that hold different types of data that you can fiddle with. And adjusting any of the numbers will change what happens in the game. I’ve seen it happen many times, a good game designer is tasked with making the game more fun and, faced with the complexity of that job, gets overwhelmed and doesn’t know what to change to make the gameplay better. At best, a designer stuck in that situation is ineffective. At worst, the game sucks because of them. In my process, I make a mental model of how I think the system should work. It gives me a place to start figuring out what numbers to change, and in what ways I need to change them. From there, I adjust the data values to suit that model. And the more rigorous I am with my mental model, the more confidence I have when I'm adjusting the sea of numbers in front of me. Let me give you an example. As we were working on Halo, the team lead’s first choice was to make the guns work the exact same way in single player and multiplayer. The responsibility for balancing all those numbers had been given to a senior designer on the project, but the general feeling was that his changes were not making the game more fun (see above). I talked things over with Jason Jones (the creative genius at the core of Bungie) and he and I agreed that somebody with more experience in game balance needed to take over the job. Initially, Jason volunteered to handle it all himself. As the man behind the game balance of Myth, and the Marathon series of shooters he was more than capable of the job. But I pointed out that multiplayer would have very different needs for the guns than the single player team. Weapons in the hands of dumb AI bad guys need to provide fun challenges for the player to overcome, but weapons in the hands of a player are a different matter. As a quick demonstration, think about the gunfights in Halo. In most cases, encounters have multiple bad guys shooting at you at one time. Each gun can be adjusted to be a little bit weaker in enemy hands so that player (the hero of the story) doesn't get overwhelmed. But in multiplayer, most decisive fights are one on one. Guns needed to be unique and powerful. I also pointed out that if we just used one set of data, as I was changing the gun data for multiplayer I might be damaging the overall balance of the single player game. Jason agreed, and we decided to "branch" the data and create two versions of the numbers, one for single player and the other for multiplayer. So starting out, I had a handful of guns with some data already attached to them based on the single player game. I had the freedom to change whatever I wanted. All I needed to do was figure out how to make the fighting fun. I needed a roadmap to follow. A mental model. But where to start? Follow this link to read this section of the article: But just making the Halo multiplayer weapons respect their roles in the matrix really wasn’t enough. That’s kind of “first-person shooter design 101.” In a first-person shooter, weapons are the stars of the show. They need to look good, and sound good. They need awesome animations. They need to be effective in their roles and they have to make the player feel powerful and competent. But perhaps most importantly, they need to reward player mastery. To accomplish that, the design needed depth. Universal Truth Number Three (Part 2) I characterize depth as game systems or balance details that are included to enrich the experience of the player, but that are not necessarily explained or documented. They’re meant to be discovered and exploited as players’ expertise with the game grows. There are lots of great examples of what I’m talking about in all types of games, but I will offer up a couple of made-up examples for illustration: Example Game 1 is Wizard Warts, a fantasy role-playing game about a cabal of magical toads set deep in a haunted swamp. Pollywogs evolve into acolytes - able to hop, swim, wear armor and use weapons. But once they grow strong enough in the shallow waters around their home, they can quest deep into the swamp to find and eat one of the legendary magic Dragon-Flies. Four different types of Dragon-Fly swarms live in the swamp: Fire, Ice, Poison/Acid and Love. Once a toad gobbles one of them up, the acolyte evolves into a Toadzard, and can thereafter belch spells powered by the type of bug-dragon they gobbled. It’s important to note that an acolyte toad can only gobble one type of magic Dragon-Fly in their life, and the choice (and the evolution into Toadzard) is irreversible. The swamp is filled with a variety of magical monsters. They are all dangerous and hostile, but we can use the data of the game to add more depth to the gameplay. For example there are Plant type monsters are more vulnerable to Fire magic and take x3 damage from any source with that description, while Undead creatures are immune to Poison spells. Notice that one of the Dragon-Flies has two “type” descriptions – Poison/Acid. I chose to include the "acid" description as part of that spell group because of the depth that I wanted to include in the design. Acquiring spell powers and evolving into a Toadzard would be a big part of the fun in the game. But if the player chooses "poison" spells and finds that they are literally useless against undead monsters, and "poison" was the only type of damage in that spell category, it could leave an entire class of Toadzard useless in some situations. That’s a very un-fun outcome to players who chose to build that type of character, and it might make the game unexpectedly difficult. Consider the example of player who decided to make their Toadzard Poison/Acid and then had to take on a tough mission against Undead bad guys. A player running into that situation might have so much difficulty that they abandon the game, and who could blame them? Dropping some "acid" in helps solve these problems. "Acid" spells could still damage undead, leaving us the freedom to make "poison" spells useless against them. At this point you might reasonably ask; "Why fight so hard to preserve that part of the design at all?" The answer is that there is a lot of potential drama in the design that occasionally makes spells useless. It aggressively forces the player to adapt their comfortable play patterns, and it might encourage players to explore more of the content in the game. Imagine the player who finds themselves in a scary predicament when the spells and strategies that they've previously counted on suddenly stop working entirely. But, as they dig into the fullness of the spell systems they find that there is a way for them to adapt to the game situation without having to start over from the beginning. A less aggressive way to achieve a similar effect would be to extend the Fire example above, and only give the monsters vulnerability to some types of spells. So for example we could include Hate type monsters that were vulnerable to Love magic and Lava type monsters that were vulnerable to Ice magic. Anyone familiar with the Pokemon series of games will recognize this precise design. It doesn't penalize players as harshly as the proposed design above, but it's also not as dramatic in the player's experience. Follow this link to read this section of the article: The interesting, and sometimes wildly frustrating thing about depth in a design is that some players never become aware of the underlying nuances. In fact there are countless examples where depth is built into games, but players don’t understand it or take advantage of it. Multiplayer games suffer the most from this kind of mismatch in player expertise, because the parts of their community that grasp the deeper elements of the design and use them often have a significant advantage over the less-knowledgeable. This can lead to all sorts of hard feelings. (if you’re a League of Legends fan, last hitting creeps should spring immediately to mind) As I mentioned earlier, depth in the game balance can exist without being documented anywhere else. Players will feel the effects as they play and hopefully they’ll pick up on the subtleties and learn how to exploit the design. But for that to work well the design needs to make some kind of intuitive sense to the player. In the Wizard Warts example, the player would glean that Fire is extra dangerous to plants. That's a common trope in games and of course; wood burns. But the underlying logic that "poison" wouldn’t have an effect on the Undead since they don’t have a working nervous system or circulatory system is less obvious, and so might never make sense to the player base. If the game is popular enough, the players will learn how the numbers work and "play around" them, but they're liable to think there's some kind of a bug in the game. So to recap: We need a mental model with an underlying design for depth which is (hopefully) intuitive to the player. Which brings me back to the multiplayer weapons design process for Halo. I’ll explain how it all connects in my next post! Universal Truth Number Three (Part 3) I wanted the Halo weapons to have depth, so I began thinking about all the guns that were in the matrix. I needed to understand what they were, and how they fit into the design. The Human weapons were easy to understand. I’m a Human, and I know what we use guns for. But the weapons used by the aliens of the Covenant were another matter. The easiest place to start would be to simply say that the alien guns were simply analogs to the Human weapons on the matrix. The pistols, assault rifles etc. could be basically the same, only with different visual presentation. Easy, yes. But that seemed like a huge missed opportunity to add depth and richness to the game. So I started thinking why would the Covenant choose these particular weapons in the first place? We (Humans) have guns. And once guns were developed, Humans developed systems to protect people from bullets (bullet proof vests, riot shields etc.) And then in the relentless march of progress, people invented ways to kill other people inside of their body armor (armor piercing bullets etc.) Remember that at the time there wasn't a lot of settled "lore" about the game story. I decided that in my model, Human Spartan armor was created as a desperate response to the Covenant attacks. It had similar functions, like a personal shield, but was based on different technology. So how about the Covenant? There were some notes about the bad guys and their guns, but the honest truth was that the aliens shot light-up bolts of energy because they looked a lot more visually impressive coming towards the player on screen. If the bad guys shot nearly invisible bullets and you couldn't see them coming at you, it would be a total drag every time you died. But just knowing that they were colored lights wasn't going to help me balance my combat data. Clearly they had guns. And they had an equivalent to our body armor – personal energy shields. I could imagine Covenant warriors facing off against enemies across the universe with their plasma weapons blazing. Or more specifically, their Plasma Rifles. As an poor man's analog to the Human pistol, the Plasma Pistol was a pretty dull thing, only useful as a desperation choice for one of the two gun slots you were limited to. I stared at the various data fields in the Halo toolset for quite a while, trying to imagine what to do with the Plasma Pistol to make it cool. And then a question occurred to me: What if the Covenant had to fight an enemy with shields like their own? Or what if they had to fight themselves? They’d need their own armor-piercing capability. In the Halo tools, every projectile had a “shield damage” value. Most were set so that they would damage shields at a rate that matched the damage that their bullets would do to the player's health bar. None of the projectiles were really aggressively balanced against shields. And you know how I feel about data balance in a matrix! I started to experiment with making Plasma Pistol bullets designed to specifically shred shields. It was a snap to make a projectile that blew them off quickly, but then it seemed overpowered to also make those bullets do good levels of “body” damage as well. Then it occurred to me: maybe the shield-shredding effect could be assigned to a different bullet. The one assigned to the secondary fire-mode for the gun – the overcharge. This proved to be very fun. In my early playtests, I'd grab the Plasma Pistol and use the overcharge specifically to blow up the shields on enemies that I ran across. But it was frustrating when I missed the overcharged shot (full disclosure: I am a much better designer than I am a player) So to compensate, I gave the shield-busting projectile a terrifying amount of magnetism so that it would track towards whatever I shot it at. I loved it – I could overcharge the Plasma Pistol and let the shot fly, and it would whip around corners and blast targets, stripping off their shields just as I came running in behind and mowed 'em down! In the short term, I won a lot of playtest games. Unfortunately, once this tactic became known to other players, battles essentially started with “overcharged salvos” of tracking shots whipping across the battlefield. The only thing you could do was hunker down in cover and wait as the first round of supercharged shots came whipping overhead before you started moving. It was interesting to see how these data adjustments changed player behavior during our playtests, but a bunch of auto-tracking missiles wasn’t very true to the spirit of the Halo combat model which rewards player skill, fire and movement. So alas, the “super tracker” overcharged shots had to go. But I did keep some tracking, to help reduce the frustration of a player using the overcharge trick but missing the shot entirely. So my mental model of bullets/armor/armor piercing was working to create fun combat. But what else could it do for the game? Follow this link to read this section of the article: I made one other change under the hood of the Human weapons, which many people don't even realize exists at all. Jason Jones had designed the Human pistol to be the weapon of choice for players at medium/long range. The accuracy, high damage and the limited sniper zoom on the pistol made it a powerful choice for dropping enemies right at the edge of their "AI awareness" bubble, enabling players to pick off one or two targets as the enemies startled into their alert state and then came charging into battle. But it was strong. Damn strong. Frankly, it was too strong for multiplayer. I toyed with damage settings that made the multiplayer pistol weaker than it's single player counterpart. But to be honest, once it was "nerfed" it became a pale shadow of it's single player cousin and using the pistol became a lot less fun. Still, I felt that turning the full power of the pistol loose on the Halo multiplayer "sandbox" unaltered would be opening the door to endless criticism, so I decided to made a subtle change. The single player version of the pistol is "autofire" - meaning that if you hold the trigger down the weapon will repeatedly fire at the precise point you're aiming at. But... that's not true with the multiplayer version of the pistol. I wanted to at least challenge the skill level of players a little more. So the multiplayer version of the Pistol has shot spread. What that means is that, if you simply hold the trigger down and let the pistol automatically fire over and over, each bullet will deviate from the point that you're aiming at. And the amount of deviation will increase with every bullet. I wanted to make it so that players could still use the badass pistol, and it could retain the fun feeling that it had in single the single player game, but only if the player could master the technique of actually pulling the trigger with each individual shot. I still believe that this was a "righteous fix" - meaning that it was justified and the solution was (in my humble opinion) elegant within the restrictions of the established game play. Unfortunately, I lost my nerve a little bit. After all, this was a huge change from the behavior of the single player version of the Pistol. I was worried that players might have to re-train themselves to use the multiplayer version of the gun, which again might lead to huge volumes of outrage from players. So I didn't make the pistol deviate enough while auto-firing. Oh, the shots will spread if you hold the trigger down, but not so much so that you might not still get the head shot that you were aiming at. To this day, not adjusting the spread rate of auto-fire on the multiplayer pistol is one of my regrets. I wasn't aggressive enough! But hey, people still seemed to like the game. One of the things that I’m proudest of is how my mental model for Human and Covenant technologies had profound impacts on the single-player game. For example, the high camouflage ping rate of the Human weapons meant that, even late in the campaign, Human guns were ideal for exposing Covenant bad guys that were cloaked in Active Camouflage shields. A second impact was on the AI development of the game. When the mighty Chris Butcher (AI programmer for Oni and Halo) saw the changes to the Plasma Pistol, it gave him the idea to have the Jackals use the Plasma Pistol in it’s overcharged mode, along with their shields, to greatly differentiate them from the Grunts wielding Plasma Pistols and grenades. I’d like to take a moment here to talk about why I keep using the term “mental model”. You might ask “Shouldn’t the design document cover all of this?” And my answer would be that my design documents have never captured all the details of the game. I find documents valuable in helping me codify my own thinking, and they can occasionally be good tools for communicating a design to the people that are responsible for implementing it. But I've never encountered a game development team that religiously read every document produced by the game designers. And when you're actually knee-deep in making the game, you rarely have the time to fiddle around with keeping all your design documents up-to-date. So my own process has evolved to be very fluid and organic. I start with some clearly stated intentions as to what I want to accomplish with a design, and then start to build it. But along the way, I watch the design evolve and continually evaluate that process. As things happen I’m constantly deciding, “How is thing coming together? Are we going in the right direction, or should we be going another way?” So my paper specs get me started, but beyond that my mental model is constantly evolving. I once read a quote from Tim Schaffer, that I'm going to have to paraphrase heavily because I can't seem to find the original quote. He described the process of making a video game as building a puzzle out of pieces falling in slow motion. But the pieces fall at different speeds and the shape of the puzzle changes, depending on which pieces you get, and which fit. That is a very poetic and accurate description of what my process looks like: I like to toss the pieces up, and every day take a look to see what’s coming together, what’s falling behind and what shape the final form is going to take. (I apologize, but I can't find the quote out there on the web. If you find it please add it to the comments section and I'll edit this post!) So that brings us full circle, back to the one-sentence blurb question that I got via Twitter: was quick camo intentional? Yes; entirely intentional. All of the camouflage behaviors are a product of my mental model for Human and Covenant weapons, and my desire to add depth to the gameplay model for players to discover and exploit. Did it work? As I said before: often players will never know all the details included to add depth to a game. The fact that a person on Twitter was asking about that feature proves that, although my mental model was thorough and effective, it wasn’t so intuitive that players completely understood it, even after a decade of playing the game. But here’s the thing: even if an audience doesn’t understand all of the influences that shape their experience with a work of art, those influences still resonate in their mind at some level. That’s called subtext. When I watch a performance of Cirque du Soleil, I don’t know exactly what’s happening in the overall story of the performance. But I know there is a story. And my experience as an audience member is all the richer for it. There are large sections of the above article omitted here. We strongly recommend you read the articles in full via these links: Follow Hardy Youtube: Twitter: Follow Next Level Design Join the Forum: Follow us on Twitter: Discuss on Discord:
  4. IntroductionThe following is a recap of an article from David Ballard that was posted on 80 Level. Follow the link at the end of this post for the full article. In this article, David walks us through his multiplayer level design process. David explains that he had originally build for co-op play. Representation of the PlayerIn order to be able to understand the players will feel and interact inside a play space, it's critical to put yourself in digital shoes. From there, you must understand and support the overall conceptual goals and approach of the game you're designing for. Blocking Out the General SpaceAt the Blockout stage, David worked on things like geometrical focal points, movement options, and scaling. He started off with a drawing, and made adjustments as needed as he transitioned it into a 3D world. Making AdjustmentsAs is always the case in a collaborative environment, it's critical to be flexible, and able to develop creative alternatives quickly. Adding Assets to the LevelIt's time to get fancy. After plenty of playtesting and iterating, David's next step was to begin adding assets. ConclusionFinally, the level is complete. David looks back at the rewards and lessons that came of it. Source: DavidSite:
  5. So begins an interview between MP1st and Niclas Astrand, the desiger of Canals. The initial questions seek to understand how and why the map made it into the game. Next comes a question on the general approach to level development that was used. Was the popularity of Canals expected? And some final thoughts from Niclas: Source:
  6. In this 2013 article, we hear from some of the designers behind Call of Duty Modern Warfare 2 and Halo 4. In it, they talk about the developmental process levels go through.One of the earliest steps is to get something down on paper: With this basic spec decided, it's time to begin designing. At the outset, paper is weapon of choice. "To me the layout is the most important aspect of a map, so I might quickly sketch out some patterns and paths on paper to start to figure out how I want the map to play," says Smith. Echoing Smith's sentiment that the layout of the two-dimensional plan is all-important, Clopper explains that the paper design is a crucial reference point when designing the detail in 3D. "It allows us to think back to what the essence of the map is. Sometimes in the 3D realm you can go down a rabbit hole riffing on some of the smaller encounter spaces." From there, they put some additional thought into the maps flow, spawning, and weapon placement, but the primary focus is on getting the level to a playable state: The flow of play and players on a map will not become evident until testing, which begins as soon as a basic 3D model is roughed or "blocked" out, and continues as the design iterates. "I try to get a level playable as soon as possible. Multiplayer layouts need hours and hours of playtime to make adjustments to make sure the map plays well," he adds. "So there is no time to waste theorizing about how it will play; you just need to get on with it." Both teams agree that one of the primary things they look for initially in playtests is engagement distances: "Early on I am looking for the distances at which people meet: where they stop to shoot at other players and if they can even find each other," says Smith. Similarly, Halo 4's designers keep a watchful eye on distance. "We definitely have standards for the size than something can be and the time it takes from one corner of a map to the other, or one objective sight to the other," says Pearson. "It's to make sure we're tuning the experience to keep the time-to-death down, or making sure that your time-to-engagement is enough to give you a breather between dying, but not so long that you're hunting through the map and not finding people." Again, game mechanics have a direct bearing. In Halo 3, sprinting was impossible. In Halo: Reach, sprinting was a selectable armor ability. In Halo 4, everyone's at it, and the maps have grown to compensate. From there, the focus shifts to metrics that are more specific to each game: Call of Duty's multiplayer modes dial up the tension as players try to stay alive to protect their "killstreaks," chains of consecutive kills that see players rewarded with powerful ordnance that can ultimately swing the outcome of a match. "Early on in development I look to see if these locations are being used," says Smith. "If so, is it too strong a position? Can the other team clear the enemy out of the location? Is it too easy to take and no one survives there for very long? You can control the flow of the map this way." Clopper echoes the importance of balancing strongholds. "Skyline has this fantastic center structure, but also out to the wings are these two bases that can also offer a very similar kind of thing. What you'll see is fights moving from the center, flowing around the space, then coming back to the center." "What we're trying to do is sort of facilitate flow between these strongpoints and counter-strongpoints," he adds. "We want to make sure there are multiple areas and multiple strategies to facilitate flow around the map. We don't want people arriving at one strongpoint, camping out there, and then winning the game just sitting in one spot." This is but a taste of the 3 page article. Visit the source to read it in its entirety: