Agile For Gamedev - A Poor Fit

The release of Cyberpunk 2077 in a poor state last year led to a lot of debate on how game development projects should be managed. Staff were overworked, deadlines were missed, features were cut, and the game came out with serious bugs. Despite throwing more money and time at the problem, and cutting the scope, quality was still low. Calling the launch a failure would not be strictly true - the game has sold many millions of copies and apparently covered its costs - but everyone knows the industry needs to do better than this.

The classic trade-off - quality is constrained by scope, cost, and time

Anyone who's followed game development for a while will know there's nothing new here. 'Crunch time' has been a staple of game development for as long as the industry has had major projects, and was already common knowledge when the famous 'EA Spouse' blog entry was posted 17 years ago. Despite faster computers, better tools, more powerful languages, and growing experience, things don't seem to have improved much in game development.

And yet in the rest of the software development world, things have changed a lot. Perhaps the biggest change in development practices has been the move towards agile development. Agile means many things to many people but the core principles are basically that you involve the customer and/or the user more, in a collaborative and iterative process that discovers requirements as you go and responds to change. It is supposed to lead to a more manageable project.

One commentator claimed that Cyberpunk failed primarily because they did not follow agile project management processes, and particularly did not practice the specific act of 'Continuous Delivery', where the software is being 'delivered' on a regular basis (e.g. daily) to be assessed. It's reasonable to say that "bad software engineering" played a part in the problem. However, the claims that this is a failure to adhere to agile principles don't add up. The developers CD Projekt Red - like most major game developers and quite a lot of smaller ones - do in fact employ people to perform continuous delivery and 'devops thinking'. It just didn't really make a difference.

I mentioned in my introduction post that I think iterative, 'agile' development can actually cause problems for game development projects, despite being the overwhelmingly dominant paradigm. Why?

The myth of people over processes

'Agile' is a vague term and therefore any criticism of it is often batted away by resorting to some variant of the No True Scotsman fallacy. And there's some merit to that, because while the Agile manifesto talks about "individuals and interactions over processes and tools", what we actually got are Scrum and Sprints and Standups and Epics and Kanban and so on - a variety of ways in which management seeks to have the benefits of Agile without giving up its array of tools and processes to impose some sort of structure on it.

The promise was that we could follow the Agile principles and thereby enable agility among the development team. The reality is that developers work under a new set of constraints and agility is demanded of them.

How Agile helps most software

Maybe it's a price worth paying? For most software, there's a good argument that Agile makes sense, especially the principle "Welcome changing requirements, even late in development". In the world of traditional software development where a program or system is being written for a client, this principle solves a big problem: programmers have to make software that meets the needs of non-technical users and customers. As Patricia Aas said, "The users rarely know how to express what they need. You don’t have the context needed to understand what they say. Only once you’ve built something do you have sufficient understanding to ask the right questions". It's inevitable that the first draft will need extensive reworking, so Agile development embraces this reality - produce that draft as early as possible and have both sides work together in a process of gradual improvement based on early and frequent feedback.

All very sensible. But the explicit assumption here is that users or customers can't adequately express what they need, or provide any sort of detailed specification of the software, therefore this is the only reasonable route that remains. And the implicit assumption that follows is tied to the cost/scope/time triangle above - i.e. changing requirements will mean that some functionality will either fall out of scope or increase the project time and cost, and that the customer can simply select their chosen trade-off.

Do these assumptions hold true in a game development context?

Effective up-front design is possible

The first assumption should be untrue. Game development is rarely performed only by programmers delivering a program for a non-technical outside client. Usually it includes a whole game design team, who shape the vision for the product in conjunction with a publisher or management. Even if you consider the customer to be the publisher rather than the design team, it's still the case that a game development team has designers on staff whose role is to be experts in analysing, designing, and planning video games.

In particular, a designer should be able to take a particular feature and break that down into clear deliverables with relatively unambiguous specifications, such that a sufficiently competent software engineer is able to implement something very much like that at the first attempt. The need for iteration to discover requirements and solicit feedback should be greatly reduced by the presence of this team of domain experts.

This role played by a 'semi-technical' design team has parallels in other industries - for example, the work done by an architect when designing buildings. By coordinating both with the client and with the teams responsible for the eventual 'construction', they can formulate a fairly comprehensive plan that delivers on the eventual user's requirements while staying within all constraints and creating a plan for engineers to follow.

One snag is that the games industry has, on the whole, failed to train designers to fill this role, and instead focuses them on content creation (mapping the world, writing lore, placing spawn points, specifying items, assigning character attributes, etc) and creative direction (choosing and maintaining design 'pillars', identifying references in other games, selecting key features, communicating the overall vision to artists and programmers, etc). Too often, there's a void in the middle where the actual design would be. It's like having city planners and construction workers, but no architects.

One game I worked on (which will remain nameless) revolved heavily around combat. Senior designers had specified that there would be various weapon types and damage types. Other designers were tasked with implementing these via scripts. It was only after many months and several bug reports that anyone realised that there was no combat or damage system specified anywhere - just various disparate scripts attached to weapons that applied the damage in inconsistent ways. Eventually someone was forced to come up with a coherent system, the necessary engineering work happened, and the various weapons were re-implemented that way.

Problems like this are incredibly wasteful from a project management point of view because time is expensive! Although software is much easier to change than, say, a building, it is not free to do so. Worse, this process causes 'technical debt' since programmers may cut corners to find the quickest route to implementation if they know there's a good chance the designers will request future amendments anyway. To quote one commenter on the Continuous Delivery video above, "All delay problems, all software quality problems were coming from basically one source: constant changes by game designers". I wouldn't go that far, as there are many problems in game development that software engineers have entirely inflicted upon themselves, but the impact of designer changes on the timeliness and quality of game software is severe.

So while the statement "the users rarely know how to express what they need" is true in many industries, it's not in ours. For example, we know that a combat-oriented game needs a combat system, and we can know what aspects make up a combat system. It's our job as game developers to understand these concepts.

You can't avoid an iteration or feedback loop entirely. Even the best plans have flaws, or a change gets forced upon you for technical or business reasons. You need to be adaptable and you can't eliminate change, but planning and foresight can reduce it and save you time. A changeable plan is better than no plan.

In particular, the cost of design time is usually much lower than the cost of programmer time, so you want the changes to happen as close to the design end as possible. Iterating mentally and verbally will always be cheaper than iterating in documents or spreadsheets. Iterating in documents will always be cheaper than iterating in designer prototypes. Iterating in designer prototypes will always be cheaper than iterating in engineer prototypes, which in turn are cheaper than iterating during production. Changes in production should be the last resort.

Sadly, this focus on agility has emphasised responding to change, and de-emphasised the need to plan things in advance. So it's no surprise that Cyberpunk 2077's development saw people say "we'll figure it out along the way" and that poor planning is blamed for many of the problems. This is endemic to the industry.

The project is rarely as agile as the methodology

So, to the second assumption - just how flexible are the corners of that project management triangle above?

In an Agile project, the idea is that stakeholders regularly evaluate the state of the software, provide feedback, and adjust priorities. As change requests are made and new requirements emerge, less important features might get delayed or shelved entirely, and if development is otherwise going well, the budget might increase to accommodate these features later. The nature of the process assumes that new requirements are going to be discovered during development and therefore either some old requirements fall out of scope or the budget stretches to include the new ones as well.

This is somewhat at odds with the way that many game development projects work. Typically, the publisher is promised a certain set of features, often delivered in a fairly well-defined order, in return for funding to pay for a team to deliver them over a known timescale. These 'milestones' during development might correspond to broad phases - e.g. preproduction, pre-alpha, alpha, beta, etc - or they may be broken down further still, for example to require a monthly deliverable, each with certain features delivered to an agreed level of completeness.

The milestone system means that even though the project is delivered frequently over several iterations (just as agile proponents would want) it's largely decided ahead of time what gets delivered and when. This reduces risk for the publisher and gives them a very clear overview of the project's progress. However, from the developer's point of view, this incurs a lot of the dangers of a traditional model where you need to hit a fixed deadline with fixed resources, but with much of the overhead of an Agile model where the software must be ready for continuous delivery and evaluation at all times. The worst of both worlds?

It's easy to blame publishers for not being willing to be more flexible. But historically most games have not been profitable. It is a risky business where a handful of weaker releases might get subsidised by 1 or 2 hits, and survival as a publisher - and therefore, survival as a game developer funded by that publisher - can revolve around whether the publisher is able to keep costs down. And the publisher, unlike most customers who are buying software, can't just decide to cut their losses and accept the current state of the software when things start to overrun, because they need a playable and polished game for retail purposes. The Agile principle of "deliver working software frequently" can only go so far.

One of my favourite games, Deus Ex: Mankind Divided, saw the publisher come in for serious criticism from YouTuber Jimquisition over publisher 'meddling' and 'cash-grabs'. Yet we also know the game was delayed at least once and even when it shipped, was criticized for being 'cut short'. Given that sales were disappointing, and that development was already over budget, it's not hard to see why the publisher may have insisted on a reduction in scope and some controversial revenue-generating features given that they probably suspected they were going to lose money on this. Granting the developers an extension of an extra year or two to make the game they truly wanted to make would have made a better product but probably a greater financial loss too. When development can cost several million dollars a month, there has to be a deadline.

All this adds to the stress on engineers. Each bit of work thrown away because a designer wanted to 'iterate on it' rather than specify it in more detail is time not available to hit these milestones. A related issue (which I want to talk about more in future) is where designers or project managers trickle out features to engineers sequentially over time. What the designer might gain from leaving the subsequent specifications to a later date is often more than lost on the engineering side as key systems have to be reworked to accommodate the new requirement, which often could have been done properly from the start in less time.

The last problem I want to mention is that games are relatively complex systems, even ignoring the software engineering aspect. It's rarely easy to reduce the scope of a game without there being significant knock-on effects. Imagine a crafting system where players create their own items. Disabling that is only the start of the process, because any other system that assumed players had access to crafted items will now need re-assessing - will players get those items elsewhere? Do we need to add more merchants into the game? Do defeated enemies need to drop more loot? Does the inventory management need change now that there are fewer items to consider? It's rarely as simple as merely dropping a few features to hit a launch date.

What should we do?

Would I agree with those who say we should abandon agile development entirely? Probably not, although the Agile term carries both too much baggage and too little clarity, so it is essentially meaningless now. Still, much of the philosophy is common sense and has clear benefits, and some degree of agility is beneficial.

So, my specific recommendations are:

1) We need to be less pessimistic about what can be effectively planned in advance. We don't need 'Big Design Up Front' or hundred-page documents before development starts. But we should be front-loading much more product design, doing more work on (digital) paper first, and when we do hand things off to engineers, do it with clear specifications from designers rather than an expectation that the work will be re-done based on feedback later. Less 'Find the Fun', and more 'Design The Fun'.

2) We need to rein in the heavy-handed project management aspects that crept into Agile development that obstruct developers, and get back to the original promise of a 'light touch' that allows engineers to be more effective. Replace the micro-management of tasks with clearer instructions to engineers and sufficient information for them to be able to carry them out well. (In return developers probably have to get better at estimating task durations, but that's for another day.) A good project manager or producer is worth their weight in gold, but that gold is too easily spent on admin and bureaucracy to 'manage upwards' while impeding actual work.

3) The games industry needs to refocus the design discipline on game system development and the production of good feature specifications. This might need more support from both academia and industry veterans to ensure that we have the effective tools and vocabulary for this, since few of the theories or concepts discussed in the past appear to have gained traction.

Conclusion

Game software development has some unique constraints that other software development often lacks - but it also employs designers who, if effective, can not just improve the product but significantly aid in planning the project. As such, Agile development - at least in the form that it usually ends up taking - is not ideal for games, and an approach that borrows from more traditional "plan-first" methods is likely to be better, especially if the industry learns to play to its strengths.