In every studio I've worked in and, for the most part, everyone in the video game development community agrees that it's a good idea to have time and energy dedicated to the quality of a game. But QA departments incur massive costs in personnel and overtime, and are sometimes vilified in P&L discussions. QA management is often tasked with finding leaner ways to do business; increasing test coverage without impacting development team practices. †So far, the answer has been to have more people testing more things, longer. Black box or functional testing has been the usual and customary focus of the gaming industry. This approach can be loaded with unidentified and risky pitfalls.
The quality solution for the video game industry does not lie with the tester or the QA manager. I've spent a decade in the video game industry shipping AAA titles, and before that several years in trans-media development houses. I've had countless discussions with development, QA, and operational oracles in the game industry. Neither I, nor anyone I've spoken with, has ever seen quality tested into a game they were working on.
Our quality solution resides in the game development team processes, and should be addressed at that level. If quality solutions were built into the beginning of the development pipeline, independent QA departments could be substantially reduced in size. Instead of functional testing to find errors, these streamlined departments could be tasked with establishing best practices across the organization and managing the ebb and flow of contract personnel for usability testing. QA Managers could facilitate the internal transition and external hiring of qualified, passionate Engineers in Test and Testers development team integration.
Software development houses have been integrating test earlier in the pipeline for years, of course. But game studios lag behind software development best practices, favoring perceived "tried and true" methods. When I first entered the industry, I sat with a game studio executive and asked if the company had a project management tome for best practices that all teams could share. The executive replied with an emphatic "No!" followed by "... and this is why we make the best games."
When agile development gained traction many game studios liked the idea but floundered on implementation, with many development teams adopting differing approaches or making up practices to fill gaps. The dynamic was game-like; each team tried to "win" by coming up with the best agile process, but failed to share successes and lessons with other teams.
Shifting from traditional practices to a new paradigm can be difficult, and often inspires resistance. But, when the benefits to collaboration between development and test are weighed against endless hours of overtime and diminishing returns, change becomes more attractive than continuing the pain. Using integrated testing solutions, teams can identify and correct issues as soon as an hour after creation. Traditional processes will usually surface those issues, but normally over a year later, during functional testing. More rapid identification and correction translates into greater agility for both developers and testers.
The image below is making the rounds on professional media networks:
Ask anyone in QA, and they'll give you some variation on we are all working together to make a better game experience. But Development often seems to feel that QA is standing by to "rip apart" any and all game content they encounter. It is not supposed to be an adversarial relationship, but can be contentious, especially during crunch times.
Changing the Paradigm
QA, in most game studios, lives behind a wall at the end of the production pipeline. Development throws completed features over that wall and QA conducts functional testing on those features – usually with little to no requirements documentation for traceability. Not the best approach if you are looking to avoid bottlenecks and ship the best game possible. In response, QA tends to dog pile on each new feature, conducting extensive manual testing without increasing staff. The result can be considerable overtime.
Imagine the improvements from an integrated testing organization. Risk could be identified, planned for and managed throughout the development lifecycle. Working closely with developers and programmers, engineers in test would be part of the feature team, sharing their sprints, goals and expectations. The need for a separate QA department, aside from a center of excellence and user experience team, could be objectively evaluated. Communication is the usual culprit in any disastrous release. The job of a "tester" should be to identify risks and communicate those risks to all stakeholders for collaborative prioritization. Test coverage is how we mitigate risk. Integrated test is how we get there.
We can realize an application feature lifecycle where testing does not exist solely at the end of the pipeline, but rather permeates the entire development lifecycle. Over the next few installments, I'll offer an approach to make this type of change possible. Topics will include:
This is the first installment of a series on professionalizing Test in Video Game Development. Several topics will be discussed. Most positions were developed in a team and individual contributor context. A number of studios were evaluated for these articles providing a broad view of process and performance.†