Is the game that your working on got that general buggy feeling? Are you waiting for that polish period towards the end of major milestone? How did it get like this and what is the general standard of your product on a day to day basis? Here are 5 signs that you're suffering from general quality fatigue and some thoughts on remedies.
Look around your team and ask yourself if there is anyone who embraces quality in the products output beyond the minimal quality required for releasing. Individuals might take the quality of their own work seriously but is there anyone looking at the big picture? This person may or may not be from the QA team but a quality advocate will help with the prevention of issues and highlight risks a lot earlier in development. In the absence of such an individual, quality is seen as thing to be done in isolation in advance of deadlines and latent unknown issues can cripple a product before it gets out the door.
So you have QA representation and are thinking job done on the quality front. What is your Dev:QA ratio? There isn't a correct answer for that as each product / company is unique (I tend to say 4:1) but if your development throughput doesn't match the QA capacity to process it then you're comprising your product quality and starting to acquire QA Debt. This is not about moving jobs across a Scrum board and completing sprints, this is about the time and attention that both developers and QA can invest in the testing process. Have a think about that kind of issues you are looking to unearth and prevent, if you expect the product to stand up to some serious destructive testing then make sure the QA team has the capacity to perform these tasks.
You've got a QA team with a decent Dev:QA ratio but how do your sprints typically end? Are you always reaching "Done Done" across the board or do you have a culture of assuming things are "Complete" when work is actually marked as "in Test"? Presuming quality is a flawed approach and easily exposed when stakeholders attempt to interact with a "Complete" task, find a problem and raise it. It's not worth rolling the dice but equally, this approach also puts your QA team on the back foot as there is inherent pressure to "sign off" and there is greater resistance to fix any of those non functionality issues. This normally results in testing being downgraded to checking and in the long term this contributes to a non polished and awkward user experience of the product.
You're team has the time it needs to test but how does your quality advocates keep up with the ever changing landscape of game development? Investing in tooling is a huge factor in the smooth delivery of a product and is often a multi disciplined approach. When it comes to QA specific development tasks, do you find yourself asking the QA team to make it's own tools? Small requests like requesting debug/cheat commands can unlock a lot of QA time to focus on other tasks but it's often seen as something the QA team should lead themselves by learning other skills. A successful product will put these tasks into the backlog and assign them to the best person qualified to complete it and not treat them as a side "nice to have" project.
And finally, where exactly does QA live in your corporate hierarchy? Are you positioning the quality advocates as an independent discipline or are they a subset of a subset within the company. Real and lasting quality can be derived in how the company treats, views and trusts their Quality Advocates and is only as effective as the level they are allowed to operate within the business. Don't compromise your QA teams remit by putting them in a corporate corner. A good rule of thumb is being able to see if the leadership structure of your Art / Design / Programming teams mirror with QA's.