Seems like the average console game company's gameplay testing process starts off with an email like this...
We’re having a playtest this Saturday afternoon, from 1:00 – 4:00. If any of you have gamer friends that would like to participate, have them fill out this NDA + survey and send it back to me.
People showing up are sat down in from of a dev box with the latest copy of the game and promised free pizza if they finish the game and fill out all the questionnaire. Sometimes if the company is really on it, the testers play is recorded so it can be double checked and reviewed later by all the developers. This testing is usually < 100 players and is typically only done only 3-4 times during the project at milestones like first playable, alpha, beta. From this very small sampling of data feedback is given to the development team and at many publishers it is even used to help estimate pre-order numbers and determine a games slice of the marketing budget.
Its not that they do not get any useful data out of this at all, but nothing pulled out of this could be called qualitative and if someone tried to pass this off for testing in any other industry, they would likely be fired.
It is not the developers or publishers fault that there is little to no real focus testing, they do not really have a choice. Console manufacturers only allow pre-release game disks to only run on special $10k development boxes that the public is not allowed to have. So even though there are millions of xBox360's out there, your testing pool is still only as many development boxes your studio plus publisher can allocate to it.
Even if your publisher has a ton of money to put in hardware, it is such a issue to coordinate getting that many people on site it can still only be done once in awhile. The results of the game not being tested that often is so many changes are getting tested all at once that it is hard to determine or isolate what change is effecting the feedback. There is also problems that we can not verify what live testers say about their playing habits or any other information they give us.
What has this done is placed undue burden on the game designers as they have to be able to come up with new and exciting game designs and get it exactly right as they will get very little chances for any real significant unbiased feedback before they ship. With the punishment for unpopular levels being a pink slip is it any wonder that designers tend to go with tried and true designs that they are sure will work. They might be more willing to take risks and move games foward if they had a quick feedback loop to tell them if they are heading in the right direction.
We need the kind of testing online PC games can pull off:
We could have this kind of testing because the console manufacturers already have everything we need, we just do not have access to it. They need to let the developers get access to the play stats on xbox Live so we can pick the right testers for our games. Then they need to allow us to release xBox arcade type pre-releases of the games so users can play test them for us.
It could work like this:
Suddenly with one database programmer and a community manager you could have access to and do coordinated testing with millions of users.
The potential gains in game quality and developer cost savings are huge as currently we often do not get feedback until it is way to late or too expensive to do anything about it. I do not know if it would be as easy to implement as I think, but it can not hurt if everyone mentions it to their account rep. Hopefully someone at Microsoft or Sony will look into it.