I have a habit of going back and reading older posts and content of writers and speakers to learn what I might have missed. A lot of times I merely re-watch or reread something that I had already encountered before to remember any lost information or gain a new insight with the ever changing context of the present. For those who don't know, James Portnow was an adjunct faculty member at Digipen University and the writer on the ongoing web series, Extra Credits. One of the earliest episodes of Extra Credits was titled "Ludus Florentis - Embracing Change in the Games Industry", wherein a number of changing factors were examined in the games industry and some predictions were made about how the industry as a whole would change.
In my constant look at the past, I stepped upon on an article in Gamasutra that was titled the same "Ludus Florentis". The article was written by James Portnow, only in 2009, while the video was uploaded on Youtube in 2012, but for the most part, the article read almost word for word as the video. However what was interesting to me were the added comment responses by the likes of Daniel Cook and Jonathon Blow.
So this peaked my interest to look back on how experts thought our industry would turn out, and compare them to how things have actually gone. So let's look at the claims that were made. I'll be paraphrasing the essence of the claims rather than quoting directly. You can go read the article here. So without further ado, let's dive in:
1. Game schools are creating a new environment for formal education of game creation that will increase the number of qualified developers, who won't possibly be able to find jobs at the existing big studios since there's so many more of them than there are positions available.
2. Lower cost platforms makes experimentation economically viable and provide an outlet for these newcomers to develop games for.
3. Graphics are no longer as important as they always have been.
4. Game engines like Unreal and Unity have lowered the barrier to entry for making games.
5. People will want more games of more different types.
These were the changing factors of the industry. The change that these would bring were a bit vague, but what was touted was that games as we knew them would change with innovation help flourish the industry to grow as a medium where we would have all sorts of new games and developers inhabiting in.
It's worth considering the context in which these claims were made. At 2009, the games industry ecosystem was still heavily dominated by AAA studios and games. Moving back in time just a few years more and we'd be in a time when talks of graphics were hot button issues, especially the now forgotten debate of "Graphics vs Story!". Indies were just emerging and getting coverage in the press. It was then where the golden age of indies was blossoming. It was also a time where the debate of games as art was growing and when many core gamers felt uncomfortable with that notion.
Most of the claims were compounds of factual and speculative clauses. All of the factual components were indeed correct, however for the speculative parts we can now see that most didn't turn out as rosey as James had predicted.
Back when game specific university programs had started, most of the industry was indifferent to them. At 2009 when James had written the article, game schools still were a small player, but as noted, they were getting an improved relation which was a result of some success stories of the time (Portal, and later Magicka). Now, in 2017 game degrees are still discouraged by almost everyone but those who have some sort of stake in them. The dream of having game schools producing highly qualified candidates for work in the industry didn't materialize.
This is born out of 2 problems. The first being something acknowledged by the original paper that the existing studios didn't have enough room for all the graduates. Which ended up true, the demand for people making games didn't match the eagerness of people enrolling in these programs. The second problem however which has always been ongoing since game degrees' inception has been the mismatch between required skills for the job and what employers are looking for. Add in the inflexibility of the skills that they teach compared to more general specializations like computer science, and the courses get discouraged by even those who have attended and graduated from such programs, even top schools such as SCAD.
Yes, the indiepocalypse happened which no-one can really deny by now. A few years back, there might have still been a debate. "Oh good games still succeed.", "The increase in Steam releases are all because of Shovelware", "Those who target a niche and stick to it do well", these are all familiar talking points we had all seen and have been proven to not be true anymore. The fact of the matter is that working as an independent developer has become incredibly hard and a lot of good quality games have been buried under the weight of all the releases on the stores. This isn't what was foreseen 8 years ago when indie games seemed to have a bright future.
The causes are actually a lot of what was pointed out in the original article and video. Simply put, more people want to get into game development, and both the barrier to entry for development is lower thanks to those tools and engines, and also the barrier to entry for distribution is lower because of all of the different platforms available to developers such as Steam, Google Play, App Store and Itch.io.
The economics of this are predictable. More developers making more games means that there will be more supply of games in the market in general, and in each store specifically. That in turn will mean far more competition between the game creators, and results in all sorts of competitive and anti-competitive behaviors and practices that all in the end make the life of other fellow game creators harder.
Meanwhile more games and developers involved in production doesn't necessarily translate to a bigger pie for everyone to share. The pie's almost the same size as before but the amount of people wanting a share of that pie are far higher. But why hasn't the pie grown? In Ludus Florentis, it was predicted that innovation would lead to new undiscovered game types that would draw in more people into games and grow our metaphorical pie.
This goes hand in hand with another often talked about question of "why are games so violent?", but talking about that begs a paper and attention for itself. However these are very closely related. Video games (and not games in general) have some very hard technical limitations on them. Video games are made with game engines, and game engines provide a soft real-time simulation executed by a computer. That's problematic.
Game engines (and by extension computers) can only understand a few things, things that are accurately measurable and can be coded. The tools that game engines provide us developers with are collision boxes, 3D space, discrete input systems, decision trees, creating and removing objects, and numbers. This is a simplification, but you get the point. Platformers and shooters are thus for example natural applications of these tools, they're instantly measurable and relatively easy to implement. It's not that video games want to be violent, it's that the mechanics that these engines provide us are most appropriately themed with violence. Actor removal as a mechanic can almost only be themed as violent, especially if the coded requirement for that actor removal is about some colliders hitting each other.
If a game designer wants to create emotions such as love or mechanics like debating, the tools don't support that, and that's not on an implementation of technology level. This is fundamentally unsuitable. Possible for sure, but the implementations end up very lacking and disconnected from the ludic interactivity that games provide, and lean more towards passive decision trees. An example of this implementation can be seen in The Walking Dead series or What Remains of Edith Finch.
The promise of new experiences have existed for some time. The realization of these new experiences has been held back by the limitations of our input interfaces and the possibilities of game engines.
That all being said, some things could have gone differently. One of the problems in games academia is a lack of consensus. The underlying theory for games is still very underdeveloped and in its infancy. No one seems to be accepting any new theoretical jargon and no schools of thought have emerged really. Perhaps though if game academics went through a path similar to that of economics or business administration went in the 20th century, we'd at least have some local schools of thought laying the foundations for academic progress in their own paradigms. Furthermore, it would be great if graduates of these schools could create a new occupational field similar to pundits in communications media. We could have researchers, writers and educators nurturing our next generation of critics and informed audiences.
The proliferation of games and the saturation of stores could also be helped with stricter curation. Honestly, there's been so many suggestions for improvements that are still implementable to this day that could solve or at least, mitigate the issue of discoverability. Still, we're going to need to see a much larger global gaming audience if the rising number of studios around the world can sustain themselves. Our pie needs to gow. That needs both time and education. Education meaning gaming literacy.
Other issues just couldn't have been avoided though. The big AAA studios would always flex their polish and scope muscles as their primary strength, regardless of whether the speed of graphical fidelity progression is slowing. It's not those budgets are going down, it just means they're going higher with lower gains.
Technical limitations of computers though will always stick with us, and we should probably embrace them and try to work around them knowingly than fantasize about something that's not achievable.