Gamasutra: The Art & Business of Making Gamesspacer
Bringing Dr. Jones to the Infernal Machine: Dealing with Memory Constraints
View All     RSS
August 13, 2016
arrowPress Releases
August 13, 2016
PR Newswire
View All






If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 
Bringing Dr. Jones to the Infernal Machine: Dealing with Memory Constraints

August 22, 2001 Article Start Page 1 of 4 Next
 

Introduction

The task of porting Indiana Jones and the Infernal Machine from PC to N64 proved to be quite challenging. The Nintendo 64 could handle the amount of polygons quite well, but memory use was a huge problem because the developers fo the PC original hadn't needed to worry about it at all. While a typical PC memory footprint might be 128 MB, the N64 needed to run the game with only 4MB -- and that must also hold the code, frame buffers, textures and display lists. You can take advantage of the N64 Expansion Pak to provide higher resolution and better performance, but only if it is present.

Looking at these numbers alone, it seems that this kind of challenge is doomed to fail from the start. However, when the project started, the original PC game was much smaller in terms of texture and model usage. As we all know, projects tend to grow during the last phase of production because everybody wants to make the game as good as it can be. Certainly, an adventure game like Indy provided enough opportunities for growth.

It turned out, we were looking for every thousand bytes we could free up in the end. A number of different methods were used to make it fit. This article presents a summary of the most important methods necessary for dealing with memory constraints, most of which are applicable and beneficial to other projects. Reducing memory usage also helps to increase performance because data is more likely to be local, and therefore more likely to be found in one of the CPU's data caches.

One could argue that developers today no longer need to concentrate on memory constraint issues because of new systems like the Sony Playstation 2 and Nintendo Gamecube. However, while their rendering capabilities have steadily improved, their memory subsystems have not changed very much. This suggests that memory organization and efficient memory usage are now an even more pressing issue.

Scenario

The approach used in porting Indy to the N64 shares many similarities with a data path that is used in producing an original title from scratch. Normally, the art and level data is exported into the game engine-specific data format using a combination of exporting tools. Those in turn are then bound together, creating a binary resource which is loaded on the target. To reduce the size of the binary resource, we were left with two options: rewriting the export tool setting it to output smaller and more optimized structures, or using the existing PC game startup and initialization part to load the binary resource.

We chose the second approach, because it offered the benefit of a quick startup given that all the code loading and initializing data was already in place. All we needed to do was to take the game and strip the rendering loop out of it. Starting here, we could take all the game's data structures, rewrite and optimize them in an easy fashion (Figure 1). We familiarized ourselves with the game's internal structure in the process. We called this abused executable the munger. We extended it to optimize and rewrite all data structures. It also provided for more consistency checks, which were required to ensure that our changes to the data structures didn't cripple the data by casting 32 bit values to eight bits.

Figure 1: Data-Path Setup

This approach has many similarities between an original data path because it is used for an original title, and therefore the methods used and lessons learned by porting Indy apply to original titles. It is also important to take data management and storage requirements into account, since the amount of art data (e.g. geometry, textures, animation data, etc.) is increasing with the release of the next generation systems

Methods Used

The methods used are categorized into three classes. The rest of the article discusses the methods in more detail and offers some practical insight, which may be of some benefit to those of you experiencing the similar memory problems.

  • Rewriting & Optimizing Data Structures: decreasing the size of data structures sounds like a tedious and erroneous process; done well this can be an easy way to reduce the size of the data.
  • Resource Caching on the Target: although a large amount of data can be cached (this works surprisingly well), different kinds of data require different caching approaches. The caching schemes used are described and classified.
  • Using a Virtual Memory Kernel to reduce code size: one of the benefits of console programming is that you have total control over the target hardware. Implementing a virtual memory kernel sounds over the top at first, however, it not only provides precious memory, but also helps in developing and debugging. In general, a virtual memory kernel presents a means to trap illegal memory access.

Article Start Page 1 of 4 Next

Related Jobs

Emblematic Group
Emblematic Group — Santa Monica, California, United States
[08.12.16]

4815880013608262
Cignition
Cignition — Palo Alto, California, United States
[08.12.16]

Gameplay Programmer
Telltale Games
Telltale Games — San Rafael, California, United States
[08.12.16]

Senior Graphics Engineer
Particle City
Particle City — Los Angeles , California, United States
[08.11.16]

Senior Gameplay Engineer





Loading Comments

loader image