GAME JOBS
Contents
Making the Move to HTML5, Part 2
 
 
Printer-Friendly VersionPrinter-Friendly Version
 
Latest Jobs
spacer View All     Post a Job     RSS spacer
 
June 6, 2013
 
LeapFrog
Associate Producer
 
Off Base Productions
Senior Front End Software Engineer
 
EA - Austin
Producer
 
Zindagi Games
Senior/Lead Online Multiplayer
 
Off Base Productions
Web Application Developer
 
Gameloft
Java Developers
spacer
Latest Blogs
spacer View All     Post     RSS spacer
 
June 6, 2013
 
Tenets of Videodreams, Part 3: Musicality
 
Post Mortem: Minecraft Oakland
 
Free to Play: A Call for Games Lacking Challenge [1]
 
Cracking the Touchscreen Code [3]
 
10 Business Law and Tax Law Steps to Improve the Chance of Crowdfunding Success
spacer
About
spacer Editor-In-Chief:
Kris Graft
Blog Director:
Christian Nutt
Senior Contributing Editor:
Brandon Sheffield
News Editors:
Mike Rose, Kris Ligman
Editors-At-Large:
Leigh Alexander, Chris Morris
Advertising:
Jennifer Sulik
Recruitment:
Gina Gross
Education:
Gillian Crowley
 
Contact Gamasutra
 
Report a Problem
 
Submit News
 
Comment Guidelines
 
Blogging Guidelines
Sponsor
Features
  Making the Move to HTML5, Part 2
by David Galeano, Duncan Tebbs [Programming, Social/Online]
9 comments Share on Twitter Share on Facebook RSS
 
 
February 21, 2013 Article Start Previous Page 3 of 3
 

Graphics

Canvas

The canvas element provides access to procedural rendering APIs, as opposed to the declarative support offered by HTML itself.

For many people canvas means the 2D context API that was originally provided, although nowadays developers can also request a 3D API that we will describe later on. The 2D context API provides similar functionality to that provided by the vectorized graphics standard for SVG images.



Once a reference is available to a canvas element in the DOM, a 2D rendering context can be requested:

ctx = canvas.getContext('2d');

ctx.save();

ctx.translate(100, 100);

for (var i = 0; i < 6; i += 1)

{

var red = Math.floor(255 - (42.5 * i));

for (var j = 0; j < 6; j += 1)

{

var green = Math.floor(255 - (42.5 * j));

ctx.fillStyle = 'rgb(' + red + ',' + green + ',0)';

ctx.fillRect(j * 25, i * 25, 25, 25);

}

}

ctx.restore();

Although very useful, we have noticed several issues with the 2D API:

  • Colors must be specified as strings defining a CSS color. This feels somewhat unnecessary and wasteful when animating colors, or otherwise creating them dynamically.
  • No functionality is available for recording and reusing a path. To move a complex shape dynamically the path must be rebuilt for every frame. Using SVG images could work much better in this case. This is issue is being addressed by the proposal to provide a canvas Path object.
  • Only global blending operations are available, which operate on the whole canvas at once. This could degrade performance or limit what can be achieved compared to local blending.

If a game only requires sprites then the 2D context API will work fine, but for complex vector graphics much better performance can be achieved with the 3D API which, for example, allows baking shapes into vertex and index buffers for optimal reuse. The lack of complex custom operations per pixel also limits what can be achieved with the 2D context. In contrast the 3D API provides support for pixel programs.

Old browsers use software rendering for the 2D operations, which will not perform well for complex scenes. Also, not all old implementations actually supported the standard. Modern browsers do provide hardware acceleration, and canvas support is now the norm.

The size of a canvas element does not change automatically in the same way other HTML elements can. The developer must manually change the width and height of canvas element to react to changes in the dimensions of its parent HTML element for example.

At Turbulenz we have implemented an emulation of the 2D context API on top of our low level 3D rendering API. This emulation cuts some corners for performance reasons so it will fail strict compliance tests, but it is useful for mixing 2D graphics with 3D rendering using a familiar API. This emulation on top of the abstracted 3D layer also helps in situations when either the browser does not provide canvas support (in which case our implementation will fall back to our plugin) or in the case when the browser’s implementation does not use hardware accelerated rendering (because our implementation uses the hardware accelerated WebGL API).

WebGL

The WebGL specification defines an API very similar to OpenGL ES 2.0 that operates on a canvas element.

It is not possible to obtain a 2D context and a 3D context from the same canvas element. Developers must choose beforehand which to use.

WebGL has evolved over a considerable period of time and so code of this form may be required to fully check for support:

function getAvailableContext(canvas, params)

{

if (canvas.getContext)

{

var contexts = ['webgl', 'experimental-webgl'];

var numContexts = contexts.length;

for (var i = 0; i < numContexts; i += 1)

{

try

{

var ctx = canvas.getContext(contexts[i], params);

if (ctx)

{

return ctx;

}

}

catch (ex)

{

}

}

}

return null;

}

var canvasParams = {

alpha: false,

stencil: true,

antialias: false

};

var gl = getAvailableContext(canvas, canvasParams);

One of the goals of WebGL was to provide a 3D API supporting a variety of devices: from high end desktops to mobile phones. OpenGL ES 2.0 matched that requirement with millions of mobile phones now being shipped with hardware support for the standard. This minimum common denominator handicaps modern desktop video cards by imposing an API that is at least a couple of generations behind those available on modern PCs. Extensions can and will improve support for more high end features but currently progress in this area is slow.

Some features not available with WebGL but commonly supported by desktop video cards include:

  • Multiple color render targets.
  • 3D textures.
  • Anisotropic filtering.
    • Some browsers have recently added an experimental extension to support it.
  • Texture arrays.
  • Antialiasing when using framebuffer objects.
  • Full support for non-power of two textures
  • Compressed textures.
    • Some browsers have recently added experimental extensions to support them.
  • Floating point textures
    • Some browsers recently started adding support.

The WebGL shader model also lags behind what modern video cards support.

Support for the WebGL API needs backing from more browsers; not all of them support it and some only support it on certain platforms. Every platform has its own set of issues. On Windows the quality of some OpenGL drivers has led to the creation of a project called ANGLE to emulate the OpenGL ES API using Direct3D, which usually has more stable drivers. ANGLE has progressed massively since its beginning, but at the time of writing, the translation from an OpenGL-like API to Direct3D affects performance negatively.

Some browsers switch between using OpenGL directly or using ANGLE depending on how much they trust the drivers for the video card, which can create inconsistent behavior between machines or even the same machine when updating the drivers (these are inconsistencies on top of those that already exist in the hardware drivers themselves). Some browsers gave up on OpenGL altogether when running on Windows and always use ANGLE. Some browsers support WebGL on some platforms and not on others. We found that when using ANGLE shader compilation and linking usually takes a long time, often stalling rendering for several frames. We found similar issues when uploading custom texture data to the GPU.

Browsers support a very limited number of image file formats that they can load into textures. The “safe” set includes only PNG and JPG, PNG being the only one with support for an alpha channel. Any other image format will have limited support. For simple file formats it is possible to decode the pixel data using JavaScript after loading the binary data with XMLHttpRequest.

At Turbulenz we designed a low level 3D API at a slightly higher level than WebGL, similar in concept to Direct3D and based on shaders and materials. This API allows us to switch between different backends depending on browser support, and if the browser does not support WebGL we use our own plugin that natively implements our low level 3D API.

The games available to play at turbulenz.com, as well as this tech demo of the Turbulenz Engine show the high level of fidelity that can be achieved using the APIs described here.

Fullscreen

One traditional limitation of browsers when trying to provide a gaming experience similar to native desktop games is the lack of a fullscreen mode. The Fullscreen API was created to address this, and works by allowing JavaScript code to request a specific HTML element that will take over the whole screen. Fullscreen mode can provide increased performance for canvas rendering, allowing the browsers to avoid full page composition of the canvas contents, presenting just your rendering output to the screen.

The fullscreen API requires that the requests for going full screen originate from a user action, such as clicking a button or pressing a specific key. Code cannot just request fullscreen at will. This requirement originates from the need for browsers to avoid malign code taking control of the screen without the user’s consent.

This API still needs wider adoption as not all modern browsers support it.

Conclusion

In this second article we have described several features of HTML5 and other standards that developers have at their disposal for games for the web. In the final article we continue this discussion for the remaining areas of game development (Audio, Input, Threading, etc), and also discuss issues such as resource loading and security.

 
Article Start Previous Page 3 of 3
 
Top Stories

image
Microsoft's official stance on used games for Xbox One
image
Keeping the simulation dream alive
image
A 15-year-old critique of the game industry that's still relevant today
image
The demo is dead, revisited
Comments

Maciej Bacal
profile image
"Browsers will try to call the callback at 60FPS"

This isn't correct. RequestAnimationFrame is using v-synch. It'll try to match the monitor's frequency, not a flat 60 FPS. Try changing your monitor's refresh rate and run a basic RequestAnimationFrame demo in Chrome using the Timeline tab. It'll show the default 60 and 30 FPS range, but the transparent "frame overhead" bars will match the v-synch. For example, for 75Hz the bars should be a bit lower than the 60 FPS line and the frame time should be about 13 ms instead of 16 ms.

David Galeano
profile image
Is that so for all browsers and platforms? You may be right and it would made total sense, but I don't think there is anything guaranteed about it on the specification, and browser development documentation always mentions 60Hz. On our own tests we have detected at least one browser calling the callback up to 75 times per second on machines with a monitor refresh rate set to 60Hz, a complete waste, but that is a rare exception, and we assume that in general on most machines and browsers the rate will effectively be 60 FPS (for one reason or another).

Maciej Bacal
profile image
Well in Chrome, for one, that's the behavior. I think it's a big enough player to give this a mention. I'm fairly sure that Safari does v-synch on Mac and iOS6.

"The expectation is that the user agent will run tasks from the animation task source at at a regular interval matching the display's refresh rate. Running tasks at a lower rate can result in animations not appearing smooth. Running tasks at a higher rate can cause extra computation to occur without a user-visible benefit."

This is from the w3c documentation (https://dvcs.w3.org/hg/webperf/raw-file/a43340fd9097/specs/RequestAnimat ionFrame/Overview.html). EDIT: I have no idea why but Gamasutra forces a space in RequestAnimationFrame in the address. Remove it if you wanna check the page.

"This API will take page visibility and the display's refresh rate into account to determine how many frames per second to allocate to the animation."

This is from Microsoft's site (http://ie.microsoft.com/testdrive/Graphics/RequestAnimationFrame/Default .html).

I'm fairly sure that Firefox doesn't synch. I guess that the thing to take away from this is, don't assume that it's going to be 60 FPS.

Christopher Casey
profile image
My experience is that requestanimationframe is still too new to depend on reliable behavior cross-platform, and forget about its behavior on mobile. It's still more reliable to use timers and/or intervals at the moment.

Actually HTML5 in general is still too shaky to even bother with it for mobile. Prepare for the time when it isn't, but it's just not good enough yet. Take for example Apple removing the trick that let you hardware accelerate animations by using the CSS rotation properties -- you just can't get the performance you need for any but the simplest cases.

David Galeano
profile image
I think "expectation" is probably the right word. We have seen Chrome dropping to 30Hz after running at 45Hz for a while, I guess some internal logic halving the rate to avoid doing more work than it is worth (the scene was too complex to render in less than 16.6ms), and we have also seen Firefox varying between 60Hz and 62.5Hz depending of the test.

"The repaint may occur up to 60 times per second for foreground tabs" and "we don't invoke the callbacks at a rate faster than 60Hz" are from other browser docs.

We do assume that by default the target rate would be 60 FPS, but of course we do code defensively, taking care of spikes or drops. The point is that with requestAnimationFrame it is perfectly possible to easily run at smooth 60 FPS and in many cases that will be the end result.

We did try to use setTimeout and setInterval and we never got smooth animations at 60 FPS.

Yes, mobile is lagging behind...

Maciej Bacal
profile image
"I think "expectation" is probably the right word. We have seen Chrome dropping to 30Hz after running at 45Hz for a while, I guess some internal logic halving the rate to avoid doing more work than it is worth (the scene was too complex to render in less than 16.6ms), and we have also seen Firefox varying between 60Hz and 62.5Hz depending of the test."

Is the 45 FPS being measured in real time or averaging? 45 is right between 30 and 60. The drop to 30 FPS would be correct since that's the "next step" in v-synch. To synch with a 60 Hz monitor, the FPS needs to be 60, 30, 15, 12, 10, etc... Though i have noticed myself that for some of my load tests Firefox and Chrome can sometimes stick with ~45 (measured in real time) which doesn't make much sense.

Robert Casey
profile image
Thanks for the eye opening article on the gaming and animation capabilities of HTML 5.

Don Olmstead
profile image
SIMD is coming to the browser but not with JavaScript. The Dart VM is fairly close to adding native support for SIMD operations. This should add another speed boost for game developers targeting the web.


none
 
Comment:
 




UBM Tech