Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
October 31, 2014
arrowPress Releases
October 31, 2014
PR Newswire
View All
View All     Submit Event





If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 
Framerate and Refresh, games and movies are not the same.
by Roger Haagensen on 11/05/12 09:56:00 pm   Featured Blogs

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

 

Framerate and Refresh, games and movies are not the same. Why does a 24FPS movie look better than a 60Hz game? Real motion blur vs fake motion blur. 30FPS vs 60FPS gaming. What is the difference?

I read this recently, Capcom Explains Why 30 FPS Isn't That Bad, and I'm sorry to say that I think this is pure nonsense.


A movie deals with light over time.

Pretty much any game engine for consoles or PC deals with time slices of light. A movie deals with light over time. What this means is that a movie at the cinema which "only" has 24 FPS (Frames Per Second) with 1/24th of a second of light per frame. While a game has (crudely explained) 30 FPS of 1/1000th of a second of light per frame. The same is true with 60 FPS.


You get to see "all" the light during 1/24th of a second despite only having a 24 FPS rate.

How can this possibly be so? Well, those familiar with photography should know the term "exposure time" quite well. For the rest of you this means that if you took 1000 frames and divided by 24 you would get 24 sets of about 41.66+ "frames", now those 41+ frames are blended into a single frame. This way you get to see "all" the light during 1/24th of a second despite only having a 24 FPS rate. Please note that although I'm saying "blend" this is not really true, check wikipedia for how film exposure really works as that article explains much better than I'm currently able to do.


I've yet to see a 1000FPS capable game.

A computer game would have to replicate this, and I've yet to see a 1000FPS capable game. In fact I can not recall any games that render at a higher FPS and blend down frames. Why? That is simple to answer, it would be wasting resources and you might as well show the frames at a higher rate instead (like 60FPS) if the hardware can handle it. Now if the hardware is able to render faster than 60FPS and the refresh rate is 60Hz and the rendering engine supported blending, then you could in theory get very good looking "film" motion blur by rendering twice as much.


Unlike film exposure a game just create a morphed blur over frames.

So what about the motion blur, can't computers fake the "movie blur" as well? Yeah they can. But unlike film exposure a game just create a morphed blur over frames (in other words 30FPS rendering with blur effect applied). I'm sure there are developers trying to use game world details to help tweak the interframe blur. And if done well a 30FPS game with interframe blur and displayed at 60FPS could look good. But again, I've yet to see this in any current games. Computers has enough issues keeping a steady framerate playing regular movies, so tackling this is obviously a major issue.


A lot of timing is millisecond based.

Part of the reason for these timing issues is the NTSC not-quite-60Hz problem. The PAL (and majority of the rest of the world) 50Hz refresh timing is easier to handle. After all 1000ms / 50Hz = 20ms per frame, with NTSC you get a fractional number instead. Why is this an issue? Well 60Hz (~59.97Hz) is very common, and 1000/60=16.66+ and as a lot of timing is millisecond based you can imagine the issue with matching 16.66. A lot of game engines try to compensate by varying between 15 and 17 ms so that over time it evens out to 60Hz. At 30FPS this jerkyness is less noticeable, but animation feels more stiff instead, and visual feedback vs user input is slower.


Do not confuse frame rate with refresh rate.

Also do not confuse frame rate with refresh rate. Refresh Rate refer to the rate at which a display (like a graphics card or TV or monitor) update what you see. While frame rate refer to how many frames per second is rendered. And rendering more frames than the refresh is just wasting resources as frames will then get discarded as they can not be shown. Ideally you want the refresh rate set to the native rate of whatever devices you use, and the frame rate to what can be smoothly rendered. This can mean 60FPS at 120Hz, or 24FPS at 60Hz. A high refresh rate generally makes a game feel more responsive. (there are ways to avoid this even at low refresh rates though) Sadly many games have the frame rate and refresh rate locked which ternd to decrease responsiveness.


With film/cinema a frame is light over a period of time, while a computer/console game a frame is a slice from a particular moment in time.

So please remember that with film/cinema a frame is light over a period of time, while a computer/console game a frame is a slice from a particular moment in time. Oh and 1000 is how many milliseconds there are in a second, milliseconds (or ms for short) are often used for timing of video on computers. A film frame (24FPS) contains 1/24th of a second of all light the camera see, and is not tied to 1000ms, it is instead tied to the speed of light itself (or spectrum of captured light). So instead of dividing 1000 by 24 it would be more correct to take the speed of light per second and divide that by 24. Light moves at 299792458 meters per second, so filming something 1 meter away at 24FPS means the light reaching the camera has a refresh rate of 299792458Hz. Obviously the math is weird on this as it's not easy to translate light to refresh rate, but it's just a very rough example.


Very costly computationally.

Will real film blur (aka the film look) ever be possible with games? Yes, one day they will. For many years now Anti-Aliasing has been used to improve the look of games/rendering, in it's most basic form this render a frame at a resolution higher than the display, then scale down. When it's scaled down the pixels are blended. This is not unlike how rendering at a higher framerate and then blending frames would be done. But doing it like that is very costly computationally. So what is the solution? Considering how computer hardware (and software) has anti-aliasing and tesselation (creates more detail using hints), something similar should be possible with frame rates as well.


The key is to keep the framerate and the refreshrate independent of each other and keep user input timings separate from both of them.

Hardware or sotware could be made that uses frame (or rather scene) hints to create interframes. There is still the issue of a stable framerate, and computers will probably struggle a long time with that before framerate timing vs refresh rate is fully smooth, some developers get it right while many don't. The key is to keep the framerate and the refreshrate independent of each other and keep user input timings separate from both of them. With Anti-Aliasing (not needed for really high DPI displays), Tesselation, improved timing, and in the near future (I hope) with Interframe Blending (my term for it) getting that film look should indeed be possible.

(It's not easy to explain something complicated in an simple manner. If there are any glaring mistakes as to light, speed, math, rendering technology, feel free to comment and I'll correct the text!)

 

Roger Hågensen considers himself an Absurdist and a Mentat, hence believing in Absurdism and Logic. Has done volunteer support in Anarchy Online for Funcom. Voicework for Caravel Games. Been a Internet Radio DJ with GridStream. Currently works as a Freelancer, Windows applications programmer, web site development, making music, writing, and just about anything computer related really. Runs the website EmSai where he writes a Journal and publishes his music, ideas, concepts, source code and various other projects. Has currently released 3 music albums.


Related Jobs

The College of New Jersey
The College of New Jersey — Ewing, New Jersey, United States
[10.31.14]

Assistant Professor - Interactive Multi Media - Tenure Track
Next Games
Next Games — Helsinki, Finland
[10.31.14]

Senior Level Designer
Magic Leap, Inc.
Magic Leap, Inc. — Wellington, New Zealand
[10.30.14]

Level Designer
Grover Gaming
Grover Gaming — Greenville, North Carolina, United States
[10.30.14]

3D Generalist / Artist






Comments


Matthew Downey
profile image
The lightspeed and fps argument feels like it suffers from logical fallacy because what if we were talking about lightyears per second (instead of meters per second), then we would get 1⁄31557600 frames a second which is basically zero. The other end of the spectrum is also true, we could be talking about something like Angstroms instead of meters and we would get 10 million times the framerate you arrived at under that logic.

But you are right that it is (for all intents and purposes) infinity.

The downtime for the film switching will have an impact on continuity, though, because when the camera is not taking in light, some blur is being lost (I am sure this is not a problem with digital cameras, but I do not think digital cameras are widely used in the film industry yet).

But, then again, 1 trillion frame per second cameras: http://news.yahoo.com/blogs/this-could-be-big-abc-news/world-fast
est-camera-153012573.html

This article talks about fps well: http://www.100fps.com/how_many_frames_can_humans_see.htm

Honestly, over 500fps is definitely overkill, unless objects are moving really, really fast.

For programmers:

I cannot wait to see some of the motion blur methods people come up with in the future. I am a high-level programmer (working with Unity), so the most I have had my hand at was stretching meshes along the direction of motion with dot products, which works for things like spheres, but you would probably have to run a (few?) convex hull calculations just to do the same thing for character meshes (and that's at runtime). Hopefully someone finds a cool & cheap algorithm for motion blur in the near future because it looks really cool in games like Crysis 2. I do not like screen-based motion blur, i would prefer an organic motion blur that takes rotation into account, but that is a lot to ask.

A S
profile image
It's a clumsy example, but you're missing the point. The distance from the object to the camera does not matter. What matters is the stream of photons striking the CCD or film the camera is using. Whether those originate from 1 light year away or 1 picometre is not important, but by putting it 1 metre away makes the unit cancellation nice and simple.

Regarding motion blur - if you're able to do mesh deformation you can probably generate some motion blurring algorithms yourself =D You could be the one to solve it...

Matthew Downey
profile image
The lightyear-->angstrom argument was based on the meter being an arbitrary distance (even if it was redefined (in 1983) to make the number of meters in a lightyear an exact number (integer)). I did NOT mean the distance from the light source.

I was saying, "hey man, imagine the universal standard measurement is not a meter but instead a kilometer, then imagine that that case became even more extreme and the new standard is a megameter"

if the universal standard were a megameter, then light would travel at 300 megameters a second. If you compound this effect more and more it can become pretty much zero, which is why I was saying that the speed of light was not necessarily the reason the frames per second of real life is basically infinity.

While this seems unreasonable and trivial, there are some whackily defined units like the Farad, which is usually seen in small amounts like microFarads.

The guy says he believes in logic, so I thought I would show an unlikely but nonetheless possible case were the argument would fall apart (i.e. when the standard unit of measurement was the lightyear).

Like you are saying, the high volume of photons that hit the film is a good reason for the frames per second to be so numerous.

[edit:] and, yeah, I really want to solve the geometric blur approach, although it will probably take some time. XD

Chris Hendricks
profile image
Thank you for this. I'd wondered for a long time why, if the eye's persistence of vision works just fine at 24 FPS, a game would need any more than that. You've given a fine explanation.

Janne Haffer
profile image
Another important aspect that's often not mentioned when people use the "but film looks smooth" argument is that there's a LOT of movement guidelines for shooting film at 24fps.

Things like "Don't rotate the camera faster than X degrees per second", basically if they weren't followed movies would look shitty and stuttery too.

Really, the only around this in games with free-camera movement is to limit your camera to slow movements or run at a higher framerate.

Jeremy Reaban
profile image
Meh. This is like one of those things where people hear the difference between mp3 and cd. But most people don't.

I think the biggest problem is a stable frame rate. Your eye can tell when it speeds up or slows down, while in movies, the frame rate is always constant. Except when they deliberately speed up the film for a special effect, usually to make car chases look faster.

Roger Haagensen
profile image
Indeed, stable framerate is the core of the issue.

But comparing framerates to the mp3 and cd situation is not ideal. You would need surprisingly high refresh and rendered framerates to reach "transparency" like with audio compression.
First of all 60Hz is not enough avoid flicker, 3D glasses suffer from this issue which is why 120Hz glasses are the cutting edge (and require a 240Hz tv I believe), and for old CRTs monitors I had to have 85Hz, much lower than that and I would notice the scree flickering.

A good refresh rate goal rather would be where most "normal" humans (meaning basically all humans) are unable to see the refresh flicker, then round that number up to an even ten just-in-case and then double that for stereoscopic 3D.

But here's the thing. LCD monitors unlike CRTs do not refresh the image, they do not need to. Now I have not dived into the DVI or HDMI or DualPort specs but due to CRT "legacy" reasons the entire frame is delivered at each refresh.
I'm surprised that hasn't been fixed yet as LCD, LCD+LED, OLED, SED could theoretically be given only the pixels/areas that have changed.

Thus a part of the image could be at 100+Hz while another part at 2Hz, and so on. I'm sure Graphics card engineers and driver and OS developers could do many nice optimizing tricks taking advantage of this.
But this would mean a new display connector would be needed.


@Downey
I'm not sure if this is mathematically correct, but it may not be infinity. I wonder if the Nyquist theory is true for light as well audio. (assuming ADC to DAC which is true for video as well and that light and sound both are a type of wave). I'm not sure how rendering farms do this for CGI but I assume they calculate the light for in-betwwen frames (internal framerate, or do they use light over time?) and then blend/mix those down to the 24FPS.

But let us assume 240Hz as minimum for 3D, then assume 4K resolution. The bandwidth requirement for this would be insane, so I assume the "new" display connectors are needed, and the old CRT legacy support being tossed away for good as per pixel refresh (or area refresh) is much more efficient.

Panning has always been an issue and watching a movie pan on a PC is really horrible. (lots of stuttering), there does exist framerate smoothers for many players that can rectify this, but it's a bandaid rather than a solution.

Also, remember not to mix up framerate and refreshrate.
For example I created a graphics engine/renderer for a audio stream player, and it uses alpha blending + masking tricks to animate movement, this allows artists to just make 3 static images which create really cool animated skins/meters when rendered.
And here's the thing. The framerate used is fixed, but it's also less than the refresh rate, and user input is independent of both, so the player feels very responsive and multitask well. But here's the thing. Even if rare, in the case of no volume change then the rendering can fall as low as 2 FPS. Yup! Variable framerate, and if it was possible to push only the pixels that actually change (less than 25% of the graphics) to the monitor I would have done that as well

Currently games have reached a point where they look crisp (photoreal'ish) while still being artistic, now they just need to ensure they look smooth while moving. And doubly so with 3D.

As a great person once said "Time is an illusion, and lunchtime doubly so!" and right now somewhere in the world somebody is having lunch.

I wish I had a "light formula" solution, if I did I'd gladly have posted it here on Gamasutra so all game developers and other software developers could benefit from it.

Ironically a LCD monitor do not need a refresh rate at all (the pixels are static and do not fade like with CRT so they do not need to be refreshed). Only restriction then would be graphics card/monitor/cable bandwidth speeds which would limit framerate and/or resolution, 2D vs 3D..

Arnaud Clermonté
profile image
The absurdity of LCD refresh rate even exists on portable consoles, if I remember well,
which makes no sense since the screen is part of the console, so they don't need to support any kind of "CRT legacy".
So if your game runs at 50 fps, well, too bad, it will be capped down to 30 because someone decided that the screen should refresh at a fixed rate instead of when the game needs it to.

Arnaud Clermonté
profile image
Games shouldn't try to get "that film look", the movie low framerate and blur.
Being interactive and usually in 1st person (or almost), they need to run much smoother.

I keep playing on PC instead of console to get 60 fps instead of 30.
The difference is obvious, it is much more comfortable, causes less headaches, and the controls are much tighter.

And when a game runs at 60 fps, you don't even need motion blur.

It's clear to me that developers should focus on running games at a good framerate instead of adding some blur to a low framerate.

Too bad most game reviews only show static screenshots and compressed videos which are limited to 30fps.
Now we have a generation of game reviewers who have no experience of PC gaming, never experienced a smooth framerate, and somehow think that 30fps is good. ( I did actually read " the game runs at a solid 30fps" in many reviews )

Paul Shirley
profile image
When film makers want to enhance 'realism' they use a faster shutter to defeat motion blur, in the process exposing the reality that 24fps just isn't enough.

The motion blur so many seek to emulate certainly helps disguise poor frame rates but deliberately trying to copy films exposure blur as a 'correct' thing to do is deeply misguided.

Do motion blur for artistic effect, do it to disguise poor frame update rates, do it by any method that looks good enough. But don't obsess on how film works and if you can throw more frames out instead, do that because real life has very little motion blur.

Luis Guimaraes
profile image
Whoever thinks 30 FPS is fine should try some Counter-Strike capped at it.

Leon Ni
profile image
If you used be in Jap, you will find that it is just a culture stuff, not really serious as a design。:)


none
 
Comment: