What's The Difference?

esmufc07

Brad
Scout
Joined
Aug 7, 2007
Messages
50,070
Location
Lake Jonathan Creek
Between a game running at 1080p @ 30fps and a game running at 720p @ 60fps? And is it possible to get games running 1080p @ 60fps? And while I'm asking questions (:p), what is the difference between 1080p and 1080i?

Thanks

(I expect this thread to turn into a 3 page debate between Redlambs and Weaste about things the average Caf user knows nothing about)
 
A game running at 60fps clearly outputs 60 images per second. A game than runs at 30 clearly outputs 30 images per second. Cinema movies run at 24 frames per second. The more frames, the more fluid the action. 1080p and 720p are simply the number of pixels shown on the screen. If we are talkin native rendering buffers, then 1080p is 1920x1080 pixels on the screen, 720p is 1280x720 pixels on the screen. Many games however scale, because of how many pixels they can push in a particular time scale. GT5 Prologue runs natively at 1280x1080 @ 60fps and then scales horizontally to fill the 1920x1080 (scaling is faster than trying to fill the extra pixels), Halo 3 runs at 1152x640 and then scales both horizontally and vertically to fit 1280x720 (NB: XB360 has a full harware scaler).
 
A game running at 60fps clearly outputs 60 images per second. A game than runs at 30 clearly outputs 30 images per second. Cinema movies run at 24 frames per second. The more frames, the more fluid the action. 1080p and 720p are simply the number of pixels shown on the screen. If we are talkin native rendering buffers, then 1080p is 1920x1080 pixels on the screen, 720p is 1280x720 pixels on the screen. Many games however scale, because of how many pixels they can push in a particular time scale. GT5 Prologue runs natively at 1280x1080 @ 60fps and then scales horizontally to fill the 1920x1080 (scaling is faster than trying to fill the extra pixels), Halo 3 runs at 1152x640 and then scales both horizontally and vertically to fit 1280x720 (NB: XB360 has a full harware scaler).

Well I understand it a bit better now.

Thanks :D
 
Further on from Weaste, I'll simplify it.

The higher the better. Games at higher resolution and run at a higher refresh rate generally look better and run smoother. It frustrates me no end playing games at 30fps or less no end, but for the most part people wouldn't generally care so that's considered the minimum.
 
Well I understand it a bit better now.

Thanks :D

One little point though, I wouldn't use a Halo vs GT5 comparison if you were going to compare graphics and resolutions/fps, because there's a lot more going on generally under the hood in Halo. I expect racing games to always beable to push boundaries further than most over game types.
 
See I don't know what that actually means.

And if WipEout can do it why can't other newly released games?

It depends on how much work you are doing to actually compose the scene. How many polygons are you rendering, how many textures are you compositing, how much AA and AF you are doing, depth and type of the colour space, how much other processing are you doing.

1080i vs 1080p is something else. LCDs do not actually interlace, CRTs do. It's a cheat to get double the vertical resolution. What interlacing means is that you first output all of the even lines of your image one frame, then all of the odd lines the next. These are shown so rapidly, that it looks as if you get double the vertical resolution, but depending on how fast you do this, you will get flicker. So, for example, when a 720p screen displays a 1080i image, it is actually getting per input frame an image that is 1920x540. This however needs to be scaled down to fit on the display device which takes it down to 1280x324. You will then, depending upon the frequency get an image where each real frame is displayed 30 times per second, but each interlaced frame 60 times per second. HDTV uses 1080i, because it only needs to broadcast half of the information per refresh frame that a full 1080p image would. A 1080p display will simply buffer two frames and then put them out as a single 1920x1080 progressive image every 30 frames per second.
 
Further on from Weaste, I'll simplify it.

The higher the better. Games at higher resolution and run at a higher refresh rate generally look better and run smoother. It frustrates me no end playing games at 30fps or less no end, but for the most part people wouldn't generally care so that's considered the minimum.

But don't most games that run a 60fps suffer from frame rate drops?
 
Further on from Weaste, I'll simplify it.

The higher the better. Games at higher resolution and run at a higher refresh rate generally look better and run smoother. It frustrates me no end playing games at 30fps or less no end, but for the most part people wouldn't generally care so that's considered the minimum.

Cheers Red;)
 
But don't most games that run a 60fps suffer from frame rate drops?

Games can always suffer framerate drops, whichever framerate you are after. This is one of the reasons 30fps is more common, because if the game can push more but not at a guaranteed pace, then locking it at 30fps can keep it constant.

It's not the ideal way to do it, since a drop from 60-50 (for example) is much less noticeable than 30-20.
 
But don't most games that run a 60fps suffer from frame rate drops?

Not if they can get all of their work done 60 times per second in every case. You only get frame drops in the situation where there is too much to do in any specific instance than you cannot display the frame in time. If you have your game synchronised on the vertical blank, then this can be quite jarring. Many developers simply do not synchronise on the vertical blanking period when they know that they cannot hit it, but this causes what is known as screen tearing.

Games that target 30fps can also suffer from frame rate drops if they cannot get all of their work done in 1/30th of a second.
 
If you are talking Blu-ray, and your TV supports it, PS3 and other players output 24fps to get perfect cinema screens. If the TV doesn't support it, you can get stuttering artefacts, as the output needs to be scaled in terms of time. This is why you will see some LCD tvs that do "120hz", because it's 5x24, thus you will not get any time scaling when watching a movie. Meaning that it will show each film frame 5 times, then the next. Doing this 120 times per second.
 
One little point though, I wouldn't use a Halo vs GT5 comparison if you were going to compare graphics and resolutions/fps, because there's a lot more going on generally under the hood in Halo. I expect racing games to always beable to push boundaries further than most over game types.

I'm not trying to compare them, I'm just pointing it out that those are the resolutions that they use, neither is rendering to a HD specified resolution.

COD4 = 1024x600 (PS3 & XB360) 60fps, Uncharted = 1280x720 30fps. Rachet & Clank = 1280x704 60fps (no scaling) so it 1280x720 in aspect.
 
Yes, that's the problem, it's not all black and white, but most people do not know this. Halo 3 is rendering its scene twice to get the HDR lighting. If people appreciate the lighting in the game, then they made a good choice. No AA though on a 640p image for me was a poor choice.
 
Yes, that's the problem, it's not all black and white, but most people do not know this. Halo 3 is rendering its scene twice to get the HDR lighting. If people appreciate the lighting in the game, then they made a good choice. No AA though on a 640p image for me was a poor choice.

I agree completely. HDR lighting is pretty, no doubt, but for the most part the average Joe just won't notice it, but will notice jaggies.
 
We've talked about this before, but if all the games this generation outputted 720x288 or 720x576 for LCDs and used all of that power to give massive IQ @ 50/60fps, then it would be quite glorious.
 
Interlacingani2.gif
 
I'm just in awe at how good it could be, Standard HD is good enough, but Super HD has a screen resolution of 7680x4320 pixels, 16 times greater than current HD. :eek:
 
This is important to realise here.

800px-Common_Video_Resolutions_2.svg.png


If you want a simple display buffer, then at 1080p it's 1920x1080x4 (4 being bytes, 32 bits RGBA) = 8MB. 720p it's 1280x720x4 = 3.5MB. So, basic front/back buffer at 720p = 7MB or 1080p = 16MB. Games use more than one buffer, because they need extra information and need to keep it. Killzone 2 I believe uses 32MB for its frame buffers alone at 720p.
 
I'm just in awe at how good it could be, Standard HD is good enough, but Super HD has a screen resolution of 7680x4320 pixels, 16 times greater than current HD. :eek:

There is something inbetween, and RSX supports it, and that's 4096x2304 or something like that. The problem here is the memory required for a single buffer of that is 38MB. It's also a silly amount of pixels to fill. Half a billion per second at 60fps in the simple case.