
I'm assuming that your images are an even number of pixels across. (width/2) x height x 8 x fps bytes per second
#1080p or 720p software#
#1080p or 720p 720p#
So using a 50 inch screen as a guideline, if the display takes up the area of your vision that a 50 inch TV takes when sitting 8 to 10 feet away, then you won't see better than 720p with 20/20 vision. Thinking about it another way, it really all has to do with how big of an area of your vision the screen takes. At arms length, it isn't going to be noticeable, but when you get in close to it, say watching in bed with it a few inches from your face, the difference will become quite noticeable.
#1080p or 720p 1080p#
Uping that to a 1080p resolution at 5 inches or so has a PPI of 415. A 4.7 inch screen at 720p has around 312 ppi. Photographs are typically suggested to be printed at 300dpi or higher when in close. In closer, it might be easier to use a pixels per inch measure. It unfortunately only goes from 20 inch displays up though. Note that someone with higher than 20/20 vision would see the differences further out.

While I wasn't able to find a CC version, Carlton Bale has a nice graph on his site that shows the screen sizes and distances at which 20/20 vision can resolve the differences. Viewing distance makes a bigger difference, but it's fully possible to tell the difference between 1080p and 720p even on screens that are inches in size as long as they actually support 1080p and are viewed from fairly close. I'm not sure if the MagicLantern hack uses compression or not though.Īs far as screen size for noticing the difference between 720p and 1080p, it doesn't have to be big. RAW images are generally compressed significantly, they are just compressed losslessly. That said, I'm not sure if they store it completely uncompressed. So the uncompressed data rate would be height * width * 14 bits per pixel / 8 bits per byte * fps.

Keep in mind that it is 14 bit color and there are 8 bits to a byte.
