tekjae.blogg.se

1080p or 720p
1080p or 720p






  1. #1080p or 720p 720p#
  2. #1080p or 720p 1080p#
  3. #1080p or 720p software#

I'm assuming that your images are an even number of pixels across. (width/2) x height x 8 x fps bytes per second

#1080p or 720p software#

  • You need 2 bytes to store 14 bits - it is unlikely (but not impossible) that the software is going to be packing multiple 14 bits together without gaps in between.Ĭheck QuickTime's v216 format for an example of how 14 bits are typically encoded - 2 bytes.įor YCbCr 4:2:2 you will get 2 pixels encoded in 8 bytes (4 channels = Y0, U, V, Y1 at 2 bytes each) in this case the rate is.
  • Width x height x 3 x 2 x fps bytes per second If I assume you are talking about uncompressed video, for RGB and YCbCr at 4:4:4 that means you need It seems that you are considering 14 bit video, as in 14 bit per colour channel. That's also assuming that such compression doesn't expand the data. It's only realistic to assume that the compression is lossless and in a real world example we would be looking at a worst case scenario of not being able to compress any better than the original amount of data. If you are recording RAW then you need to know what that really means, is there any compression or not ? You also need to know the colour space, RGB or YCbCr and if chrominance subsampling is involved. If it is taking more than that area, then you will easily be able to see the difference.

    #1080p or 720p 720p#

    So using a 50 inch screen as a guideline, if the display takes up the area of your vision that a 50 inch TV takes when sitting 8 to 10 feet away, then you won't see better than 720p with 20/20 vision. Thinking about it another way, it really all has to do with how big of an area of your vision the screen takes. At arms length, it isn't going to be noticeable, but when you get in close to it, say watching in bed with it a few inches from your face, the difference will become quite noticeable.

    #1080p or 720p 1080p#

    Uping that to a 1080p resolution at 5 inches or so has a PPI of 415. A 4.7 inch screen at 720p has around 312 ppi. Photographs are typically suggested to be printed at 300dpi or higher when in close. In closer, it might be easier to use a pixels per inch measure. It unfortunately only goes from 20 inch displays up though. Note that someone with higher than 20/20 vision would see the differences further out.

    1080p or 720p

    While I wasn't able to find a CC version, Carlton Bale has a nice graph on his site that shows the screen sizes and distances at which 20/20 vision can resolve the differences. Viewing distance makes a bigger difference, but it's fully possible to tell the difference between 1080p and 720p even on screens that are inches in size as long as they actually support 1080p and are viewed from fairly close. I'm not sure if the MagicLantern hack uses compression or not though.Īs far as screen size for noticing the difference between 720p and 1080p, it doesn't have to be big. RAW images are generally compressed significantly, they are just compressed losslessly. That said, I'm not sure if they store it completely uncompressed. So the uncompressed data rate would be height * width * 14 bits per pixel / 8 bits per byte * fps.

    1080p or 720p

    Keep in mind that it is 14 bit color and there are 8 bits to a byte.








    1080p or 720p