Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Giuanniello

macrumors 6502a
Original poster
Oct 21, 2012
719
204
Capri - Italy
I am total nerd with regards to video, in years and years of photography and smartphones along with a couple GoPro I barely shoot ten videos total...

Question: I download videoclips and some have bitrates of, say, 3.000bps, some 1.200 (mainly old ones) and some carry huge numbers (and size) going well above 6.000 if not higher; sometimes I shrink the size of some I want to store and I understand these apps working on the bitrate but wondering what happens quality wise, I mean, given an HD video, what does shrinking from, say 6K bps to 3K bps does to the clip?

Grazie
 
  • Like
Reactions: kylefoden

R S K

macrumors regular
Oct 18, 2022
193
71
Hannover, Germany
It's the exact same as with photography, only you're not talking about bitrate (i.e. file size) in per second.

What happens if you set a JPEG from 100 to 50% quality? Exactly. It gets worse because you're removing twice the amount of information for the sake of file size. That's how compression works. Artifacts, banding, etc., etc. set in the further you go. Of course, a slew of factors play into when and if any of that happens with any particular file, such as original quality and file type, chosen codec coming in and going out, etc. There's no one-size-fits-all solution or answer.
 
  • Like
Reactions: kylefoden

Giuanniello

macrumors 6502a
Original poster
Oct 21, 2012
719
204
Capri - Italy
Ok, let's put it another way, the example with jpg compression fits my brain till a certain point, if I have a 1920x1080 video with a 3000bps and I lower it to 2000bps am I gonna notice this regardless of the media used to show it or it is not device dependant? Even easier, if I have to show a photo on a computer or on a mobile screen I won't use 250dpi, 72dpi or 96dpi on a computer screen maybe, if I have to print and print big the higher the res the better depending on final print size, how this applies to movie watched on a computer or HD tv screen?

Grazie
 

R S K

macrumors regular
Oct 18, 2022
193
71
Hannover, Germany
Ok, let's put it another way, the example with jpg compression fits my brain till a certain point, if I have a 1920x1080 video with a 3000bps and I lower it to 2000bps am I gonna notice this regardless of the media used to show it or it is not device dependant
Again: there is no one answer. It depends on all the various factors mentioned above. No two codecs compress alike! E.g. 2000 Mbps with H.264 is something entirely different to 2000 Mbps with HEVC. The HEVC clip will be exponentially better even at lower bitrates. Every codec has its own quality : size : performance ratio. None cater best to all three. It's like the adage "Cheap, fast, good. Pick two.".

Currently, HEVC for example would be best in terms of quality while very small. But depending on hardware/software the performance can struggle. Encoding will also take an inordinate amount of time in comparison to e.g. H.264 or ProRes if you don't at least have an M1 Pro.

Which bitrate would be best no one can say but you by simply testing. There again various option come into play that you can use or not. Things from encoder entropy, VBR vs. CBR, compression keyframes, single or multi-pass, etc., etc., etc. If none of that means anything to you, simply use whatever app with whatever default settings and see what works for you. Once again, not a question anyone can answer for you.


Even easier, if I have to show a photo on a computer or on a mobile screen I won't use 250dpi, 72dpi or 96dpi
DPI are entirely irrelevant in video. That's exclusively a print thing. Whether you have a 1920x1080 image (or video, if that were a thing) with 3000 DPI or even 3 DPI makes zero difference in image quality. If anything, then PPI is a relevant measure for displaying video, but even then it changes nothing about the video's actual underlying quality.
 

ColdCase

macrumors 68040
Feb 10, 2008
3,361
276
NH
Ok, let's put it another way, the example with jpg compression fits my brain till a certain point, if I have a 1920x1080 video with a 3000bps and I lower it to 2000bps am I gonna notice this regardless of the media used to show it or it is not device dependant? ..
Bitrate does affect video quality, but it’s not the only thing that affects video quality. Resolution and frame rate are key factors, too.

Low bit rates generally reduces quality, noticing it depends on your eye and is personal. Like reducing the number of pixels in a photo. Typically you notice it most on 65 inch displays or large photos, artifacts you would not notice on an iphone screen or wallet sized photo.

There have been rules of thumb developed over the years, but your thumb may vary. You need to either experiment to find the right compromise or trust the rules of thumb.

There are formats that transmit all pixels for each frame (CBR) but most video will first send a competed frame, all the pixels, and only send changes in the subsequent frames. So video is typically a variable bit rate, more bits when there is lots of motion, very little when there is no changes. Its quite a bit more complicated, of course.

Generally, constraining bit rate to less than the RAW rate will require some form of compression, a reduction in resolution or frame rate. Noticeable depends on content and your eye.
 
Last edited:

Giuanniello

macrumors 6502a
Original poster
Oct 21, 2012
719
204
Capri - Italy
So what is the point in a video clip to be shown off a computer screen or a tv to have a 10K bps vs a 5Kbps? Old clips often barely reach 1.2K-2K but nowadays they easily get past 5K, I understand this is also due to the increased hardware capacity of capturing devices but is that high of a bitrate necessary or, better, is it necessary when the clip is to be confined within a certain viewing size/distance?

Grazie
 

R S K

macrumors regular
Oct 18, 2022
193
71
Hannover, Germany
So video is typically a variable bit rate
Far from it. There are a lot of CBR-only codecs. ProRes being one of them. Many if not most others are at least optionally CBR, such as H.264 and HEVC.


So what is the point in a video clip to be shown off a computer screen or a tv to have a 10K bps vs a 5Kbps?
Quality, plain and simple.

Again, if you are familiar with even the most basic concepts of photo/image compression, the exact same principles apply, only with an additional temporal component! Higher compression, lower quality. Easy. And the actual image content is relevant to when higher is ok (for example talking-head videos) and when not (gradients, high delta) as well as what color depth is needed. And obviously the way it's viewed can of course be just as relevant.

Coming full circle: there is therefore no one answer. There are at best very general guidelines depending on whether quality or size is more relevant.
 

ColdCase

macrumors 68040
Feb 10, 2008
3,361
276
NH
Far from it. There are a lot of CBR-only codecs. ProRes being one of them. Many if not most others are at least optionally CBR, such as H.264 and HEVC.
True, was just saying that most video you watch on a screen, whether streaming or off DVD or a hard drive is VBR.

But there is no one answer.
 

ColdCase

macrumors 68040
Feb 10, 2008
3,361
276
NH
So what is the point in a video clip to be shown off a computer screen or a tv to have a 10K bps vs a 5Kbps? Old clips often barely reach 1.2K-2K but nowadays they easily get past 5K, I understand this is also due to the increased hardware capacity of capturing devices but is that high of a bitrate necessary or, better, is it necessary when the clip is to be confined within a certain viewing size/distance?

Grazie
You only need as high a bit rate as you want to see. Older video (at 2k) is compressed and does not have as much resolution, color depth, dynamic range, or frame rate as newer. If you don't want to see that, or your viewing equipment can't handle it, then its wasted bandwidth.
 

R S K

macrumors regular
Oct 18, 2022
193
71
Hannover, Germany
Older video (at 2k) is compressed and does not have as much resolution, color depth, dynamic range, or frame rate as newer.

Once again: irrelevant without knowing even the first thing about the material. Whether 1K, 2K, or 150K is completely meaningless without the slightest bit of context i.e. knowing what we're even talking about in terms of dimensions or what codec/datarate it's coming from, etc., etc., etc.

I can easily blow up a 500kbps clip to 5Mbps and gain nothing. Just as I can crunch a 5Mbps down to 500Kbps without any perceptible difference if it's tiny and horribly compressed, to begin with. Or if it's simply coming from 10bit ProRes 4K and going to 8bit HEVC HD or whatever other endlessly possible scenario.

We're talking in circles here.
 
Last edited:
  • Like
Reactions: kylefoden

ColdCase

macrumors 68040
Feb 10, 2008
3,361
276
NH
I think we are in violent agreement, but the OP is not as well versed, and my attempt to simplify seems have rubbed you the wrong way.

Yes the bit rate is not the entire story. In fact bit rate just tells you how big a pipe you need to provide to view the video real time, not a measure of picture quality. Its just the math of the amount of video bits per frame and the number of frames per second.

But, if everything else is held constant, what does varying the size of the bit pipe between source and display screen give you? When the pipe is big enough, it doesn't matter. When its too small, frames are dropped and the video can stutter. Buffers are used to smooth out the stutter, in which case the video display is delayed until enough bits are stored up. You see this a lot with streaming.

The original video can have a large amount of pixels per frame and at 60+ frames per second it can generate huge file sizes that require huge pipes which overwhelm inexpensive hardware. There are several methods that reduce a video's bit rate enough to accommodate a smaller, cheaper, more practical pipe. The bit rate reduction may impact perceived picture quality, especially if compression is used and you are viewing the video on a 65 inch screen.

In the end, the difference in quality due to bit pipe size constraints is in the eye of the beholder. Could be insignificant, but thats a judgement thing.
 
Last edited:

Giuanniello

macrumors 6502a
Original poster
Oct 21, 2012
719
204
Capri - Italy
Buongiorno, ok, almost got it, matter of perception if I understand it correctly, even in photography I always shoot uncompressed raw in case I have to print and need all of the possible informations, in practice I been printing crops of a jpg 20x20 and even the guy who prints for me, who is a pro, was truly satisfied (and astonished at the beauty of the female subject 😁) nonetheless the compression and crop, in most of the other instances I send him uncompressed 14bit files but, unless there are, say, big dynamic range excursions it's hard to tell a 14bit vs a 10bit as well as an uncompressed tiff vs a jpg of course this depends also on the size of the media.

At the end of the day all of this is because I want to understand, since I found a hard drive filled with videoclips, some in mp4, some in mkv or avi and wmv and willing to save space I am converting them all to mp4 (not the latest codec as my computers are quite old and both encoding and playing would suffer) and during conversion I am facing old files which barely reach 1.2Kpbs and I leave them as they are while more recent ones can reach 10Kbps in whose case I reduce it to the point there is quite a bit of saved space.

Grazie again
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.