Does the average video bitrate make a difference with Blu-ray discs? Many of those interested in such matters have a strong opinion, and often that opinion is a strong: 'Yes it does!' Actually, it is often: 'Of course it does!'
But I contend that this is actually a subject upon which very few people can have a truly valid opinion. By 'valid', I mean one with firm evidence to back it.
Oh, of course the more bits used to define the signal within a given lossy system, the more accurate it will be. There can be no disputing that. But compression systems like MPEG2, MPEG4 AVC and VC1 are perceptual encoders. They are designed to throw away information that you cannot, or are less likely to, see anyway. Note: that is their design. It may not be their outcome.
So, the question remains: can these differences be perceived?
Here I shall be offering several blind trials to test just that. All of these will be limited trials, in the sense of examing only one of the quality markers of movies: whether the differences are noticable on still frames. It is possible that differences may not be detectable between still frames from two versions of the movie, yet be quite apparent when the moving footage is viewed. Still, we do what we can.
Obviously, to perform the test requires two versions of the same movie. There are limited choices at this point, but I shall be adding more over time. Below are links to the available trials. I'm hoping to offer modest prizes from time to time with some of these, and this is shown below.
So, click on a link and try a trial.
Movie | Company | Date commenced | Remarks |
---|---|---|---|
Natural Born Killers | Warner Bros | 30 October 2009 | Competition closed, prizes claimed. |
The Book of Eli | Sony | 5 January 2010 | N/A |