The TV Frame Game
Through another one of the numerous techie competing standards stories, (the TL;DR summary being that NTSC TV standard was considered a bit rubbish on this side of the pond and as a result in Europe we developed two alternative standards PAL and SECAM) in the UK and the USA we ended up with two somewhat incompatible TV systems. In the USA they had TV pictures with a vertical resolution of 480 lines, playing at a frame rate of 30 frames per second, whilst on this side of the Atlantic we were watching a higher resolution 576 line picture, but playing at a frame rate of 25 frames per second. The TV companies had ways of converting pictures between the two standards, and eventually we got to home video recorders being able to play tapes recorded in the other standard, and TV’s that could cope with both, indeed these days in the UK you’ll find most DVD or BluRay players and TV’s will quite happily switch between European 50Hz standards and the North American 60Hz, whatever the standard of the material that was put into the machine.
When the HD standards came around there seemed to be general agreement across the world, and everybody settled on 720 lines or 1080 lines for high definition pictures and all seemed right with the world… Or maybe not…
That brings us to me watching a video last night which involved a number of shots of trains going left to right or right to left across the screen, and a really annoying judder as the trains went past. I was watching from an HD video file playing back on our Apple TV through Plex. Thinking it was a problem with the Apple TV I tried it through Plex on our Xbox One — same problem, and watching the raw file on the desktop, same problem again. Looking at the file it had come from a UK production company and was encoded in 1080p with a frame rate of 25 frames per second, perfectly standard UK file. So I took a look at the Apple TV. Digging into the settings I had the picture standard set to Auto, further down it said it had automatically set itself to 1080p 60Hz. There was also an option to specify which picture format to use, with a 1080p 50Hz option, so I switched that over, watched the file again, and away went the judder, switch back to auto and the Apple TV would decide to switch to 1080p 60Hz.
The basic problem seems to be that unlike the DVD Players, video recorders or BluRay players the latest generation of devices like the Apple TV or Xbox, even though many are capable of switching the resolution, automatically go for 1080p 60Hz and then behave as if the TV they’re connected to is a dumb panel that can’t cope with any other standard, as a result they then try to convert video at another frame rate in software. The judder I could see on the video is a result of the Apple TV or Xbox trying to show 25 frames per second on a device that is wanting 30 frames per second, so on smooth movements you get the judder because 20% of the frames in any one second of video are being shown twice. Knowing my TV is a European model that can cope with a 50Hz picture I can switch the Apple TV over and it works fine (not so for the Xbox incidentally) but then if I watch a North American video at 30 frames per second the Apple TV is locked in 50Hz and has much the same problem trying to handle showing 30 frames in the period when it only has 25 frames.
At this point the cinema purists are going to point out that there is another very common frame rate, with is 24 frames per second, which is the frame rate that most movies are made at, and many BluRays are now released as that standard because again a lot of TV sets these days will cope with the frame rate. So what do the Apple TV, Xbox and other TV streamer boxes do? They try and show those 24 frames in whatever frame rate the box is currently set to, and have exactly the same problem.
Going through my digital videos I have a real mixed bag. Most of the UK stuff is 25 frames per second, some where it has come off film is 24 frames per second, US stuff mostly 30 frames per second. Looking at home videos I have the same mixed bag, primarily because even though they’re all UK bought devices the cameras and phones I’ve had over the years don’t always produce UK standard video, for example iPhones using the standard camera software will consistently record in 60Hz standards — you have to resort to apps like Filmic to get the phone to record in European 50Hz standards, or even 24 frames per second if you want to work with cinema standards.
So even though world has agreed the size of a picture, there is still no agreement over how many of those pictures are shown per second. Most of our digital streaming boxes either will only work at the US 60Hz standard (the earliest Sky Now boxes were stuck on 60Hz) or are switchable but thanks to the software are difficult to switch across — the Apple TV you have to go rummaging in the settings, on the Xbox you effectively have to con the Xbox into thinking your TV can only do 50Hz pictures before it will switch — with the devices doing a second rate job when your TV is quite often perfectly capable of playing things back correctly.
Having one standard is never going to work as we’ll still have vast amounts of archive content at the older frame rates, so for the moment it would really help if the digital streamer manufacturers actually started acknowledging that there are a variety of standards — even your average US consumer who doesn’t have any 50Hz content is going to notice glitching if they watch a movie. We’ve had DVD and Video Recorders that could switch for years, why is it that the new tech seems to have taken such a massive step backwards?
Originally published at Exigency In Specie.