Proof Samsung Doesn’t Understand Its Own Technology

This afternoon, Samsung posted the photo above on Google+, Google’s competitor to Facebook, with the caption “What’s the difference between Full High Definition TV and #UHD TV? See for yourself.” This is the worst example comparing Full HD TV with the new standard the industry is pushing, Ultra HD also known as 4K. Allow me to explain.

Under FHD, we see 1080×1920. This represents the width and height size for the amount of pixels in the screen, or the resolution of the screen. Under UHD, we see 2160X3840, a higher resolution than FHD. All this illustrates is that there are more pixels, or a higher resolution, in UHD television. With higher resolution comes more detail in the picture – the blades of grass are more defined, wrinkles in people’s faces and clothes are more details and visible, etc.

What’s wrong with this illustration? The image shows a football player spanning across both sides of the image. The FHD side, he is blurred, the UHD side, he is not. If you look closely, the blur is only left-to-right, which is motion blur. Motion blur occurs when objects move and is entirely a camera problem, not a TV problem. Motion blur is the result of the camera’s shutter opening and closing too slow to capture the movement smoothly. It results in a streak-like image because of the movement. Resolution can never fix this, but Samsung is trying to say that higher resolution will.

The only argument any TV manufacturer can make to say UHD can deliver a “smoother looking image” (notice I didn’t discuss blur) is with a higher frame-rate or refresh rate, which this image doesn’t address. This would also be a weak argument as, again, the image is limited to what the camera produces. A higher frame-rate or refresh rate would mean that you can see more images per second, which can give the illusion of a smoother picture, but it still won’t correct blur. The problem with this argument is that if a TV is showing more images per second than was recorded, you’re not gaining more images per second. Movie and TV production record in set standards, typically 24 frames per second (film) or 29.97 frames per second (TV). Some might record at 60 frames per second, but this is under special circumstances. Higher frame rates tend to cause an effect of “home movie” feeling video, so industries stick to tried-and-true standards most of the time.

If you want to see for yourself, Samsung’s original posting can be found here:

The comments show a lot of criticism from those that know how technology works. The fact Samsung either doesn’t understand their own technology or is preying on the stupidity of their customers worries me.

Comments are closed.