Why Do Pictures of a Digital Screen Look Much Different Than in Real Life?

A good example of how captured images are different than real life are pictures of a computer screen or a mobile phone screen. The colors on them look nothing-like what we see in reality. The following top-rated answers from the internet will explain why.

1.

Your brain does an enormous amount of image processing what your eyes take in and shows the results as what you “see” (optical illusions are often designed to expose this image processing). The camera takes millions of tiny samples of what’s actually there at one given instant in time.

Most of the time these are close enough, but computer screens use some tricks in the image processing to display an image, so the camera can’t show that.

The big two are:

  • The screen is made up of one of three very tiny red, green or blue color spots that end up being similar in size to the red, green, or blue samplers in the camera. That creates moire.
  • Further, older screens updated via a line, so the camera only captured the parts of the screen lit by the line, while your brain remembers the prior image and smooth between the two.

2.

The Slow Mo Guys on YouTube created a video explaining the way screens display images to us and how they use the way our eyes and brain process images to show us movement and color. They use really high speed camera recording equipment to slow down what a screen does to display their images. This is also true for pictures since they capture a split second of what the screen is showing at that moment, they almost never look like what your brain sees because your eyes/brain are looking at a constantly changing image. https://youtu.be/3BJU2drrtCM

3.

This is the ratio of the brightest to darkest shades.To put it in practical terms, if you are in a park on a sunny day, you could see the bright blue sky and at the same time see fallen leaves in the shadow of a tree. If you took a picture of that same scene, you would have to choose which one would be properly exposed in the photo. If you wanted to get the bright blue sky, the shadow would be totally black and you wouldn’t be able to see the leaves. If you wanted to get the leaves in the shadow, the sky would be totally white.

Cameras are actually getting pretty good at capturing wide dynamic range, but screens are still far behind, only being able to display a pretty small dynamic range. Even when you compensate for this with HDR (High Dynamic Range) photo processing, it still doesn’t look like reality because it is only an approximation. The highlights are darker than they should be and the shadows are lighter.

4.

It depends on what picture you’re talking about.

If you’re talking about taking a photo of a screen that is on, it’s because to display things, computer screens are constantly emitting lights in pulses that are fast enough to be undetectable by our brains (60 refresh cycles per second is common); and this doesn’t happen all at once;

Some areas light up at different times than others, depending on what technology is used to drive those lights, so when you take a picture (which has an exposure time that allows just a single frame or two to get captured) it will get the light right at that moment, more or less.

In most places that you will see screens being used in movies or whatever, the actors will just be looking at a blank screen and content will be added in post-production or special camera settings will be used to capture the screen in the best possible way.

Source

You may also like...