More than likely your monitor only displays 8 bits of precision in an sRGB color space. Is that enough color precision to display images without artifacts? No. Not even close.
The most common artifact that you will see due to a lack of color precision is banding. With 8 bits of precision, you have 2^8=256 different values available for each color. So each channel (red, green, and blue) each go from 0 to 255. If the precision was “enough”, then the colors would be so close together that the human eye couldn’t tell the difference between them, and we would see a gradient. Is it? In the image below, the gradient goes from 91 to 96. Also, if you mouse over the numbers you see more gradient lines. The last one is a special treat: I dusted off my animated GIF skillz from 1996 and made an animation of the first 4 images.
If you mouse over you should be able to clearly see the edges around them. At least I can. And you can REALLY see it in the animated GIF.
This next image is a little harder to see. It’s similar to the first two except that the red, green, and blue channels are out of sync. While you probably can’t see as much banding, you should be able to see that bands shift from purple-ish to green-ish.
With 8 bits, in addition to not having luminance accuracy, you also don’t have as much color accuracy. Here is a hypothetical. Suppose that you want a clean blend between red and green. As an RGB triple on an 8-bit display, red is (255,0,0) and green is (0,255,0), so you get 254 “steps” between them. But I never said how bright red and green were. Now lets make them darker, and say that dark red is (16,0,0) and dark green is (0,16,0). Well, now we only have 15 shades between them. And in the worst case there are literally no shades in between our darkest shades of pure red (1,0,0) and pure green (0,1,0).
Current displays can show 256*256*256=16 million colors. That’s actually not even close to being enough, at least with the way that they are distributed. If we could have a theoretical monitor that had a better distribution of colors (i.e. more in the blacks) then 16 million colors might be enough. But monitors today have too much precision in the whites and not enough in the blacks.
So with the real monitors that we have, which in most cases are 8-bit sRGB, we don’t have enough precision to recreate the colors that our eye can see. That’s why professional coloring systems use 10-bit colors instead of 8. For most images 8-bits is fine. But if you have lots of dark, smooth gradients you’ll see lots of banding. This problem is compounded by TV signal compression. Next time you are watching a dark TV show check out the blacks in the background. You’ll see what I mean, especially in the reds.
This might sound like a crazy hypothetical but it happens all the time. Also, television LCDs are much worse than your normal computer monitor LCDs. When I was working at Naughty Dog, about once every month or so an artist would come to me complaining about banding. And each time I’d take a screenshot, open it in Photoshop, and show them that the two colors are only 1 code apart so it’s the TV and there’s nothing I can do about it. If you were to ask me how to increase the quality of TV broadcasting, I wouldn’t say more resolution. I’d ask more more color precision.
It would be really cool to some day have HDR displays. But be patient: It’s going to be a while. We don’t even have enough color precision for the displays that we have now, so there’s no point in having an HDR display until we get more range than an 8-bit sRGB signal.