The consensus here seems to always be leave in SDR and use match content, but I’ve had no issues with leaving Dolby Vision on all the time. SDR content like youtube and TV shows that were never mastered in HDR look 100% fine to me. Part of the Dolby Vision spec is that it forces your TV into it’s most color accurate settings so why wouldn’t I want that as well? Using an LG C2 by the way.
I think it’s a waste of energy and it’s not great to have your tv on 100% brightness for long periods of time especially when it’s not needed
I went back to 4K SDR (had DV on always earlier) because regular TV content, thru Direct TV app on ATV, and non-4K content looked too dark and unreal for my taste. I played with the TV settings but it never quite looked right. I have a Sony OLED.
What I’ve learned from many many YouTube videos is that you should select 4k SDR as your main video format and then enable match range options. This is because you don’t need Dolby vision or HDR even for the apple tv menu elements. You’ll only want those when you’re watching something — and it’ll automatically switch your mode to the best possible format provided by the show or movie. So if it’s available on Dolby vision it will show you it in dolby vision and then switch back down when your browsing and so on and so forth.
Whatever works for you. Personally my projector isn’t the best at DV and I also noticed DV is great for one show and awful for another. Not that it was good TV but Shadow and Bone on Netflix for some reason was enough for me to finally just go back to SDR fixed and stop thinking about it. Simpler.
SDR content is not accurate in Dolby Vision mode. Thats it. It also increases energy consumption unnecessarily.
Per what some others have said: I think it really depends on your TV. On our bedroom TV (TCL 5 series) leaving it on makes everything look off. On our living room TV (Hisense U8K) leaving Dolby Vision on looks great (quality looks far better than SDR actually).
There is no should. It’s your tv use the settings as you prefer.
The recommendation comes from an idea that you will more likely get content the way it is supposed to be presented with problem apps this way, and to avoid the display mode switching with ads
How do you accomplish this?
I have an LG CX set to Cinema mode. I set my ATV to 4K Dolby Vision DV and Match Content “Off”. My experience with SDR content is like your experience — the picture looks great. YouTube videos and old sitcoms on Pluto and Plex look as expected
I tried the recommended settings in the past and the picture wasn’t as good.
Why the heck would you turn match content off is this scenario?
You better sit down. I don’t match frame rate either.
I’ve tried the generally recommended settings and the video experience is less than optimal with my LG CX OLED TV in Cinema mode. My settings normalize the variations where some content is over saturated, some crush brightness, some are harsh, some still have the soap opera effect, etc. I get a consistent and beautiful picture across all of my streaming apps.
Matching ON assumes every content producer puts a lot of thought and money into ensuring their videos look a certain way. Well, they don’t. Or can’t. I would guess many don’t have the budget, equipment, or time to produce superior video. So I let my ATV and LG CX improve then.
The films from film makers who want their films to be seen a certain way (eg Christopher Nolan) are often in 4K DV anyway.
Haha. Every range of opinions on the subject. It looks great on my LG OLED. It does give black screen briefly switching to commercials in some streaming shows.
I almost find that a feature. I’ve missed so many commercials on YouTube. Cause the screen was still black matching content and frames.
If you care about accuracy you should probably leave it to match content. You’ll probably have your TV calibrated to show SDR content at 100nits peak, which is a lot dimmer than most people think SDR should look.
If you want it to look bright or maybe even “pretty” you’re probably already watching SDR with a peak of like 300nits, so who really cares then anyways?
If I were to give you a tip: Give accuracy a try. Have tour TV calibrated or at least get close to more accurate setting by copying the settings of a respected calibrator and watch SDR as intended: In in dark room with a faint extra light source at most on a TV calibrated for a peak brightness of 100nits. Let your eyes and brain adjust for a few hours and chances are you’ll love it.
This is for critical viewing. Watching the news in a bright room? Who gives a fuck? Just watch it however you like. It’s like hoe critical listening to music is a completely different thing than having some music on the background while doing chores. The second doesn’t need to be accurate, but the first one really benefits from proper equipment, proper source and correct room accoustics
Two reasons.
A). A surprising amount of SDR apps do not flag their video output correctly when the Apple TV UI is set to HDR/DV, and therefore, either they do not switch dynamic range properly, or they do not map out correctly their video/color levels within an HDR signal/container.
B). Perhaps more importantly, the tvOS interface is not, I repeat, is not, natively rendered in DV. Therefore, when you set the default Apple TV video output to DV, a conversion from SDR video levels happens, which affects both color and accuracy.
In sum, set your Apple TV to 4K SDR 60hz, RGB High (to avoid the green tint bug with SDR content when set to the default YCBCR), and enable both range and frame rate matching. That’s it.
Don’t listen to all the naysayers. If you like the look of DV on standard SDR content, you can leave it on all the time. It’s your TV and your preferences. I personally leave DV off unless content is in DV, but I also tried DV on all the time and it was OK experience. Unfortunately I found some content did not render properly in DV, so I stopped that practice.
Part of the Dolby Vision spec is that it forces your TV into it’s most color accurate settings so why wouldn’t I want that as well?
Oh my LG C1 OLED, most Dolby Vision content looked fine. But for AppleTV+ content in particular, something was up with either the appleTV or the TV itself that caused the brightness to constantly shift form shot to shot with Dolby Vision content only. Ted Lasso was the worst offender since it’s a generally very bright show. Within a single scene, the brightness would often shift between the correct level and too-dim from shot to shot. It because so annoying I just disabled Dolby Vision content across the board.
I noticed this problem with Ted Lasso, too. I’ve seen it happen when two characters are talking where one is in front of a bright background and the other is in front of a dark background.
I also noticed this problem with Frasier 2023 on Paramount+, but I couldn’t determine any reason. It is streamed in 4K DV.
The consensus
There is no consensus because it depends ENTIRELY on the TV.
On my LG C1 I leave DV on all the time with match content/framerate of course, and the menus and UI look stunning.
On my shittier Vizio LCD TV, I leave it on 4K SDR because DV looks washed out and terrible. But with match content/framerate for content, of course.
There is no single answer. It depends on the TV, and its easy enough to test.