|
Joined: Sep 2008
Posts: 92
Member
|
Member
Joined: Sep 2008
Posts: 92 |
Xpost from the GitHUB Discussion Topic, for more eyeballs/viewpoints. Probably for @cuavas @MooglyGuy, but was curious if HDR could be supported in MAME as a way to, forgive my imprecise language, boost/juice the brightness of the raster/vector image up a bunch of nits. One thing we get with the shaders is a somewhat darkening of the screen, could this be overcome with an application of HDR to it? The original CRTs are quite bright in comparison to our modern LCDs. I recall this was mentioned 5+ years ago when HDR monitors were not as common as today; but was curious to see if the tech. has matured since or if there are philosophical/technical reasons for disallowing it. Thanks - RB Says:We're not opposed and I think it will be necessary for good vector rendering in the same way that VRR is accepted as the real fix for tearing, but I don't think it's something that can just be bolted on to the existing 2006-era rendering architecture. @MooglyGuy would probably know more though.
|
|
|
|
Joined: May 2009
Posts: 2,225 Likes: 387
Very Senior Member
|
Very Senior Member
Joined: May 2009
Posts: 2,225 Likes: 387 |
No reason why it can't be done. My TV purportedly supports HDR, so it's something that I can look into.
What I don't necessarily understand, due to not having looked into it until now, is how HDR would actually brighten the top end of the image. Unless the TV or monitor's backlight goes into overdrive when operating in HDR mode, isn't the top-end brightness that you see still going to be identical between HDR and SDR? Doesn't the former just get you more steps along the way to that maximum brightness?
|
|
|
|
Joined: Mar 2001
Posts: 17,258 Likes: 267
Very Senior Member
|
Very Senior Member
Joined: Mar 2001
Posts: 17,258 Likes: 267 |
I don’t know how it works exactly but it goes brighter. I recently picked up an LG G3 OLED TV, currently the world record holder for HDR brightness, and when HDR content goes max bright it almost hurts to look at. Movies and PS5/XSX games that are HDR compatible are almost like going from NTSC to 1080P in terms of looking better.
|
|
|
|
Joined: May 2009
Posts: 2,225 Likes: 387
Very Senior Member
|
Very Senior Member
Joined: May 2009
Posts: 2,225 Likes: 387 |
Well, as soon as I've worked out the subtle bug that's result in the bloom effect to be mis-positioned on vector games and gotten the D3D11 renderer into more comprehensively better shape, I'll have a look.
|
|
|
|
Joined: Jul 2015
Posts: 116
Senior Member
|
Senior Member
Joined: Jul 2015
Posts: 116 |
No reason why it can't be done. My TV purportedly supports HDR, so it's something that I can look into.
Unless the TV or monitor's backlight goes into overdrive when operating in HDR mode, isn't the top-end brightness that you see still going to be identical between HDR and SDR? Doesn't the former just get you more steps along the way to that maximum brightness? This is something that hardly depends on the hardware itself. Usually HDR starts at 300nits and can go up to 2000nits. It is considered that 150nits is the average for a TV/Monitor without HDR. As you see, even the lowest HDR device, has a double amount for nits (brightness). The problem i see with HDR is, that there are so many different standards and even worse, HDR itself has also seen improvement/development, so the "newest, hottest" shit is HDR+ (where you have a LUT for EACH frame). Netflix for example, sets this as standard for new recordings of their (future) content. As a developer you have the pain to decide, which range you want to cover with your HDR implementation. R.Belmont is right, when he says that HDR can be that bright, it almost hurts to look at. I would say that starts with 1000nits. I have a TV that has only 400 and that is not much of a difference that i would speak of a WOW effect, but it is there, just way more subtle. In short, the top-end brightness with 2000 can be VERY bright (and dark) and it will differ from a average SDR monitor very much. I would like to see Asteroids bullets with HDR and 2000 nits  . I am sure it will impress  .
I live... I die... I live again.
|
|
|
|
Joined: Sep 2008
Posts: 92
Member
|
Member
Joined: Sep 2008
Posts: 92 |
A baseline of regular HDR10 would be most widely available on monitors and TVs. Dolby Vision is/was a bit more uncommon, HDR+ considerably so. Thanks JD for looking into it.
|
|
|
|
Joined: Jul 2015
Posts: 116
Senior Member
|
Senior Member
Joined: Jul 2015
Posts: 116 |
HDR 10 just tells the colour depth and billions of colours is the least it should use (most common). That is 1024 steps per channel (R,G,B). Dolby Vision is using 12bit for colour depth.
I live... I die... I live again.
|
|
|
|
Joined: Mar 2001
Posts: 17,258 Likes: 267
Very Senior Member
|
Very Senior Member
Joined: Mar 2001
Posts: 17,258 Likes: 267 |
Also, the HDR format (HDR10/HDR+/Dolby Vision) depends on the video card and drivers' support, it's not something MAME can influence.
|
|
|
|
Joined: Dec 2011
Posts: 194 Likes: 2
Senior Member
|
Senior Member
Joined: Dec 2011
Posts: 194 Likes: 2 |
It also depends on the particular monitor you're using if I'm not mistaken.
|
|
|
|
Joined: Jul 2015
Posts: 116
Senior Member
|
Senior Member
Joined: Jul 2015
Posts: 116 |
Also, the HDR format (HDR10/HDR+/Dolby Vision) depends on the video card and drivers' support, it's not something MAME can influence. True, MAME can not influence this, but HDR is supported by: Video cards from the AMD RX 400, Nvidia GTX 900 series, or Intel integrated graphics found in Intel 7th-gen Core, or newer. So nearly any direct x12 card will do or cards that are not older then seven years. If MAME provides a wide color gamut (WCG) table for colors, then this should be enough to drive HDR on certified HDR monitors and use a sRGB table for SDR "average" users.
I live... I die... I live again.
|
|
|
1 members (jlopezm),
145
guests, and
0
robots. |
Key:
Admin,
Global Mod,
Mod
|
|
Forums9
Topics9,355
Posts122,423
Members5,082
|
Most Online1,283 Dec 21st, 2022
|
|
These forums are sponsored by Superior Solitaire, an ad-free card game collection for macOS and iOS. Download it today!
|
|
|
|