No reason why it can't be done. My TV purportedly supports HDR, so it's something that I can look into.
Unless the TV or monitor's backlight goes into overdrive when operating in HDR mode, isn't the top-end brightness that you see still going to be identical between HDR and SDR? Doesn't the former just get you more steps along the way to that maximum brightness?
This is something that hardly depends on the hardware itself. Usually HDR starts at 300nits and can go up to 2000nits. It is considered that 150nits is the average for a TV/Monitor without HDR. As you see, even the lowest HDR device, has a double amount for nits (brightness). The problem i see with HDR is, that there are so many different standards and even worse, HDR itself has also seen improvement/development, so the "newest, hottest" shit is HDR+ (where you have a LUT for EACH frame). Netflix for example, sets this as standard for new recordings of their (future) content. As a developer you have the pain to decide, which range you want to cover with your HDR implementation.
R.Belmont is right, when he says that HDR can be that bright, it almost hurts to look at. I would say that starts with 1000nits. I have a TV that has only 400 and that is not much of a difference that i would speak of a WOW effect, but it is there, just way more subtle.
In short, the top-end brightness with 2000 can be VERY bright (and dark) and it will differ from a average SDR monitor very much. I would like to see Asteroids bullets with HDR and 2000 nits
. I am sure it will impress