Previous Thread
Next Thread
Print Thread
Page 1 of 4 1 2 3 4
#100988 - 08/03/15 02:49 PM Need help, with GLSL  
Joined: Jul 2015
Posts: 83
u-man Offline
Member
u-man  Offline
Member

Joined: Jul 2015
Posts: 83
Serbo-Croatian MF living in Ge...
Hello Arcademics,

I hope someone can help me here. I have problems with GLSL and how the interaction between MAME-Bitmap and SCREEN-Bitmap works. Sadly the only info i found is this:

http://git.redump.net/mess/plain/src/osd/sdl/shader/docs/PluggableShader.txt

I want to optimize the CRT-geom shader and want to add bloom to my conversion. For that, i need to have seperate gaussian blur passes on the SCREEN-Bitmap side. The bad thing is, i always got an error message, if it is on SCREEN-Bitmap side and i dont know why frown

I attached a graphical picture of how i understand the whole thing, as i think its easier to describe these kind of things with a picture.



Strangely the seperate bloom passes work, if they are on the MAME-Bitmap side, but then i have the problem, that the subsequent CRT-geom shader doesnt treat right the result. It just dont blend (no additive process) the result with the "normal" game-screen and my final picture just look blurred, but it has all other effects correctly applied, like scanlines, shadowmask etc.
I am really desperate here frown .

This is my v-bloom on MAME-Bitmap side that works:


Code:
uniform sampler2D mpass_texture;
uniform vec2 color_texture_pow2_sz;
uniform vec2 color_texture_sz;

#define CRTgamma 2.5
#define display_gamma 2.2
#define TEX2D(c) pow(texture2D(mpass_texture, (c)), vec4(CRTgamma))

void main()
        {
                vec2 xy = gl_TexCoord[0].st;
                float oney = 1.0/color_texture_pow2_sz.y;

                float wid = 2.0;

                float c1 = exp(-1.0/wid/wid);
                float c2 = exp(-4.0/wid/wid);
                float c3 = exp(-9.0/wid/wid);
                float c4 = exp(-16.0/wid/wid);
                float norm = 1.0 / (1.0 + 2.0*(c1+c2+c3+c4));

                vec4 sum = vec4(0.0);

                sum += TEX2D(xy + vec2(0.0, -4.0 * oney)) * vec4(c4);
                sum += TEX2D(xy + vec2(0.0, -3.0 * oney)) * vec4(c3);
                sum += TEX2D(xy + vec2(0.0, -2.0 * oney)) * vec4(c2);
                sum += TEX2D(xy + vec2(0.0, -1.0 * oney)) * vec4(c1);
                sum += TEX2D(xy);
                sum += TEX2D(xy + vec2(0.0, +1.0 * oney)) * vec4(c1);
                sum += TEX2D(xy + vec2(0.0, +2.0 * oney)) * vec4(c2);
                sum += TEX2D(xy + vec2(0.0, +3.0 * oney)) * vec4(c3);
                sum += TEX2D(xy + vec2(0.0, +4.0 * oney)) * vec4(c4);

                gl_FragColor = pow(sum*vec4(norm),vec4(1.0/display_gamma));
        }


and here is my h-bloom on the SCREEN-Bitmap that doesnt work:

Code:
uniform sampler2D mpass_texture;
uniform vec2 color_texture_pow2_sz;
uniform vec2 color_texture_sz;

uniform vec2 screen_texture_sz;
uniform vec2 screen_texture_pow2_sz;

#define CRTgamma 2.5
#define display_gamma 2.2
#define TEX2D(c) pow(texture2D(mpass_texture, (c)), vec4(CRTgamma))

void main()
        {
                vec2 xy = gl_TexCoord[0].st;
                float onex = 1.0/screen_texture_sz.x;

                float wid = 2.0;

                float c1 = exp(-1.0/wid/wid);
                float c2 = exp(-4.0/wid/wid);
                float c3 = exp(-9.0/wid/wid);
                float c4 = exp(-16.0/wid/wid);
                float norm = 1.0 / (1.0 + 2.0*(c1+c2+c3+c4));

                vec4 sum = vec4(0.0);

                sum += TEX2D(xy + vec2(-4.0 * onex, 0.0)) * vec4(c4);
                sum += TEX2D(xy + vec2(-3.0 * onex, 0.0)) * vec4(c3);
                sum += TEX2D(xy + vec2(-2.0 * onex, 0.0)) * vec4(c2);
                sum += TEX2D(xy + vec2(-1.0 * onex, 0.0)) * vec4(c1);
                sum += TEX2D(xy);
                sum += TEX2D(xy + vec2(+1.0 * onex, 0.0)) * vec4(c1);
                sum += TEX2D(xy + vec2(+2.0 * onex, 0.0)) * vec4(c2);
                sum += TEX2D(xy + vec2(+3.0 * onex, 0.0)) * vec4(c3);
                sum += TEX2D(xy + vec2(+4.0 * onex, 0.0)) * vec4(c4);

                gl_FragColor = pow(sum*vec4(norm),vec4(1.0/display_gamma));
        }


Off course i followed the example from the posted link and put the plain shader on the MAME-Bitmap side first, but it seems that it doesnt pass the result to the SCREEN-Bitmap side. Something must be wrong in my code or i didnt understand something frown .

Please someone help me out here or point me to the right direction.

Final things: Windows 7, MAME 0164 64bit.

Thx u-man

Last edited by u-man; 08/03/15 02:51 PM.

I live... I die... I live again.
#101408 - 09/01/15 11:20 AM Re: Need help, with GLSL [Re: u-man]  
Joined: Jul 2015
Posts: 83
u-man Offline
Member
u-man  Offline
Member

Joined: Jul 2015
Posts: 83
Serbo-Croatian MF living in Ge...
Ok, it seem i have asked too much. I am still wondering, that no one here, can me explain, how to pass results from MAME to SCREEN and vice versa frown .

I am desperate here and really want to understand, how this stuff is done properly. At least I would like to know if my schematic is right.

I appreciate even the smallest advices here or any links fo further investigations.

thx, u-man


I live... I die... I live again.
#101409 - 09/01/15 12:03 PM Re: Need help, with GLSL [Re: u-man]  
Joined: Mar 2001
Posts: 15,576
R. Belmont Online content
R. Belmont  Online Content

Very Senior Member

Joined: Mar 2001
Posts: 15,576
USA
Nobody here knows. The GLSL engine was contributed around 2008 by someone who no longer posts here. I've done significant maintenance on it (in particular, flattening the texture format to always be ARGB and adding some uniforms to better facilitate porting other shaders) but I have no idea what the interaction is meant to be between the MAME and SCREEN shaders. The .txt file is pretty much all there is.

Given the state of things it's entirely possible you understand correctly and the code is simply broken.

Last edited by R. Belmont; 09/01/15 12:05 PM.
#101410 - 09/01/15 01:11 PM Re: Need help, with GLSL [Re: u-man]  
Joined: Jul 2015
Posts: 83
u-man Offline
Member
u-man  Offline
Member

Joined: Jul 2015
Posts: 83
Serbo-Croatian MF living in Ge...
Thats sad to hear frown , but i really and truly thank you for the answer R. Belmont smile . Actually the interaction between MAME and SCREEN shaders, could be very essential for optimizing some shader processes like i.e. the bloom. The definition is very simple: SCREEN shaders are ALWAYS operating on the users choosen screen resolution (and need at least one MAME shader present) and MAME shaders are always operating on the game resolution, with two exceptions:

a.) If only one/single MAME shader is present, than all processes are done on screen resolution, which is not optimal for some processing, because it would be faster (and fairly enough resoluton for good results) to render them on game resolution. A good example is the NTSC shader of MooglyGuy here. It needs much more power in standalone mode, without a plain shader applied.

b.) If it is the last MAME shader in the chain and no SCREEN shaders are used, than again all processes are done on screen resolution.

All this stuff is handled pretty good in HLSL, because it is optimized. I.E. all necessary effects that need to be rendered at game resolution, are rendered first like: NTSC, color correction etc. and all effects that needs higher resolution like scanlines, bloom, shadowmask etc. are rendered at the "hlsl precale" factor choosen. All in all, it is a optimized pipeline/chaining for effects. That was told me by Jezze, who tried to explain me things.

So in my case, i want to optimize the bloom process for the Lotte shader and to add bloom to the CRT-geom shader (which is non-existing in CRT-geom).

The bloom could be much faster calculated and be way less resource demanding, if it could be calculated in seperate passes. One pass for horizontal blur and one pass for vertical blur. If you do a blur (which is needed for bloom) in a single process, you have to calculate i.e. 512x512*n*n pixels, which is much more than a 2-pass with a 512x512*n*2 formula wink .

Now the hard part here, would be to do those seperate passes with SCREEN shaders, which is necessary to achieve good and nice results and then to pass the results of these operations to the final shader. Thats where i am failing at frown .

This was told to me, by Lotte himself and by Jezze, thats why i want to do this, with the GLSL shaders that contain bloom effects.

Last edited by u-man; 09/01/15 01:17 PM.

I live... I die... I live again.
#101431 - 09/02/15 09:46 PM Re: Need help, with GLSL [Re: u-man]  
Joined: May 2011
Posts: 41
SoltanGris42 Offline
Member
SoltanGris42  Offline
Member

Joined: May 2011
Posts: 41
Originally Posted By u-man
Thats sad to hear frown , but i really and truly thank you for the answer R. Belmont smile . Actually the interaction between MAME and SCREEN shaders, could be very essential for optimizing some shader processes like i.e. the bloom. The definition is very simple: SCREEN shaders are ALWAYS operating on the users choosen screen resolution (and need at least one MAME shader present) and MAME shaders are always operating on the game resolution, with two exceptions:

a.) If only one/single MAME shader is present, than all processes are done on screen resolution, which is not optimal for some processing, because it would be faster (and fairly enough resoluton for good results) to render them on game resolution. A good example is the NTSC shader of MooglyGuy here. It needs much more power in standalone mode, without a plain shader applied.

b.) If it is the last MAME shader in the chain and no SCREEN shaders are used, than again all processes are done on screen resolution.

All this stuff is handled pretty good in HLSL, because it is optimized. I.E. all necessary effects that need to be rendered at game resolution, are rendered first like: NTSC, color correction etc. and all effects that needs higher resolution like scanlines, bloom, shadowmask etc. are rendered at the "hlsl precale" factor choosen. All in all, it is a optimized pipeline/chaining for effects. That was told me by Jezze, who tried to explain me things.

So in my case, i want to optimize the bloom process for the Lotte shader and to add bloom to the CRT-geom shader (which is non-existing in CRT-geom).

The bloom could be much faster calculated and be way less resource demanding, if it could be calculated in seperate passes. One pass for horizontal blur and one pass for vertical blur. If you do a blur (which is needed for bloom) in a single process, you have to calculate i.e. 512x512*n*n pixels, which is much more than a 2-pass with a 512x512*n*2 formula wink .

Now the hard part here, would be to do those seperate passes with SCREEN shaders, which is necessary to achieve good and nice results and then to pass the results of these operations to the final shader. Thats where i am failing at frown .

This was told to me, by Lotte himself and by Jezze, thats why i want to do this, with the GLSL shaders that contain bloom effects.


Hi u-man!

I tried to get this stuff working. I may have gotten a little further than you. But not too much. Here's what I have so far: Split Shaders

Basically, it sort of works...

What I do is:

  • "glsl_shader_mame0" is just "glsl_plain". I couldn't get my Surface Pro 3 to work right without this. No idea why since my Desktop doesn't mind skipping it.
  • "glsl_shader_mame1" called "hscale" that will perform horizontal scaling. It also would do the horizontal bloom if I was adding that to the shader. After this shader runs, the output is a screen resolution texture. If you imagine a screen resolution of 1440x1080 and a system resolution of 320x240 then the output of this is a 1440x1080 texture with a the top 1440x240 taken up by our horizontally scaled image.
  • "glsl_shader_screen0" called "vscale" that performs vertical scaling + scanlines. Vertical bloom would go in here too if I had it. The output is 1440x1080 with the whole texture used.
  • "glsl_shader_screen1" called "shadowmask" that performs the shadowmask effect. I'm currently having issues with this working right. I have some kind of texture coordinates miscalculation that causes issues sometimes.... Output is screen resolution 1440x1080 still.
  • "glsl_shader_screen2" called "YUV" that handles tint/saturation. It might nice to expand this to include gamma, brightness, and contrast too. I could do this at the beginning but it you put it at the end, you can fix it if your scanlines/shadowmask daeken the picture too much...


Like I said, it doesn't all work perfectly. That's why I never uploaded it anywhere. But it all runs at over 150% at 1920x1440 (full 4:3 resolution of my Surface Pro 3) on Intel Graphics so I don't need it any faster than that.
EDIT: Until it gets hot apparently and then slows down to <70% speed... frown

Anyway, take a look and see if you have any questions. And let me know if you can't get it to work at all.

Here's the exact shader order I use:
Code:
glsl_shader_mame0         C:\emu\sdlmess\split\glsl_plain
glsl_shader_mame1         C:\emu\sdlmess\split\hscale
glsl_shader_screen0       C:\emu\sdlmess\split\vscale
glsl_shader_screen1       C:\emu\sdlmess\split\shadowmask
glsl_shader_screen2       C:\emu\sdlmess\split\YUV



-greg

Last edited by SoltanGris42; 09/02/15 10:03 PM.
#101432 - 09/02/15 11:02 PM Re: Need help, with GLSL [Re: u-man]  
Joined: Jul 2007
Posts: 78
cgwg Offline
Member
cgwg  Offline
Member

Joined: Jul 2007
Posts: 78
Sorry, I was busy when the first post was made, and subsequently forgot about it.

Your understanding is almost consistent with mine, but you need to remember that a shader has separate input and output resolutions. A MAME shader always has input at the game resolution, and a SCREEN shader always has input at the screen resolution. Consequently, all MAME shaders output at the game resolution, except for the last one, which outputs at the screen resolution. At least, that's how it's supposed to work — there have been some changes since the last time I seriously tested it.

The other thing to keep in mind is that each shader's output is the next shader's input, and more complicated patterns of passing buffers between shaders are not possible. The exception is that it's possible to read the original frame from any shader. [In fact, this seems to come upside down when read from a SCREEN shader.] Also, the SCREEN shaders seem to be broken for me at the moment if there are fewer than 2 MAME shaders.

I added a halation or "bloom"-type feature to my shader a long time ago, but never in the MAME version. The feature-set is sufficient, though; it could be done with three MAME shaders: two for x/y blur and the third that does the main scanlines. In fact, I've just done this: you can get it here. Run it with
Code:
mame -glsl_shader_mame0 gaussx -glsl_shader_mame1 gaussy -glsl_shader_mame2 CRT-geom-halation

#101434 - 09/03/15 07:20 AM Re: Need help, with GLSL [Re: u-man]  
Joined: Apr 2006
Posts: 543
Dullaron Offline
Senior Member
Dullaron  Offline
Senior Member

Joined: Apr 2006
Posts: 543
Fort Worth, TX.
I love it cgwg. Thanks. smile


Windows 10 Pro 64-bit / Intel Core i5-4460 3.20 GHz / 8.00 GB RAM / AMD Radeon R9 200 Series
#101436 - 09/04/15 06:43 AM Re: Need help, with GLSL [Re: u-man]  
Joined: Apr 2006
Posts: 543
Dullaron Offline
Senior Member
Dullaron  Offline
Senior Member

Joined: Apr 2006
Posts: 543
Fort Worth, TX.
cgwg I drop the glow down.

// strength of halation or "bloom" effect - e.g. 0.1 for 10%
halation = 0.02;

I was playing Defender and I notice the hills have that thick blur brightness. I fix it on my end to make it less blur brightness instead of thick blur brightness. Doesn't like a brown water color now.

Find that setting in the CRT-geom-halation.vsh file.

Use what you want.


/Edit

I change it to 0.02

and

In the mame.ini

brightness 1.04

Last edited by Dullaron; 09/04/15 08:05 AM. Reason: Edit

Windows 10 Pro 64-bit / Intel Core i5-4460 3.20 GHz / 8.00 GB RAM / AMD Radeon R9 200 Series
#101445 - 09/04/15 02:29 PM Re: Need help, with GLSL [Re: SoltanGris42]  
Joined: Jul 2015
Posts: 83
u-man Offline
Member
u-man  Offline
Member

Joined: Jul 2015
Posts: 83
Serbo-Croatian MF living in Ge...
First of all, i am really suprised and excited about the latest posts here. Thank you so much for joining here CGWG and SoltanGris42. Your advices are always welcome.

I will start with CGWGs post:

Originally Posted By cgwg

The other thing to keep in mind is that each shader's output is the next shader's input, and more complicated patterns of passing buffers between shaders are not possible. The exception is that it's possible to read the original frame from any shader. [In fact, this seems to come upside down when read from a SCREEN shader.] Also, the SCREEN shaders seem to be broken for me at the moment if there are fewer than 2 MAME shaders.

I added a halation or "bloom"-type feature to my shader a long time ago, but never in the MAME version.

If i understand you correctly, it is not possible to pass a result from lets say MAME shader 1 to MAME shader 3, am i right?

On the other side, I dont see a case, where such a thing would be important, but I am only novice and only starting to understand the whole thing. IMHO i think it is very important to overlook and carefully plan the shader-chain, to achieve the desired result. Comparing with HLSL and how the shader-chain is setup, there is not much stuff that needs to be done with game-resolution, except colorstuff and NTSC. So in our case with GLSL, its a downside if we stay only on the MAME shader side, as you allready wrote, only the last shader operates on screen resolution, but to have a nice looking bloom, it should be calculated with screen resolution.

Please dont get me wrong here, i am a very big fan-boy of your CRT-geom-shader. I always liked your simple and somehow unique approaches with the CRT-geom or your nice effect file done for a shadowmask. I am happy that you have posted a solution for my described problem, but i still think the bloom would look better, if calculated with SCREEN resolution. Maybe even with a Treshold shader calculated before applying the blur´s. A Treshold that first filters the bright parts of the game screen, so that the blur is only applied to the very bright sections of the game screen. Something like this: https://github.com/libretro/common-shaders/blob/master/crt/shaders/glow/threshold.cg

That shader needs to run in linear gamma, though, so if you just try to run it on its own, it won't look correct.

I think this would greatly improve your Bloom. I will take your posted shader here and will try to update my conversion of your CRT-geom. So that the conversion also includes your bloom.

I also didnt know that the SCREEN shader only starts to work with 2 MAME shaders present. I always tried it with only one plain MAME shader, your suggestion here would explain why i had troubles to get the SCREEN shaders working. I just followed the tutorial/docs and they claimed that you need at least one MAME shader present. My fault, that i never tried it with two wink .

I will PM you pretty soon, when i am done with my work on your shader. I also have some ideas, that I would talk about with you smile .

Now some stuff for SoltanGris 42 laugh :

Originally Posted By SoltanGris42

Hi u-man!

Basically, it sort of works...

What I do is:

[list]
[*] "glsl_shader_mame0" is just "glsl_plain". I couldn't get my Surface Pro 3 to work right without this. No idea why since my Desktop doesn't mind skipping it.
[*] "glsl_shader_mame1" called "hscale" that will perform horizontal scaling. It also would do the horizontal bloom if I was adding that to the shader. After this shader runs, the output is a screen resolution texture. If you imagine a screen resolution of 1440x1080 and a system resolution of 320x240 then the output of this is a 1440x1080 texture with a the top 1440x240 taken up by our horizontally scaled image.
[*] "glsl_shader_screen0" called "vscale" that performs vertical scaling + scanlines. Vertical bloom would go in here too if I had it. The output is 1440x1080 with the whole texture used.
[*] "glsl_shader_screen1" called "shadowmask" that performs the shadowmask effect. I'm currently having issues with this working right. I have some kind of texture coordinates miscalculation that causes issues sometimes.... Output is screen resolution 1440x1080 still.
[*] "glsl_shader_screen2" called "YUV" that handles tint/saturation. It might nice to expand this to include gamma, brightness, and contrast too. I could do this at the beginning but it you put it at the end, you can fix it if your scanlines/shadowmask darken the picture too much...
-greg


First thank you for the posted shaders. This will help me much. You also use two MAME shaders before passing to SCREEN shader, so it seems CGWG is right with his statement, that the SCREEN shader is either bugged or it just needs two MAME shaders present, instead of one like in the docs.

I dont understand (or dont see a reason), why you put the YUV shader to be the last shader in your chain. Why it would make a difference, to put it before? That way you calculate YUV with screen resolution, which would be slower than calculated on game-resolution. What I am trying to say is, your shader could be faster, if YUV is calculated before the scanlines and shadowmask. I think its better to do it this way and to include some transparency/alpha value for the shadowmask to reduce the "darkness". At least thats how its done in HLSL.

I may have a answer to your Surface Pro 3 problems and the plain shader. As I have seen that some shaders require Linear Gamma prior correct working, like the before mentioned Treshold shader for example. Maybe its something similar to your problem.
I will need some time, to look into your setup.

I am very happy that you and CGWG are here and maybe we can solve some mysteries here together. Remember the conversation we had about deconstructed shaders? I guess together we will find some solutions. You allready proofed, that optimization is possible with your speed stats of 150%. I hope that our future versions, will be a benefit for everyone that likes shaders.

I will stop here and come back as soon as possible, hopefully with some more good news smile .


I live... I die... I live again.
#101458 - 09/05/15 09:15 PM Re: Need help, with GLSL [Re: u-man]  
Joined: Jul 2007
Posts: 78
cgwg Offline
Member
cgwg  Offline
Member

Joined: Jul 2007
Posts: 78
Originally Posted By u-man
If i understand you correctly, it is not possible to pass a result from lets say MAME shader 1 to MAME shader 3, am i right?

Yes, that's right, although this would only be needed for rather sophisticated shader chains.
Originally Posted By u-man
i still think the bloom would look better, if calculated with SCREEN resolution.

No, I generally don't think so. There isn't any more detail in the picture than there was in the original resolution — the only exception is if you want to blur an effect produced by an earlier shader.
Originally Posted By u-man
Maybe even with a Treshold shader calculated before applying the blur´s. A Treshold that first filters the bright parts of the game screen, so that the blur is only applied to the very bright sections of the game screen. Something like this: https://github.com/libretro/common-shaders/blob/master/crt/shaders/glow/threshold.cg

For my own shaders, I try to have some kind of physical basis for whatever I do. What I call "halation" can be caused by two different effects: internal scattering of light in the front glass, or electrons scattering between the phosphors and the shadow mask. Both of these are fairly linear processes, I believe, and there's not much argument for a threshold effect.

The main inaccuracies in my implementation are the following:
  • The latter source of halation would be more-or-less monochrome, which I haven't implemented.
  • A gaussian probably isn't the correct shape . I used a gaussian for the usual reason: it's the only rotationally symmetric filter that can be separated into x and y filters.

"Bloom" effects with thresholding are often used together with HDR rendering to give the impression that some object is much brighter than the rest of the scene. By contrast, I try to be careful to avoid any clipping in my shaders, so that this would be unnecessary.
Originally Posted By u-man
I also didnt know that the SCREEN shader only starts to work with 2 MAME shaders present. I always tried it with only one plain MAME shader, your suggestion here would explain why i had troubles to get the SCREEN shaders working. I just followed the tutorial/docs and they claimed that you need at least one MAME shader present. My fault, that i never tried it with two wink .

Well, that's clearly a bug, and including 2 MAME shaders is just a convenient workaround.

Page 1 of 4 1 2 3 4

Moderated by  R. Belmont 

Who's Online Now
5 registered members (remax, Cpt. Pugwash, Pernod, robcfg, Dorando), 20 guests, and 3 spiders.
Key: Admin, Global Mod, Mod
Shout Box
Forum Statistics
Forums9
Topics8,337
Posts107,734
Members4,738
Most Online225
May 26th, 2014
Powered by UBB.threads™ PHP Forum Software 7.6.0
Page Time: 0.029s Queries: 15 (0.007s) Memory: 5.0401 MB (Peak: 5.2872 MB) Zlib enabled. Server Time: 2017-05-24 15:40:48 UTC