I've noticed a couple issues using GLSL. For reference, I'm running Fedora 10 with a GeForce 8600GT.
1. In certain games -gl_glsl_filter 1 shows artifacts as shown. They depend on the size of the window and are either vertical or horizontal strips. They also interact strangely with fullscreen. In tgmj, for example, they appear when I start the game in fullscreen (1600x1200) but then disappear when I toggle fullscreen off and back on. Click for full size.
2. GLSL bilinear filtering only works on some games. It works on pacman and sf2, but invaders and sfiii3 are unfiltered.
The two problems occur whether I use -gl_glsl_filter 1 or if I use -glsl_shader_mame0 shader/glsl_bilinear.
As an excuse to bump my thread, I thought I might show my attempt at a scanlines shader. Horizontally I use simple linear interpolation since I don't want to simulate a CRT's video amplifier. Vertically I give the beam a Gaussian profile. I also simulate CRT bloom by increasing the beam width for the brightest colours.
The aperture effect used here was designed to work with the RGB subpixel structure of LCDs, so you should see it at full size. I use Lanczos scaling horizontally and give the beam a Gaussian profile (in a linear color space). Scanline width and bloom effects can of course be adjusted.
I took one of the bilinear shaders as a starting point. Here's everything tarred up. Put aperture4x4v2.png in the appropriate place and put the shader files in shader/custom. I produced that screenshot by running this:
Okay, finally, I got it working ; it looks really cool !
However, the shader effect leads to a few vertical black lines (<10) over the picture. Any idea ?
Edit : Hmm, I wanted to make a screenshot, however sdlmame (linux) locks the keyboard and the built-in screenshot feature seems to short-circuit the effects. It's not such a bad behaviour, as you don't switch to another virtual desktop when playing
I'm aware of this issue, but after a bit of testing, I don't think it's unique to GLSL or my shader. Here's a screenshot I took without GLSL or even bilinear filtering (look at the vertical lines along the diagonal):
The glitches appear or disappear depending on the window size.
It looks like -gl_forcepow2texture helps somewhat, in that this problem is replaced by one that only affects the diagonal, and not half the screen.
On a related note, I seem to be experiencing a huge memory leak with -effect in recent versions of MAME (it looks like 0.136 is fine but 0.140 and 0.141 are not).
I think this shader illustrates the problem well, and makes it clear that the texture coordinates are computed slightly differently for the two triangles:
I wanted to play with shaders for improving the look of vector games, but after realizing that currently this isn't implemented, I decided on a different approach: work with raster games and develop shaders that should also work on vector games, if their antialiased vectors were rendered into an offscreen buffer. Then, if I'm lucky, someone more familiar with the MAME codebase will like what they see and implement it.
This work is done using "screen bitmap" shaders, which operate on buffers at the final screen output resolution. First, I use shaders/glsl_plain to render to screen resolution with nearest-neighbor sampling: The next part was the trickiest: simulate the look of OpenGL antialiased vectors. Eventually I came up with this shader, which seems to mostly work. Without gamma correction, it is the closest approximation to antialiased line drawing: Adding gamma correction eliminates the unevenness of brightness: Next, I do a 9x9 Gaussian blur. Fortunately this can be separated into two 9-tap filters instead of one 81-tap filter. First, vertically: Then, horizontally: I should mention that at each stage, blending is done in a linear colour space, which requires applying gamma and inverse gamma functions at each stage to avoid the loss of precision that would result from storing linear color in an 8-bit intermediate texture.
There are a couple more ways of making this more realistic. First, a CRT's video amplifier has limited bandwidth, so I use a horizontal Lanczos shader as the last "mame bitmap" shader (which takes the image at emulated resolution as its input and outputs at screen resolution) to slightly smooth the pixel transitions: Together with the other shaders, this gives: Finally, since this is in colour, I apply a slot mask overlay with -effect aperture2x4rb:
For a vector game, simulating limited bandwidth of the video amplifier probably couldn't be done with just shaders, although this is relatively unimportant.
Where would I submit the shaders? I'm aware of this SDLMAME shader pack from 2007, but currently MAME only has the two built-in shaders: nearest-neighbor and bilinear scaling. Is there an interest in packaging more shaders with MAME?
This set of shaders is available here. To reproduce the above, you'll also need the glsl_plain shader and the aperture2x4rb overlay, and a command line like
Also, it seems that only drivers that output in 16-bit indexed colour work. It looks like the rgb32_lut version of the glsl_plain and glsl_bilinear shaders is broken, since they expect a colortable_texture of height 1, but the LUT is instead packed into a more square texture. Here is a less broken version of glsl_plain_rgb32_lut.fsh. The rgb32_lut shaders can be used instead of the rgb32_dir shaders by setting one of the video attributes on the command line to a value other than 1 (e.g. -gamma 1.01). And it looks like the rgb32_dir shaders are compiled but never get used (this is what I was referring to in the first post of this thread when I wrote that GLSL bilinear filtering only works on some games).
I hate to sound too newbish, but what's the default dir for the shaders and the filter images?, I could only load them using absolute paths.
I also get these kind of errors on load, does this mean my video card doesn't support the features?, I wouldn't be surprised as it's an onboard intel card on this machine.
The effect overlays can go in the artwork directory. I don't know of any default directory for shaders.
As for the errors, it might be your card. Try replacing the lanczosx shader with a second glsl_plain, since lanczosx is the only one that required a #version declaration to enable more features. It might also be the shaders, since I only have an nvidia card, and nvidia's shader compiler is very forgiving. I tried to get rid of all the implicit conversions, but I may have missed something and I have no way of testing a stricter compiler.
Not all games will work with this - your best "smoke test" is to start with Gradius, the game he shows working. Some others will show no image with this method for the reasons he discussed.
One change that seems to work (for tgm2, at least) for enabling rgb32_dir shaders is to allow SDL_TEXFORMAT_RGB32 wherever SDL_TEXFORMAT_RGB32_PALETTED is allowed in drawogl.c (specifically, around lines 1770 and 2054).
That is MESS running the NES ROM tvpassfail. A more challenging test would be something that relies on artifact colours.
Y/C separation is tricky. My current code works at 4x the horizontal resolution of the NES (6x the chroma subcarrier frequency).
This is using a simple notch filter:
With a 1/4 pixel horizontal shift applied to adjacent lines, a (1 2 1) comb filter can also be done. Vertical lines are improved at the expense of everything else:
I'm sure there are many inaccuracies in what I have so far. There are also limitations to what can be done with just shaders; currently the pattern is static since I can't implement the phase shift between successive frames.
Oh by the way, when looking for cgwg's shaders filters on the web, I stumbled upon a comparison of shaders that look pretty amazing. It seems they are intended for an SNES emulator but I wonder if there is a way to get them to work with sdlmame ?
For many of the XML format GLSL shaders used by bsnes and some other emulators, converting them to run on MAME is a simple process of splitting the vertex and fragment shaders into separate files, and doing some search-and-replace to change some names of uniforms.
In this case, CRT.OpenGL began as a shader that I created for SDLMAME, then ported to run on bsnes. Now I've ported it back, with some improvements and fixes contributed by a couple of the bsnes forum members, and it is available here. I've changed some of the parameters to what I think is better, but you should be able to play with them yourself by editing any of the .fsh files and replacing the other two with the changed file. I've also removed the built-in phosphor overlay, but you can achieve the same using MAME's -effect option.
Note that there currently seems to be a problem with the texture borders, so you might want to disable the curvature feature of the shader until this is resolved.
Edit: It looks like search-and-replace GL_CLAMP with GL_CLAMP_TO_BORDER in drawogl.c does the trick, but that might be overkill.
Galaxy Game doesn't seem to be working with GLSL. From my testing, it looks like somehow the two color indices come out as 2 and 1 instead of 0 and 1. I tried another two-color indexed game (circus) and it has the expected behavior and works fine. This patch seems to fix it, although if get_black_pen is something that should be usable, then the correct fix is elsewhere.
In addition, my personal build with GL_CLAMP replaced by GL_CLAMP_TO_BORDER doesn't work with the tiny 1x2 palette texture and the built-in shaders, since those result in a divide-by-zero. This patch to the shaders should eliminate the clamping of the palette texture coordinates but should have no effect on the output from normal builds.
I applied those patches except the Galaxy Game one (because it's obviously wrong - you can't rely on the indices being in any particular order) and I noticed a few things. gradius3 shows no video at all with CRT.vsh and gokuparo works and looks great except the left and rightmost 8-12 pixels are smeared/repeated from the actual edge of the image.
ETA: Oops, managed not to compile after changing the clamp mode. Anyway, the applied patches will be in u1.
ETA2: I've heard hints of a "standard" for GLSL emulator shaders so they're interchangable - if that happens, point me to the specs and I'd be happy to make MAME compliant.
comment on the CRT.vsh: *awesome*! it has been some time since my last attempt to use it (and I only manage to get a black screen at time), so I am really enjoying this filter! Thanks a lot cgwg!
@Arbee: I think the format used by bsnes is becoming sort of standard (in the sense they ported it to snes9x for sure and maybe other emus)... not sure if it's the one you are referring to, though
Sorry, I forgot to mention again that I was lazy in dealing with the three color formats, and you need to use the glsl_plain shader that comes with MAME as the first pass:
My understanding is that the driver asks for a palette with 2 entries (black and white), then the palette code always adds 2 more entries (black and white, again; these are what get_black_pen and get_white_pen return), resulting in a 4-entry palette. The problem is that the last 2 entries don't end up getting passed through to the shaders.
Quote
I've heard hints of a "standard" for GLSL emulator shaders
The standard is here. Version 1.0 is limited to a single pass and was first used by bsnes, which recently switched from using XML to a custom format and no longer supports the standard. I think it's also supported by snes9x and some other emulators, but I don't know offhand which ones. Version 1.1 is listed as a "working draft" format, and it adds multipass support. It's supported by SSNES and maybe some other emulators. Some example shader files can be found here. SSNES also supports many more features, such as giving every pass access to outputs from all previous passes, access to the previous frame, and control by a Python script that also can access the emulated system's memory.
Ok, with the curvature toned down a little and LINEAR_PROCESSING enabled (my GTX570 was insulted without the extra load ;-) I get a look I rather like. Here's Apple IIgs in MESS, in a nice big window. The scanlines are a close match to my real Apple 13" RGB monitor.
Sorry, I forgot to mention again that I was lazy in dealing with the three color formats, and you need to use the glsl_plain shader that comes with MAME as the first pass:
just for confirmation (before I start fiddling with parameters): does this mean that the combination of the two shaders is expected to work fine for all games, right?
Yes, the combination of the two shaders works for everything I've tried. Make sure to use recent SVN to avoid weird problems at the edges where the curvature is supposed to be.
I have uploaded the scanline shader from 2009 (the one reposted in the previous page) with the fixes needed to be compatible with latest MAME/MESS (i.e. a couple of "2 * xxx" --> "2.0 * xxx" due to some invalid "int * float" operation, and a bunch of "pow" requiring a vec4 as second argument) + I have added to the pack a couple of effects .png created by Aaron ages ago (scanlines.png and aperture1x3rb.png) that to my eyes are still unrivaled when dealing with arcade games.
@cgwg: if you show up and have some time, would you mind adding the distorsion to this shader too, like you have done to the CRT one? I might give a try to adding it by myself in the next few days, but I fear I might miss some important detail...
This was trivial to do; it's uploaded here. The old shader had the same design as the more recent one, except the scanline profile was purely gaussian, whereas the new one is not. I only had to change 3 lines in the new version of the shader.
Looking at its output, however, I think that it needs some tweaking so that the bloom effect is less harsh. Also, it looks like the maximum scanline width is too large, and the resulting scanline profile gets cut off. This can be remedied by making the scanline less wide, or adding more taps in the vertical direction.
I've uploaded a tweaked version of the shader with gaussian scanlines here.
Originally Posted by R. Belmont
Ok, with the curvature toned down a little and LINEAR_PROCESSING enabled (my GTX570 was insulted without the extra load ;-) I get a look I rather like.
Lacking easy access to a CRT for testing, I can't argue with what looks right, but I should point out that my choice to disable LINEAR_PROCESSING wasn't for performance reasons. As I wrote here:
Quote
The original shader deliberately did the horizontal Lanczos interpolation in the native nonlinear color space, as that was intended to be a crude approximation of the effects of limited bandwidth in a CRT's amplifier. On the other hand, the beam should have a width greater than one pixel in addition to its height, and the beam should be blended in a linear color space. With just one pass, it's not practical to have both effects, and it would be interesting to see a comparison of screenshots.
For your convenience, I've done just that:
The gaussian shader, without and with LINEAR_PROCESSING.
The newer, nongaussian shader, without and with LINEAR_PROCESSING.
Disabling LINEAR_PROCESSING is intended to simulate the effect commonly seen in CRTs of areas with high-frequency detail being darkened.
Here's a preview of the new version that should hopefully be ready soon. I've been doing some work to improve the geometry, and this should be more accurate and allow for tilting the simulated CRT.
I've uploaded the new version here. The main changes are the following:
More accurate geometry for simulating a spherically curved CRT. Now controlled by two parameters: radius of curvature and distance of viewer from screen.
Tilting of the simulated screen.
Hard border and rounded corners.
Adjustable overscan.
Moved most of the parameters to the vertex shader file. See the comments at the start of the main function. There are still a few #defines that can be toggled in the fragment shader files.
3x oversampling of the beam profile. Reduces Moir� patterning at small output sizes. This has probably the biggest impact on performance, and should be disabled if things become sluggish.
howdy from someone who only found this thread a few minutes ago. I just got back into mame and have been trying to find good glsl filters. your CRT filter does a pretty good job of making these games look authentic.
I just wanted to pop my head in and say that the new CRT-geom you just posted doesn't work for me, FWIW. I get this:
Quote
src/osd/sdl/gl_shader_tool.c:370: GL Error: object 0x15 compilation failed src/osd/sdl/gl_shader_tool.c:370 glInfoLog: ERROR: 0:36: '*' : wrong operand types no operation '*' exists that takes a left-hand operand of type 'const int' and a right operand of type 'varying float' (or there is no acceptable conversion) ERROR: 0:36: '*' : wrong operand types no operation '*'
followed by a dump of the vsh.
looks like line 36 in the .vsh has an issue. if I edit it into a dummy value like "float C = 0.5", it can get past that, but then dies again with the same "wrong operand" problem on a later line. I'll readily admit though that I don't know glsl from a hole in the wall so this may be a red herring.
I'm on osx 10.6.8 with an nvidia 8600mgt, and the sdlmame 144 stable from sdlmame.parodius.com (his 144u6 link is broken and I don't want to deal with compiling from source).
Sorry, that type of issue seems to always turn up. Nvidia's GLSL compiler tends to be more permissive than others. I managed to figure out how to enable GLSL debug output in MAME, and this version, which compiles without any warnings, should hopefully work for you (I changed 2 to 2.0 in two places).
yep, fixed. which is weird, because you'd think an nvidia compiler would produce something that works on another nvidia card...
I like the new adjustable rounded corners and proper 144 border blackout. I'd faked it previously by abusing the 'effect' image options. the tilt feature is cool too, but is there an easy way to have it read an ini value rather than being hard coded in the vsh? for games with sideways screen hacks (like daioh) you have to swap the x and y values. you can get around this currently by having a separate copy for normal and sideways games, but that seems silly. don't bother if it's too much of a pain to implement though.
one thing I would really like to see is the scanline bloom effect like you put in the 'crt-old/scanline' version at the top of the page. I think that really gives the proper look. (if it's already in crt-geom I can't see it, maybe make it adjustable?)
good work though. this really ought to come bundled in the mame distros or with the source or something.
...or, the top of the previous page at this point...
while I'm thinking of scanlines, how hard would it be to implement an 'aperture1x3rb' style effect as an alternative to the simple-light-and-dark scanlines? using an effect image doesn't play nice with the screen curvature options (and tends to make the picture darker in a way that can't be compensated for easily).
if you could somehow give the choice of different tv/arcade patterns all in one package, that would be awesome.
OS X doesn't use vendor OpenGL stacks or drivers. Apple uses a unified GL stack (including an LLVM-based shader compiler) and their own drivers. So you get very consistent results across cards on OS X (hardware capabilities permitting), including what the shader compiler accepts or doesn't. And Apple's compiler is less permissive than what Nvidia ships in their Windows and binary-blob Linux drivers (as is ATI's on PCs; hence a lot of demoscene stuff only runs on Nvidia).
the tilt feature is cool too, but is there an easy way to have it read an ini value rather than being hard coded in the vsh? for games with sideways screen hacks (like daioh) you have to swap the x and y values. you can get around this currently by having a separate copy for normal and sideways games, but that seems silly. don't bother if it's too much of a pain to implement though.
I'm aware of this issue. I haven't tried it, but I think that it is possible to detect which direction is down by using the derivative functions in GLSL, although that's not ideal. The proper way of doing this (and other things) would involve integrating the shader into MAME like is done with HLSL on Windows.
Quote
one thing I would really like to see is the scanline bloom effect like you put in the 'crt-old/scanline' version at the top of the page. I think that really gives the proper look. (if it's already in crt-geom I can't see it, maybe make it adjustable?)
Ok, I've added the old-style purely gaussian scanline bloom as an option controlled by #defines. It's uploaded here, with the old-style scanlines currently enabled.
Quote
while I'm thinking of scanlines, how hard would it be to implement an 'aperture1x3rb' style effect as an alternative to the simple-light-and-dark scanlines? using an effect image doesn't play nice with the screen curvature options (and tends to make the picture darker in a way that can't be compensated for easily).
Aperture/shadow mask patterns are a difficult thing to do. I've written about this before, both in one of the HLSL threads on these forums and on the byuu.org forums, but the upshot is that most currently available monitors don't have a high enough resolution. And any pattern will always have the effect of darkening the picture. That being said, given that there might be some interest, maybe I'll try to dig up my past experiment in simulating a curved aperture grille (although I suppose that would mean that I should also implement a shader that simulates a cylindrically curved screen to go with it).
The proper way of doing this (and other things) would involve integrating the shader into MAME like is done with HLSL on Windows.
it's not a big deal, like I said you can just have two copies with different names. I don't think this is important enough to waste a heck of a lot of time on.
Quote
Ok, I've added the old-style purely gaussian scanline bloom as an option
hrm, I see that the code is in there, but I'm not seeing a difference between this and the 'fixed' geom you posted last.
upon closer inspection, I'm not sure about what exactly it is I'm seeing in the old 'scanlines' version. I thought it was taking areas of high luminosity and making the dark lines less dark (closer to native unfiltered brightness), but now that I look harder it seems it's only doing that near the center of the screen, and the outlying areas are pretty much unchanged? maybe I'm unclear what the bloom effect was supposed to look like and what I'm really seeing is a weird bug. I can post pics if you have no idea what I'm talking about.
Quote
but the upshot is that most currently available monitors don't have a high enough resolution.
I'm not sure I see the problem here. current monitors are already good enough to handle what is effectively a 2px wide light/dark scanline effect, how is a 2px wide green/magenta effect that much different?
Quote
And any pattern will always have the effect of darkening the picture.
that's not entirely true, as you can compensate for it within the shader by lightening the bright lines- not something I can get with a simple effect png because it's always applied last in the chain and always multiplies down.
Really cool effect, excellent work. Using MAME 0.144 (and 0.144u6), Fedora 16, Radeon HD 5670, runs great!
Tinkered with the parameters in main(), first, can you pass parameters already, or alter main() to allow passing of parameters?
Assuming I get this, the tilt away/towards is determined by vec2(), so I altered it to vec2(-0.15,0.0) and looks great for pacman for tilt away, but the same settings for mrdo renders as a tilt towards, so I have to change to +0.15 Is that supposed to do that?
maybe I'm unclear what the bloom effect was supposed to look like and what I'm really seeing is a weird bug.
I decided to make a few pictures. Here are some scanlines with the bloom effect turned off. The little bit of clipping in the bright parts is unintentional; I would have to adjust the normalization to eliminate it. The profile of the scanlines is gaussian, with constant width. Now with the bloom effect turned on. It's not a big effect, but the scanlines get wider when the image is brighter. Finally, the newer, nongaussian bloom effect. At the bright parts, the scanlines get a bit wider, but they also change shape and become flatter.
Quote
current monitors are already good enough to handle what is effectively a 2px wide light/dark scanline effect, how is a 2px wide green/magenta effect that much different?
Unlike scanlines, the phosphor triplets on a CRT have no direct relation to the pixels generated by video hardware. Usually, the dot pitch will be somewhat higher than the distance between scanlines.
Originally Posted by The Flying Ape
Tinkered with the parameters in main(), first, can you pass parameters already, or alter main() to allow passing of parameters?
Currently MAME passes a few parameters to external GLSL shaders. These give necessary values like the pixel dimensions of the image. Passing through more parameters would require modifying the code that interfaces with the shaders. At some point, it makes more sense to integrate a CRT shader like is done with HLSL on Windows.
Quote
Assuming I get this, the tilt away/towards is determined by vec2(), so I altered it to vec2(-0.15,0.0) and looks great for pacman for tilt away, but the same settings for mrdo renders as a tilt towards, so I have to change to +0.15 Is that supposed to do that?
Yes, my shader doesn't automatically detect which direction is down. In this case, mrdo and pacman are rotated 90 degrees in opposite directions by MAME, so you would need opposite tilt parameters for them.
ok, that makes sense. however what I was seeing was that towards the center of the screen, the bright lines bleed together into a solid lump with no dark in-between. this appears to be a bug though: it comes and goes depending on window size. mind you I have all the scaling and filtering options turned off, and if I screencap the image and open it in photoshop they're all exactly the same size to the pixel. it just happened that the default size was one where it showed up, and the game I was playing at the time had a lot of bright parts in the center, so I thought it was intentional.
Quote
Unlike scanlines, the phosphor triplets on a CRT have no direct relation to the pixels generated by video hardware.
that's only mostly true. remember that your original 15khz monitors were limited to 250-300 vertical lines because of frequency issues, and most games used pretty much all of those. the range of possible pixel-to-phosphor ratios is a lot smaller than what you'd see for a pc monitor crt running at who knows what res. I think you could fudge it. even if all you did was overlay the aperture1x3rb when playing the game at 2x or larger, that would still look halfway decent IMO. I mean, it's not possible to get a modern lcd to look exactly like an arcade monitor, I think most people would be happy as long as it's a good faith attempt along the same lines. you could always let people turn it off with a #define if they don't like it.
Quote
At some point, it makes more sense to integrate a CRT shader like is done with HLSL on Windows.
how easy would it be to get that started anyway? I don't really know the mame dev community, would they be receptive to this? crt-geom as it stands is already pretty damn good, it would be nice to make it more official.
These give necessary values like the pixel dimensions of the image.
oh, another question about this re: scanlines.
my graphics card can handle the 3x beam oversample smoothly when playing most games at 2x, but when when trying to play something at like 3x it crawls. I'm under the understanding that the oversampling was implemented to reduce moire at small sizes, so would it make sense to automatically turn it off at larger ones?
this appears to be a bug though: it comes and goes depending on window size. mind you I have all the scaling and filtering options turned off, and if I screencap the image and open it in photoshop they're all exactly the same size to the pixel. it just happened that the default size was one where it showed up, and the game I was playing at the time had a lot of bright parts in the center, so I thought it was intentional.
If you can post a screenshot of the bug, it might be helpful.
Quote
I think you could fudge it. even if all you did was overlay the aperture1x3rb when playing the game at 2x or larger, that would still look halfway decent IMO. I mean, it's not possible to get a modern lcd to look exactly like an arcade monitor, I think most people would be happy as long as it's a good faith attempt along the same lines. you could always let people turn it off with a #define if they don't like it.
I'm not opposed to trying something more sophisticated when I get around to it, but at the moment my recommendation is to use one of the -effect overlays, and turn up your monitor's brightness if the result is too dark. I've written a bit more about shadow mask simulation here
Quote
Quote
At some point, it makes more sense to integrate a CRT shader like is done with HLSL on Windows.
how easy would it be to get that started anyway? I don't really know the mame dev community, would they be receptive to this? crt-geom as it stands is already pretty damn good, it would be nice to make it more official.
My understanding is that they are reasonably receptive to external submissions, however some of the more interesting benefits (interlaced video modes, NTSC/PAL simulation) would require modifications to the MAME/MESS core and drivers in order to implement properly, and that would be too daunting a task for me to consider.
Quote
my graphics card can handle the 3x beam oversample smoothly when playing most games at 2x, but when when trying to play something at like 3x it crawls. I'm under the understanding that the oversampling was implemented to reduce moire at small sizes, so would it make sense to automatically turn it off at larger ones?
Flow control (if statements, etc) is best avoided in shaders, so my preference is to leave this controlled by a #ifdef.
If you can post a screenshot of the bug, it might be helpful.
note that these are both the same dimensions. in theory when I resize my window all that should change is the amount of black space surrounding the image, but obviously that's not what's actually happening. here's a diff:
the reason I though that this was intentional at first is that Daioh has a number of images with bright centers:
also: these screenshots were taken while using CRT-old-20111120. if I play around I can sorta get the same problem with the latest CRT-geom, but it's nowhere near as pronounced and I doubt I would've noticed had I not been looking for it.
anyway, the take-home on this one is that what I thought was a deliberate bloom effect is really a bug, so you can get rid of that gaussian #define you added.
Your screenshots show images scaled by a factor of about 2, which is lower than the shader is designed for. Scaling by a factor of 2 means that scanlines are at the maximum spatial frequency of the image, and if no special care is taken, my shader can easily produce patches where the scanlines effect is reduced or nonexistent.
Your screenshots show images scaled by a factor of about 2, which is lower than the shader is designed for.
I don't really have a choice there. daioh is 384px tall, you can't run that at 3x+ even at 1080p, and I'm not about to dump ~$300 on a sideways mount or a new monitor.
I totally understand if you're going to get some moire effects or something at certain sizes due to the curvature, that's almost a given. what's at issue here is that the particular pattern changes even when the overall size of the image does not. in theory, the amount of black space around the image area shouldn't have any effect on how the distortion manifests.
this is all sorta off topic though. I'm not dictating you need to fix this, I mean I thought I liked the effect before I realized it was a bug. in any event, it's a lot less pronounced in the latest CRT-geom anyway, so this isn't really a big deal.
my recommendation is to use one of the -effect overlays
like I said though, they don't play nice with screen curvature
also: it occurred to me that I'm not being completely accurate when I say this. what I really mean is that effect overlays don't play nice with screen curvature when there are scanlines. in thinking about it more, maybe a quick and dirty option would be to make the scanlines an option via a #define? ie; let people pick and choose from all the features equally. the aperture pngs already kinda create a sort of scanline effect anyway.
I don't really have a choice there. daioh is 384px tall, you can't run that at 3x+ even at 1080p, and I'm not about to dump ~$300 on a sideways mount or a new monitor.
If you use the correct aspect ratio, then at 1080p it will scale to 810 pixels wide, which is a scale factor of 3.375, and should produce reasonably good results, particularly with the oversampling enabled.
Originally Posted by quartz
effect overlays don't play nice with screen curvature when there are scanlines.
What do you mean by this? What sort of problem occurs?
If you use the correct aspect ratio, then at 1080p it will scale to 810 pixels wide, which is a scale factor of 3.375, and should produce reasonably good results, particularly with the oversampling enabled.
uhh..... either you're not understanding me, or I'm not understanding you.
daioh is 384px tall. 384 x 3 = 1152. a 1080p res screen is exactly 1080 pixels tall. thus, you can't run a 384px game at 3x at normal rotation. the width doesn't really factor in here.
you can run the game if you rotate it 90 degrees, but that assumes you can also rotate your monitor to compensate.
I'm not really sure what you're talking about. normally rotated on a 1080p you can't scale past 2.8125, but rotated you can go up to 4.5, so I dunno where the 810/3.375 is coming from or what that has to do with anything.
I don't even have a 1080p screen, I was just using that as an example. most consumer monitors these days max out at 1080p, so unless you have a high end professional screen or a normal one that can rotate, complaining that people aren't running at 3x is kinda unreasonable.
Originally Posted by cgwg
Originally Posted by quartz
effect overlays don't play nice with screen curvature when there are scanlines.
What do you mean by this? What sort of problem occurs?
when the curved dark scanlines overlay on top of the straight colored lines in the aperture png, you get ugly rainbow moire effects. this isn't your or anyone's fault, it just goes with the territory. if you let people turn off the scanlines but keep the screen curvature, they can still use straight-line effect pngs without it looking that ugly. I dunno, maybe I'm the only one who would use this.
If the image is 1080 pixels tall, then with the correct 3:4 aspect ratio, it will be 810 pixels wide. Since for a vertical game the scanlines are vertical, it is the horizontal scale factor which is important. The daioh image is 240 pixels wide, giving a scale factor of 3.375.
Quote
when the curved dark scanlines overlay on top of the straight colored lines in the aperture png, you get ugly rainbow moire effects. this isn't your or anyone's fault, it just goes with the territory. if you let people turn off the scanlines but keep the screen curvature, they can still use straight-line effect pngs without it looking that ugly. I dunno, maybe I'm the only one who would use this.
I don't have that big of a problem with how the overlay effects look with my shader currently, but if you want to disable the scanlines and replace them with linear interpolation, replace the contents of the scanlineWeights function in the fragment shader files with the following line:
Replace the number 2.0 with whatever you like; it acts similarly to using -prescale, and helps to make the linear interpolation look not terrible (1.0 is ordinary linear interpolation). I don't plan to include this in any version that I distribute, though.
ok, there's still a communication gap here, and I'm pretty sure it's on your end. for the third time, daioh is 384px tall and 240px wide. 1080 / 384 = 2.8125. you can't run this game at 3.375x on a 1080p res screen unless you rotate or crop. do the freaking math.
additionally, 240x384 is 5:8. 3:4 is most definitely NOT the correct aspect ratio.
Quote
I don't have that big of a problem with how the overlay effects look with my shader currently,
if your screen is large enough, or your games small enough that you're always running at like 4x, then maybe it doesn't show up for you, but it looks pretty bad on my screen.
Quote
but if you want to disable the scanlines and replace them with linear interpolation, replace the contents of the scanlineWeights function in the fragment shader files with the following line:
cool, thanks, I'll play with that and see if the tradeoff is worth it.
Quote
I don't plan to include this in any version that I distribute, though.
ok, there's still a communication gap here, and I'm pretty sure it's on your end. for the third time, daioh is 384px tall and 240px wide. 1080 / 384 = 2.8125. you can't run this game at 3.375x on a 1080p res screen unless you rotate or crop. do the freaking math.
additionally, 240x384 is 5:8. 3:4 is most definitely NOT the correct aspect ratio.
Ugh, not this misconception again.
The display aspect ratio of an arcade game (or console, for that matter) has nothing to do with the number of pixels. If it was meant to display on a standard CRT, it's 4:3, period. Chun Li is not supposed to be fat, etc.
If it was meant to display on a standard CRT, it's 4:3, period.
sorry, but you're wrong. daioh used what is today a standard aspect ratio of 16:10 (5:8) in a portrait 'tallscreen' monitor with 1:1 aspect ratio pixels. don't tell me I don't know what I'm talking about, the pizza place down the street from me had one of these things up until last year and I used to play it all the time.
look, play the game yourself if you don't believe me. if you run it in mame at 3:4 all the graphics look obviously squished- the ship is a pancake, all the planets are ovals, etc.
I don't know what kind of monitor the cabinet in your pizza place had, but I can guarantee that in 1993 no one was making 16:10 arcade monitors, or drawing graphics meant to be displayed on them.
Anyway, you're wrong about the graphics looking "squished" at 3:4. Look at the large stars with haloes in the intro sequence, on the screen with the giant robot rearing up behind the planet. At 1:1 pixel aspect ratio, the haloes are obviously oval. Only when stretched to 3:4 do they become circular:
Sorry, no. There simply is no such thing as a 16:10 CRT, and games made for JAMMA cabinets were required to work with a standard 4:3 or 3:4 CRT. It looks super-tall because you aren't used to seeing TVs sideways, but it really isn't. I played a 27" Strikers 1945 III that I was convinced was 16:9 for a while but then popped the back and the monitor was a dead stock 4:3 Hantarex. If we can track down someone with a Daioh machine I'm sure they'll be able to pop the back and say the same thing.
widescreen CRTs have been around for a while, just because you've never seen one before doesn't mean they don't exist.
Originally Posted by R. Belmont
If we can track down someone with a Daioh machine I'm sure they'll be able to pop the back and say the same thing.
yeah, the problem with that is that they're pretty rare, from what I understand.
look guys, this argument is retarded. how's about you call me nuts, I call you nuts, and we just move on. cgwg has made an awesome plugin and I don't want to derail this thread with some stupid flamewar about aspect ratios.
Yeah, and those both came out after 2000. Find me one from 1993, let alone an arcade monitor chassis from that era that used one. (The only thing close was the special-order super-high-dollar 16:9 version of Virtua Racing, and that was medium-res so it wouldn't work with Daioh anyway).
Look, it's ok to admit you personally think the game looks better at the wrong aspect. There are people out there who swear by 400 pound Chun Li (caused by playing SF2 in pixel-aspect at 16:9 to "fill the screen"). It's just that we'll eventually prove you're wrong because JAMMA specifies 4:3 or 3:4, period
I dunno, maybe I'm wrong. here are two better quality images:
honestly that looks taller to me than 3:4, but not as tall as 10:16. this cabinet is noticeably different from the one that was at my pizza place though, so I dunno.
as a side note, I really wish google didn't keep removing features from their search. trying to find pictures of this game is hell because I can't do "-azumanga" anymore.
cgwg: have you ever thought about adding vignette or blur/focus filters?
Can you elaborate on how vignetting would be appropriate for a CRT filter?
At one point, I did make a shader that simulates worse beam focus at the corners than at the center of the screen. One of the limitations of the present design, though, is that the simulated beam has height but no width, so the resulting blurring is only vertical.
ok wait, it just occurred to me that 'vignetting' has multiple definitions. what I'm talking about is having the image lose luminance slightly towards the edges/corners
Yes, and that's precisely the effect that makes no sense in a CRT context. It is *not* an effect that happens to older CRTs; CRTs die when the gun itself is no longer capable of shooting, and that makes the entire screen dimmer at once.
I've uploaded the new version here. The main changes are the following:
More accurate geometry for simulating a spherically curved CRT. Now controlled by two parameters: radius of curvature and distance of viewer from screen.
Tilting of the simulated screen.
Hard border and rounded corners.
Adjustable overscan.
Moved most of the parameters to the vertex shader file. See the comments at the start of the main function. There are still a few #defines that can be toggled in the fragment shader files.
3x oversampling of the beam profile. Reduces Moir� patterning at small output sizes. This has probably the biggest impact on performance, and should be disabled if things become sluggish.
I'm having trouble using the new shader with the certain settings I used on the old one. I use a (very) high resolution PC CRT monitor for my setup, so I already have a natural curvature and round corners. I'm just interested on the scanlines and related effects. On the old shader it was easy to disable curvature effects, since the value that controlled it could be set to zero and be done with it. On this one however, it's not too clear for me how it's done, any clues?
Hello to everyone on this thread. This is my first post here and I registered to see if I can please get some help with getting GLSL filters to work with SDLMAME running on Mac OSX 10.7. I launch SDLMAME thru the frontend, MacMameInfoX. I have downloaded the CRT pack that give me 4 files - 3.fsh and 1.vsh files. Do I need to use startup command switches in MacMameInfoX, do I need to edit a MAME.INI file? Thanks in advance.
Thanks for the reply etaba78. OK. I have had to edit the MAME.INI file and I have put the CRT folder in all of my Mame directories and this seems to have finally worked for the CRT file and some games do work with this so I get the curved screen effect now but I also get the error on startup "cannot open shader_file: ./src/osd/sdl/shader/glsl_plain.vsh OpenGL: GLSL loading mame bitmap shader 0 failed (./src/osd/sdl/shader/glsl_plain)" Thanks.
I believe MacMAMEinfoX sets the working directory to your "~/Documents/MacMAME User Data/Misc Support Files" - copy the shader files into that directory, then change your ini file to say:
Thx for your help - really appreciate it. Have not added the glsl_plain files to SDLMAME but I assume you get them from http://mamedev.org/source/src/osd/sdl/shader/index.html and then just save them as .vsh and .fsh files. Will look into this later.
Anyone know why I the CRT shader does not seem to work with some games? Example; on Kung-fu Master, in the attract sequence the girl's dress is black when it should be red. This happens when running SDLMAME 0144u6-x86_64 or any of the 0145 builds. It does work however correctly with earlier builds; 0144u1 say or any of the earlier ones BUT it produces a strange 'bleeding' effect around the edge of the screen where the curviture should be. Thanks.
I can reproduce that behavior using the old version of the glsl_plain shader. Newer versions of the MAME source have an updated version of glsl_plain. In the MAME source tree, the built-in shaders are in src/osd/sdl/shader.
I have just realised that mame ships some shaders in src/osd/sdl/shader and that my RPM Fusion package does not install them now. Which files do I need to include?
It's that time of the year again ;-) I'm afraid I lost, again, the pixel shaders scanlines (+ mask) that were provided here. All links are broken now, unfortunately !
Also, I would love to hear from you, about the combination of masks / filters you currently use :-)
Sorry for the extreme delay; I've been at various times busy or lazy. I've reuploaded two files. This is the older version from 2009 that I think you're asking for, and here is the most recent version from a year ago.
I've made a couple shaders for simulating the look of RGB LCD screens (but not panel response time, which can't be done with the current shader plugin setup).
The first one was previously posted on the bsnes forums; it's heavily based on the Pixellate shader, which comes with bsnes and is like a better nearest-neighbor filter. This modified version is the equivalent RGB "nearest-subpixel" filter. This shader only really works at >3x horizontal scaling, and can be pretty sensitive to the output display gamma; it's fine on my IPS display, but artifacts show up if I look at this on my TN laptop panel from the wrong angle. See here for a screenshot. It can also be pretty harsh to look at, which led me to make the next shader.
The second shader is based on the same idea, but blurs (and overlaps) the subpixels horizontally. I find the result easier on the eyes, and it's less sensitive to display gamma. Here's a screenshot.
Both of these should be run in the same way as the CRT shaders, i.e. use glsl_plain as the first pass. I've tested them with both Nvidia and Intel graphics.
On a slightly-related note: I've noticed strange behavior when using Xfce on Fedora 18: when I resize the MAME window, the contents aren't scaled with it (i.e. they get cropped). I don't see this behavior when using Gnome 3. I guess it's probably an SDL or Xfwm bug.
Hey cgwg just wanted to say your work on the CRT emulation shaders is great. What tools do you use to work on these? Do you have an emulator that you prefer? do you work in some other shader builder/design tool?
I just use a text editor. In terms of plug-in shader support, most emulator implementations are equivalent for simple single-pass shaders. For more sophisticated things, RetroArch has probably the most features, but sufficiently-complicated designs will eventually be best served by being more integrated into the emulator, like is done for HLSL in MAME.
For instance, SDLMAME's plug-in GLSL shader support doesn't give access to previous frames, which makes simulating slow LCD response impossible. This could be done using a shader for RetroArch, but it would be best to measure the physical characteristics of the LCDs that came with various handheld consoles, and have the emulator automatically specify the correct parameters.
Old thread but maybe someone's still kicking around. I'm running SDLMAME and GLSL shader pack (http://www.mameau.com/mame-glsl-shaders-setup/). I've messed around with CRT-geom.vsh over the past year, generally slowly removing curvature.
But now I'd like no curvature at all, and "simulated distance" to also be effectively turned off. Right now I've got:
Code
// gamma of simulated CRT
CRTgamma = 2.0;
// gamma of display monitor (typically 2.2 is correct)
monitorgamma = 2.2;
// overscan (e.g. 1.02 for 2% overscan)
overscan = vec2(1.00,1.00);
// aspect ratio
aspect = vec2(1.0, 0.75);
// lengths are measured in units of (approximately) the width of the monitor
// simulated distance from viewer to monitor
d = 2.0;
// radius of curvature
R = 10.0;
// tilt angle in radians
// (behavior might be a bit wrong if both components are nonzero)
const vec2 angle = vec2(0.0,0);
// size of curved corners
cornersize = 0.001;
// border smoothness parameter
// decrease if borders are too aliased
cornersmooth = 1000.0;
I'm not sure that's correct for both of those to be off -- suggestions? And maybe an outcome of this, here's a screenshot where you can see some vertical bands warping inward.
So while the curvature of the image in general seems normal and straight, there is something going on, perhaps with the scanlines. Zooming to 100% shows this, and zooming waaaay in shows some scanline oddness.
I notice if the cornersmooth = 1000.0; change into cornersmooth = 10000.0; it will get rid of the curve scanlines but you still have the straight scanlines showing.
Code
// START of parameters
// gamma of simulated CRT
CRTgamma = 2.4;
// gamma of display monitor (typically 2.2 is correct)
monitorgamma = 2.2;
// overscan (e.g. 1.02 for 2% overscan)
overscan = vec2(1.0,1.0);
// aspect ratio
aspect = vec2(1.0,0.75);
// lengths are measured in units of (approximately) the width of the monitor
// simulated distance from viewer to monitor
d = 2.0;
// radius of curvature
R = 10.5;
// tilt angle in radians
// (behavior might be a bit wrong if both components are nonzero)
const vec2 angle = vec2(0.0,0.0);
// size of curved corners
cornersize = 0.001;
// border smoothness parameter
// decrease if borders are too aliased
cornersmooth = 10000.0;
// END of parameters
That didn't fix it (and honestly not sure why it would've since that's a corner attribute). If you look closely at my screenshot, at 100%, you'll see grey vertical bands bending inwards. I suppose it could be a moire-type pattern, some byproduct of the resolution.
This only happens when my iMac is in mirror mode (to display on a plasma nearby), which puts them both in 1080p resolution. When back to this iMac's native resolution (2560x1440) this doesn't happen.
I've tried several other resolutions but this is the best case scenario, with many games not having this tripling up or whatever of scanlines.
As I understand it, mirror mode is actually still rendering at the native resolution and then scaling it to 1080P. Or maybe it was the other way around. Anyway, that's probably where the artifacting comes from. You'll need to wait for SDL2 builds to become standard so you can make the external display separate and run it at actual 1080P.
As I understand it, mirror mode is actually still rendering at the native resolution and then scaling it to 1080P. Or maybe it was the other way around. Anyway, that's probably where the artifacting comes from. You'll need to wait for SDL2 builds to become standard so you can make the external display separate and run it at actual 1080P.
That has to be it. I remember trying to send it to a second display and never having luck, didn't know this was a known issue with the current version. How far away is SDL2, anyone know?
There are some additional controls in the fragment shader file, CRT-geom_rgb32_dir.fsh. In particular, commenting out the "#define CURVATURE" line will disable the curvature entirely.
The second problem is a kind of Moir� interference pattern. The horizontal scale factor is close to, but not exactly, 3 (and because there's still a bit of curvature in the screenshot, the scale factor varies across the image). As a result, sometimes the scanline will sit exactly on a row of pixels, and sometimes it will sit between two rows. Thus, it appears as either a single bright pixel or two side-by-side half-bright pixels. In order to make the overall brightness the same in both cases, the shader needs to know your screen's gamma. If, when you defocus your eyes so that the scanlines are invisible, you can still see the interference pattern, then it could help to try adjusting the 'monitorgamma' setting to reduce visibility of the pattern.
You can also eliminate the interference pattern by turning off curvature and scaling by an integer factor in the horizontal (or vertical) direction for vertical (or horizontal) systems.
I found a work around the graphics issue. There is a glitch.
Turn up the brightness to fix the graphics. Just weird but it works. I don't know why I didn't think of this before. I will added this trick onto my website.
Could you post your shaders again? I can't find a working link.
I'd like to port my shader to glsl, but I'd like to see a working example first. I'm very confused by this idx16, rgb32_lut, rgb32_dir business. Why should the shader have to deal with that?
Hi ! It's that time of the year again Just checking what's new !
I still use this CRT shader. Apart from the extra customization (and the fact the curvature is incorrect for vertical games), is there any benefit in using CRT-geom instead ?
Do you guys have new improved versions to share ?
I'm very happy to see the shader support was fixed in the latest MAME
BTW, for those interested, I stumbled upon those patches you may not know : http://www.systempixel.fr/extra/ They work on Linux too. (hiscore support, Neo Geo overclocking, "no buffer" version to reduce input latency, unlicensed Neo Geo, no nag to remove warnings)
If auto save state doesn't work in a game, please report it at MAMETesters. We've got an automated framework for testing save states now (0.155 should have the cleanest save states ever seen in an emulator) but it's not infalliable
Oh, thanks for the new shader ! I first tried the one specified on your link, which was extremely slow on my laptop (celeron + old nvidia proprietary driver). Then I tried the one specificed as "updated for intel drivers" and it's super smooth and looks pretty cool (although, I'm not that sure, it looks pretty strange, with vertical bars visible during horizontal scrollings).
The picture is pretty strange though. Looks like cardboard and has strange vertical bars even with noevenstretch disabled. I'll investigate some more
Edit : with MASK_APERTURE_GRILL, the "vertical bars" do not appear anymore...
Edit 2 : except with vertical games apparently which look very bad (Gunbird).
I'll keep checking once in a while Very interesting !
Edit 3 : great news, for savestates, thanks ! And thanks again for all your great contributions !
The aperture mask patterns all should be pixel-for-pixel aligned to your window. So you shouldn't see any scaling artifacts from those at all (because no scaling is done). Unless the shader is doing something wrong... Is your resolution at 1080P or above? Below that is dicey. You might turn off curvature to see if that helps any.
As far as vertical games go, you need a lot of vertical resolution to avoid artifacts because the scanline width gets too close to width of the pixels on your lcd. Even 1920x1440 isn't quite enough. 4K will fix that right up for you. Or else you can rotate your monitor if it supports portrait mode.
But if you think something is rendering incorrectly, go ahead and post a screenshot, and I'll see if I can figure it out.
My LCD resolution is clearly at fault here. This is a 1366*768 laptop although I'll eventually use MAME in my cab which has a 17" 1280x1024 monitor. I'll let you know Thanks for taking the time to reply to a simple and annoying user !
Just one question : to reduce this kind of artifacts, do you advise to enable or disable the unevenstretch option ? I'm never too sure about this one.
I have one last question Is it possible to make MAME fallback to another shader if it detects a vertical game ? The only solution I see is to use AdvanceMenu with 2 seperate configurations and point to 2 drawers for horizontal or vertical games...
Yeah, vertical games look much better with CRT-geom instead of Lottes_CRT.
In addition to horizontal.ini and vertical.ini, you can do the same with entire drivers as well. It looks like MK1 and MK2 look better with CRT-geom and I'm unsure if it applies to all games in the driver at this point, but creating midtunit.ini and pointing it to CRT-geom works
Hmm, getting pretty technical indeed, but that's interesting It's really subjective too, which makes the choices even more complicated.
Oh about MK2/MK3, sorry for this off topic, but is it possible to amplify the sound ? Its level is so much lower than with other drivers... Couldn't do it in the sliders in MAME)
The picture is pretty strange though. Looks like cardboard and has strange vertical bars even with noevenstretch disabled. I'll investigate some more
Edit : with MASK_APERTURE_GRILL, the "vertical bars" do not appear anymore...
Edit 2 : except with vertical games apparently which look very bad (Gunbird).
I'll keep checking once in a while Very interesting !
Edit 3 : great news, for savestates, thanks ! And thanks again for all your great contributions !
The aperture mask patterns all should be pixel-for-pixel aligned to your window. So you shouldn't see any scaling artifacts from those at all (because no scaling is done). Unless the shader is doing something wrong... Is your resolution at 1080P or above? Below that is dicey. You might turn off curvature to see if that helps any.
As far as vertical games go, you need a lot of vertical resolution to avoid artifacts because the scanline width gets too close to width of the pixels on your lcd. Even 1920x1440 isn't quite enough. 4K will fix that right up for you. Or else you can rotate your monitor if it supports portrait mode.
But if you think something is rendering incorrectly, go ahead and post a screenshot, and I'll see if I can figure it out.
Yeah. What I posted here is complete BS. The shadow mask patterns just look bad when rotated. The solution is to rotate the patterns if the screen is going to be rotated. So either we'll need to versions of the shader + a switch inside to set horz/vert or the shader can guess which it should be based on resolution (it might not always guess correctly...)
The new hotness in shaders is this one from Nvidia's Timothy Lottes (it works on AMD cards too ;-)
It's a fine shader, but I've never quite understood the "new hotness" aspect to it � the design is fairly similar to what I first did five years ago, plus some overlays that can be more-or-less replicated using the "-effect" option that's been around for ages.
It's a fine shader, but I've never quite understood the "new hotness" aspect to it � the design is fairly similar to what I first did five years ago, plus some overlays that can be more-or-less replicated using the "-effect" option that's been around for ages.
Have you tried using an overlay with "-effect"? This png is fairly similar to the shadow mask from the first Timothy Lottes shader.
I agree with this. Better effect PNGs are all that's really needed. But I prefer to have the shadow mask effect in the shader so that the resulting loss of brightness can be corrected to preference there. Otherwise you (well, I do..) end up messing with a bunch of different overlay PNGs in photoshop.
Also, "new hotness" is a bit of an exaggeration since only ~20 people downloaded the newer version of it!
Anyway, just for fun, here is an effect png with GLSL diabled. The scanlines and shadowmask come from cropping an all white area from a rendered screenshot of the crt shader. Only works for this exact vertical resolution of course, but you can see what can be done with a clever overlay. If you disable glsl, run at 1080p, and use the PNG, it looks pretty good:
I only tried to get the shader working because I wanted to see it in MAME/MESS. And I posted it because Windows folks don't really know what the GLSL system can do. I like the CRT-geom shader just fine.
Just getting back to MAME fun lately after a long break. (I still have copies of MacMoon and Multi Pac archived for old time's sake, for anyone who remembers.)
Anyhow, for the past week I've been attempting to create an account on MAMEWorld's message board *specifically* to download the "Lottes_CRT" shader. Unfortunately, their email server must be having issues, as I can't get it to send the necessary verification email, nor a "password reminder" for an ancient account. (Either the server times out with a 504, or no email ever arrives.)
So, could someone please either point me to an alternate download source for this shader? Or could someone possibly email a copy to me?
I love CGWG's CRT-Geom, and have been having fun tweaking it, but would also like to experiment with the Lotte's shader.
Thanks in advance, and also THANK YOU to R. Belmont for keeping MAME available to Mac OS... the switch to SDL2 is great, BTW... finally Punch-Out! on TWO screens!
Just getting back to MAME fun lately after a long break. (I still have copies of MacMoon and Multi Pac archived for old time's sake, for anyone who remembers.)
Before my time (and I am still not a Mac user) but I did help compile CAESAR's massive list of archaic emulators back in the day. MacMoon 0.86 has been preserved here: http://caesar.logiqx.com/php/emulator.php?id=macmoon
Of course, CAESAR's been limping along the last half-dozen years or so as it is. I'm sure there's other download sites out there, looks like the author's site for MacMultiPac is still online at the same address even.
Before my time (and I am still not a Mac user) but I did help compile CAESAR's massive list of archaic emulators back in the day...
I've upped Lottes_CRT shader here for now...
Nice, thank you VERY much Stiletto! The Lottes shader actually performs better than I expected; have been having fun further tweaking the settings as well, and it looks great on the projector. Thanks to SoltanGris42 as well for making it work with SDLMAME!
The CAESAR site is really cool too� I had forgotten about Williams/Midway's original retail emulators for Defender/Robotron/Joust. Creating a nostalgia *for* nostalgia feedback-loop� didn't Zappa say something about "Death by Nostalgia"? :P
The CAESAR site is really cool too� I had forgotten about Williams/Midway's original retail emulators for Defender/Robotron/Joust. Creating a nostalgia *for* nostalgia feedback-loop� didn't Zappa say something about "Death by Nostalgia"? :P
Yep, but it's the direction we're headed. I've seen 18-21-year-olds logging into emulation forums recently and asking about "the old days of emulation", it's pretty weird. Anyhow, that's getting off-topic.
Glad you like Lottes shader. Eventually what will happen though is HLSL and GLSL will merge and you'll get all of those bells and whistles from Windows MAME in SDLMAME. Gonna be a little bit before we get there, though.