Hi everyone ! :-)

Sorry for the sensationalist topic title :-)

Loooong term MAME/Linux user here. For some reason I forgot to use the regular (sdl)MAME in the last years and switched to RetroArch and libretro-mame. I guess it cannot be considered as as infidelity as it uses the same codebase anyway. I have several questions in my quest of finding the best setup. I use Linux (KDE Neon).

- Input lag is extremely important for me. On a Raspberry, I couldn't play due to the horrible lag (among reasons : OGL on RPi adding 1 frame of lag ; triple buffering adding 1 frame of lag ; some cores adding 1 frame of lag because not processing the input events at the right time etc.). I thought the optimal setup was to use Lakka/RetroArch on PC with an Intel GPU : KMS mode supposedly reduces lag significantly (no X server) and also allows to use double buffering. So in the best PC case vs the regular Raspberry case there is a difference of 3 or more frames -- also on a fast PC you can specify frame_delay).

Nowadays, I'm using a regular distro & an Nvidia GPU. Retroarch enables to use KMS but it is not really supported on Nvidia. Retroarch enables also to use Vulkan instead of openGL which should supposedly improve input lag. (and also has a X-less context but it kinda segfaults on my machine too ; well not all the time...).

--> Do you people have a clue whether regular (sdl)MAME has more, less or as much input lag as RetroArch ? I always feel there is some and that it makes games harder than they should. And they already are !! It should have more as it doesn't support KMS nor Vulkan. But does it use double buffering by default ? I clearly remember the Retroarch devs said triple buffering was always forced on Linux unless KMS was used. (or Vulkan in a X-less context). So 1 more 16.66 ms of lag ?

- as I grow older and as my eyes are in a very bad state, I cannot stand to look at a double buffered (or unfiltered) picture anymore. In retroarch, you can for instance chain several pixel shaders : xbr (or scalefx) then a crt one. It gives a stellar result. The first pixel shader increases the resolution & improves edges ; the second mimics an old screen.

--> I used to use glsl shaders in (sdl)mame but couldn't find new ones (like xbr etc). So I figured out I had to switch to BGFX output (not sure what it really is), which enables to choose a shader. But it doesn't seem possible to chain several ones and I'm not sure how many times the pictures is scaled. (for instance I usually use in RA xbr scaled 3 times then I apply crt-hyllian).

Sorry for all those questions :-) I'm looking for the best way to get a low input lag, in double buffering, and cool pixel shaders. I'm really such a spoilt kid wink

Thanks to everyone, thanks again Richard !

Last edited by torturedutopian; 10/10/17 07:44 AM.