Previous Thread
Next Thread
Print Thread
Page 1 of 5 1 2 3 4 5
Joined: Mar 2006
Posts: 17
R
Redx Offline OP
Junior Member
OP Offline
Junior Member
R
Joined: Mar 2006
Posts: 17
I sent this email just now, but if I knew there was an active Nestopia forum I would have just post here instead in the first place :duh:

Just to say it's not my intention to spam.


Anyway here it is:


>
When Vsync is enabled, it causes a slight but noticable lag in input reaction. This is only the case when vsync is active. I even did a "blind test" to make sure I wasn't just imagining things :p

Also, the triple buffering option doesn't seem to work (screen tearing still occur).

Some people mentionned that vsync might be inferior to triple buffering in that it does have a tendency to cause input lag. I don't know if this information is accurate but this is what I heard. So I thought maybe this input lag problem could be solved if tb worked in Nestopia.

Here's a dicussion about it on the Zsnes forum:

http://board.zsnes.com/phpBB2/viewtopic.php?t=4926&postdays=0&postorder=asc&start=150




Thanks for your time and thanks for this amazing emulator

Joined: Jan 2005
Posts: 154
Senior Member
Offline
Senior Member
Joined: Jan 2005
Posts: 154
What we need is a test ROM or emulator code that somehow measures all three latencies: input, video, and audio. This would put an end to the "it feels laggy" kinds of reports, which are difficult to verify because they are on the threshold of perception.

I'm thinking of something along the lines of a regular visual/aural beat along with one controlled by a button on the joypad, where the player synchronizes with the regular beat. Subconsciously we will measure and adjust to any latency in order to get the beats in synchronous (basic feedback loop), which the emulator/test ROM could then measure. This would still only give you input+video and input+audio latencies, so I'm not sure how input could be separated.

Joined: Mar 2006
Posts: 17
R
Redx Offline OP
Junior Member
OP Offline
Junior Member
R
Joined: Mar 2006
Posts: 17
Like I mentionned on the other board,I think it would be awesome to have an objective way (meaning: not done by a person) to measure those things. Like you said, it would certainly put an end to subjective reports.

However,I do think that the "blind test" I did was very good.(copy-pasting what I wrote)

Quote:
Btw Blackmyst, I did a self-administered blind test laugh

I got into the timing configuration screen where Vsync is and - without looking at the screen of course, repeatedly clicked the mouse on the Vsync option and then exit, in such a way that I didn't know if Vsync was on or off when I did exited. I even alternated my click "rhythm" so that I wouldn't always do an even or odd numbers of "clicks". And I tested this in parts of a game (Zelda 2) without scrolling, so there wasn't any screen tearing when Vsync was off (that would have been of course a dead giveaway)
..The actual "test" consisted of simply pressing down and notice how responsive "Link" was to duck.

Aaaaaanyway...I tested this more then twenty times, and guessed right 100% of the time...So I'm convinced this is not a sort of "auto-sugestive placebo idea" mumbo jumbo thing...

Again, to anyone saying they don't notice a lag: This lag is no more than a frame (1/60) or two max.
That's a lot of words for: "I think that test was pretty objective" but an extermal mean would be even better of course.

The test rom is not a bad idea, but for it to be of any use,I think it would need a "control" environment. For example, you do the test with Vsync on and then you do the same test with vsync off.

But if you just do it with nothing to compare to, it would not be very useful. People would just adjust their timing until they get it "right" and that would destroy the whole test. Plus you'd have to control for different hardware setups, and basically the same subjective human bia would be present.

Joined: Mar 2006
Posts: 17
R
Redx Offline OP
Junior Member
OP Offline
Junior Member
R
Joined: Mar 2006
Posts: 17
I just got an idea to measure input lag, but it would require an external program/module in conjunction with the emu.

The external program/module/dll would record the exact time which an input is pressed. It would also record the exact time in which a display change occur (for example Mario jumping or Link ducking)

And then compare the time at which the two event occured. Of course, the same rom and the same location in the game would have to be used each time.

If someone could manage to code such a small utility it would be quite awesome...

Joined: Jan 2006
Posts: 40
J
Junior Member
Offline
Junior Member
J
Joined: Jan 2006
Posts: 40
Windows, Linux, and OSX are not hard real-time operating systems. Hence there is certainly going to be latency, and there is no bound on that latency. So any tests would just show that for that system, such and such latency was observed. Many aspects of a computer can affect the latency measured.

USB alone has a latency due to the polling rate of HID devices, which is typically 8ms. So that is the minimum latency from you pressing a button on a controller to your OS's device driver reading the data. Then there is the latency of your OS getting that data to the emulator. Roll in the factor of process scheduling, which again, is not hard real-time, and also factor in the synchronization between the display device and the current state of the emulator (say the current logical NES cycle it is executing)... and you will end up with a very large range of possible end-to-end latency.

A real NES, of course, has none of these issues. I don't want to discourage anybody, as there are many people here smarter and more creative than me. But I'd say that the issue of latency in input, video, and audio is going to be the most difficult thing to perfect with regards to emulation accuracy. Much of the problem is out of the realm of the emulator, i.e., hardware and OS limitations may prevent perfect timing accuracy.

It would be great if the entire PC system (CPU, video, audio, and HID) could be forced into lockstep synchronization so as to match up exactly with a NES, but is that even possible with common PC hardware?

Joined: Mar 2006
Posts: 17
R
Redx Offline OP
Junior Member
OP Offline
Junior Member
R
Joined: Mar 2006
Posts: 17
All valid points.

But what I was saying is that there is definitely more latency when vsync is enabled. That's something that's probably within the control of the program. I'm still curious to see if the problem would persist if triple buffering worked. Like I said, I read it doesn't cause as much lag as vsync.


To be honest, I personally can't perceive any lag when vsync is off, even though there has to be some for the reasons you gave above. If only a few ms...

I still think the idea of an external program/module to measure (to the best capacity of the PC at least) would be very useful in many situations, such if you want to compare the response time of different Nes emus or if there is any difference between response when vsync/triplebuffering is on/off etc...

Such a method could never take into account the lag caused by the usb gamepad and similar factors but it would be a good start.

Joined: Jan 2006
Posts: 40
J
Junior Member
Offline
Junior Member
J
Joined: Jan 2006
Posts: 40
Quote:
Originally posted by Redx:
I just got an idea to measure input lag, but it [b]would require an external program/module in conjunction with the emu.

The external program/module/dll would record the exact time which an input is pressed. It would also record the exact time in which a display change occur (for example Mario jumping or Link ducking)

And then compare the time at which the two event occured. Of course, the same rom and the same location in the game would have to be used each time.

If someone could manage to code such a small utility it would be quite awesome... [/b]
I don't think that the measuring device can run on the PC itself, because the latency is the time interval from a physical human pressing a real world button, all the way to the audio and video changing due to the button press. Even on a real NES, there will be some delay. The goal should be to have the delay be as close as possible to the delay on a real NES.

Maybe somebody that is good with electronics can string together something that can plug into the video, audio, and USB ports of a PC? But even that wouldn't account for possible latency introduced by the actual devices hooked into those ports. How small is the delay from a video signal entering a CRT to the time the signal is converted into photons? Similarly what is the time delay for pressing a button on a gamepad and that gamepad turning the button press into a USB packet?

It seems as though only blargg's method will measure the entire system for latency.

Joined: Jan 2006
Posts: 40
J
Junior Member
Offline
Junior Member
J
Joined: Jan 2006
Posts: 40
What is your display devices refresh rate? What is the polling rate for your USB gamepad? There is a Windows USB driver hack that lets you increase the polling rate from 125hz to 1000hz. Have you tried that, as well as increasing the refresh rate for your display device? The maximum delay due to vsync is bounded by the refresh rate of your display device.

Joined: Mar 2006
Posts: 17
R
Redx Offline OP
Junior Member
OP Offline
Junior Member
R
Joined: Mar 2006
Posts: 17
Quote:
What is your display devices refresh rate?
Right now it's 100hz. But when I'm using Nestopia: 60hz. Anything else just isn't as smooth...and yes, that includes multiples (i.e: 100hz for Pal 120hz for Ntsc)

Quote:
as well as increasing the refresh rate for your display device? The maximum delay due to vsync is bounded by the refresh rate of your display device.
I do believe using 120hz WOULD fix the issue. But like I said, 120hz just isn't as smooth as matching the original refresh rate..So it wouldn't be a perfect solution. I'd rather just turn off vsync.


Quote:
There is a Windows USB driver hack that lets you increase the polling rate from 125hz to 1000hz
Whow didn't know... Will check it out

Joined: Jan 2005
Posts: 154
Senior Member
Offline
Senior Member
Joined: Jan 2005
Posts: 154
Quote:
But I'd say that the issue of latency in input, video, and audio is going to be the most difficult thing to perfect with regards to emulation accuracy. Much of the problem is out of the realm of the emulator, i.e., hardware and OS limitations may prevent perfect timing accuracy.
Agreed, unfortunately. I see no way around the latency issue on a modern PC. It's only getting worse, as each new software layer adds latency. They can keep increasing throughput, but latency is not something that more horsepower can improve much. Maybe the ultimate emulator will eventually be a mini-PC-on-a-card with its own CPU, joypad input and video output.

Finally, an advantage to running an ancient operating system (Mac OS Classic in my case) that lacks pre-emptive processes and allows a process to hog the CPU! Hmmm, maybe real-time/customized versions of Linux might be best for emulation.

Quote:
I don't think that the measuring device can run on the PC itself, because the latency is the time interval from a physical human pressing a real world button, all the way to the audio and video changing due to the button press.
Exactly. To measure it really accurately you'd need something like a high-speed camera.

I think my original idea might not work, but here's one that would definitely work: a reflex tester. NES ROM changes the image/makes a sound and you press the button when you notice it. Run test multiple times to get an accurate result that is the sum of your reaction time and the latency. Run test with various emulator configurations and you can find the relative latencies. Run test on your NES and you can compare the real thing with the emulator. Hard data. I'll have to get on this tomorrow.

As for vsync, it would be best for the emulator to wait for vsync before polling input and emulating the frame. If it does these latter things first, the time between that and vsync results in added latency. If it's up to the emulator to choose when to refresh under Windows, the the emulator can take things one step further and wait until the monitor beam hits the top of the window rather than the usual top of the entire screen. It would of course have to assume the usual top-to-bottom scanning pattern. I've implemented this kind of beam-position calculation on my Mac and it works. It requires that you have an accurate timestamp as to when the host monitor's vsync occurs.

Page 1 of 5 1 2 3 4 5

Moderated by  Marty, R. Belmont 

Link Copied to Clipboard
Who's Online Now
2 members (2 invisible), 23 guests, and 3 robots.
Key: Admin, Global Mod, Mod
ShoutChat
Comment Guidelines: Do post respectful and insightful comments. Don't flame, hate, spam.
Forum Statistics
Forums9
Topics9,102
Posts119,263
Members5,019
Most Online890
Jan 17th, 2020
Our Sponsor
These forums are sponsored by Superior Solitaire, an ad-free card game collection for macOS and iOS. Download it today!

Superior Solitaire
Forum hosted by www.retrogamesformac.com