I timed the grandfather clock running under Bally BASIC and AstroBASIC on real hardware. I also timed the AstroBASIC version running under MAME. I used a digital stopwatch to get these estimates which are probably accurate to within one second. Here are the results:
1) Real Hardware - Bally BASIC: 1 Minutes = 1 minute in real time.
2) Real Hardware - AstroBASIC: 1 Minute = 38 seconds in real time.
3) Emulation - AstroBASIC: 1 Minute = 55 seconds in real time.
I just used regular unaltered version of MAME 0.207 and the one that was modified for tape support. I ran a machine language cartridge called "Goldfish Demo." This program has a clock/timer built into it. I set the time and then timed it to see if it was accurate, and it seems to run about right::
Real Hardware - Machine Language Program (Goldfish Demo): 1 Minute = 1 Minutes
Emulation - Machine Language Program (Goldfish Demo): 1 Minute = 1 Minute
If anyone is curious and would like to try Goldfish Demo for themselves, then it can be downloaded here:
.zip]http://www.ballyalley.com/emul...he%20Bit%20Fiddlers)[Cart%20Version].zipAgain, I think the above link will fail due to the use of brackets in the filename. Here is a text version of the URL:
http://www.ballyalley.com/emulation/cart_images/cart_images/Goldfish%20Demo%20(1982)(The%20Bit%20Fiddlers)[Cart%20Version].zip
This leaves me wonder why "AstroBASIC" running a BASIC program under emulation runs slower than real hardware. Can anyone help explain why this might happen if another cartridge ("Goldfish Demo") runs at full speed?
Adam