[hatari-devel] STE sound breakage with lower sound frequencies

Nicolas Pomarède npomarede at corp.free.fr
Wed Feb 9 10:46:29 CET 2011


Le 09/02/2011 00:35, David Savinkoff a écrit :

> All of this ntp and time changes are dirty enough to force a re-think.
> Maybe we should use SDL exclusively as it is cleaner and will 'not' be
> less accurate averaged over time. Furthermore main.c would not depend
> on #include<time.h>

Hello,

sdl is using either gettimeofday or clock_gettime on unixlike systems 
(changing time when SDL is running will confuse it if clock_gettime was 
not available at compile time) so I don't see the difference, except SDL 
"truncates" all times to millisecond instead of keeping the micro/nano 
value.
What do you mean by "less accurate averaged over time" ?

> SDL at 10ms is just as accurate as usleep in microseconds; it is the
> precision that is 10ms. To time-average 60Hz VBLs, one only needs to
> have a 10ms delay 1/3 of the time and a 20ms delay 2/3 of the time.
> eg.
> VBL(1) = 10ms
> VBL(2) = 20ms
> VBL(3) = 20ms
> Time average the above over 3 VBLs and you have 16.66... ms
> This time averaging has been taking place since Thomas added the code.
> In light of this, SDL at 1ms is a luxury.

10 ms is really a very rough averaging to get close to 16.666 ms
In that regard, I prefer sleeping 17ms + 17ms + 16ms which also gives 
16.66 over 3 VBL.

So yes, 10ms precision can be used to average 16.66, but it will do so 
with many more jitter (=standard deviation) than if using a 1ms precision.

My point is : just use as many precision as the OS provides (with a 
reasonnable amount of code), it won't hurt anyone and it will benefit to 
some cases.


> I don't believe your eyes will bothered, especially if xorg
> (or equivalent) is doing its job (buffering). Video is the
> only thing that is affected, sound is not.

Yes, of course a lot of thing happen behind SDL to copy the buffer to 
the video screen, wait or not for a vsync, ...

Today, you don't see the difference because most LCD monitors will run 
at 70/80 Hz, not 50 or 60, so anyway Hatari emulating a 50/60 Hz video 
will not look smooth.

But if you connect your PC to an old CRT capable of doing exactly 60 Hz, 
then doing sleep of 10+20+20 ms instead of 17+17+16 will be really 
noticable. If using nanosleep, you will even get a 16.667 ms sleep which 
means Hatari's video should be really synchronized with the CRT monitor 
at 60 Hz (this is how hardcore emulation fans are doing when using MAME, 
they prefer using CRT because it can output at the same video freq as 
the original arcade machine).


Regards

Nicolas



More information about the hatari-devel mailing list