I have a timer in my program that needs to the highest resolution possible. I'm using System.Timers.Timer. However, even though you're capable of setting time interval in milliseconds it sets it to tens-of-milliseconds.
For example, I set it to 5 milliseconds and it's actually 10. I set it to 10 and it's 10. I set it to 14 it's 20.
Is there any way around this?
For example, I set it to 5 milliseconds and it's actually 10. I set it to 10 and it's 10. I set it to 14 it's 20.
Is there any way around this?