hello,
I am attempting to time some algorithms that I am testing, but, it seems as though I can't figure out how to time with enough precision (as you can probably guess, I keep getting run times of 0).
The method that I am using is as follows:
#include <time.h>
clock_t start, finish;
float runningTime;
start = clock();
//call function to be timed
finish = clock();
runningTime = (float) (finish-start)/CLOCKS_PER_SEC;
How do I get the time to be more precise? I have tried other forms of the method shown above, but have not had any real success. Most of the timing methods that I have looked up only deal with obtaining various forms of the date, or the time of day, not with being able to time small algorithms. Is there a method where I can call clock(), or some form of clock(), and receive the time in microseconds or nanoseconds?
Any help will be appreciated.
Thanks,
ErnestH
I am attempting to time some algorithms that I am testing, but, it seems as though I can't figure out how to time with enough precision (as you can probably guess, I keep getting run times of 0).
The method that I am using is as follows:
#include <time.h>
clock_t start, finish;
float runningTime;
start = clock();
//call function to be timed
finish = clock();
runningTime = (float) (finish-start)/CLOCKS_PER_SEC;
How do I get the time to be more precise? I have tried other forms of the method shown above, but have not had any real success. Most of the timing methods that I have looked up only deal with obtaining various forms of the date, or the time of day, not with being able to time small algorithms. Is there a method where I can call clock(), or some form of clock(), and receive the time in microseconds or nanoseconds?
Any help will be appreciated.
Thanks,
ErnestH