Hi Avishay,
Secondly, the resolution that i get thru gettimeofday() after conversion
can be upto microseconds. But I fear that is fake ?
A resolution upto millisecond is correct? This has something to do with
the sys clock / timing issues I read somewhere.
What do you mean by "fake"?
This is how I was profiling:
print_time();
_profiling_code_
print_time();
I am using this function to print time-stamps :
void print_time ()
{
struct timeval tv;
struct tm* ptm;
char time_string[40];
long microseconds;
/* Obtain the time of day, and convert it to a tm struct. */
gettimeofday (&tv, NULL);
ptm = localtime (&tv.tv_sec);
/* Format the date and time, down to a single second. */
strftime (time_string, sizeof (time_string), "%H:%M:%S", ptm);
/* Compute milliseconds from microseconds. */
microseconds = tv.tv_usec;
/* Print the formatted time, in seconds, followed by a decimal point
and the microseconds. */
printf ("%s.%06ld\n", time_string, microseconds);
return;
}
By fake I mean it is fakely generated value when you do:
printf ("%s.%06ld\n", time_string, microseconds);
But actually it does not represent the true resolution upto microseconds.
--
Kernelnewbies: Help each other learn about the Linux kernel.
Archive: http://mail.nl.linux.org/kernelnewbies/
FAQ: http://kernelnewbies.org/faq/