Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I measure time with microsecond precision in Java?

Tags:

java

time

I saw on the Internet that I was supposed to use System.nanoTime() but that doesn't work for me - it gives me the time with milliseconds precision. I just need the microseconds before and after my function executes so that I know how long it takes. I'm using Windows XP.

Basically, I have this code that, for example, does 1 million up to 10 millions of insertions in a java linked list. The problem is that I can't measure the precision right; sometimes it takes less time to insert everything in the smaller list.

Here's an example:

class test
{
    public static void main(String args[])
    {
        for(int k=1000000; k<=10000000; k+=1000000)
        {
            System.out.println(k);
            LinkedList<Integer> aux = new LinkedList<Integer>();
            //need something here to see the start time
            for(int i=0; i<k; i++)
                aux.addFirst(10000);
            //need something here to see the end time
            //print here the difference between both times
        }
    }
}

I did this many times - there was an exterior loop doing it 20 times for each k - but the result aren't good. Sometimes it takes less time to to make 10 million insertions than 1 million, because I'm not getting the correct measured time with what I'm using now (System.nanoTime())

Edit 2: Yes, I'm using the Sun JVM.

Edit 3: I may have done something wrong in the code, I'll see if changing it does what I want.

Edit 4: My mistake, it seems System.nanoTime() works. Phew.


1 Answers

My guess is that since System.nanoTime() uses the "most precise available system timer" which apparently only has millisecond-precision on your system, you can't get anything better.

like image 77
Zach Scrivena Avatar answered Sep 14 '25 04:09

Zach Scrivena