I have a simple java program, and I want to know the time difference between some set of operations. For this question the details are not important, but let us take the following scenario.
long beginTime = System.currentTimeMillis();
//Some operations. Let us asssume some database operations etc. which are time consuming.
//
long endTime = System.currentTimeMillis();
long difference = endTime - beginTime;
When the code is run on a machine, how reliable will the difference be?
Let us say, that the processor starts executing some instructions from my code, then gives context to another process, which executes for some time, and then comes back to execute instructions related to this java process.
So, the time difference should depend on the current state of my machine, i.e. how many processes are running etc? So, in profiling time it takes for some operations to run, is this mechanism not reliable?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…