calculation of time during different Server load

I want to calculate execution time of a program (in C) as a part of my academic project.Could you please tell me how you are calculating time(execution time) for a program accurately during different server load periods?

Maybe time command from linux is what you are looking for…

See codes chef is displaying a time associated with successfully executed programs…i know that they are calculating that time on the basis of 4 or 5 test cases…But a server may have different load at different time (for example depends on number of users sometime load may be high ,average, or low).My question is, in such situation if there is any chance for a same program executed on the server to show different-different time…if ‘yes’ please tell me how you are solving the problem?

Is load really important when you ask for CPU time? See the %U and %S format parameters.

i would personnaly suggest process accounting (to keep a track of consumed time and memory of a running process) :

http://linux.die.net/man/8/accton

http://linux.die.net/man/8/sa