Time Complexity vs Actual Time

Say I have a algorithm which is O(n^2) where n<=1000 , thus I know that O(n^2) runs under 1 sec .
But say I have 2 second limit so what can I say about the maximum time complexity of the algorithm.
Thus what I want to ask is that for a given time limit how can I estimate the required time complexity of the solution algorithm.Say for t=3 sec or t=4 sec

at codechef, you can perform approximately 10${^8}$ computations in 1 second and thus for any n seconds time limit it will be n*10{^8} computations.

In 1 second the number of instructions which can be performed are around 10^8( we have no fixed value as it can vary due to various conditions). Time complexity of an algorithm can give you an idea whether it will fail or not. For example if algorithm is O(n^2) and upper limit of n is 10^5 then you can surely say it won’t pass. But if n is around 10^4 then we can see that O(n^2) algorithm would approximately perform 10^8 instructions. We are not sure whether it would pass or not. It may not pass if the overhead of algorithm is too much or the instructions are heavy like floating point operations. On the other hand it can pass if the most of the operations are quick like bitwise operations or array elements access. It also depends on language which you are using. Java and python are slow so they can fail in such conditions.
Thus i see good problem setters generally set constraints which make it clear that whether a give algorithm would fail or not. Like in above example if O(n^2) is accepted then n should be around 10^3 and then they can add around 10 testcases. And if it’s not intended solution then simply make n till 10^5.