Max of aj-ai where j>=i+l

Have to find MAX of aj-ai in an array where j>=i+l in linear time.How to do it?My approach is to start j from l+1 to n and check min element every time is it a correct way to do it?

when j = l+1 we check only 1st elemrent and for j = l+2 next 2 element and so on

Is this approach is right?

To maximum a[j]-a[i] (j>=i), for each j the possible optimal solution would the a[j] - the minimum value before j, so you just need to keep this value and update the best solution

mmin = a[0];
ans = 0;
for (int j = 0; j < length(a); ++j){
    mmin = min(a[j], mmin);
    ans = max(ans, a[j] - mmin);
}
return ans;

You can find few more solutions here.