Longest increasing subsequence

Hey I have tried to implement a dp solution. There’s a little mistake which I can’t figure out. Care to help me?

    #include <iostream>
#include<algorithm>
using namespace std;

int main() {
	// your code goes here
	int n;
	cin>>n;
int a[100];
for(int p=1;p<=n;p++)
cin>>a[p];
int dp[110];
for(int h = 1;h<=n;h++) dp[h] = 1;
for(int i = 1;i<=n;i++)
{
	
	for(int j =1;j<i;j++)
	{
		
		if(a[j]<a[i])
		{
			
			//dp[i] = max(dp[i],(dp[j]+1));
			if(dp[j]+1>dp[i])
			dp[i]=dp[j]+1;
		}
	}


	}
   int max= 0 ;
   for(int i=1;i<=n;i++)
   if(dp[i]>max)
   max=dp[i];
   cout<<max;
	return 0;
}

Your code has worked on all the cases i have tried.Could you please give your test cases? I think it is fine and cannot see any error so far.

Could you tell me what sort of problem are you getting with the code? On what cases do you feel there is a problem?

I’ve tried 2 examples.Working on none. So I guess all cases?

4 Likes