Disjoint Set Union Codeforces EDU

#define ll long long int
#define pb push_back
#define vi vector<ll>
#define vb vector<bool>
#define vd vector<double>
#define vc vector<char>
#define vii vector<vi>
#define mp make_pair
#define vpi vector< pair<ll,ll> >
#define take_input freopen("input.txt", "r", stdin)
#define give_output freopen("output.txt", "w", stdout)
#define fastIO ios_base::sync_with_stdio(false);cin.tie(NULL);cout.tie(NULL)
#define fi first
#define se second
#define mod 1000000007
#define min_pql priority_queue< ll, vector<ll>, greater<ll> >

using namespace std;
using namespace std::chrono;

vi par(200001, 0);
vi rnk(200001, 0);
vi val(200001, 0);
vii teams(200001);

void make_set(int n){
	for(int i=1; i<=n; i++){
		par[i] = i;
		rnk[i] = 1;

int find_set(int a){
	if(a == par[a]) return a;
	return par[a] = find_set(par[a]);


void union_set(int a, int b){
	a = find_set(a);
	b = find_set(b);
	if(a != b){
		if(rnk[a] > rnk[b]){
			par[b] = a;
			for(ll i:teams[b]){
			teams[b] = {b};
			rnk[a] += rnk[b];
			par[a] = b;
			for(ll i:teams[a]){
			teams[a] = {a};
			rnk[b] += rnk[a];

int main(){
    // take_input;
    // give_output;
	int n, m; cin >> n >> m;    
	for(int i=0; i<m; i++){
		string s; cin >> s;
		if(s == "add"){
			int a,b; cin >> a >> b;
			a = find_set(a);
			for(ll i:teams[a]){
				val[i] += b;
		}else if(s == "join"){
			int a,b; cin >> a >> b;
			union_set(a, b);
		}else if(s == "get"){
			int a; cin >> a;
			cout << val[a] << endl;

I used a 2d array named team which used to hold all the members in dsu corresponding to a leader. But I am getting TLE. What can be an optimized approach?

I don’t find anything else that consumes time. Try optimising this part.

Yeah I know but what approach to follow?

I think you’re not clear with the concept.

I just saw the problem statement. You’re supposed to add b points to all team members of the team to which player a belongs to. Can we do it more efficiently? :thinking:
Try using another array that keeps track of experience points. I am not sure if I encounter a bug in this approach, I should probably try solving this problem.

Yeah that is what I am doing…I am adding b to each member points but iterating over all the members to update their value is what giving me TLE.

Can @cubefreak777 help?
Even I am not getting proper approach to solve this.

For some reason I’m being the directed to edu page, can you share the link of the problem or tell me how to access the problem ?

Okay, so we can solve this problem by manipulating the usual disjoint set data structure.


  • F(X) returns the root of the component in which X is present.
  • A[X] (X is the root of its component) represents the list of all nodes which are present in the component of X.
  • D[X] (X is the root of its component ) is an array that stores the total pending value that is yet to be propagated to all elements of the component.
  • R[X] is the total experience value that has been propagated to node X.

So we’ll look at get and add operations first as they’re comparatively easier to understand.

Get Operation:

  • It’s pretty clear that experience value of X will be the total updated value+the pending value which is to be propagated to all nodes in the component in which X is present.
  • \therefore get(X)=R[X]+D[F(X)]

Add Operation:

  • For adding a given value V to all nodes X in the component of X we’ll be a little clever and instead of adding X to all nodes in the component will increase the D[F(X)] by V, i.e. increase the pending value for root of X by V .
  • add(X,V) \implies D[F(X)]=D[F(X)]+V

Join Operation \implies join(X,Y):

  • Firstly we’ll transfer all elements in the list A[X] to A[Y].
  • Pass on the pending value in component of X to all elements in A[X], i.e, R[X]+=D[F(X)]
  • Subtract D[F(Y)] from R[X] as note that in get operation we add the pending values at the root so if we don’t do this then we’ll be over counting the D[F(Y)] twice.
  • Do rest of the standard DSU operations as it is.

Time Complexity analysis.

  • Adding complexity: \mathcal{O(logN)}
  • Query complexity: \mathcal{O(logN)}
  • Joining complexity: \mathcal{O(NlogN)}

Hence a naive analysis might give an impression that this solution should TLE as the worst-case complexity of the entire solution is \mathcal{O(Q\times NlogN)}. But that’s not the case, thanks to the following observation.
Since there can be at most N join operations over all queries and each join operation at least doubles the size of the component each node belongs to, the number of join operations involving a particular node will be at most \mathcal{O(logN)}. And therefore the total complexity of all the queries of joining type will be \mathcal{O(Nlog^2N)}
Hence the correct complexity of solving the entire problem (considering contributions from the queries separately) would be: \mathcal{O(Nlog^2N+QlogN)} which fits in the bounds pretty well

using namespace std ;
struct dsu{
  int n ;
  vector<vector<int>>adj ;
  dsu(int _n):n(_n){
    p.resize(n) ;  s.assign(n,1) ;
    del.assign(n,0) ;ans.assign(n,0) ;
    iota(p.begin(),p.end(),0) ;
    adj = vector<vector<int>>(n) ;
    for(int i=0;i<n;i++)
      adj[i].push_back(i)  ;
  int find(int x){
    return (x==p[x]?x:p[x]=find(p[x])) ;
  int qry(int A){
    return ans[A]+del[find(A)];
  void upd(int A,int x){
    del[find(A)]+=x ;
  bool join(int a,int b){
    a=find(a),b=find(b) ;
      return 0  ;
      swap(a,b) ;
    for(int x:adj[b])
      ans[x]+=del[b]-del[a] ,adj[a].push_back(x);
    adj[b].clear() ; 
    del[b]=0  ;
    p[b]=a ; s[a]+=s[b] ;
    return 1 ;
signed main(){
  int n,q  ;
  cin >> n >> q  ; 
  dsu D(n) ;
    int A,B;
    string qt ;
    cin >> qt ;
      cin >> A >> B ;--A; --B ;
      D.join(A,B) ;
    }else if(qt=="add"){
      cin >> A >> B  ;--A ;
      D.upd(A,B) ;
      cin >> A ;--A ;
      cout << D.qry(A) << '\n' ;

Do let me know if something isn’t clear.


It redirects to the Edu Page, because the link given is just Courses - Codeforces
Think before posting

Thanks for enlightening me.

Not everyone is as smart as you.


Thank you !! Helped me to Identify My mistakes,Nice Explainations.