STRANGENIM - Editorial

PROBLEM LINK:

Practice
Contest: Division 1
Contest: Division 2
Contest: Division 3
Contest: Division 4

Author:
Tester: sushil2006
Editorialist: iceknight1093

DIFFICULTY:

Easy

PREREQUISITES:

Nim, Grundy Numbers, segment/fenwick trees

PROBLEM:

Alice and Bob play a modified version of nim. On their turn, a player chooses an index i such that A_i \gt 0, and removes a number of stones from the pile equal to any digit of A_i (when written in decimal).

You’re given an array A. Process Q queries on it:

  1. Given L, R, find the number of subsequences of [A_L, A_{L+1}, \ldots, A_R] such that Alice wins the resulting game when played on this subset.
  2. Given i and x, set A_i := x

EXPLANATION:

Let’s break this problem into smaller parts - first, we analyze the game itself to see how the winner can be predicted.

While this is rather hard by analyzing the game itself, we have a powerful tool with us: the Sprague-Grundy theorem, which states that
So, we instead focus on computing the Grundy numbers of each pile.
Because A_i \leq 2\cdot 10^5 this can be simply precomputed for every value:
\text{grundy}[x] = \text{MEX}(\text{grundy}[x-d]) across all digits d that appear in x - there are at most 6 digits so bruteforce works fine.

Once all the grundy numbers of the piles are known, checking whether the first player wins is simple: the grundy numbers should have non-zero XOR.


Now, let’s move on to answering range queries - ignore the updates for now.
Going over every subsequence of the range, for each query, is clearly impossible.
However, note that the actual sizes of the piles doesn’t really matter any more: the only thing that matters is the grundy number of each pile (and their bitwise XOR).
Now, observe that the grundy numbers could be precomputed quickly since we were taking the mex of not too many values: this also implies that the mex itself is quite small.
Indeed, given that we’re dealing with 6-digit numbers at most, every grundy number is no more than 6.

Now, let’s look at what this means for their bitwise XOR.
Suppose we’re looking at some subset of elements, with \text{freq}[x] being the number of piles with grundy number x taken.
Then, if \text{freq}[x] is even, x won’t contribute anything at all to the overall XOR; whereas if \text{freq}[x] is odd, it will contribute exactly x to the bitwise XOR.
So, the overall bitwise XOR is \bigoplus_{x=0}^6 (\text{freq}[x] \cdot (x\bmod 2)).

In particular, there are only 2^7 distinct cases we even need to consider, depending on the parity of the frequency of the grundy numbers.
Let’s fix one such case, with mask being a bitmask denoting the parities.
Then, for any integer x,

  • if \text{freq}[x] = 0, then x must have even frequency.
  • If \text{freq}[x] = 1, there are 2^{\text{freq}[x] - 1} ways to choose a subset that has even frequency, and the same number to choose a subset that has odd frequency.
  • So, the number of ways to choose a valid subset with this mask is the product of 2^{\text{freq}[x] - 1} across all x; or 0 if the first condition is violated.

If mask corresponds to a combination with XOR 0, we add this product to the answer; otherwise we don’t do anything.


To be able to handle this quickly, we need a data structure that allows us to compute, for each 0 \leq x \leq 6, the frequency of x on a segment of A.
We also need to be able to handle point updates.
This is simple enough to do with a segment tree or fenwick tree.

As a further optimization, note that there’s no need to go through all 2^7 masks for each query.
Instead, precompute only the masks that result in a XOR of 0 (there are only 16 of them, so this is about an 8x speedup).


There are alternate solutions, relying on linear algebra to maintain a basis in a segment tree and count things appropriately from that.
This is asymptotically faster than the solution presented above, but was not enforced so as to let participants without linear algebra knowledge AC.
if you’re interested, the tester’s code below implements this.

TIME COMPLEXITY:

\mathcal{O}(N\cdot 7 + Q\cdot (7\log N + 7\cdot 16)) per testcase.

CODE:

Author's code (C++)
#include <bits/stdc++.h>
using namespace std;

#define ll long long
#define endl "\n"
const int MAXN = 2e5 + 1;
const int MOD = 1e9 + 7;

vector<int> grundy(MAXN, 0);
vector<int> possible_comb;

// Function to compute power(x, y) % MOD
ll power(ll x, ll y, ll mod) {
    ll res = 1;
    while (y > 0) {
        if (y & 1) res = (res * x) % mod;
        x = (x * x) % mod;
        y >>= 1;
    }
    return res;
}

// Precompute Grundy numbers
void precompute_grundy() {
    for (int i = 1; i < MAXN; ++i) {
        vector<int> vis(10, 0);
        string s = to_string(i);
        for (char ch : s) {
            if (ch != '0') vis[grundy[i - (ch - '0')]] = 1;
        }
        for (int j = 0; j < 10; ++j) {
            if (!vis[j]) {
                grundy[i] = j;
                break;
            }
        }
    }
}

// Precompute possible combinations with XOR = 0
void precompute_possible_combinations() {
    for (int mask = 0; mask < (1 << 10); ++mask) {
        int xor_sum = 0;
        for (int j = 0; j < 10; ++j) {
            if (mask & (1 << j)) xor_sum ^= j;
        }
        if (xor_sum == 0) possible_comb.push_back(mask);
    }
}

// Fenwick Tree for each Grundy number
class FenwickTree {
private:
    vector<int> bit;
    int n;

public:
    FenwickTree(int size) : n(size) {
        bit.assign(size + 1, 0);
    }

    void update(int idx, int delta) {
        for (++idx; idx <= n; idx += idx & -idx) {
            bit[idx] += delta;
        }
    }

    int query(int idx) {
        int sum = 0;
        for (++idx; idx > 0; idx -= idx & -idx) {
            sum += bit[idx];
        }
        return sum;
    }

    int range_query(int l, int r) {
        return query(r) - query(l - 1);
    }
};

void solve() {
    int n, q;
    cin >> n;

    vector<int> arr(n);
    vector<FenwickTree> fenwick(10, FenwickTree(n));

    // Read input and initialize Fenwick trees
    for (int i = 0; i < n; ++i) {
        cin >> arr[i];
        arr[i] = grundy[arr[i]];
        fenwick[arr[i]].update(i, 1);
    }

    cin >> q;
    while (q--) {
        int type;
        cin >> type;

        if (type == 1) {
            int l, r;
            cin >> l >> r;
            --l; --r;

            vector<int> freq(10, 0);
            for (int i = 0; i < 10; ++i) {
                freq[i] = fenwick[i].range_query(l, r);
            }

            ll total = power(2, r - l + 1, MOD);
            ll valid = 0;

            for (int mask : possible_comb) {
                bool valid_mask = true;
                ll temp = 1;
                for (int i = 0; i < 10; ++i) {
                    if (mask & (1 << i)) {
                        if (freq[i] == 0) {
                            valid_mask = false;
                            break;
                        }
                    }
                    if (freq[i] > 0) {
                        temp = (temp * power(2, freq[i] - 1, MOD)) % MOD;
                    }
                }
                if (valid_mask) {
                    valid = (valid + temp) % MOD;
                }
            }

            ll result = (total - valid + MOD) % MOD;
            cout << result << endl;
        } else {
            int idx, val;
            cin >> idx >> val;
            --idx;

            fenwick[arr[idx]].update(idx, -1);
            arr[idx] = grundy[val];
            fenwick[arr[idx]].update(idx, 1);
        }
    }
}

int main() {
    ios_base::sync_with_stdio(false);
    cin.tie(NULL);

    precompute_grundy();
    precompute_possible_combinations();

    int t = 1;
    while (t--) {
        solve();
    }

    return 0;
}
Tester's code (C++)
#include <bits/stdc++.h>
#include <ext/pb_ds/assoc_container.hpp>
#include <ext/pb_ds/tree_policy.hpp>

using namespace std;
using namespace __gnu_pbds;

template<typename T> using Tree = tree<T, null_type, less<T>, rb_tree_tag, tree_order_statistics_node_update>;
typedef long long int ll;
typedef long double ld;
typedef pair<int,int> pii;
typedef pair<ll,ll> pll;

#define fastio ios_base::sync_with_stdio(false); cin.tie(NULL)
#define pb push_back
#define endl '\n'
#define sz(a) (int)a.size()
#define setbits(x) __builtin_popcountll(x)
#define ff first
#define ss second
#define conts continue
#define ceil2(x,y) ((x+y-1)/(y))
#define all(a) a.begin(), a.end()
#define rall(a) a.rbegin(), a.rend()
#define yes cout << "Yes" << endl
#define no cout << "No" << endl

#define rep(i,n) for(int i = 0; i < n; ++i)
#define rep1(i,n) for(int i = 1; i <= n; ++i)
#define rev(i,s,e) for(int i = s; i >= e; --i)
#define trav(i,a) for(auto &i : a)

template<typename T>
void amin(T &a, T b) {
    a = min(a,b);
}

template<typename T>
void amax(T &a, T b) {
    a = max(a,b);
}

#ifdef LOCAL
#include "debug.h"
#else
#define debug(...) 42
#endif

/*



*/

const int MOD = 1e9 + 7;
const int N = 1e5 + 5;
const int inf1 = int(1e9) + 5;
const ll inf2 = ll(1e18) + 5;

struct basis{
    array<int,3> a;
    int siz = 0;
    basis(){
        a.fill(-1);
    }
    void insert(int x){
        rep(i,siz) amin(x,x^a[i]);
        if(x) a[siz++] = x;
    }
};

template<typename T>
struct segtree {
    // https://codeforces.com/blog/entry/18051

    /*=======================================================*/

    struct data {
        basis bas;
    };

    data neutral = data();

    data merge(data &left, data &right) {
        data curr = left;
        rep(i,right.bas.siz){
            curr.bas.insert(right.bas.a[i]);
        }
        return curr;
    }

    void create(int i, T v) {

    }

    void modify(int i, T v) {
        tr[i].bas = basis();
        tr[i].bas.insert(v);
    }

    /*=======================================================*/

    int n;
    vector<data> tr;

    segtree() {

    }

    segtree(int siz) {
        init(siz);
    }

    void init(int siz) {
        n = siz;
        tr.assign(2 * n, neutral);
    }

    void build(vector<T> &a, int siz) {
        rep(i, siz) create(i + n, a[i]);
        rev(i, n - 1, 1) tr[i] = merge(tr[i << 1], tr[i << 1 | 1]);
    }

    void pupd(int i, T v) {
        modify(i + n, v);
        for (i = (i + n) >> 1; i; i >>= 1) tr[i] = merge(tr[i << 1], tr[i << 1 | 1]);
    }

    data query(int l, int r) {
        data resl = neutral, resr = neutral;

        for (l += n, r += n; l <= r; l >>= 1, r >>= 1) {
            if (l & 1) resl = merge(resl, tr[l++]);
            if (!(r & 1)) resr = merge(tr[r--], resr);
        }

        return merge(resl, resr);
    }
};

void solve(int test_case){
    int n; cin >> n;
    vector<int> a(n+5);
    rep1(i,n) cin >> a[i];

    int m = 2e5;
    vector<int> dp(m+5); // grundy values

    rep1(i,m){
        vector<int> v;
        int x = i;
        while(x){
            int d = x%10;
            x /= 10;
            if(d){
                v.pb(dp[i-d]);
            }
        }

        sort(all(v));
        v.resize(unique(all(v))-v.begin());
        v.pb(inf1);

        rep(j,sz(v)){
            if(v[j] != j){
                dp[i] = j;
                break;
            }
        }
    }

    vector<int> pow2(n+5);
    pow2[0] = 1;
    rep1(i,n) pow2[i] = pow2[i-1]*2%MOD;
    
    segtree<int> st(n+5);
    rep1(i,n) st.pupd(i,dp[a[i]]);

    int q; cin >> q;
    while(q--){
        int t; cin >> t;
        if(t == 1){
            int l,r; cin >> l >> r;
            auto bas = st.query(l,r).bas;
            int tot = pow2[r-l+1];
            int bad = pow2[r-l+1-bas.siz];
            int ans = (tot-bad+MOD)%MOD;
            cout << ans << endl;
        }
        else{
            int i,x; cin >> i >> x;
            st.pupd(i,dp[x]);
        }
    }
}

int main()
{
    fastio;

    int t = 1;
    // cin >> t;

    rep1(i, t) {
        solve(i);
    }

    return 0;
}
2 Likes

https://www.codechef.com/viewsolution/1127369376

any idea what might be wrong in this ;-; , (ik it should be tle , havent optimized the summation of nCeven part)

other than that , i think it should be correct ;-;

I calculted each number from 0 to 6 that is present in that range , precomputed every possible combination which gives us the xor 0 , and then checked how many of these combinations i can make and substracted that from 2^(r-l) - 1

If anyone could help …

(btw really awesome problem , had fun thinking up the approach :slight_smile: )

1 Like

There were 95 solves in div2 with 39% accuracy, and only 51 solves in div1 with 23% accuracy. Seems fishy.

3 Likes

why are we calculating for zero xor values? Alice will win if xor is non-zero right?

why is it rated easy? It has a rating of 2400+. Can someone provide a link of how questions are rated simple/easy/etc.?

I’m not going to go through your code extensively but a few possible mistakes and unoptimisations might be:

  1. Not considering cases like even number of same grundy value as x XOR x = 0.
  2. Not computing the binomial coefficients properly like if you consider the 1 2 3 triplet case, you need to compute xC1yC1zC1 + xC2yC2zC2 + … + xCxyCxzCx where x=minimum freq among 1, 2, 3 and z=maximum freq among 1, 2, 3.

I had the same approach as you and I believe it is possible to solve using this approach but it requires quite some effort and hence gave up on it. I queried chatgpt on recurrence relations for sums of powers of binomial coefficients of n and found them. Precomputing them(quite complicated for higher powers by the way), it is theoretically possible to solve this problem but it is heavily involved and infeasible to implement in a contest situation.
[/quote]