 CHMOD - Editorial

#1

EASY

PREREQUISITES

Simple Math, Repeated Squaring

PROBLEM

You are given a list of N integers.

Each value in the list is between 1 and 100.

You have to respond to T queries of the following type

• Given L and R
• Find the product of all integers in the given list between L and R, inclusive
• Find the above product modulo some M, which is also given

EXPLANATION

For each query, iterating through the list between L and R to maintain the modular products is too slow.

Of course, we use the fact that each value is between 1 and 100 to our advantage.

• There are 25 prime numbers between 1 and 100

Each number has a unique prime factorization. The product of a set of numbers can also be simulated by adding up the frequencies of each prime in all numbers in the set.

For example, suppose we have to multiply 36 and 45.

```36 = 2232
45 = 325

36 * 45 = 22345
```

Thus, we can maintain a table of cumulative frequencies for each of the 25 primes between 1 to 100 for the given list of numbers.

When processing a query

• consider each of the 25 primes
• find the frquency of the prime between L and R. This can be done in O(1) using pre-calculation of cumulative frequencies
• calculate primefrquency for each prime and multiply these values
• maintain the result modulo M

These ideas are best presented in the pseudo code below.

PSEUDO CODE

```Given:
N, the number of numbers
L[N], the list of numbers
P, primes between [1, 100]
CF[N,25], cumulative frquency for each prime

for each query
Given Query: left, right, M
for i = 1 to 25
r = CF[right,i] - CF[left-1,i]
v = P*r % M, use repeated squaring

```

The complexity of answering each query would be O(25 log N).

Cumulative Frequencies can be calculated in O(25 * N).

CODING COMMENTARY

You can either calculate the primes in thr porgram or hard code the array of primes by calculating it offline.

The repeated squaring should take care of the fact that the exponent can be 0. a0 should return 1 for any a.

Calculating the cumulative frequencies table should be done carefully. The frequencies of the primes for each number between 1 and 100 can be pre-calculated. Use these frequencies to build the cumulative frequencies table.

SETTER’S SOLUTION

Can be found here.

TESTER’S SOLUTION

Can be found here.

#2

D&C solution gives TLE. Complexity should be O(lg n) per query. Does anyone know why it is? Maybe in this approach MOD (%) operator is called fewer times…

#3

The idea of my solution is to use segment tree to calculate sums in [L, R] interval for log10 values and later convert this to integer applying modulo operation.

When I had such sum, first I found how many billions are in result (log10 / 9.0) exponentially multiplied this value and result multiplied with 10^dif where dif is something like real mod after dividing by 9.0 (hope it’s clear).

#4

I use the same Algorithm as you do…but i got TLE …

#5

I used the same algo but instead of 25 i used all the 100 numbers that is O(100log(n)) solution , but it got TLE , can anyone explain why it is so , as time is independent of constants so O(100log(n)) should be same as O(25*log(n)) .

#6

this is what i did…the similar approach…can be applid to solve.this problem wcount

#7

Took me ages to figure the idea! My solution: http://www.codechef.com/viewplaintext/2511516

First, I had a go at it using Python and Java, thanks to the support for big integers. But they gave me TLE.

Thinking “out of the box”, I tried the naive method. Actually find all products (keeping the modulo) in the given ranges. That too, timed out obviously.

Then I figured out that N is only as large as 100. Instead of storing the primes’ frequencies. I stored the numbers frequencies itself. A complexity of O(100 log N) per query and O(100 N) for pre-calculations. But this welcomed me with a shower of TLES.

Common, how else can I optimize? Multiplications… modulos… I thought of how to stop myself from looping, even the 100 times, just to avoid those computations. Couldn’t find any better solution. I thought, I am having modular exponentiation. If somehow I could increase the exponents, by combining some numbers, then I could further optimize. But couldn’t think of how to?

One day, thanks to the fundamental theorem of arithmetic, I remembered the prime numbers. Actually, I thought of storing the counts of powers of 2, as counts of powers of 2 itself. And just like that, the thought extended. I was not much thrilled, as the complexity now will be O(25 log N), but notation-wise, it is the same as O(100 log N), as both are O(log N). Still, I thought of 75% reduction in the running time.

Coded it up, but tests in my local machine could only bring down the running time (on average) from ~8.5 seconds to ~2 seconds per file. And we needed 1 second. But there is nothing bad as not trying. So, I tried, and voila! It was accepted.

That is my story of solving CHMOD.

What I wonder?

There are at least a few, who solved it at one go. I mean, how did they even think directly about the ideas of cumulative frequencies of prime numbers. Didn’t the idea of storing cumulative frequencies of the given numbers cross their mind? Or is there another idea?

Would love to hear from people who cracked this at their first attempt itself.

#8

Is the setter’s solution not up? I am getting a “Page Not Found”.

#9

I dont think this question qualifies as an easy one. It requires segment trees and modular exponentiation. Please consider moving it to medium. It took me 3 hours to get to the Algorithm. And only around 900 could solve it over the ten days.

#10

using namespace std;

unsigned long long int pr(int a,int b,unsigned long long int m){

``````if(b == 0)return 1;
unsigned long long int ans = pr(a,b/2,m)%m;
if(b%2){
return (((a*ans)%m)*ans)%m;
}else{
return (ans*ans)%m;
}
``````

}

int ans;

int main()
{

``````int p = {2,3,5,7,11,13,17,19,23,29,31,37,41,43,47,53,59,61,67,71,73,79,83,89,97};

unsigned long long mod,aa;
int l = 25;
int arr = {{0}};
for(int m = 2;m < 101;m++){
int awe = m;
for(int j = 0; j < l && awe > 1; j++) {
if(awe%p[j] == 0) {
while(awe%p[j] == 0) {
arr[m][j]++;
awe /= p[j];
}
}
}
}
int n;
cin >> n;
int in[n];
for(int i = 0;i < n;i++) {
scanf("%d",&in*);
}
for (int i = 0; i < 25; i++ ) {
ans* = arr[in]*;
}
for(int i = 1;i < n;i++){
for(int j = 0;j < 25;j++){
ans*[j] = ans[i-1][j] + arr[in*][j];
}
}
int final;
int k;
cin >> k;
while(k--){
int l,r;
scanf("%d%d%llu",&l,&r,&mod);
aa = 1%mod;
for(int i = 0;i < 25;i++){
if (l > 1)final* = ans[r-1]* - ans[l-1]* + arr[in[l-1]]*;
else final* = ans[r-1]*;
}

for(int i = 0;i < 25;i++){
if(final*){
aa = (aa*pr(p*,final*,mod));
if(aa >= mod)aa = aa%mod;
}
}
cout<< aa<<endl;
}

return 0;
``````

}’

why i got TLE…can any one help me please…

#11

this is giving TLE
anyone plz help

#12

can sm1 plss check why m i getting WA for my code…

http://www.codechef.com/viewsolution/2529589

i m getting all right answers for as many test cases as i could think of…am i missing sm exceptional case??

#13

Can anybody tell what’s wrong with the code?

http://www.codechef.com/viewsolution/2517179

#14

What’s wrong with the following DP algo? (Not totally correct just the idea)

long long inputArray[n];
long long dp[n];

dp=inputArray;
for(int i=1;i<n;i++)dp*=dp[i-1]inputArray;

for(int i=0;i<queries;i++)
{
if(l==0)cout<<dp[r]<<endl;
else cout<<dp[r]/dp[l-1]<<endl;
}

#15

Why is comlplexity of prime factorizing each number not considered ? In the worst case, we’ll have atleast floor(25/2)=12 division tests for each number. I think the expression of complexity is of the sort:
O( N * CostFactorization + T * NumUpdates * ExponentiationTime) so it will be O( N + T * lg(MAX))

#16

can somebody give me the testcase for which i am getting sigfpe
my solution is:http://www.codechef.com/viewsolution/2481713

#17

@khitish…int can not store such huge values(multiplication of all nos may reach upto 100 ^ 100000…some combination of nos may lead to ur b array storing a “0”…please go through the editorial to get an idea of what u have to do to solve this sum…!!!

#18

the 25 prime logic was very good trick in this problem…otherwise TLe

#19

can there be overflow in the cumulative frequency if each of it’s elements is int? Also when I logically declared my data-types i got WA solution1and when I changed everything to long long it got accepted solution2. Can any one please tell the reason or point out where is the overflow in the first program? Thanks in advance.

#20

Again, posting here for help…
I’m stuck with this problem. For some test case no. > 100, it’s giving me SIGSEV runtime error…I’ve tried every corner case I can think of…still not getting any leads…