Hello, the following problem was asked in a Codenation test recently. I couldn’t make head or tail of the question. Please help me understand the problem and how to solve it. Thanks in advance.

Find the expected value of the number of segments in a string of length A in a language having alphabet size B.

A segment is defined as the maximum contiguous substring containing the same character. Eg. In string 10011. The segments are 1, 00 and 11. The number of segments will be 3.

Input format: The first argument is A and second argument is B.

Output format: Return the expected value of the number of segments. This can be represented in the form of x/y. Return x.y^(-1)(mid 10^9 + 7).

Example : A=1,B=2. Output is 1.

A=2,B=2. Output is 500000005.