### PROBLEM LINK:

**Editorialist:** Jingbo Shang

### DIFFICULTY:

Easy - Medium

### PREREQUISITES:

Greedy, Dynamic Programming

### PROBLEM:

Given some sequences of pairs < time, cost > (**K** in total), merge them in order (reserving the relative order in sequences) and minimize the total cost (the summation of finish time * cost).

### EXPLANATION:

There are two methods to solve this problem: greedy or dynamic programming.

First, we will introduce dynamic programming for this problem. If all **K** pairs are in one sequence, then we can just follow the sequence and get the answer. This case is trivial. Then, we start from the special case – only two sequences, denote as **A[1…n]** and **B[1…m]**. Assume the final order is **C[1…n+m]**. Because we should maintain the relative order, **C[1…k]** must be some result of merging **A[1…i]** and **B[1…j]**, where **i + j = k**. This observation is important and gives us the intuition of dynamic programming.

Let’s use **f[i][j]** to stand for the minimum total cost of merging **A[1…i]** and **B[1…j]**. Then, it is easy to find that:

```
f[i][j] = min{ f[i-1][j] + A[i].cost * totalTime(i-1, j), f[i][j-1] + B[j].cost * totalTime(i, j-1)}.
```

where, **totalTime(i, j)** means the total time used for **A[1…i]** and **B[1…j]**, which can be got by prepared prefix sum. Therefore, now, we can merge two sequence using this dynamic programming in O(**nm**). Also, we can get a possible order.

It is so lucky that if we merge the sequences one by one (choose any possible order when merging, if there exist many), we always get the optimal order in final (will explain later in the greedy part). Based on this conclusion, we can solve this problem using dynamic programming in O(**K**^2).

Second, we will try greedy method. Consider the boundary case: there are no sequences, i.e. all pairs are independent. In this case, the best order should be ordered by **cost/time**, from large to small. The intuition is that we need to drop the most expensive pair as soon as possible. And if **cost/time** are same for some pairs, their order does not matter anything. This also gives us the explain why the multiple solutions during merging will all lead to the optimal solution in the end.

Back to the problem, if the most expensive pair **u** is dependent on pair **v**, what can we do? Clearly, if pair **v** was done, then we can directly do the pair**u**. Otherwise, the fact is that once pair **v** was done, we would immediately deal with pair **u**. Therefore, we can merge pair **u** to pair **v**. That is to add their time and cost together. This step contributes **u.cost * v.time** to the total cost, due to pair **u** should wait for the pair **v**.

Using this greedy method, we can also solve this problem in O(**K**^2). If using some data structures, such as priority heap, the time complexity can be improved to O(**KlogK**).

Actually, the problem can be solved via this greedy method even if the dependence is a tree. See this problem for more details.