# Cover interval with minimum sum intervals – DP recursion depth problem

I have just found the official solutions online (have been looking for them for a while, but after posting this I quickly found it), and I’m currently trying to understand it. As I can tell, it uses a much simpler DP, which only uses an O(N) array.

Story:
This year I attended a programming competition, where we had a few interesting problems. I had trouble with some of them, and since the next round approaches, I want to clear them up.

The problem in a nutshell
We are given `N` weighted intervals, and a `[A,B]` interval. We have to cover the `[A,B]` interval with the given ones, in such a way, that we minimize the overall weight sum (and then the number of required intervals). We need to print the sum, the number of intervals, and then the intervals themselves. If we can not cover `[A, B]`, we need to report that as a special value (-1).

First thoughts
If we sort the intervals by begin time, then we can do simple 0-1 Knapsack-like DP and solve the problem. Also, if the priorities were swapped (minimize count THAN sum), a simple greedy would do it.

The limits
Basically, all interval starts and ends are in the range `1-1 000 000`, and N<=100 000. All intervals lie within [A,B].

My approach
I wrote a recursive algorithm like the `0-1 Knapsack` one in python, that also stored the last selected interval – thus allowing the recovering of the selection list from the DP array later. It was a `(current_interval, last_covered_day) -> (cost, last_selected_interval, last_covered_day')`-like function. I used a `dict` as a DP array, as a regular array that big would have violated memory constraints and filling it fully would also increase runtime (at least that’s what I thought – but a 1000000 * 100000 array would certainly would!). I wrote the function as a recursive one so it would not fill in the entire DP array and be faster & more memory-efficient.

The problem with this
Simply, I got `RecursionError: maximum recursion depth exceeded`s on larger datasets – 100k deep recursion was simply too much. I read since on GeeksForGeeks that it should be possible to increase it, but I am still not confident that it would be safe. My recursive function is also not tail-call optimizatiable, so that would also not work.

So my questions
Is there a way of solving this problem without DP? If no, is filling in a full table an option with those high limits? Maybe we can come up with a different DP approach that does not use such big tables? Is it safe to just increase recursion depth limits with this kinds of problems?

Posted on Categories proxies