Assume we have N kids who all like dogs. One day they go to the shelter and there are M dogs of unique breeds. Each kid has a favorite dog breed and a second favorite dog breed. They are also lined up in order. Then way the dogs are selected are as follows. For the first kid in line, if their first choice breed is available, take it and leave. Else if, first not available and second is, take it and leave. Else, leave crying with no dog lol. We want to find, for each 0 <= i <= N – 1, determine how many kids would get a dog if the owner if the first i kids were removed from the line.

Example. 4 kids, 2 dog breeds(denoted as 1 and 2). Let’s say all 4 kids have breed 1 as first choice and breed 2 as second choice. The output should be 2,2,2,1. Because, if we remove the first 0 kids from line, only 2 get dogs. If we remove first kid from line, still 2 get dogs. If we remove first 2 kids from line, 2 get dogs. If we remove first 3 from line, 1 gets a dog. This is a trivial case.

Obviously a solution would be to have a list of the the kids preferences and a set of the available dog breeds, and then for each i, start from the ith index of the list, refresh the dog set, and simulate it, but that would take O(NM) time. Is there a way to do this more cleverly and efficiently?