Grouping n points into groups of size m with objective to have least traveling distance in each group

Assumptions:

  • There are “n” jobs which are distributed over the city.
  • Company has “k” available workers.
  • Each worker can do “x” jobs per day.
  • “x” is dependent to the worker skills and the distance he travels each day so it’s not a constant.
  • Workers have no initial traveling distance.
  • “s” is a set that shows each workers can do how many jobs based on the distance he travels
  • “d” is the number of days that takes for company to do all the jobs.

Objective: Minimize the “d”


I know this problem is probably NP-hard so I don’t need the exact answer. I think it’s kinda a variation of Traveling salesman problem combining with scheduling and assignment problems.

My algorithm for this problem is to “some how” efficiently ( of course not the most efficient way ) grouping the jobs based on their traveling distance in the groups in to groups of “m” which is the mean of set “s”. Then after each day rerun the algorithm to get better results.

My question is what is the best way to do that grouping? Anyway if you know a better algorithm I would be more than happy to know them.

Is it possible for two Sorceresses to be a Familiar to each other?

On one of my Pathfinder tables, two players came up with the idea of having their new characters – two 3rd level Sorceresses – not only having a romantic relationship, but also being so important to each other that they went through a ritual that made both of them a familiar for each other.

Their plan is for them to build their characters as Fey-Touched, take Improved Familiar and then bind each other as a familiar.

While Improved Familiar is pretty clear regarding Fey-Touched creatures being a valid target for a familiar, I’m under the impression that what the rules are saying is that you can pick a regular familiar with the template “Fey-touched” applied upon it, not any arbitrary creature with said template, and so they wouldn’t be able to make this work.

I’m not even sure if a PC can be built with the Fey-Touched template from the get go, but assuming it is possible:

  • Is my reading of Improved Familiar correct?
  • If it is and it is impossible for them to proceed with their plan this way, is there any other build that could enable them to follow this concept that would be valid with no or minimal DM-fiat?

Does the damage for Delayed Blast Fireball increase each round?

The damage for the spell delayed blast fireball is listed as:

The spell’s base damage is 12d6. If at the end of your turn the bead has not yet detonated, the damage increases by 1d6.

The spell’s duration is “Concentration, up to 1 minute”.

Does this damage increase repeat on each of your turns throughout the spell’s duration?
Or does it only apply to the turn when you cast the spell?

I tried to draw comparison to the 3.5e version of the spell, but according to the d20 SRD, there is no damage bonus for delaying the blast; the spell is simply more powerful than fireball.

Minimum number of tree cuts so that each pair of trees alternates between strictly decreasing and strictly increasing

I want to find the minimum number of tree cuts so that each pair of trees in a sequence of tree alternates between strictly decreasing and strictly increasing. Example: In (2, 3, 5, 7) , the minimum number of tree cuts is 2 – a possible final solution is (2, 1, 5, 4).

My search model is a graph where each node is a possible configuration of all tree heights and each edge is a tree cut (= a decrease of the height of a tree). In this model, a possible path from the initial node to the goal node in the above example would be (2,3,5,7) – (2,1,5,7) – (2,1,5,4). I have used a breadth-first search in it to find the goal node. As BFS don’t traverse already traversed nodes, the part of the graph that I traverse during the search is in fact a tree data structure.

The only improvement to this algorithm that I was able to think was using a priority queue that orders the possible nodes to be explored in increasing order 1st by number of cuts (as traditional BFS already does) and 2nd by the number of strictly increasing/decreasing triplets. This increases the probability that a goal node with the minimum number N of cuts will be within the first nodes of all nodes with N cuts to be evaluated and the search can finish a little faster.

The time required to execute this algorithm grows exponentially with the number of the trees and the height of the trees. Is there any other algorithm/idea which could be used to speed it up ?

Does the Primal Awareness feature allow you to cast each spell once?

In the Unearthed Arcana – Class features article, Primal Awareness, a new variant feature for the ranger was introduced, replacing Primeval Awareness.

This feature lists a number spells as additional spells known that don’t count against the number of ranger spells you know. That part of the feature is fairly clear. However, it goes on to say:

You can cast each of these spells once without expending a spell slot. Once you cast a spell in this way, you can’t do so again until you finish a long rest.

To me, this section is unclear and has two possible interpretations:

  1. “You can cast each of these spells once…” – You can cast each of the spells lists 1/long rest without expending a spell slot. A total of 6 spells cast per day.
  2. You can cast any of these spells once. – Once you have cast one of these spells in this way, you cannot cast any of them until you finish a long rest. A total of 1 spell cast per day.

I can make a logical argument for both cases, one based on the first sentence, the other on the second.

Which of these interpretations is correct?

Should each and every link SER makes be sent to either GSA SEO Indexer or Indexing Service via API?

I am, as usual, trying to refine my efforts. Recently, I’ve been making some headway in this regard. :)
My big question I’ve been wanting to ask: Does every link SER makes require indexing?
I presently use GSA SEO Indexer, but not on EVERY project. Am I wasting efforts in not using either this program or an external indexer? Could I be getting more out of my linking with SER if I did index EVERY link?
Thanks, kindly, for any info…!

Does DFS in an unweighted DAG find the shortest path for each vertex from a source?

I have many questions which related to this topic. I saw somewhere that a topological sorting can be used to find shortest path, and in DAG it can even find shortest weighted paths of all vertex by asking about each vertex whether it’s weighted higher than edge plus previous vertex. The point is, I don’t understand since we are in DAG why isn’t DFS alone enough to find shortest paths. Both weighted and unweighted. I also don’t understand why we need a stack in topological sort. If you were to perform a DFS on some DAG. It will automatically produce a topological sort, if you do a pre-order DFS traverse.

Schedule each entree so that all entrees are completed in the shortest amount of time

Lets say we have plenty people to dress up entrees, but only one chef to cook them. Each entree E_i, takes c_i time to cook and d_i time to “dress up”. The dressing up of entrees can occur while other entrees are being cooked and dressed up, but we can only cook one entree at a time. How would you go about creating an algorithm to schedule each entree E_i in a manner that they all get finished in the shortest amount of time?