Interview q: Small possible length of stick from an array of stick lengths

I was asked this question in a phone interview recently and I bombed it completely. Zero clue how to approach it. I wasn’t able to find any similar patterns on google-ing. Thought maybe folks here might be able to help?


Statement: Given m sticks with different lengths. Combine these sticks to form longer sticks with the same length. What’s the smallest possible length of these newly unified sticks?

Conditions:

  • Must use all sticks
  • m < 50
  • max length of single stick less than 20

Example:

Input: 5 2 1 5 2 1 5 2 1 Output: 6 (Process: 1+5, 1+5, 1+5, 2+2+2) 

Input: 3 3 3 2 2 5 Output: 9 (Process: 3+3+3, 2+2+5) 

Input: 1 2 3 4 5 Output: 5 (Process: 2+3, 1+4, 5) 

Input: 1 3 4 5 Output: 13 (Process: 1+3+4+5) 

Does the ability score modifier go in the large box or the small oval?

When filling in the ability scores section of the standard character sheet as used in the D&D 5e Player’s Handbook, I’ve seen a lot of people put each ability score in the large box, and the corresponding ability score modifier in the small oval below it. This makes sense to some because that’s the order you’re writing it from top to bottom.

However, it seems more practical to put the ability score modifier in the large box, since that’s almost always the number you need to look at, and the ability score in the small oval.

Is there any official ruling on this, or other evidence to suggest which approach (if either) is official, standard, or more correct?

Attempt to merge 2 small fresh projects leads to freeze of GSA SER

Hello Sven,
I have tried several times, to make all projects Inactive, GSA SER is Stopped, no Threads is recognized on the status bar. I have also reset the Submitted records, and projects just keeps Options and Verified (105 and 130) to make projects for merge smaller. No matter what, anytime I try it, GSA SER got frozen. At the moment of trying is running only GSA SEO Indexer,  CapMonster, GSA Proxy Scraper and DropBox application, that feeds GSA SER by fresh lists.
Beyond mentioned apps is yet ran. I have set in all GSA apps count of threads to 20, however neither like that it have no possitive impact.

Is there anything else I can do, not to make GSA SER freezing all the time? That leads me to kill it in Task Manager and start it again.

My HW config is following: Intel i3-7130U @2,70GHz, 12GB RAM DDR4, 1TB M2 NvMe, 500Gbps WAN
System resources are following: 12%CPU, 35%RAM, 0%HDD, 0%LAN. 
My OS is MS Windows 10 PRO 64-bit. The machine is completely dedicated just for purpose of link-building.
Thanks for Your answer.

Regards,
Michal

Is there any way to gain the endless special quality without carrying around a small necromantic magic item?

Dragon Magazine #354 has a fairly well-known special quality called endless, which prevents aging and all its normal effects, but is unfortunately not actually granted by its associated feat, Wedded to History. Within the pages of this magazine, the only way to gain the quality (DM fiat aside) is to have someone cast kissed by the ages on you, and then to forevermore give up a magical item body slot and risk taking a penalty if you ever lose the item–which also radiates enough necromancy to make many NPCs very uneasy.

But what about methods outside the pages of the magazine? Is there any sort of feat, feature, or other special means by which someone can gain this extraordinary special quality, without needing to go around holding a pseudo-phylactery?

Large Creature Overrun of a Meduim and Small Creature

A halfling and a human are blocking a group of winter wolves (large) on a 10′ wide ledge. Can the wolves just squeeze through the halfling’s space that is two sizes smaller; at 1/2 speed or 1/4 speed? Do the wolves need/get to overrun both at advantage since they only take up two squares a winter wolves take up 4? Playing G2 from Tales from the Yawning Portal.

Minimum spanning tree with small set of possible edge weights

Given a undirected graph which only has two different edge weights $ x$ and $ y$ is it possible to make a faster algorithm than Prim’s algorithm or Kruskal’s algorithm?

I saw on Kruskal’s algorithm that it already assumes that the edges can be sorted in linear time with count sort, so then can we gain any benefit by knowing the weights of all of the edges ahead of time?

I don’t think there is any benefit, but I am not sure…

How do you determine constraint length,k is small value or large value?

enter image description here

[For small values of k, this is done with a widely used algorithm developed by Viterby(Forney, 1973]

My question is how do they determine k value is considered small or big? What is the threshold value for k? For example, length of this code is 7 and they considered it as small value. How about 10? or how about 20? Are they considered as small value or large value? I’m curious about the threshold value of k.

This is an excerpt from Computer Network Book by Andrew S. Tanenbaum

The data link layer(chapter 3), Page 208, Fifth edition.

Does a small race do extra damage when enlarged by the Enlarge/Reduce spell?

Me and a friend were discussing gnome barbarians, and the fact that small creatures have disadvantage on an attack when using a weapon with the heavy property. I brought up that the “Enlarge/Reduce” spell would negate this disadvantage by making the gnome size class medium temporarily.

Though, the “Enlarge/Reduce” spell also causes creatures to do 1d4 extra damage with weapons they use. If a gnome/kobold/halfling/goblin were to hold a greatsword, and have a wizard enlarge them, it appears that, not only would the disadvantage imposed by being a small race be removed, but they would also do 1d4 damage more than any normally medium character using the weapon. This makes no sense to me though, am I reading everything correctly?

Problems for which a small change in the statement causes a big change in time complexity

I know that there are several problems for which a small change in the problem statement would result in a big change in its (time) complexity, or even in its computability.

An example: The Hamiltonian path problem defined as

Given a graph, determine whether a path that visits each vertex exactly once exists or not.

is NP-Complete while the Eulerian path problem defined as

Given a graph, determine whether a trail that visits every edge exactly once exists or not.

is solvable in linear time with respect to the number of edges and nodes of the graph.

Another example is 2-SAT (polynomial complexity) vs K-SAT (NP-complete) although someone could argue that 2-SAT is just a specific case of K-SAT.

What do you call this kind of problems –if they even have a name? Can someone provide a list of other examples or some references?