Is it correct or incorrect to say that an input say $C$ causes an average run-time of an algorithm?

I was going through the text Introduction to Algorithm by Cormen et. al. where I came across an excerpt which I felt required a bit of clarification.

Now as far as I have learned that that while the Best Case and Worst Case time complexities of an algorithm arise for a certain physical input to the algorithm (say an input $ A$ causes the worst case run time for an algorithm or say an input $ B$ causes the best case run time of an algorithm , asymptotically), but there is no such physical input which causes the average case runtime of an algorithm as the average case run time of an algorithm is by it’s definition the runtime of the algorithm averaged over all possible inputs. It is something I hope which only exists mathematically.

But on the other hand inputs to an algorithm which are neither the best case input nor the worst case input is supposed to be somewhere in between both the extremes and the performance of our algorithm is measured on them by none other than the average case time complexity as the average case time complexity of the algorithm is in between the worst and best case complexities just as our input between the two extremes.

Is it correct or incorrect to say that an input say $ C$ causes an average run-time of an algorithm?

The excerpt from the text which made me ask such a question is as follows:

In context of the analysis of quicksort,

In the average case, PARTITION produces a mix of “good” and “bad” splits. In a recursion tree for an average-case execution of PARTITION, the good and bad splits are distributed randomly throughout the tree. Suppose, for the sake of intuition, that the good and bad splits alternate levels in the tree, and that the good splits are best-case splits and the bad splits are worst-case splits. Figure(a) shows the splits at two consecutive levels in the recursion tree. At the root of the tree, the cost is $ n$ for partitioning, and the subarrays produced have sizes $ n- 1$ and $ 0$ : the worst case. At the next level, the subarray of size $ n- 1$ undergoes best-case partitioning into subarrays of size $ (n-1)/2 – 1$ and $ (n-1)/2$ Let’s assume that the boundary-condition cost is $ 1$ for the subarray of size $ 0$ .

The combination of the bad split followed by the good split produces three sub- arrays of sizes $ 0$ , $ (n-1)/2 – 1$ and $ (n-1)/2$ at a combined partitioning cost of $ \Theta(n)+\Theta(n-1)=\Theta(n)$ . Certainly, this situation is no worse than that in Figure(b), namely a single level of partitioning that produces two subarrays of size $ (n-1)/2$ , at a cost of $ \Theta(n)$ . Yet this latter situation is balanced! Image

If the Bestow Curse spell causes me to do extra damage, does the target’s death trigger the Necromancy wizard’s Grim Harvest feature?

In D&D 5e using a wizard of the School of Necromancy:

If I cast bestow curse on a monster, then kill it with a crossbow, would it trigger the Grim Harvest feature due to the extra 1d8 damage? Does it matter if the monster had 1 HP?

What about if I cast bestow curse, then hit it with magic missile (and kill it with just 1 missile) – would Grim Harvest trigger off the missile or the curse?

Is filling up plan cache causes a decrease in space allocated for data cache?

SQL Server uses allocated server memory for different kind of purposes. Two of them are plan cache and data cache which are used to store execution plans and actual data correspondingly.

My question: Do these two caches have different allocated space section in Buffer pool, or in contrary, they have just one section in Buffer pool which they share between each other?

In other words, if plan cache is filling up, does space for data cache is reducing as well?

Problems for which a small change in the statement causes a big change in time complexity

I know that there are several problems for which a small change in the problem statement would result in a big change in its (time) complexity, or even in its computability.

An example: The Hamiltonian path problem defined as

Given a graph, determine whether a path that visits each vertex exactly once exists or not.

is NP-Complete while the Eulerian path problem defined as

Given a graph, determine whether a trail that visits every edge exactly once exists or not.

is solvable in linear time with respect to the number of edges and nodes of the graph.

Another example is 2-SAT (polynomial complexity) vs K-SAT (NP-complete) although someone could argue that 2-SAT is just a specific case of K-SAT.

What do you call this kind of problems –if they even have a name? Can someone provide a list of other examples or some references?

converting latin to utf8mb4 causes questionmarks

  • The original format of the data is unknown
  • The new table is in utf8mb4_general_ci

If I do CONVERT(BINARY CONVERT(column USING latin1) USING UTF8) as mentioned here – it fixes all text, but converts something like: © in the original column to ? in the new column.

If it helps to determine what original encoding it was in, the original text renders as e.g. KotaÄići and converts to Kotačići.

Is there a way to both preserve special characters and restore correct utf8 text format?

VirtualBox LAMP Server to Proxmox causes webroot to disappear

Hey guys! First time here.
Hopefully I'm in the right spot, move this thread if not.

I'm having some problems when trying to transfer my completely functional website thats hosted on my Virtualbox VM LAMP Server onto a Proxmox installation.
Let me explain.
I've had my first website up for about two months now. I'm running it on a Ubuntu Server 18.04 LAMP stack in a VirtualBox VM on Windows 10. I have my own GoDaddy signed certs so my site has https. I wanted to transfer this to My Proxmox…

VirtualBox LAMP Server to Proxmox causes webroot to disappear

How might the weight of a falling object affect the damage it causes?

Fall damage is 1d6 per 10 feet. What adjustments if any should I make for objects falling on a player character? (e.g. a bear)

Assuming the objects are meaningful threats but not instant character death, should the weight of an object change the calculation, e.g. more then 1d6 per 10 feet.

Or is this more in the spirit of improvising damage chart? i.e. the setback (cat to face) dangerous (orc fell on me), and deadly (the large bear).

If this is house-rule territory does any one have any experience or advice beyond the wiki page relevant to 5e.

Whenever a project is edited, it causes all articles file to be rewritten and causes huge delay

I have imported many articles. They are 270mb

I just noticed that whenever I edit a project, all articles file is rewritten even though I didn’t make a single change

I suppose they are also read whenever we click edit of any project thus taking really long time to load the project edit window 

This causes huge delay. Can you make articles read and write only whenever we make change of an article?

What causes Autonumbering to stop working after “Evaluating” Notebook in 12.0

I’m using “Section” autonumbering in a notebook that contains text and calculations. Normally my “Section” style numbering looks like:

1)  2)  3) 

After evaluating the notebook the “Section” style numbering looks like:

0) 0) 0) 

Nothing I have tried corrects the problem. I have tried stopping and restarting the kernel. I have tried unloading and reloading the style sheet. When I execute:

CurrentValue[{"CounterValue", "Section"}] 

the command returns zero even though prior to executing the command I initiated 3 “Section” command. I get the same type of failure if I use the default stylesheet and the “ItemNumbered” style.

If I exit Mathematica and reopen it all the sections are numbered correctly. Also if I open another notebook it autonumbers correctly. The problem is localized to any notebook that has been evaluated with the notebook evaluation command.

This problem doesn’t happen with 11.3, only 12.0. Anyone know how to correct the problem in the evaluated notebook?