## Graph ordering with smallest max vertex “discrepancy”

Consider an undirected graph $$G=(V,E)$$ and a bijective function $$f:V \rightarrow [|V|]$$ which orders the vertices by mapping them onto the first $$|V|$$ natural numbers.

Define the cost of an ordering of a graph to be:

$$\text{cost}(G,f) = \max_{(u,v) \in E} f(u)-f(v)$$

(I refer to $$f(u) – f(v)$$ in the title as “discrepancy” to avoid confusion with terms such as length or distance)

Intuitively, if we were to construct $$G$$ by adding on one vertex at a time and adding the appropriate edges to the already-existing vertices, the cost is how far back in our list of vertices we’d have to go and add edges to.

Question: How difficult is the problem of minimizing this cost? Is there an efficient algorithm for finding an optimal ordering? Also is there a name for this cost that i’m not aware of?

Ideas:

A BFS ordering can do arbitrarily bad on this problem: Consider a complete binary tree — the optimal cost is $$O(\log N)$$ but a BFS ordering can cost up to $$O(N)$$.

A DFS ordering can do arbitrarily bad on this problem: Consider a path graph modified such that each vertex is connected to an additional leaf vertex — the optimal cost is $$2$$ but a DFS ordering can cost up to $$O(N)$$.

## RAM usage discrepancy? Memory leak? Ubuntu 18.10 with Cinnamon

So I’m fairly new to using Ubuntu, but what I’ve been finding is that I need to reboot regularly because eventually (over the course of a day or two) I seem to run out of usable RAM and stop being able to play the games I play because they lock up and start disk thrashing and if I’m lucky I’ll be able to wait it out and force-close the game, and if I’m not I’ll need to just hard-reboot. The weirdest thing about it, though is this: https://i.imgur.com/mD8wVRE.png

This shows that just about half my RAM is in use. But then I tab over to the processes page and: https://i.imgur.com/cZ4sZqm.png

As you can see, this is without even running things like a browser or anything. So where has the RAM gone?

## Transaction discrepancy between Google Analytics and Woocommerce

Hey guys!

I am currently experiencing a 25% discrepancy between the numbers reported in Google Analytics and those reported in Woocommerce backend and don't know the cause of it. I have read many articles (ie: Google Analytics and Woocommerce transaction mismatch, etc) but so far I haven't been able to fix my problem and Google Analytics is still reporting 25%…

Transaction discrepancy between Google Analytics and Woocommerce

## How should the DM manage the discrepancy between the player’s memory and their PC’s memory?

It may happen that, during a session, players don’t remember the name of a NPC that they met (or, more generally, information about something that happened) during the previous session. Obviously, their PC remembers that information. Conversely, a player may have taken notes about a not so important event that happened several years ago (in game). In this case, it is possible that the PC does not remember it.

How should the DM manage the discrepancy between the player’s memory and their PC’s memory? In the case of 5e, should he have the PCs make Intelligence checks?

## Lower bound of disjointness by discrepancy?

I need to show that $$Disc_\mu(Disj) \geq \frac{1}{2n+1}$$ for any distribution $$\mu$$. Disjointness is defined as

$$Disj(X,Y)=\left\{ \begin{array}[ll]+1 & \text{if X \cap Y = \emptyset } \ 0 & \text{otherwise}\end{array}\right.$$

The discrepancy for a function is defined as

$$Disc_\mu(f) = max_R \{Disc_\mu(R,f)\}$$ for any rectangle $$R$$ and further

$$Disc_\mu(R,f) = |\Pr_\mu [(x,y) \in R \wedge f(x,y)=1] – \Pr_\mu[(x,y) \in R \wedge f(x,y) =0] |$$.

I don’t understand where the $$\frac{1}{2n+1}$$ comes from and how to define a proper rectangle $$R$$.