## Finding the most frequent element, given that it’s Theta(n)-frequent?

We know [Ben-Or 1983] that deciding whether all elements in an array are distinct requires $$\Theta(n \log(n))$$ time; and this problem reduces to finding the most frequent element, so it takes $$\Theta(n \log(n))$$ time to find the most frequent element (assuming the domain of the array elements is not small).

But what happens when you know that there’s an element with frequency at least $$\alpha \cdot n$$? Can you then decide the problem, or determine what the element is, in linear time (in $$n$$, not necessarily in $$1/\alpha$$) and deterministically?

## What’s most vulnerable the data or the DBMS?

When data breaches occur, do hackers figure out how the data is stored and read this or do they just figure out how to make queries to the database via the DBMS? Which is more likely

## What payment is the most comfortable for int transfers?

I work from Ukraine, and mostly my customers are from USA. What payment way is most popular for this? What would you choose?
I try to work through TransferWise, providing my IBAN.

## Most efficient method for set intersection

Suppose I have two finite sets, $$A$$ and $$B$$, with arbitrarily large cardinalities, the ordered integral elements of which are determined by unique (and well defined) polynomial generating functions $$f:\mathbb{N}\rightarrow\mathbb{Z}$$ given by, say, $$f_1(x_i)$$ and $$f_2(x_j)$$, respectively. Assume, also, that $$A\cap B$$ is always a singleton set $$\{a\}$$ such that $$a=f_1(x_i)=f_2(x_j)$$ where I’ve proven that $$i\neq j$$.

Assuming you can even avoid the memory-dump problem, it seems the worst way to find $$\{a\}$$ is to generate both sets and then check for the intersection. I wrote a simple code in Sagemath that does this, and, as I suspected, it doesn’t work well for sets with even moderately large cardinalities.

Is there a better way to (program a computer to) find the intersection of two sets, or is it just as hopeless (from a time-complexity perspective) as trying to solve $$f_1(x_i)=f_2(x_j)$$ directly when the cardinalities are prohibitively large? Is there a parallel-computing possibility? If not, perhaps there’s a way to limit the atomistic search based on a range of values—i.e., each loop terminates the search after it finds the first $$i$$ value such that $$f_1(x_i)>f_2(x_j)$$, knowing that $$f_1(x_{i+1}), f_1(x_{i+2}), f_1(x_{i+3}), \cdots, f_1(x_{i+n})>f_1(x_i)>f_2(x_j)$$.

## Minimum number of nodes to select such that every node is at most k nodes away

I received this problem on an exam a few months ago, and have kept thinking about how to solve it with no luck.

Given a binary tree where each node in the tree can either be selected or unselected, implement a function k_away() which returns the minimum number of nodes that need to be selected so that every node in the tree is at most k nodes away from a selected node.

So, the nodes simply contain a pointer to the left child, a pointer to the right child, and a boolean marking it as selected or not:

struct Node {     Node *left;     Node *right;     bool selected = false; // starts out false }; 

The problem specifies a constraint of having a time complexity of O(n) and an auxiliary space complexity of O(n).

What I’ve thought of so far:

• It seems like there are 2^n potential solutions (if I can choose to either select or not select every node and there are 2 of them), so brute force is a bad idea
• I’ve searched around for similar problems and the closest thing I can find is the Dominating Set problem which seems… impossible to solve at this moment in polynomial time. I doubt this problem was given as an impossible problem.
• Running DFS to get to the leaf nodes, then counting height as recursion unrolls. Every k away, marking the node as selected. This seems to work on small test cases, but does not return the minimum number away.
• Running BFS on the root node to find all nodes k away while using another data structure to mark all visited nodes as ‘covered’, and then recursively running the BFS on each k-away node. This also seems to work on small test cases, but again doesn’t return the minimum number away.

## What is the most damage that can be done in one round on your turn?

I was surprised that this question has not been asked.

I want to find out what the most damage that can be done in a single round?

Here are the rules.

• Rules from Official Hardcover books only.
• Any official playable race can be used.
• No optional rules, other than those listed here.
• Multiclassing and feats are permitted.
• Characters can be up to LV 20
• Standard point buy for stats.
• No Magic items.
• No infinite loops we are not trying to break the system. We are trying to find the maximum damage in a reasonable setting.
• Consistent effects only. Random elements like wild magic cannot be used.
• Epic Boons are permitted if you use the Aberrant Dragonmark feat.. You can choose the boon you would like.
• No outside help. Permanent summoned creatures are permitted, and considered to have already been summoned.(find steed, familiar, homunculous.. etc..) Their damage can be added to yours in the calculation.
• Assume one average target with no unusual defenses.
• All attacks hit.
• All saves are passed.
• Polymorph is allowed, but I suspect there are better methods.
• Any attack or combination of spells and attacks can be used provided you are following the rules.
• Include the Maximum rolled damage as well as the average damage, where average is calculated using min+maximum /2 to calculate it.

I understand that a lot of you don’t like optimization questions. But try to leave me a comment before you give me a negative vote. I will address the issue and fix it(If possible).

This post is for people who do enjoy the theory craft and challenge. I think it will be interesting to find out what the damage limit is.

## What is the most efficient way to turn a list of directory path strings into a tree?

I’m trying to find out the most efficient way of turning a list of path strings into a hierarchical list of hash maps tree using these rules:

• Node labels are delimited/split by ‘/’
• Hash maps have the structure:
{     label: "Node 0",     children: [] } 
• Node labels are also keys, so for example all nodes with the same label at the root level will be merged

So the following code:

[     "Node 0/Node 0-0",     "Node 0/Node 0-1",     "Node 1/Node 1-0/Node 1-0-0" ] 

Would turn into:

[     {         label: "Node 0",         children: [             {                 label: "Node 0-0",                 children: []             },             {                 label: "Node 0-1",                 children: []             },         ]     },     {         label: "Node 1",         children: [             {                 label: "Node 1-0",                 children: [                     {                         label: "Node 1-0-0",                         children: []                     },                 ]             },         ]     }, ] 

## Counting a number of sequences where the item is the most frequent one

Consider an array $$a$$, where elements are numbers. Consider all subsequence of $$a$$. For each subsequence $$s$$, we find an element $$k$$ with the largest number of occurrences in $$s$$ (if there are several options, choose the smallest such $$k$$).

Problem: for each number $$k$$, find the number of subsequences of $$a$$ for which $$k$$ is the chosen number.

Example:

Input: [1, 2, 2, 3] Output: 1 -> 6 ([1], [1, 2], [1, 2'], [1, 3], [1, 2, 3], [1, 2', 3]) 2 -> 8 ([2], [2'], [2, 3], [2', 3], [2, 2'], [1, 2, 2'], [2, 2', 3], [1, 2, 2', 3]) 3 -> 1 ([3]) 

My idea is that, I make a map $$cnt$$ where $$cnt[x]$$ is number of times $$x$$ occurred in array $$x$$. I am trying to find the answer for $$k$$ then i will create a map $$temp$$ where $$temp[i]=\min(cnt[k], cnt[i])$$. Then consider the value of $$cnt[k]$$ to be $$m$$ then in a subset element $$k$$ can occur from $$0$$ to $$m$$ times and using this i will find answer for all of its occurrences from $$0$$ to $$m$$ which is a simple number of sub sets problem and finally i will have my answer.

But the complexity of my algorithms is bad as it is quadratic or worse. Can it be improved?

## Most effective way of improving survivability for an Ancestral Guardian Barbarian?

The Path of the Ancestral Guardian Barbarian (Xanathar’s Guide to Everything, p. 9-10) is an extremely powerful barbarian. It pretty much makes your allies invulnerable against an enemy boss.

But it does nothing for your own health. It will greatly incentivize enemies to take you down first to be rid of your annoying Guardian benefits.

Assuming I’m currently level 3 as an Ancestral Guardian, and leveling soon to 4, my stats are average (point buy), and I have no healing from allies, what is the best way to maximize my survivability for fights to come?

• I’m willing to look into multiclassing, if there’s a valid strategy there.
• I don’t expect anyone else to grab any healing abilities.
• I’m not interested in specific magical items. (Potions and other common/uncommon magic items are fine)
• I expect the campaign to last until about level 10.
• By character level 10, I would like to have at least 6 levels of Barbarian
• Emphasis on surviving against bosses, if possible.
• Expected about 2 combat encounters per day.

• A Champion Fighter (who is very cowardly and selfish, doesn’t tank much)
• A Fey Warlock (fairly standard, supportive player)
• An Evocation Wizard (who lives to blow stuff up)
• A Ranger/Rogue (who uses stealth and long range)

## What are the most tolerable options for a more general public type not to be victimized by malware?

I’ve talked with a new friend who is fairly bright and who can do some interesting things programming Office applications, but whose technical abilities omit infosec. And he got bitten by nasty malware.

I’m wondering what options might be most productive to offer to him. I’m not sure it’s realistic to repel all dedicated assault, but cybercriminals often look for someone who would be an easy kill, and (perhaps showing my ignorance here), I think it could be realistic to make a system that’s hardened enough not to be an easy kill.

Possibilities I’ve thought of include:

1. Windows 10 with screws turned down (how, if that is possible?).

2. Mint or another Linux host OS for what can be done under Linux, and a VMware or VirtualBox VM that is used for compatibility and may be restorable if the machine is trashed.

3. Migrating to a used or new Mac, possibly with a Windows Virtual Machine, but most people using Macs don’t complain they are missing things.

4. Perhaps with one of the technical situation, point my friend to user education saying things like "Don’t download software that you hadn’t set out to get. The price of Marine Aquarium of \$ 20 up front is dwarfed by the hidden price tags of adware and spyware offering a free aquarium screensaver.

This is not an exhaustive list, although it’s what I can think of now. I’ve had a pretty good track record for not engaging malicious software, and I think it can be learned (and that documentation for online safety would be taken very, very seriously).

What can I suggest to my friend for online safety?