Searching a 2D array of binary Data

I’m working on optimizing the structure of an optical metadevice. I have a randomly generated 2D matrix, where 0,1 represents the presence/absence of a hole. Each structure manipulates light in a different way, thus giving rise to a unique spectrum.

The problem I wish to solve is to maximize the efficiency of this structure. Given the large size of the solution space ($ 2^{100}$ in this case), it isn’t possible to simulate each structure. Is there any search method I could use to complete this optimization?

A General workflow would be:

  1. Generate a random hole structure
  2. Flip one or some bits (based on the optimization algorithm)
  3. Compute the spectrum
  4. Go back to step 2 and make decision based on the previously computed spectrum.

Here’s a link to a sample hole array.

Apologies for the vague statement of the problem. Thanks in advance!

Animal Companions hiding, searching, and readying an action

The PHB has this to say about Animal Companions:

The beast obeys your commands as best as it can. It takes its turn on your initiative. On your turn, you can verbally command the beast where to move (no action required by you). You can use your action to verbally command it to take the Attack, Dash, Disengage, or Help action. If you don’t issue a command, the beast takes the dodge action.

My question is, although it specifies that movement takes no action while a list of actions take your actions, can a Ranger also verbally command the beast to hide, search, or ready an action (that isn’t in the list of actions that take your actions)? Nothing in the feature specifies that you can’t order the beast to take any of the other actions listed.

What are recognized ways for searching a specific string like “video234.mp4” in the DOM of a large indexing site with pre-existing pages?

I’m trying to search off a whole site for specific string in the source code of a specific page that exists on the site, I’m thinking of a scrawler to do this, is a scrawler intended to do this or are there more efficient way?

Unlike indexing sites like Google etc, which results can vary even if the same request is made another time, this site does only have pages created by the user which makes it a bit more easier to make it possible to search off the site. The content isn’t served by javascript so this isn’t an obstacle.

Amount of expected loop iterations when searching an array by random index

Lets say we have an array A of size n. It has 1 as its first index and n as its last index. It contains a value x, with x occurring k times in A where 1<=k<=n

If we have a search algorithm like so:

while true:   i := random(1, n)   if A[i] == x     break 

random(a,b) picks a number uniformly from a to b

From this we know that the chances of finding x and terminating the program is k/n with each iteration. However what I would like to know is what would be the expected value for the number of iterations or more specifically the amount of times the array was accessed in this program given the array A as described above.

A Monk threw three coloured darts, then spends one minute searching the battlefield, how many darts does the Monk recover?

A Monk PC has three ordinary darts. He’s very creative and painted the darts different colours, the primary colours. One dart is red, one dart is yellow, one dart is blue.

The monk throws the three coloured darts at an opponent. He then spends one minute searching the battlefield, what is the monk able to recover?

Each coloured dart has a different but equal sentimental value to the monk PC.

A crossbowman fires three coloured bolts, then spends one minute searching the battlefield, which bolt(s) does he recover?

At the end of the battle, you can recover half your expended ammunition by taking a minute to search the battlefield. PHB.146

A ranger PC with a crossbow has three ordinary bolts. He’s very creative and painted the bolts different colours, the primary colours. One bolt is red, one bolt is yellow, one bolt is blue.

The ranger fires the three bolts at an opponent. He then spends one minute searching the battlefield, what is the ranger able to recover?

Each coloured bolt has a different but equal sentimental value to the ranger PC.

“Searching and sorting” algorithm to find the natural logarithim of a number?

Yeah, this is for a homework assignment, but I hope that you’ll humor me anyway. I am asked to design an algorithm that finds the natural logarithm of a number. This would be straightforward, but I’m not allowed to use strategies that involve integration, bit manipulation, approximation formulas, or Taylor Series. Instead, we’re asked to generate an array of values where at least one of them will be the answer, and then to find the correct value in that array. There must be some kind of established algorithm for this, but this and other resources haven’t been helpful. I’m stuck. What should I be thinking about here?

Improve SEO by forcing web crawlers to read csv file searching for keywords

I am trying to improve the seo of my website and I recently used an online seo tester for my first custom-coded website.

I am trying to improve the number of unique keywords and textual content crawled and I’m hoping to use the .csv file I created for the plotly.js sunburst. I followed this example

Right now I think the best way is to allow access to the .csv would be using the robots.txt file but I have not been able to confirm that approach will help. I’m new to the web development world so I apologize if the question is primitive. Any help is appreciated.

Database Security – Encryption & Searching

I am completely new to cryptography but have been trying to make myself familiar with the concepts and applications. I have a project where I believe cryptography to be beneficial.

Project Info:

DB = MySQL 5.6+

Engine = InnoDB

My application will reside on an intranet web server behind a network firewall with a very small white-list. Few users of this application will have the ability to add/remove values from the database. A larger number of users would be able to read these values. Values I would hope to encrypt could include:

  • emails
  • account numbers
  • paths
  • dates
  • unique ids

Largest table(s) would have up to 150k entries and total sessions likely to remain under 100.

Being an intranet site I assume (with limited security knowledge) that my primary threats will be malicious users, hardware theft, and persistent XSS from an internal or external source. I am doing my best to mitigate all of these.

Doing some research on how to encrypt my data while allowing it to be searchable leaves me with a few options (please correct wrong information);

  • CipherSweet Blind Indexing: requires library, may be overkill, false positives possible
  • MySQL AES_ENCRYPT/DECRYPT: if logs are compromised plaintext values will also be compromised
  • Application Side: runtime “nightmare”, heavy load, could cause issues with multiple threads running


  1. While ugly and poor practice should application side encryption/decryption be acceptable for my environment?
  2. Would the likelihood of false positives with CipherSweet be negligible for my datasets?
  3. Given my environment, would letting MySQL handle the encryption/decryption be acceptable (neglecting hardware theft or server compromise)
  4. Bonus – should I be worrying about external XSS given my environment

I understand this question may fall into the category of discussion and if that is the case please direct me to where I may find further information to narrow my questions.

How can haveIbeenPwned be so fast in searching?

That’s a question that I always tried to answer to myself without success many times. The service is able to search in billion of compromised account at a very impressive speed(Just a few milliseconds). Instead, to search a compromised account on a single data breach with tools like grep takes some minutes if the breach is large (just think only to search in or Collection#1). I don’t think that they could use databases because of the amount of data. So, How the search performed by this service can be so fast? What’s the technology used behind to reach this result and how they can collect this amount of GBs online on simple servers?