## Hashing sensitive data and checking for duplicates

I have some sensitive client data that needs to be hashed, but I also need to check that that data isn’t duplicated by another client. So the hash function needs to produce the same value for the same data so I can search the db for duplicates.

One option is brcrypt with a constant salt but that isn’t very secure.

Any ideas?

## checking url’s for specific keywords

Hi there ,

I have scrapped a bunch of URL’s , now I want to check those url’s , if they contain certian line of text , how to do that … please let me know .

Thanks,
Syed

## checking url’s for specific keywords

Hi there ,

I have scrapped a bunch of URL’s , now I want to check those url’s , if they contain certian line of text , how to do that … please let me know .

Thanks,
Syed

## Proof by contradiction – Only checking a right-neighbor in a sequence of pairwise distinct integers is sufficient to identify the first local maximum

I’m trying to figure out if my proof is valid, I think it makes intuitive sense but am worried I’m missing something. Any help would be much appreciated!

# Question

A peak element is an element that is greater than its neighbors.  Given an input array nums, where nums[i] ≠ nums[i+1], find a peak element and return its index.  The array may contain multiple peaks, in that case return the index to any one of the peaks is fine.  You may imagine that nums[-1] = nums[n] = -∞.  Example 1:  Input: nums = [1,2,3,1] Output: 2 Explanation: 3 is a peak element and your function should return the index number 2.  Example 2:  Input: nums = [1,2,1,3,5,6,4] Output: 1 or 5  Explanation: Your function can return either index number 1 where the peak element is 2, or index number 5 where the peak element is 6. 

# Solution

public int findPeakElement(int[] nums) {     for (int i = 0; i < nums.length-1; i++) {         if (nums[i] > nums[i + 1]) {             return i;         }     }     return nums.length - 1; } 

## Why don’t you have to check the left neighbor at each element?

Assume towards a contradiction that we are iterating through nums, yet to discover a peak, and we come across and element at index i whose right-neighbor at index i+1 is strictly smaller. If the element at index i were not a peak then the element at index i-1 would have to be strictly larger. Then we have that

nums[i-1] > nums[i] > nums[i+1] 

This then implies that nums[i+1] is the last element in a strictly decreasing sequence of elements (that we’ve seen), the start of which must be a peak (either the sequence starts at index 0 or it starts at index k, 0 < k < i). This contradicts our assumption, therefore the first element whose right-neighbor is strictly smaller is a local peak.

## Checking disjointness between subsets of a poset

If there is a poset $$(P, \le)$$ and two sets $$X \subseteq P$$ and $$Y \subseteq P$$, and we have a way $$f : P^2 \to 2$$ to efficiently compute for any $$(x, y) \in P^2$$ whether there exists a $$z \in P$$ such that $$(x \le z) \wedge (y \le z)$$, we want to return $$\mathbf{T}$$ if there exists a pair $$(x, y) \in X \times Y$$ such that $$f(x, y) = 1$$ and $$\mathbf{F}$$ otherwise, using the fewest possible number of calls to $$f$$.

## Password checking resistant to GPU attacks and leaked password files without introducing a DoS attack on the server?

Later, passwords were hashed once and the hashed value stored. If the attacker had a leaked password file he could try hashing guesses and if a hash value matched, use that guess to login.

Then passwords were salted and hashed thousands of times on the server and the salt and the resulting hash value was stored. If the attacker had a leaked password file he could use specialized ASICs to hash guesses and if a guess matched use that password to login.

Can we do better than that?

Can we make password guessing of an attacker so hard that even if he has the hashed password file, he will not get a major advantage over testing the passwords against the server – even if he has specialized ASICs?

## Why did browsers choose to implement HSTS with Preload over checking custom DNS information?

Browsers and standards bodies favor HSTS with Preload because it avoids ever sending an http request to a website that supports https. This is good, because cleartext http requests can be intercepted to set up Man in The Middle attacks.

But a number of websites explain that a centralized Preload list doesn’t scale up well to the mostly https web that has been proposed by W3C, the EFF, and others. Managing one centralized list creates a bottleneck for looking up, adding, and deleting list items.

Yet this technology has been implemented rather than, say, to use DNS, which is already nicely distributed and is already used by browsers to lookup URL domain names.

Of course, the DNS is not yet secure, and proposals to make it secure are controversial. But why would the DNS have to be secure to hold one more bit of information (whether the domain can support https–and ONLY https–or not)?

In the worst case, a malicious MiTM attack could make it seem that a website is insecure when it is actually secure. But in this case, an insecure connection would simply fail. This failure would deny the malicious user any advantage.

So naturally I’m wondering why a centralized HSTS with Preload is preferred over adding a new flag to DNS zones for indicating that the domain supports https connections.

## Is checking if the length of a C program that can generate a string is less than a given number decidable?

I was given this question:

Komplexity(S) is the length of the smallest C program that generates the string S as an output. Is the question “Komplexity(S) < K” decidable?

With respect to decidability, I only know about the Halting Problem and just learned about Rice’s Theorem while searching online (though I don’t think it can be applied here?). I couldn’t reduce the problem to any undecidable problem I know about. Thanks in advance for any help

## My website on checking showing 403 error on HTTP Status [on hold]

My website is running fine but when I check on the https://httpstatus.io/, it is showing the 403 error and also on the Google page insights, it is not showing the report.

My website DNS is currently pointing to the Cloudflare and previously I have installed the Cloudflare plugin in my WordPress and now I have uninstalled it.

If I removed my website from Cloudflare, the problem will be solved or not, Or it will affect my website.

Any help is much appreciated. Thanks In Advance.

## Checking to see if my relation is in 3NF based on the functional dependencies

I have a relation, called Score (which stores scores of football games), and it has the following functional dependencies for its various attributes G, H, T, S, W, D, and O (representing GameID, HomeOrAway, TeamID, Season, Week, Date, and Outcome):

GH → TSWDOP

SD → W

TSW → GH

D → SW

I would like to know if Score is in 3NF, or if not, suggest a decomposition that achieves 3NF. Can you guys help me go about figuring this out? I know that for each functional dependency, I need to check that the left side contains a key for Score, but I’m not sure how to really go about doing that. Any help would be greatly appreciated!