Feeding entropy pool with my own data

Let’s state that I have a huge bunch of truly unpredictable random data in file "random.bin". The random data has been created outside of my system and has been securely transfered into my system.

Questions:

  • How could I feed this file to the OS entropy pool so that a subsequent call to /dev/random by my fancy software will not block?
  • Would the command ‘cat random.bin > /dev/random’ makes it?

Notes:

  1. The question is not about the use of /dev/urandom.
  2. Answer must not make use of fancy tool like ‘havege’.
  3. Perfect answer would be Unix flavour independent (so working on AIX, Solaris, RHEL, BSD, …).
  4. It is similar but different than question Feeding /dev/random entropy pool? .

Theoretical question with regard to weak password based KDF & high entropy input

I know this question is theoretical however I would like some thoughts from a security perceptive.

Take this thought:

1) I generate a high entropy, cryptographically secure string of bits (256-bits of entropy) 2) I use this entropy as a password fed into a weak PBKFD, like PBKDF1 3) I use the output as a key, should I expect this key provides 256-bit entropy?

Extra question, should I instead use a strong KDF, like PBKDF2, would I then be able to expect no drop in entropy?

PBKDF2 by definition should stretch entropy right? not decrease it.

Thanks

Novice question: Limiting number of combo attempts with Fail2ban and 128 bits of entropy

Apps such as Fail2ban and DenyHosts enable unix administrators to limit username/password combo attempts to typically 3 attempts. But why 3? Some admins enable more, like 6 or 8 giving honest users a little more slack when making different attempts at a password they may not recall exactly. But why not 18? Or even 30?

If a sophisticated cracker wanted to brute force a combo with a scheme involving 128 bits of entropy, s/he would need to make trillions of attempts a second. So if an admin limited the total number of attempts to 100 using Fail2ban, wouldn’t the authentication system still be secure and robust, as long as the admin sets up their username/password scheme to require 128 bits of entropy?

Data compression – Entropy

Let’s say I have an alphabet $ $ \Sigma = \{A, B, C, D, E\}$ $

with probabilities $ $ P(A) = P(B) = P(C) = 0.25 \text{ and } P(D)=P(E) = 0.125.$ $

I know that the entropy then is: $ $ H(\Sigma) = 3 \cdot 0.25 \cdot \log 4 + 2 \cdot 0.125 \cdot \log 8 = 2.25.$ $

My question now is: What does this mean in relation to the lower limit of compression? How many bits will I at least need to compress a text that consists of the above alphabet?

How to properly calculate password entropy

I am very confused with calculating password entropy i know that the formula is E = log2(RL). where E is password entropy, R is the range of available characters, and L is the password length. but what i if don’t have the password length. Imagine there is a company that has 5 millions users and decided to use English alphabet (26 characters) for creating random password for each user (passwd length is not decided yet) the password will be hashed using SHA-256 …. hash = SHA – 256 (nickname + passwd).

  • How to calculate or plot password entropy in terms of passwd length in this case ?
  • How long does it take to crack even one password if using a GPU that can process 370200 hash/s ?

using HW random number generator as source of entropy

Currently I am using haveged on my server as source of entropy.

My Server is used as KVM hypervisor, to run virtual machines.

I did not use haveged at the beginning, and I noticed the VMs were draining the entropy pool from the server. Sometimes, when VMs were started SSH waited for enough entropy (to generate session keys, I guess).

Now with haveged, I don’t have this problem anymore.

But I would like to try to use a HW random number generator. I am not saying haveged is bad, but true HW random number generator can only make the entropy better. I have seen some HW RNG which work on basis of Geiger counter, some which collect noise from microphone, and so on.

Which are most reasonable to use ? Could somebody perhaps recommend some specific one ?

Ideally, I would like it to be connected over serial port. Second best would be over USB.

information theory, find entropy given Markov chain

There is an information source on the information source alphabet $ A = \{a, b, c\}$ represented by the state transition diagram below:

Markov chain

a) The random variable representing the $ i$ -th output from this information source is represented by $ X_i$ . It is known that the user is now in state $ S_1$ . In this state, let $ H (X_i|s_1)$ denote the entropy when observing the next symbol $ X_i$ , find the value of $ H (X_i|s_1)$ , entropy of this information source, Calculate $ H (X_i|X_{i-1}) $ and $ H (X_i)$ respectively. Assume $ i$ is quite large

How can I find $ H(X_i|s_1)?$ I know that $ $ H(X_i|s_1) = -\sum_{i,s_1} p\left(x_i, s_1\right)\cdot\log_b\!\left(p\left(x_i|s_1\right)\right) = -\sum_{i,j} p\left(x_i, s_1\right)\cdot\log_b\!\left(\frac{p\left(x_i, s_1\right)}{p\left(s_1\right)}\right)$ $ but I don’t know $ p(s_1)$ .

$ $ A=\begin{pmatrix}0.25 & 0.75 & 0\0.5 & 0 & 0.5 \0 & 0.7 & 0.3 \end{pmatrix}.$ $

From matrix I can know that $ p(s_1|s_1)=0.25$ , etc.

But what is the probability of $ s_1$ ? And how can I calculate $ H (X_i|X_{i-1})$ ?

Calculating spatial “orderedness” at each position in a binary image: entropy? lacunarity?

Let’s say I have image where all pixel values are either 0 or 1. What I’d like to do is to be able to generate a new image with the same dimensions where each pixel represents how “ordered” the area around the corresponding pixel in the original image is. In particular, I’m looking for “spatial” order: whether or not there is some regularity or pattern in that local area. This could then be used to segment in image into regions of relative order and regions of relative disorder.

For example:

ordered1 and ordered2

are both highly ordered. On the other hand, disordered

probably has varying levels of order within the image but is overall disorded. Finally, an image like

mixed

has areas of order (bottom left and to some extent top right) and disorder (rest of the image).

I’ve considered taking some general measure of entropy (like Shannon’s image entropy) and applying it with a moving window across the image, but my understanding is that most measures of entropy do not capture much about the spatial aspects of the image. I’ve also come across the concept of “lacunarity” which looks promising (it’s been used to segment e.g., anthropogenic structures from natural landscapes on the basis of homogeneity) but I’m having a hard time understanding how it works and thus if it’s truly appropriate. Could either of these concepts be made to work for what I’m asking, or is there something else I haven’t considered?

Is sha256 a good function to derive keys from a secret of sufficient length and entropy?

Assuming I have a secret key of sufficient length and entropy (I get to decide the length and have a good random source).

I would like to generate 256 length keys by hashing the root key with the name of each key, ex:

key1 = sha256(rootKey +"key1")  key2 = sha256(rootKey +"key2")  ... keyN = sha256(rootKey +"keyN")  

Is the sha256 hash a good choice ?

If yes, what length should the root secret be ? I’m thinking 256 bit is pretty good, but it wouldn’t cost much to make it bigger…