![whats entropy whats entropy](https://i.ytimg.com/vi/E5m5H9FXsXU/maxresdefault.jpg)
Let’s say you had started your counter in the hash chain at 1, and had used no more than 128 values so far. That number is the same as the base-2 logarithm of the number of possible secrets that could have been drawn, or equivalently the number of digits it would take to write out the number of possible secrets in binary.
#Whats entropy series#
I’d have a much harder time cracking your series if I only knew that you were using the same algorithm but started somewhere between 1 and 1,000,000 instead of at 1.Ĭlaude Shannon‘s concept of entropy is essentially this: count up the minimum number of yes/no questions it would take to figure out whatever your secret is. If I knew you were doing this, and that you’d only used up 100 or so numbers so far, it wouldn’t take me long to do the same thing and figure out where you were in the series. The result will be a hash chain, a “random” looking series of numbers that is entirely predictable - that’s the “pseudo” in PRNG. So how does a computer, a deterministic machine, harvest entropy for that seed in the first place? And how can you make sure you’ve got enough? And did you know that your Raspberry Pi can be turned into a heavy-duty source of entropy? Read on!Įverything You Needed to Know About Entropy…īut first, to hammer von Neumann’s point home, here’s an algorithm for a PRNG that will pass any statistical test you throw at it: start counting up integers from one, hash that counter value, increase the counter by one, and append the new counter value to the previous hash and hash it again. Almost anything that your computer wants to keep secret will require the generation of a secret random number at some point, and any series of “random” numbers that a computer generates will have only as much entropy, and thus un-guessability, as the seed used. And while “un-guessability” isn’t a well-defined mathematical concept, or even a real word, entropy is. This shouldn’t be taken as a rant against PRNGs, but merely as a reminder that when you use one, the un-guessability of the numbers that it spits out is only as un-guessable as the seed.
![whats entropy whats entropy](https://image.slidesharecdn.com/entropyinphysicsbiologyandinthermodynamics-161229065435/95/entropy-in-physics-biology-and-in-thermodynamics-2-638.jpg)
But if you know, or can guess, the seed that started the PRNG off, you know all past and future values nearly instantly it’s a purely deterministic mathematical function.
![whats entropy whats entropy](https://i.pinimg.com/originals/c7/91/48/c79148806fecf00cf0b4355c1cb949e9.jpg)
Granted, if you come in the middle of a good PRNG sequence, guessing the next number is nearly impossible.
![whats entropy whats entropy](https://i.ytimg.com/vi/3ENg6vmLUaM/hqdefault.jpg)
#Whats entropy generator#
What von Neumann is getting at is that the “pseudo” in pseudorandom number generator (PRNG) is really a synonym for “not at all”. For, as has been pointed out several times, there is no such thing as a random number - there are only methods to produce random numbers, and a strict arithmetic procedure of course is not such a method.” Let’s start off with one of my favorite quotes from John von Neumann: “Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin.