Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Algorithm to print out a shuffled list, in-place and with O(1) memory

After reading this question I started to wonder: is it possible to have a shuffling algorithm which does not modify or copy the original list?

To make it clear:

Imagine you are given a list of objects. The list size can be arbitrary, but assume it's pretty large (say, 10,000,000 items). You need to print out the items of the list in random order, and you need to do it as fast as possible. However, you should not:

  • Copy the original list, because it's very large and copying would waste a LOT of memory (probably hitting the limits of available RAM);
  • Modify the original list, because it's sorted in some way and some other part later on depends on it being sorted.
  • Create an index list, because, again, the list is very large and copying takes all too much time and memory. (Clarification: this is meant any other list, which has the same number of elements as the original list).

Is this possible?

Added: More clarifications.

  1. I want the list to be shuffled in true random way with all permutations equally likely (of course, assuming we have a proper Rand() function to start with).
  2. Suggestions that I make a list of pointers, or a list of indices, or any other list that would have the same number of elements as the original list, is explicitly deemed as inefficient by the original question. You can create additional lists if you want, but they should be serious orders of magnitude smaller than the original list.
  3. The original list is like an array, and you can retrieve any item from it by its index in O(1). (So no doubly-linked list stuff, where you have to iterate through the list to get to your desired item.)

Added 2: OK, let's put it this way: You have a 1TB HDD filled with data items, each 512 bytes large (a single sector). You want to copy all this data to another 1TB HDD while shuffling all the items. You want to do this as fast as possible (single pass over data, etc). You have 512MB of RAM available, and don't count on swap. (This is a theoretical scenario, I don't have anything like this in practice. I just want to find the perfect algorithm.item.)

like image 764
Vilx- Avatar asked Dec 08 '09 12:12

Vilx-


3 Answers

Well it depends a bit on what kind of randomness you except for the shuffling, i.e. should all shufflings be as probable, or can the distribution be skewed.

There are mathematical ways to produce "random-looking" permutations of N integers, so if P is such a permutation from 0..N-1 to 0..N-1, you can just iterate x from 0 to N-1 and output list item L(P(x)) instead of L(x) and you have obtained a shuffling. Such permutations can be obtained e.g. using modular arithmetics. For example, if N is prime, P(x)=(x * k) mod N is a permutation for any 0 < k < N (but maps 0 to 0). Similary for a prime N, for example P(x)=(x^3) mod N should be a permutation (but maps 0 to 0 and 1 to 1). This solution can be easily expanded to non-prime N by selecting the least prime above N (call it M), permute upto M, and discard the permuted indices above N (similary below).

It should be noted that modular exponentiation is the basis for many cryptographic algorithms (e.g. RSA, Diffie-Hellman) and is considered a strongly pseudorandom operation by the experts in the field.

Another easy way (not requiring prime numbers) is first to expand the domain so that instead of N you consider M where M is the least power of two above N. So e.g. if N=12 you set M=16. Then you use bijective bit operations, e.g.

P(x) = ((x ^ 0xf) ^ (x << 2) + 3) & 0xf

Then when you output your list, you iterate x from 0 to M-1 and output L(P(x)) only if P(x) is actually < N.

A "true, unbiased random" solution can be constructed by fixing a cryptographically strong block cipher (e.g. AES) and a random key (k) and then iterating the sequence

AES(k, 0), AES(k, 1), ...

and outputting the corresponding item from the sequence iff AES(k,i) < N. This can be done in constant space (the internal memory required by the cipher) and is indistinguishable from a random permutation (due to the cryptographic properties of the cipher) but is obviously very slow. In the case of AES, you would need to iterate until i = 2^128.

like image 146
Antti Huima Avatar answered Oct 22 '22 06:10

Antti Huima


You're not allowed to make a copy, modify it, or keep track of which elements you've visited? I'm gonna say it's not possible. Unless I'm misunderstanding your third criteria.

I take it to mean you're not allowed to say, make an array of 10,000,000 corresponding booleans, set to true when you've printed the corresponding element. And you're not allowed to make a list of the 10,000,000 indices, shuffle the list, and print out the elements in that order.

like image 33
Ross Avatar answered Oct 22 '22 08:10

Ross


Here is a very simple proof that no PRNG scheme can work:

The PRNG idea has two phases: first, select a PRNG and its initial state; second, use the PRNG to shuffle the output. Well, there are N! possible permutations, so you need at least N! different possible start states, entering phase 2. This means that at the start of phase 2 you must have at least log2 N! bits of state, which isn't allowed.

However this does not rule out schemes where the algorithm receives new random bits from the environment as it goes. There might be, say, a PRNG that reads its initial state lazily and yet is guaranteed not to repeat. Can we prove there isn't?

Suppose we do have a perfect shuffling algorithm. Imagine we start running it, and when it's halfway done, we put the computer to sleep. Now the full state of the program has been saved somewhere. Let S be the set of all possible states the program could be in at this halfway mark.

Since the algorithm is correct and guaranteed to terminate, there is a function f which, given the saved program state plus any long enough string of bits, produces a valid sequence of disk reads and writes completing the shuffle. The computer itself implements this function. But consider it as a mathematical function:

f : (S × bits) → sequence of reads and writes

Then, trivially, there exists a function g which, given only the saved program state, produces the set of disk locations yet to be read and written. (Simply pass some arbitrary string of bits to f, then look at the results.)

g : Sset of locations to read and write

The remaining bit of the proof is to show that the domain of g contains at least NCN/2 different sets regardless of the choice of algorithm. If that's true, there must be at least that many elements of S, and so the state of the program must contain at least log2NCN/2 bits at the halfway mark, in violation of the requirements.

I'm not sure how to prove that last bit, though, since either the set-of-locations-to-read or the set-of-locations-to-write can be low-entropy, depending on the algorithm. I suspect there's some obvious principle of information theory that can cut the knot. Marking this community wiki in the hopes someone will supply it.

like image 2
Mohammad Riyaz Avatar answered Oct 22 '22 07:10

Mohammad Riyaz