So a few days ago, I got an amusing idea for an interview question which I realized was totally pointless as an interview question, because it has no practical value whatsoever. So instead, I'm going to post it on my blog, as a way to help waste the time of all my CS friends. There is no prize whatsoever for a correct answer, except for the satisfaction of having avoided work for a while solved an amusing problem.
Here are two really bad ways to sort an array:
Random sort: Repeatedly select a random permutation and apply it to the set. Stop when it becomes sorted.
Brute-force sort: Iterate over the set of all permutations of N elements. Apply each in turn. If the list is now sorted, stop.
The question is: which of the two is less efficient, and (the trickier part) by how much?
(Clarification: For the latter, "how much" in terms of average [mean] time to sort. You can also average over a large number of possible inputs)
Clarification just so I know what you're getting at: When you say "Repeatedly apply a random permutation to the set" does that mean "arrange the set's elements in an entirely random order irrespective of its previous state" or are you applying a single, randomly selected permutation repeatedly to subsequent iterations of the set?
That said, if you meant the former, the first part of the answer is that (2) is more efficient, as it will never repeat. If you meant the latter, (2) is the only one guaranteed to give a solution, as a single, re-used random permutation may have a cycle through a subset of the solution space, whereas brute force covers the entire solution space.
Clarifying now the question: When you say "by how much", there's no real answer, as (1) has no guarantee of a solution through any finite number of iterations (again, assuming its possible permutations cover the entire solution space.) That forces us to pick an expected chance of success in order to come up with a meaningful comparison of the two.
For the first thing: Yes, I meant the former. The latter wouldn't work. :)
For the latter: Mean runtime over a large, uniformly selected sample of input arguments. (i.e., input arguments are equally likely to be in any permutation of sorted order)
April 30 2010, 17:58:43 UTC 5 years ago
That said, if you meant the former, the first part of the answer is that (2) is more efficient, as it will never repeat. If you meant the latter, (2) is the only one guaranteed to give a solution, as a single, re-used random permutation may have a cycle through a subset of the solution space, whereas brute force covers the entire solution space.
Clarifying now the question: When you say "by how much", there's no real answer, as (1) has no guarantee of a solution through any finite number of iterations (again, assuming its possible permutations cover the entire solution space.) That forces us to pick an expected chance of success in order to come up with a meaningful comparison of the two.
April 30 2010, 18:07:48 UTC 5 years ago
For the latter: Mean runtime over a large, uniformly selected sample of input arguments. (i.e., input arguments are equally likely to be in any permutation of sorted order)
April 30 2010, 18:19:35 UTC 5 years ago
However, I should point out that if you were interviewing Teela Brown, her randomly selected sample would already be sorted.