I am reading the book Computational Complexity: A Modern Approach and I am having problems understanding oblivious Turing machines.
An oblivious Turing machine (TM) is such a TM that the movement of its heads is entirely determined by the length of the input. That is, the TM is oblivious to its input. So far so good.
But one of the excercises is to prove the following theorem:
If a language L is decidable in time T(n)
then there exists an oblivious TM that decides L in time O(T(n)^2).
It is obvious that the oblivious TM must not operate on the original input of L
but at some coded version. That is, the gist of the theorem is the coding of a bitstring to an integer (length of the input of the oblivious TM). But if one would want to code the set of possible inputs of L
(bitstrings) to an integer, one would run into very high numbers fairly quickly since there are 2^n
bitstrings of length n
.
Am I understanding the problem correctly? How do you prove the theorem?
Here I suggest you read this paper. It is a rather interesting and wonderful paper that will give you a proof at a lower time bound that requested. (I think you should be able to finagle it to be O(N^2) or you could conclude that O(N*log(N)) is technically O(N^2) but following this proof directly may upset your professor. I imagine he intends for you to approach it differently.
Edit:
The original link to the paper no longer works. Here is another publicly posted one.
http://www-dev.ccs.neu.edu/home/viola/classes/papers/PippengerF-Oblivious.pdf
"Relations Among Complexity Measures" by Michael J. Fischer and Nicholas Pippenger, 1979.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With