In the early 1980's physicist Richard Feynman observed that no classical computer could simulate quantum mechanical systems without incurring exponential slowdown. [WC98] At the same time, it seems reasonable that a computer which behaves in a manner consistent with quantum mechanics could, in principle, simulate such systems without exponential slowdown. This possible violation of the Polynomial Church-Turing thesis: ``any reasonable attempt to model mathematically computer algorithms and their time performance is bound to end up with a model of computation and associated time cost that is equivalent to Turing machines within a polynomial.'' [EID00][Papadimitriou94] Piqued much interest in the field.

At the same time, the evolution of classical computers has seen the size of transistors and memory elements shrink exponentially. These components can not continue this trend indefinitely and still behave in a classical manner. At the current rate sometime around 2020 the number of atoms used to represent a single bit of information will be one. [WC 98] At this scale the quantum behavior of the memory element must be dealt with.