... if you would class a UTM as not conscious, but some more complex machine as conscious (such as the human brain, if you accept that it is a machine), then consider what happens between these extremes: is there a gradual emergence of consciousness with increasing complexity, or is there a sudden point at which consciousness becomes possible?
Is an organic brain so different in it's functionality that it cannot be mapped to a Turing Machine? Two arguments against such a mapping that I've seen are that a brain has states that cannot be finitely measured, and that a brain follows patterns that cannot be measured mathematically.
To the first argument, it seems that anything measurable in the real world is essentially analog, a matter of resolution. And resolution becomes meaningless at some point. For instance, we can declare the width of the USA in inches, but it won't be accurate because each second waves and tide change the width. Or we can declare the weight of a chicken egg in micrograms, but almost every egg encountered will not match that weight. A more sophisticated measurement is necessary, perhaps an average with a standard deviation. The expression "A chicken egg weighs 60g +/- 10g" is finite.
The second argument--that the brain follows patterns that cannot be measured mathematically--is purported to be a reference to a famous proof by Godel.