ECKERSLEY AND SANDBERG
bound on the speed of first-generation brain emulations would be the number of CPUs available
for the task, or the amount of money and electricity available to purchase and power them. The
bound might be high or low, relative to humans, but it would be proportional to the availability of
larger installed computer base.
Dedicated hardware particularly suited for brain emulation might also provide speedups over
generic CPUs. Typically such specializations provide between one and two orders of magnitude
speedup. (Sandberg and Bostrom 2008)
Some applications may require interacting with the physical world at its own speed; there are
tasks (social interaction, physical action) that cannot be accelerated beyond a certain point. Fast
computation also typically comes at an energy cost: if supercomputing remains energy-limited in
the future the speed of emulations will depend on economic trade-offs. Despite these constraints,
it is probable that many emulations will find reasons and means to run much faster than human
2.4 Emulation autonomy would be fragile
Emulation autonomy can be threatened in all the same ways as human autonomy can be
threatened (threats of pain, social pressure, imprisonment, brainwashing etc.), but there are new
possibilities that suggest that their autonomy might be more vulnerable.
Suppose an agent Alice (who might be human, or an emulation) possesses a digital copy of
the full neural state of an emulation, who we will call Aesop. Suppose further that Alice has
access to enough storage and computational resources to make further copies of the emulation
and run some of these copies.
Alice can instantiate Aesop. She can control the virtual reality environment in which Aesop
finds himself: his senses (or his attempts to use communications systems) can only tell him about
the world to the extent that Alice allows this. Furthermore, she could construct fake stories and
details of reality to misdirect him. If necessary, she could slow or freeze the rate at which he is
emulated, in order to determine off-line the most convincing virtual reality response to one of his
It seems that Alice can persuade Aesop to do almost anything. In particular, she can copy a
state, and then attempt to persuade him in way A. If he refuses, she can restore the old state, and
then attempt to use persuasive method B. There is no bound on the number of persuasive
techniques she might try. The instant that Alice has persuaded Aesop to perform a single task for
her she can pause and make a copy of his mental state before she tells him the details of the task.
Thereafter, she can reinstantiate that state and hand Aesop a different problem to solve. The best
Aesop could do to defend himself against Alice's predations would be to constantly insist on
interacting with the physical Earth in complicated ways, hoping that Alice could not fake such
interactions. But he would be constantly vulnerable to trickery, constantly in danger of
performing tasks that served Alice's ends rather than his own.
Once Alice has done this, Aesop appears to be virtually enslaved to her. Aesop, or at least
this copy of him, no longer possesses autonomy.
An emulation that owns or has effective control over the hardware necessary for her own
existence would normally enjoy autonomy. But any time that the physical or software security of
those systems was compromised, the agent would face the risk that someone might make non-
autonomous, enslaveable copies of their mental states. Presuming that the emulation was able to
Download Date | 10/28/15 10:39 PM