BL ACK HOLE COMPUTER may sound absurd butis proving to be a useful conceptual tool forresearchers studying cosmology and fundamentalphysics. And if physicists are able to create blackholes in particle accelerators — as some predict willbe possible within a decade — they may actuallyobserve them perform computation.AEGRE FR AGILIS catelliconubium santet incredibiliterquinquennalis appar at usbellis.Catelli suffragaritperspicax fiducia suis,quod pretosius umbraculiadquireret Caesar. Cathedrasagnascor quinquennalissaburre. quod pretosiusumbraculi adquireret Caesar.COPYRIGHT 2004 SCIENTIFIC AMERICAN, INC.
In keeping with the spirit of the age,researchers can think of the lawsof physics as computer programsand the universe as a computerB Y S E T H L L O Y D A ND Y. J A C K NGBLACK HOLECOMPUTERShat is the difference between a computer and a black hole?This question sounds like the start of a Microsoft joke, butit is one of the most profound problems in physics today.Most people think of computers as specialized gizmos: streamlined boxes sitting on a desk or fingernail-size chips embedded inhigh-tech coffeepots. But to a physicist, all physical systems arecomputers. Rocks, atom bombs and galaxies may not run Linux,but they, too, register and process information. Every electron,photon and other elementary particle stores bits of data, and everytime two such particles interact, those bits are transformed. Physical existence and information content are inextricably linked. Asphysicist John Wheeler of Princeton University says, “It from bit.”JE AN-FR ANCOIS PODE VINWw w w. s c ia m . c o mCOPYRIGHT 2004 SCIENTIFIC AMERICAN, INC.SCIENTIFIC A MERIC A N53
Black holes might seem like the exception to the rule that everything computes. Inputting information into thempresents no difficulty, but according toEinstein’s general theory of relativity,getting information out is impossible.Matter that enters a hole is assimilated,the details of its composition lost irretrievably. In the 1970s Stephen Hawkingof the University of Cambridge showedthat when quantum mechanics is takeninto account, black holes do have an output: they glow like a hot coal. In Hawking’s analysis, this radiation is random,however. It carries no information aboutwhat went in. If an elephant fell in, anelephant’s worth of energy would comeout— but the energy would be a hodgepodge that could not be used, even inprinciple, to re-create the animal.That apparent loss of informationposes a serious conundrum, because thelaws of quantum mechanics preserve information. So other scientists, including Leonard Susskind of Stanford University, John Preskill of the CaliforniaInstitute of Technology and Gerard ’tHooft of the University of Utrecht inthe Netherlands, have argued that theoutgoing radiation is not, in fact, random— that it is a processed form of thematter that falls in [see “Black Holesand the Information Paradox,” by Leonard Susskind; Scientific American,April 1997]. This past summer Hawking came around to their point of view.Black holes, too, compute.Black holes are merely the most exotic example of the general principlethat the universe registers and processesinformation. The principle itself is notnew. In the 19th century the foundersof statistical mechanics developed whatwould later be called information theoryto explain the laws of thermodynamics.At first glance, thermodynamics andinformation theory are worlds apart:one was developed to describe steam engines, the other to optimize communications. Yet the thermodynamic quantitycalled entropy, which limits the abilityof an engine to do useful work, turnsout to be proportional to the number ofbits registered by the positions and velocities of the molecules in a substance.The invention of quantum mechanics inthe 20th century put this discovery on afirm quantitative foundation and introduced scientists to the remarkable concept of quantum information. The bitsthat make up the universe are quantumbits, or “qubits,” with far richer properties than ordinary bits.Analyzing the universe in terms ofbits and bytes does not replace analyzing it in conventional terms such as forceand energy, but it does uncover new andsurprising facts. In the field of statistical mechanics, for example, it unknotted the paradox of Maxwell’s demon,a contraption that seemed to allowfor perpetual motion. In recent years,we and other physicists have been applying the same insights to cosmologyand fundamental physics: the nature ofblack holes, the fi ne-scale structure ofspacetime, the behavior of cosmic darkenergy, the ultimate laws of nature. Theuniverse is not just a giant computer; it isa giant quantum computer. As physicistOverview/Cosmic ComputersMerely by existing, all physical systems store information. By evolvingdynamically in time, they process that information. The universe computes. If information can escape from black holes, as most physicists now suspect,a black hole, too, computes. The size of its memory space is proportionalto the square of its computation rate. The quantum-mechanical nature ofinformation is responsible for this computational ability; without quantumeffects, a black hole would destroy, rather than process, information. The laws of physics that limit the power of computers also determine theprecision with which the geometry of spacetime can be measured. Theprecision is lower than physicists once thought, indicating that discrete“atoms” of space and time may be larger than expected. 54SCIENTIFIC A MERIC A NPaola Zizzi of the University of Padovasays, “It from qubit.”When Gigahertz Is Too Slowt h e c on f l u e nc e of physics and information theory flows from the centralmaxim of quantum mechanics: at bottom, nature is discrete. A physical system can be described using a finite number of bits. Each particle in the systemacts like the logic gate of a computer. Itsspin “axis” can point in one of two directions, thereby encoding a bit, and canfl ip over, thereby performing a simplecomputational operation.The system is also discrete in time.It takes a minimum amount of time toflip a bit. The exact amount is given bya theorem named after two pioneers ofthe physics of information processing,Norman Margolus of the MassachusettsInstitute of Technology and Lev Levitinof Boston University. This theorem isrelated to the Heisenberg uncertaintyprinciple, which describes the inherenttrade-offs in measuring physical quantities, such as position and momentum ortime and energy. The theorem says thatthe time it takes to flip a bit, t, dependson the amount of energy you apply, E.The more energy you apply, the shorterthe time can be. Mathematically, therule is t h/4E, where h is Planck’s constant, the main parameter of quantumtheory. For example, one type of experimental quantum computer stores bits onprotons and uses magnetic fields to flipthem. The operations take place in theminimum time allowed by the Margolus-Levitin theorem.From this theorem, a huge varietyof conclusions can be drawn, from limits on the geometry of spacetime to thecomputational capacity of the universeas a whole. As a warm-up, consider thelimits to the computational power ofordinary matter— in this case, one kilogram occupying the volume of one liter.We call this device the ultimate laptop.Its battery is simply the matter itself, converted directly to energy perEinstein’s famous formula E mc 2 .Putting all this energy into flipping bits,the computer can do 1051 operations persecond, slowing down gradually as theCOPYRIGHT 2004 SCIENTIFIC AMERICAN, INC.NOVEMBER 2004
EXTREME COMPUTINGCOMPUTATIONOUTPUTA keyboard and associated circuitryencode information as voltage pulsesin a wire.The pulses interact, guided by devicessuch as transistors, which performlogical operations such as NOT.The pulses, having been processed,are translated into meaningfulpatterns of light.Consisting of one kilogram of hotplasma in a one-liter box, this deviceaccepts data encoded as particlepositions, velocities and spins.The particles interact. Collisions canbe arranged to perform operationssuch as NOT: a collision can causeparticles to flip.As particles leave the volume, theirproperties can be measured andtranslated. The system slowly windsdown as its energy degrades.This black hole consists of onekilogram in a volume 10 –27 meterin radius. Data and instructions areencoded in matter and dropped in.On their descent, particles interactmuch as in the ultimate laptop, exceptthat gravity also plays a role. Thegoverning laws are not yet understood.The hole emits radiation, named afterphysicist Stephen Hawking. Newtheories suggest that the radiationcarries the computational output.SPEED: 1020 hertz MEMORY: 1031 bitsSPEED: 109 hertz MEMORY: 1012 bitsINPUTMEMORY: 1016 bitsULTIMATE LAPTOPalthough they may not accept input or give output in aform that is meaningful to humans. Natural computers areinherently digital: they store data in discrete quantumstates, such as the spin of elementary particles. Theirinstruction set is quantum physics. BLACK HOLEA L F R E D T. K A M A J I A NSPEED: 1035 hertzORDINARY LAPTOPWhat is a computer? That is a surprisingly complex question,but whatever precise definition one adopts, it is satisfiednot just by the objects people commonly call “computers”but also by everything else in the world. Physical objectscan solve a broad class of logic and mathematics problems,w w w. s c ia m . c o mCOPYRIGHT 2004 SCIENTIFIC AMERICAN, INC.SCIENTIFIC A MERIC A N55
energy degrades. The memory capacityof the machine can be calculated usingthermodynamics. When one kilogramof matter is converted to energy in a liter volume, its temperature is one billion kelvins. Its entropy, which is proportional to the energy divided by thetemperature, corresponds to 1031 bits ofinformation. The ultimate laptop storesinformation in the microscopic motionsand positions of the elementary particleszipping around inside it. Every single bitallowed by the laws of thermodynamicsis put to use.Whenever particles interact, theycan cause one another to flip. This process can be thought of in terms of a programming language such as C or Java:the particles are the variables, and theirinteractions are operations such as addition. Each bit can flip 1020 times persecond, equivalent to a clock speed ofWeak magnetic field(low energy)TimeStrong magnetic field(high energy)100 giga-gigahertz. In fact, the systemis too fast to be controlled by a centralclock. The time it takes a bit to flip is approximately equal to the time it takes asignal to travel from one bit to its neighbor. Thus, the ultimate laptop is highlyparallel: it acts not as a single processorbut as a vast array of processors, eachworking almost independently and communicating its results to the others comparatively slowly.By comparison, a conventional computer flips bits at about 109 times per second, stores about 1012 bits and containsa single processor. If Moore’s law couldbe sustained, your descendants would beable to buy an ultimate laptop midwaythrough the 23rd century. Engineerswould have to find a way to exert precisecontrol on the interactions of particlesin a plasma hotter than the sun’s core,and much of the communications bandwidth would be taken up in controllingthe computer and dealing with errors.Engineers would also have to solve someknotty packaging problems.In a sense, however, you can alreadypurchase such a device, if you know theright people. A one-kilogram chunk ofmatter converted completely to energy— this is a working definition of a 20megaton hydrogen bomb. An explodingnuclear weapon is processing a hugeamount of information, its input givenby its initial configuration and its outputgiven by the radiation its emits.From Nanotech to XennotechTimeFIRST L AW of quantum computation isthat computation takes energy. Thespin of a proton encodes a single bit,which can be inverted by applying amagnetic field. The stronger the fieldis — the more energy it applies — thefaster the proton will flip.56SCIENTIFIC A MERIC A Ni f a n y c h u n k of matter is a computer, a black hole is nothing more orless than a computer compressed to itssmallest possible size. As a computershrinks, the gravitational force that itscomponents exert on one another becomes stronger and eventually grows sointense that no material object can escape. The size of a black hole, called theSchwarzschild radius, is directly proportional to its mass.A one-kilogram hole has a radius ofabout 10 –27 meter. (For comparison,a proton has a radius of 10 –15 meter.)Shrinking the computer does not changeits energy content, so it can perform 1051operations per second, just as before.What does change is the memory capacity. When gravity is insignificant, thetotal storage capacity is proportional tothe number of particles and thus to thevolume. But when gravity dominates, itinterconnects the particles, so collectively they are capable of storing less information. The total storage capacity of ablack hole is proportional to its surfacearea. In the 1970s Hawking and JacobBekenstein of the Hebrew University ofJerusalem calculated that a one-kilogram black hole can register about 1016bits — much less than the same computerbefore it was compressed.In compensation, the black holeis a much faster processor. In fact, theamount of time it takes to flip a bit, 10 –35second, is equal to the amount of time ittakes light to move from one side of thecomputer to the other. Thus, in contrastto the ultimate laptop, which is highlyparallel, the black hole is a serial computer. It acts as a single unit.How would a black hole computerwork in practice? Input is not problematic: just encode the data in the form ofmatter or energy and throw it down thehole. By properly preparing the materialthat falls in, a hacker should be able toprogram the hole to perform any desiredcomputation. Once the material entersa hole, it is gone for good; the so-calledevent horizon demarcates the point of noreturn. The plummeting particles interactwith one another, performing computation for a finite time before reaching thecenter of the hole— the singularity— andceasing to exist. What happens to matterCOPYRIGHT 2004 SCIENTIFIC AMERICAN, INC.NOVEMBER 2004A L F R E D T. K A M A J I A NBy preparing the material that falls intoa black hole, A HACKER COULD PROGRAM ITto perform any desired computation.
as it gets squished together at the singularity depends on the details of quantumgravity, which are as yet unknown.The output takes the form of Hawking radiation. A one-kilogram hole givesoff Hawking radiation and, to conserveenergy, decreases in mass, disappearingaltogether in a mere 10 –21 second. Thepeak wavelength of the radiation equalsthe radius of the hole; for a one-kilogram hole, it corresponds to extremelyintense gamma rays. A particle detectorcan capture this radiation and decode itfor human consumption.Hawking’s study of the radiationthat bears his name is what overturnedthe conventional wisdom that black holesare objects from which nothing whatsoever can escape [see “The Quantum Mechanics of Black Holes,” by Stephen W.Hawking; Scientific American, January 1977]. The rate at which black holesradiate is inversely related to their size,so big black holes, such as those at thecenter of galaxies, lose energy much moreslowly than they gobble up matter. In thefuture, however, experimenters may beable to create tiny holes in particle accelerators, and these holes should explodealmost immediately in a burst of radiation. A black hole can be thought of notas a fixed object but as a transient congregation of matter that performs computation at the maximum rate possible.CLASSIFYING COMPUTERSThe ultimate laptop and black hole computer embody two different approachesto increasing computing power. The ultimate laptop is the supreme parallelcomputer: an array of processors working simultaneously. The black hole is thesupreme serial computer: a single processor executing instructions one at a time.0.1 mUltimate laptop consistsof a collection of particlesthat encode and processbits. Each can execute aninstruction in 10 –20 second.In that time, signals canmove a distance of only3 10 –12 meter, which isroughly the spacing betweenparticles. Therefore,communication is muchslower than computation.Subregions of the computerwork almost independently.Signal3 10–12 mBlack hole1.5 10–27 mBlack hole computer alsoconsists of a collectionof particles. Because ofgravity, they encode fewerbits, giving more energyper bit. Each can execute aninstruction in 10 –35 second,which is the time it takes fora signal to cross the hole.Therefore, communicationis as fast as computation.The computer operatesas a single unit.A L F R E D T. K A M A J I A NEscape Plant h e r e a l q u e s t i o n is whetherHawking radiation returns the answerof the computation or merely gibberish. The issue remains contentious, butmost physicists, including Hawking,now think that the radiation is a highlyprocessed version of the informationthat went into the hole during its formation. Although matter cannot leave thehole, its information content can. Understanding precisely how is one of theliveliest questions in physics right now.Last year Gary Horowitz of the University of California at Santa Barbaraand Juan Maldacena of the Institute forAdvanced Study in Princeton, N.J., outlined one possible mechanism. The escape hatch is entanglement, a quantumphenomenon in which the properties ofw w w. s c ia m . c o mtwo or more systems remain correlatedacross the reaches of space and time.Entanglement enables teleportation, inwhich information is transferred fromone particle to another with such fidelity that the particle has effectively beenbeamed from one location to another atup to the speed of light.The teleportation procedure, whichhas been demonstrated in the laboratory, first requires that two particles beentangled. Then a measurement is performed on one of the particles jointlywith some matter that contains information to be teleported. The measurementerases the information from its originallocation, but because of entanglement,that information resides in an encodedform on the second particle, no matterCOPYRIGHT 2004 SCIENTIFIC AMERICAN, INC.how distant it may be. The informationcan be decoded using the results of themeasurement as the key [see “QuantumTeleportation,” by Anton Zeilinger; Scientific American, April 2000].A similar procedure might work forblack holes. Pairs of entangled photonsmaterialize at the event horizon. Oneof the photons flies outward to becomethe Hawking radiation that an observersees. The other falls in and hits the singularity together with the matter thatformed the hole in the fi rst place. Theannihilation of the infalling photon actsas a measurement, transferring the information contained in the matter to theoutgoing Hawking radiation.The difference from laboratory teleportation is that the results of this “meaSCIENTIFIC A MERIC A N57
Evolution of Black Hole Theory“Objects so dense that nothing, not even light, can escape”—this definition of black holes has become a cliché of newspaperarticles and freshman astronomy lectures. But it is probablywrong. Physicists have argued since the mid-1970s thatCL ASSICAL VIEW, based onprequantum physics, holdsthat a blob of matter fallingthrough the hole’s outerrim—the event horizon — canneither escape nor send out itsinformation. It hits the centerof the hole — the singularity —where its mass is assimilatedand its information lost.energy can leak out of a black hole, and most now thinkthat information (which describes the form that the energytakes) can, too. These diagrams show a black hole from ahypothetical viewpoint outside spacetime.HOROWITZ-MALDACENA MODELVirtual-particle pairsuggests that the outgoingparticle carries away not justraw mass but also information.The particle is quantummechanically entangled withits infalling partner, which inturn gets entangled with theblob. The entanglement beamsthe blob’s information out.MatterEvent horizonHAWKING MODEL is a firstsurement” are not needed to decode theinformation that was teleported. Horowitz and Maldacena argued that the annihilation does not have a variety of possible outcomes — only one. An observeron the outside can calculate this uniqueoutcome using basic physics and therebyunlock the information. It is this conjecture that falls outside the usual formulation of quantum mechanics. Though controversial, it is plausible. Just as the initial singularity at the start of the universemay have had only one possible state, soit is possible that the final singularities inside black holes have a unique state. This58SCIENTIFIC A MERIC A Npast June one of us (Lloyd) showed thatthe Horowitz-Maldacena mechanism isrobust; it does not depend on what exactly the final state is, as long as there isone. It still seems to lead to a small loss ofinformation, however.Other researchers have proposed escape mechanisms that also rely on weirdquantum phenomena. In 1996 AndrewStrominger and Cumrun Vafa of Harvard University suggested that black holesare composite bodies made up of multidimensional structures called branes,which arise in string theory. Information falling into the black hole is storedQuantumteleportationin waves in the branes and can eventuallyleak out. Earlier this year Samir Mathurof Ohio State University and his collaborators modeled a black hole as a gianttangle of strings. This “fuzzyball” actsas a repository of the information carried by things that fall into the blackhole. It emits radiation that reflects thisinformation. Hawking, in his recent approach, has argued that quantum fluctuations prevent a well-defined event horizon from ever forming [see “Hawkinga Theory,” by Graham P. Collins; NewsScan, October]. The jury is still out onall these ideas.COPYRIGHT 2004 SCIENTIFIC AMERICAN, INC.NOVEMBER 2004A L F R E D T. K A M A J I A NSingularitystab at considering quantumeffects. Pairs of virtualparticles materialize at theevent horizon (red and blueballs). One member of eachpair, like other matter, fallsto the singularity. Its partnerflies outward. The particlespins are random and do notcarry any information aboutthe infalling blob.
Understanding how information couldLEAVE A BLACK HOLE is one of theliveliest questions in physics right now.CyberspacetimeTHE AUTHORSt h e p rop e r t i e s of black holes areinextricably intertwined with those ofspacetime. Thus, if holes can be thoughtof as computers, so can spacetime itself. Quantum mechanics predicts thatspacetime, like other physical systems,is discrete. Distances and time intervalscannot be measured to infinite precision;on small scales, spacetime is bubbly andfoamy. The maximum amount of information that can be put into a regionof space depends on how small the bitsare, and they cannot be smaller than thefoamy cells.Physicists have long assumed that thesize of these cells is the Planck length (lP)of 10 –35 meter, which is the distance atwhich both quantum fluctuations andgravitational effects are important. Ifso, the foamy nature of spacetime willalways be too minuscule to observe. Butas one of us (Ng) and Hendrik van Damof the University of North Carolina atChapel Hill and Frigyes Károlyházy ofEötvös Loránd University in Hungaryhave shown, the cells are actually muchlarger and, indeed, have no fi xed size:the larger a region of spacetime, the larger its constituent cells. At first, this assertion may seem paradoxical— as thoughthe atoms in an elephant were biggerthan those in a mouse. In fact, Lloyd hasderived it from the same laws that limitthe power of computers.The process of mapping the geometryof spacetime is a kind of computation, inwhich distances are gauged by transmitting and processing information. Oneway to do this is to fill a region of spacewith a swarm of Global Positioning System satellites, each containing a clockand a radio transmitter [see illustrationon next page]. To measure a distance,a satellite sends a signal and times howlong it takes to arrive. The precision ofthe measurement depends on how fastthe clocks tick. Ticking is a computational operation, so its maximum rate isgiven by the Margolus-Levitin theorem:the time between ticks is inversely proportional to the energy.The energy, in turn, is also limited.If you give the satellites too much energyor pack them too closely together, theywill form a black hole and will no longerbe able to participate in mapping. (Thehole will still emit Hawking radiation,but that radiation has a wavelength thesize of the hole itself and so is not usefulfor mapping features on a fi ner scale.)The maximum total energy of the constellation of satellites is proportional tothe radius of the region being mapped.Thus, the energy increases moreslowly than the volume of the regiondoes. As the region gets bigger, the cartographer faces an unavoidable tradeoff: reduce the density of satellites (sothey are spaced farther apart) or reduceSETH LLOYD and Y. JACK NG bridge the two most exciting fields of theoretical physics:quantum information theory and the quantum theory of gravity. Lloyd, professor of quantum-mechanical engineering at the Massachusetts Institute of Technology, designed thefirst feasible quantum computer. He works with various teams to construct and operatequantum computers and communications systems. Ng, professor of physics at the University of North Carolina at Chapel Hill, studies the fundamental nature of spacetime. Hehas proposed various ways to look for the quantum structure of spacetime experimentally. Both researchers say their most skeptical audience is their family. When Lloydtold his daughters that everything is made of bits, one responded bluntly: “You’re wrong,Daddy. Everything is made of atoms, except light.” Ng has lost credibility on the subjectbecause he is always having to turn to his sons for help with his computer.w w w. s c ia m . c o mCOPYRIGHT 2004 SCIENTIFIC AMERICAN, INC.the energy available to each satellite (sothat their clocks tick more slowly). Either way, the measurement becomes lessprecise. Mathematically, in the time ittakes to map a region of radius R, thetotal number of ticks by all the satellitesis R 2 /l P2. If each satellite ticks preciselyonce during the mapping process, thesatellites are spaced out by an averagedistance of R1/3l P2/3. Shorter distancescan be measured in one subregion butonly at the expense of reduced precisionin some other subregion. The argumentapplies even if space is expanding.This formula gives the precision towhich distances can be determined; it isapplicable when the measurement apparatus is just on the verge of becoming ablack hole. Below the minimum scale,spacetime geometry ceases to exist. Thatlevel of precision is much, much biggerthan the Planck length. To be sure, it isstill very small. The average imprecisionin measuring the size of the observableuniverse is about 10 –15 meter. Nevertheless, such an imprecision might be detectable by precise distance-measuringequipment, such as future gravitationalwave observatories.From a theorist’s point of view, thebroader significance of this result is thatit provides a new way to look at blackholes. Ng has shown that the strangescaling of spacetime fluctuations withthe cube root of distances provides aback-door way to derive the BekensteinHawking formula for black hole memory. It also implies a universal bound forall black hole computers: the number ofbits in the memory is proportional to thesquare of the computation rate. The proportionality constant is Gh/c5 — mathematically demonstrating the linkagebetween information and the theoriesof special relativity (whose definingparameter is the speed of light, c), general relativity (the gravitational conSCIENTIFIC A MERIC A N59
Computing SpacetimeMeasuring distances and time intervals is a type of computation and falls under the same constraints that computers do.It turns out that measurement is a much more slippery process than physicists had thought.TO MAP A VOLUME of space, you might usea constellation of Global Positioning Systemsatellites. They make measurements by sendingsignals and timing their arrival. For maximumprecision, you need lots of satellites. But thenumber of satellites is limited: too many, and theentire system will collapse to a black hole. – 2 x 10–22 cm RADIUS: 100 kmSATELLITES: 4SPACING: 90 km RADIUS: 200 kmSATELLITES: 8SPACING: 150 kmINCREASE IN ERROR: 26% – 3 x 10–22 cmMEASUREMENT UNCERTAINT Y is thus not fi xedbut can vary with the size of the object beingmeasured. The larger the object is, the fuzzier itsdetailed structure. That differs from everydaylife, in which the measurement imprecision isindependent of the object and depends only onhow finely subdivided your ruler is. It is as thoughyour choice of what to measure affects thefine-scale structure of spacetime.it also follows directlyfrom the fundamentalquantum limits to the precision of measurement.The Answer Is . 42stant, G) and quantum mechanics (h).Perhaps most significantly, the resultleads directly to the holographic principle, which suggests that our three-dimensional universe is, in some deep butunfathomable way, two-dimensional.The maximum amount of informationthat any region of space can store seemsto be proportional not to its volume butto its surface area [see “Information inthe Holographic Universe,” by Jacob D.Bekenstein; Scientific American, August 2003]. The holographic principle isnormally thought to arise from the unknown details of quantum gravity, yet60SCIENTIFIC A MERIC A Nt h e p r i n c i p l e s of computationcan be applied not just to the most compact computers (black holes) and tiniestpossible computers (spacetime foam) butalso to the largest: the universe. The universe may well be infinite in extent, but ithas existed a finite length of time, at leastin its present form. The observable partis currently some tens of billions of lightyears across. For us to know
COMPUTERS W hat is the difference between a computer and a black hole? This question sounds like the start of a Microsoft joke, but it is one of the most profound problems in physics today. Most people think of computers as specialized gizmos: stream-lined boxes sitting on a desk o