Showing posts with label Simulation Hypothesis. Show all posts
Showing posts with label Simulation Hypothesis. Show all posts

Wednesday, June 15, 2016

Further on Base Reality

Just trying to see the context of possibilities under a paradigmatic viewing of a simulation of Space and Time.

3. Wheeler. His phrase “It from Bit” implies that at a deep level, everything is information.

“Physicists have now returned to the idea that the three-dimensional world that surrounds us could be a three- dimensional slice of a higher dimensional world.” (L. Randall, 2005) p52

If you probe the black hole with more energy, it just expands its horizon to reveal no more, so what occurs below the Planck length is unknown.
There is a Big Rip versus Big Bang as to the beginning, and at a fundamental level the question that such a base reality had to emerge from a quantum reality.

Quantum Realism: Every virtual reality boots up with a first event that also begins its space and time. In this view, the Big Bang was when our physical universe booted up, including its space-time operating system. Quantum realism suggests that the big bang was really the big rip.
See number 10

Its not really a God of the gaps but a limitation on what we can know, so quantum realism sets in as to what the base reality arises from. Yet, information, is being released from the black hole.

Such visualization of what is happening between a Q <-> Q measure is a visualization of change of energy(pulling apart)as the idea of distance requiring greater energies. Reduction-ism had run its limits in the black hole, so what happens inside it looked at from the boundary.

Theoretical excursions from standard views do represent paradigmatic changes regarding cognitive function, as a slip between two changes in our thought functions(the vase) and Kuhn is worth repeating.

But more then that, regardless of these new views, things already existed in place for me governing sources that were attained in my own explorations of consciousness, so these in the view of this paradigmatic change, have new words covering what existed as experience before this simulation hypothesis an access too, what is believed to be quantum realism, as an informational source. You show two medium's using some of the changes in views. This is a method I am using as well.

So that "one thing" is quite wide and pervasive in my opinion as to all information existing...much like...Jung's collective unconscious.....which settle's through the tip of the pyramid as a inverse action of what is emergence, and as, what becomes the base reality. A schematic transference of the idea settling into the nature as a base of the pyramid. It would seem correlative toward Plato point up to that one thing. Quantum realism? How ideas filter down into the base reality.

So it really is an interactive exchange, as an inductive/deductive approach toward reality. This is part of my discovery regarding the geometrical underpinnings I mentioned. This development of thought is being seen as I learn of what is going on in my views of science today.

So this has become part of my work to understand how mathematics being developed along with quantum theory is being geometrically realized, as an excursion traveling deep into consciousness that I explored regarding mandala interpretations of experiences attained in lucid dreaming. Layers of interpretation.

Long before understanding and learning of Euclid, I was drawing images of point line plane, and images developed further from this.

It was interesting to see the idea of illusion set in as what was once seen and what is true as a picture present as part of the video demonstration in a follow up. This somehow fits with the idea of quantum cognition as it is seen at these subtle shifts of Paradigmatic change in my view.

Such acceptance proliferates to a new perspective and acceptance of the world seen as the simulation hypothesis. This to me would explain part of one's acceptance of the hypothesis since 2009.


See Also:

  • Base Reality
  • Wednesday, June 08, 2016

    Base Reality?

    Elon Musk, "There’s a one in billions chance we’re in base reality" Written by Jason Koebler, Motherboard.
    The strongest argument for us being in a simulation, probably being in a simulation is the following: 40 years ago, we had pong, two rectangles and a dot,” Musk said. “That is what games were. Now 40 years later we have photorealistic 3D simulations with millions of people playing simultaneously and it’s getting better every year. And soon we’ll have virtual reality, augmented reality, if you assume any rate of improvement at all, the games will become indistinguishable from reality.

    Progression can stop in this base reality.....and science goes no further due to a calamity that wipes out the human race?

    A thought that stuck out in my mind.

    As I went through comparative labels,  some things that came up were regarding quantum gravity, or the physics of organic chemistry. How would a simulation hypothesis explain these things.

    Click the image to open in full size.
    This symbol was used to demonstrate in a global sense that everything is derived from bits. Taken from a speech given by John Archibald Wheeler in 1999.  Also from, J. A. Wheeler: Journey into Gravity and Spacetime (Scientific American Library, Freeman, New York, 1990),  pg. 220

     Abstraction lives in the land of the simulations as information for consciousness? It only becomes real,  physically, as a matter orientated state of expression?

     But in the same breathe,

        To my mind there must be, at the bottom of it all,
        not an equation, but an utterly simple idea.
        And to me that idea, when we finally discover it,
        will be so compelling, so inevitable,
        that we will say to one another,
        “Oh, how beautiful !
        How could it have been otherwise?” From a personal notebook of Wheeler circa 1991

    An idea then.

    It was designed by the RobotCub Consortium, of several European universities and is now supported by other projects such as ITALK.[1] The robot is open-source, with the hardware design, software and documentation all released under the GPL license. The name is a partial acronym, cub standing for Cognitive Universal Body.[2] Initial funding for the project was 8.5 million from Unit E5 – Cognitive Systems and Robotics – of the European Commission's Seventh Framework Programme, and this ran for six years from 1 September 2004 until 1 September 2010.[2]

    The motivation behind the strongly humanoid design is the embodied cognition hypothesis, that human-like manipulation plays a vital role in the development of human cognition. A baby learns many cognitive skills by interacting with its environment and other humans using its limbs and senses, and consequently its internal model of the world is largely determined by the form of the human body. The robot was designed to test this hypothesis by allowing cognitive learning scenarios to be acted out by an accurate reproduction of the perceptual system and articulation of a small child so that it could interact with the world in the same way that such a child does.[3]


    See Also:


    Tuesday, January 07, 2014

    Is Reality a Virtual Simulation?

    To my mind there must be, at the bottom of it all,
    not an equation, but an utterly simple idea.
    And to me that idea, when we finally discover it,
    will be so compelling, so inevitable,
    that we will say to one another,
    “Oh, how beautiful !
    How could it have been otherwise?”
    From a personal notebook of Wheeler circa 1991
    Click the image to open in full size.
    This symbol was used to demonstrate in a global sense that everything is derived from bits. Taken from a speech given by John Archibald Wheeler in 1999. Also from, J. A. Wheeler: Journey into Gravity and Spacetime (Scientific American Library, Freeman, New York, 1990), pg. 220
    The Last Question. Of course in science fiction we like to popularize things. As if, the model itself, has yet to become the real thing? Can one say that there model is better, while they say all other models are insufficient? They have to be speaking from a framework right? In that sense Asimov was a visionary, that brought the dream of, to become something real?

    As I was reading I got this impression of the "they( as some grand designer)" as if the designer of the monolith, and those without freewill, apes. I know it's just a story, but the first story to me that somehow as story tellers, we'd given the impression that Hal, was imbued with something more then we had come to know in the beginning days of computer intelligence. In another sense still, moon dwellers, with no freewill.

    So in that sense there was this drive to apply human capabilities to a machine, and thus all humanity expression in terms of this machinist attribute of being? So at some point the Frankenstein(a biological design robot) becomes alive through our efforts to construct this live emotive thing we call a robot.

    So it is as if the simulation had taken on this elevation of sorts, as to say, that the machine had graduated once having realized that such a robot could dream, and thus a culmination of all things possible for a human being, had somehow now become that simulation of reality? A Second Life?

    So we have this godlike power now. And in place of sending machines to distant planets to gather information and to do our bidding in an effort to gather information, some "they" beyond the parameters of our seemingly capable world of science found out, that the biological robot had already been designed? Say what?

    Click the image to open in full size.

    Monday, November 25, 2013

    A Universe on the Other Side

     "I think people thought that the universe was smaller, yet discoveries in the last century have found there are black holes everywhere, billions of black holes in our universe and each may produce a universe on the other side, like an infinite tree," he said. - See more at: New Hit Film ‘Gravity’ Speaks to Our Endless Fascination with Deep Space - See more at:

    Just to help here given a platform with which to consider,  the question of," Dr. Poplawski from the University of New Haven, Connecticut, concluded that each time a black hole forms, a new universe could form within it."

    One is always looking for evidence of such things. The very contention of black hole itself has to have had a basis with which to consider. So we may say these black holes are real.

    While the subject provides many things to consider how does Dr. Poplawski provide evidence for such a statement of universe within universe?

    If you look at closely at align perspectives with which to examine this it may help to look at how one is perceiving the idea of the universe? For cosmology they may say that in a coordinated system there is no before or after, just what exists as is? So any notion of what came before this universe or what is to come after is hard pill to swallow.

    For some of us it is not a problem. For me then the idea is that in local regions of the universe, information is crunched in order to be dissipated in the larger universe. This supplies the motivation for expansion, and at the same time the evolving nature of the universe has to have more black holes in order to summat the existence of any cosmological constant that is to be considered positive?

    The question of entropy,  as it existed in the early universe? How does this figure into the ability for any new universe to form? You are confronted with the notion of a symmetry existing in the early universe for the nature of entropy to follow the path it is today? How could any universe have existed as a fundamental reality of the current universe?


    For example.

    In 2010, Penrose and Vahe Gurzadyan published a preprint of a paper claiming that observations of the cosmic microwave background made by the Wilkinson Microwave Anisotropy Probe and the BOOMERanG experiment showed concentric anomalies which were consistent with the CCC hypothesis, with a low probability of the null hypothesis that the observations in question were caused by chance.[5] However, the statistical significance of the claimed detection has since been questioned. Three groups have independently attempted to reproduce these results, but found that the detection of the concentric anomalies was not statistically significant, in the sense that such circles would appear in a proper Gaussian simulation of the anisotropy in the CMB data.[6][7][8]

    The reason for the disagreement was tracked down to an issue of how to construct the simulations that are used to determine the significance: The three independent attempts to repeat the analysis all used simulations based on the standard Lambda-CDM model, while Penrose and Gurzadyan used an undocumented non-standard approach.
    [9]Conformal cyclic cosmology

    So for some who hold entropy as a subjective examination of the reality with which we live now,  how can one exist as a viewer of the larger universe that contains all these other universe being respective of the arrow of time??
    Can disorder precede order as a question of what came first as to the existence of the universe? The chicken or egg question. The idea then that the overarching principle here is an arrow of time, such hypothesis to consider needs a factor with which to consider such expansion being supported by some factors given to what can exist as a fundamental reality in those local regions of the universe.

    Friday, May 03, 2013

    Generalizations on, It from Bit

    I know there is an essay question going on and I thought it might quite the challenge indeed to wonder and construct from a layman standpoint some of the ideas that are emerging from the challenge.

    The past century in fundamental physics has shown a steady progression away from thinking about physics, at its deepest level, as a description of material objects and their interactions, and towards physics as a description of the evolution of information about and in the physical world. Moreover, recent years have shown an explosion of interest at the nexus of physics and information, driven by the "information age" in which we live, and more importantly by developments in quantum information theory and computer science.

    We must ask the question, though, is information truly fundamental or not? Can we realize John Wheeler's dream, or is it unattainable? We ask: "It From Bit or Bit From It?"

    Possible topics or sub-questions include, but are not limited to:

    What IS information? What is its relation to "Reality"?

    How does nature (the universe and the things therein) "store" and "process" information?

    How does understanding information help us understand physics, and vice-versa?
     See: It From Bit or Bit From It?

    So maybe as I go along it will contrive into something tangible, worth considering. If I show it's production here and it eliminates any possibility of an entrance, then it"s nice that I will have learn something along the way.

    Part 1

    To my mind there must be, at the bottom of it all,
    not an equation, but an utterly simple idea.
    And to me that idea, when we finally discover it,
    will be so compelling, so inevitable,
    that we will say to one another,
    “Oh, how beautiful !
    How could it have been otherwise?”
    From a personal notebook of Wheeler circa 1991

    This symbol was used to demonstrate in a global sense that everything is derived from bits. Taken from a speech given by John Archibald Wheeler in 1999.  Also from, J. A. Wheeler: Journey into Gravity and Spacetime (Scientific American Library, Freeman, New York, 1990),  pg. 220

    So the idea here while starting from a vague representation of something that could exist at of deeper region of reality is ever the question of where our perspective can go. So as if telling a story and reaching for some climax to be reached I ponder a approach. If one goes through the story it is to bring fastidiously a place in the future so as to see where John Archibald Wheeler took us now, in the form of his perspective on Information, Physics and Quantum.

    So indeed I have open on a simplistic level so as to move this story into the forum of how one can might approach a description of the wold so as to say it's foundation has been purposeful and leading. So I will move forward and present a phenomenological approach considered and go backward in time.

    Quantum gravity theory is untested experimentally. Could it be tested with tabletop experiments? While the common feeling is pessimistic, a detailed inquiry shows it possible to sidestep the onerous requirement of localization of a probe on Planck length scale. I suggest a tabletop experiment which, given state of the art ultrahigh vacuum and cryogenic technology, could already be sensitive enough to detect Planck scale signals. The experiment combines a single photon's degree of freedom with one of a macroscopic probe to test Wheeler's conception of "quantum foam", the assertion that on length scales of the order Planck's, spacetime is no longer a smooth manifold. The scheme makes few assumptions beyond energy and momentum conservations, and is not based on a specific quantum gravity scheme See:Is a tabletop search for Planck scale signals feasible? by Jacob D. Bekenstein-(Submitted on 16 Nov 2012 (v1), last revised 13 Dec 2012 (this version, v2))

    The question of any such emergence is then to consider that such examples held in context of the digital world of physics. This is to say it can be used to grossly examine levels to take us to the quest of examining what exists as a basis of reality. So too then,  as to what can be described as purposeful examination of the interior of the black hole.

    Jacob D. Bekenstein

    Such examinations then ask whether such approaches will divide perspective views in science into relations that adopt some discrete or continuum view of the basis of reality. This then forces such division as to a category we devise with respect to the sciences as theoretical examinations and it's approach. So from this perspective we see where John Wheeler sought to seek such demonstration in which to raise the question of a basis of reality on such a discrete approach?

    So such foundational methods are demonstrated then as to form from such developments in perspective. John Wheeler sought to seek a demonstration of the idea  of foundation as an end result by his students. To seek an explanation of the interior of the black hole as to demonstrate such a progeny in his students is to force this subject further along in history. So,  from this standpoint we go back in time.


    Generalizatons on:" It From Bit" Part 2

    John Archibald Wheeler (born July 9, 1911) is an eminent American theoretical physicist. One of the later collaborators of Albert Einstein, he tried to achieve Einstein's vision of a unified field theory. He is also known as the coiner of the popular name of the well known space phenomenon, the black hole. 

    There is always somebody who is the teacher and from them, their is a progeny. It would not be right not to mention John Archibald Wheeler. Or, not to mention some of his students.

    Notable students

    Demetrios Christodoulou
    Richard Feynman
    Jacob Bekenstein
    Robert Geroch
    Bei-Lok Hu
    John R. Klauder
    Charles Misner
    Milton Plesset
    Kip Thorne
    Arthur Wightman
    Hugh Everett
    Bill Unruh

    So it is with some respect that as we move back in time we see the names of those who have brought us forward ever closer to the understanding and ideal of some phenomenological approach so as to say such a course of events has indeed been fruitful. Also, to say that such branches that exist off of John Archibald Wheeler's work serve to remind us of the wide diversity of approaches to understanding and developing gravitational approaches to acceptance and development.

    COSMIC SEARCH: How did you come up with the name "black hole"?

    John Archibald Wheeler: It was an act of desperation, to force people to believe in it. It was in 1968, at the time of the discussion of whether pulsars were related to neutron stars or to these completely collapsed objects. I wanted a way of emphasizing that these objects were real. Thus, the name "black hole".

    The Russians used the term frozen star—their point of attention was how it looked from the outside, where the material moves much more slowly until it comes to a horizon.* (*Or critical distance. From inside this distance there is no escape.) But, from the point of view of someone who's on the material itself, falling in, there's nothing special about the horizon. He keeps on going in. There's nothing frozen about what happens to him. So, I felt that that aspect of it needed more emphasis.

    So as we go back in time we see where certain functions as a description and features of a reality has to suggest there was some beginning. It is also the realization that such a beginning sought to ask us to consider the function and reality of such new concepts so as to force us to deal with the fundamentals of that reality.

    Dr. Kip Thorne, Caltech 13

    So again, as we go back in time we see where such beginnings in sciences approach has to have it's beginning not only as a recognition of the black hole, but of where we have been lead toward today's approach to gravity in terms of what is discrete and what is considered, a continuum. These functions, as gravity, show a certain distinction then in terms of today's science as they exist from John Archibald Wheeler's approach so as to that question to his search for links to, Information, Physics and the Quantum began.

    Thursday, June 21, 2012

    Reality is Information?

    This paper argues that at least one of the following propositions is true: (1) the human species is very likely to go extinct before reaching a “posthuman” stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of their evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation. It follows that the belief that there is a significant chance that we will one day become posthumans who run ancestor-simulations is false, unless we are currently living in a simulation. A number of other consequences of this result are also discussed.
     Nick Bostrom interviewed about the Simulation Argument.

     In this informal interview in Atlanta June 8, 2012, Tom Campbell, author of My Big TOE, expands on the significance of the scientific experiment called the Double Slit in terms everyone can understand.

    " If you understand the Double Slit experiment, you understand how our reality works".
    He continues " Everything we do is not different from the Double Slit experiment".

    This explanation is valuable to scientists as well as the general public. Tom takes a difficult subject and applies helpful analogies to clarify the implications of this scientific experiment.
    See: Tom Campbell: Our Reality Is Information
    Up Date: There is further information later on that Campbell recognizes as needed to further solidify his understanding on Double Slit, so he is open to that, which is good.

    See Also:

    Monday, May 21, 2012

    Digital Physics

    In physics and cosmology, digital physics is a collection of theoretical perspectives based on the premise that the universe is, at heart, describable by information, and is therefore computable. Therefore, the universe can be conceived as either the output of a computer program or as a vast, digital computation device (or, at least, mathematically isomorphic to such a device).

    Digital physics is grounded in one or more of the following hypotheses; listed in order of increasing strength. The universe, or reality:







    Every computer must be compatible with the principles of information theory, statistical thermodynamics, and quantum mechanics. A fundamental link among these fields was proposed by Edwin Jaynes in two seminal 1957 papers.[1] Moreover, Jaynes elaborated an interpretation of probability theory as generalized Aristotelian logic, a view very convenient for linking fundamental physics with digital computers, because these are designed to implement the operations of classical logic and, equivalently, of Boolean algebra.[2]
    The hypothesis that the universe is a digital computer was pioneered by Konrad Zuse in his book Rechnender Raum (translated into English as Calculating Space). The term digital physics was first employed by Edward Fredkin, who later came to prefer the term digital philosophy.[3] Others who have modeled the universe as a giant computer include Stephen Wolfram,[4] Juergen Schmidhuber,[5] and Nobel laureate Gerard 't Hooft.[6] These authors hold that the apparently probabilistic nature of quantum physics is not necessarily incompatible with the notion of computability. Quantum versions of digital physics have recently been proposed by Seth Lloyd,[7] David Deutsch, and Paola Zizzi.[8]

    Related ideas include Carl Friedrich von Weizsäcker's binary theory of ur-alternatives, pancomputationalism, computational universe theory, John Archibald Wheeler's "It from bit", and Max Tegmark's ultimate ensemble.

    Digital physics




    Digital physics suggests that there exists, at least in principle, a program for a universal computer which computes the evolution of the universe. The computer could be, for example, a huge cellular automaton (Zuse 1967[9]), or a universal Turing machine, as suggested by Schmidhuber (1997), who pointed out that there exists a very short program that can compute all possible computable universes in an asymptotically optimal way.

    Some try to identify single physical particles with simple bits. For example, if one particle, such as an electron, is switching from one quantum state to another, it may be the same as if a bit is changed from one value (0, say) to the other (1). A single bit suffices to describe a single quantum switch of a given particle. As the universe appears to be composed of elementary particles whose behavior can be completely described by the quantum switches they undergo, that implies that the universe as a whole can be described by bits. Every state is information, and every change of state is a change in information (requiring the manipulation of one or more bits). Setting aside dark matter and dark energy, which are poorly understood at present, the known universe consists of about 1080 protons and the same number of electrons. Hence, the universe could be simulated by a computer capable of storing and manipulating about 1090 bits. If such a simulation is indeed the case, then hypercomputation would be impossible.

    Loop quantum gravity could lend support to digital physics, in that it assumes space-time is quantized. Paola Zizzi has formulated a realization of this concept in what has come to be called "computational loop quantum gravity", or CLQG.[10][11] Other theories that combine aspects of digital physics with loop quantum gravity are those of Marzuoli and Rasetti[12][13] and Girelli and Livine.[14]

    Weizsäcker's ur-alternatives


    Physicist Carl Friedrich von Weizsäcker's theory of ur-alternatives (archetypal objects), first publicized in his book The Unity of Nature (1980),[15] further developed through the 1990s,[16][17] is a kind of digital physics as it axiomatically constructs quantum physics from the distinction between empirically observable, binary alternatives. Weizsäcker used his theory to derive the 3-dimensionality of space and to estimate the entropy of a proton falling into a black hole.

    Pancomputationalism or the computational universe theory


    Pancomputationalism (also known as pan-computationalism, naturalist computationalism) is a view that the universe is a huge computational machine, or rather a network of computational processes which, following fundamental physical laws, computes (dynamically develops) its own next state from the current one.[18]
    A computational universe is proposed by Jürgen Schmidhuber in a paper based on Konrad Zuse's assumption (1967) that the history of the universe is computable. He pointed out that the simplest explanation of the universe would be a very simple Turing machine programmed to systematically execute all possible programs computing all possible histories for all types of computable physical laws. He also pointed out that there is an optimally efficient way of computing all computable universes based on Leonid Levin's universal search algorithm (1973). In 2000 he expanded this work by combining Ray Solomonoff's theory of inductive inference with the assumption that quickly computable universes are more likely than others. This work on digital physics also led to limit-computable generalizations of algorithmic information or Kolmogorov complexity and the concept of Super Omegas, which are limit-computable numbers that are even more random (in a certain sense) than Gregory Chaitin's number of wisdom Omega.

    Wheeler's "it from bit"


    Following Jaynes and Weizsäcker, the physicist John Archibald Wheeler wrote the following:

    [...] it is not unreasonable to imagine that information sits at the core of physics, just as it sits at the core of a computer. (John Archibald Wheeler 1998: 340)

    It from bit. Otherwise put, every 'it'—every particle, every field of force, even the space-time continuum itself—derives its function, its meaning, its very existence entirely—even if in some contexts indirectly—from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. 'It from bit' symbolizes the idea that every item of the physical world has at bottom—a very deep bottom, in most instances—an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes–no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe. (John Archibald Wheeler 1990: 5)

    David Chalmers of the Australian National University summarised Wheeler's views as follows:

    Wheeler (1990) has suggested that information is fundamental to the physics of the universe. According to this 'it from bit' doctrine, the laws of physics can be cast in terms of information, postulating different states that give rise to different effects without actually saying what those states are. It is only their position in an information space that counts. If so, then information is a natural candidate to also play a role in a fundamental theory of consciousness. We are led to a conception of the world on which information is truly fundamental, and on which it has two basic aspects, corresponding to the physical and the phenomenal features of the world.[19]

    Chris Langan also builds upon Wheeler's views in his epistemological metatheory:

    The Future of Reality Theory According to John Wheeler: In 1979, the celebrated physicist John Wheeler, having coined the phrase “black hole”, put it to good philosophical use in the title of an exploratory paper, Beyond the Black Hole, in which he describes the universe as a self-excited circuit. The paper includes an illustration in which one side of an uppercase U, ostensibly standing for Universe, is endowed with a large and rather intelligent-looking eye intently regarding the other side, which it ostensibly acquires through observation as sensory information. By dint of placement, the eye stands for the sensory or cognitive aspect of reality, perhaps even a human spectator within the universe, while the eye’s perceptual target represents the informational aspect of reality. By virtue of these complementary aspects, it seems that the universe can in some sense, but not necessarily that of common usage, be described as “conscious” and “introspective”…perhaps even “infocognitive”.[20]

    The first formal presentation of the idea that information might be the fundamental quantity at the core of physics seems to be due to Frederick W. Kantor (a physicist from Columbia University). Kantor's book Information Mechanics (Wiley-Interscience, 1977) developed this idea in detail, but without mathematical rigor.

    The toughest nut to crack in Wheeler's research program of a digital dissolution of physical being in a unified physics, Wheeler himself says, is time. In a 1986 eulogy to the mathematician, Hermann Weyl, he proclaimed: "Time, among all concepts in the world of physics, puts up the greatest resistance to being dethroned from ideal continuum to the world of the discrete, of information, of bits. ... Of all obstacles to a thoroughly penetrating account of existence, none looms up more dismayingly than 'time.' Explain time? Not without explaining existence. Explain existence? Not without explaining time. To uncover the deep and hidden connection between time and existence ... is a task for the future."[21] The Australian phenomenologist, Michael Eldred, comments:

    The antinomy of the continuum, time, in connection with the question of being ... is said by Wheeler to be a cause for dismay which challenges future quantum physics, fired as it is by a will to power over moving reality, to "achieve four victories" (ibid.)... And so we return to the challenge to "[u]nderstand the quantum as based on an utterly simple and—when we see it—completely obvious idea" (ibid.) from which the continuum of time could be derived. Only thus could the will to mathematically calculable power over the dynamics, i.e. the movement in time, of beings as a whole be satisfied.[22][23]

    Digital vs. informational physics


    Not every informational approach to physics (or ontology) is necessarily digital. According to Luciano Floridi,[24] "informational structural realism" is a variant of structural realism that supports an ontological commitment to a world consisting of the totality of informational objects dynamically interacting with each other. Such informational objects are to be understood as constraining affordances.

    Digital ontology and pancomputationalism are also independent positions. In particular, John Wheeler advocated the former but was silent about the latter; see the quote in the preceding section.
    On the other hand, pancomputationalists like Lloyd (2006), who models the universe as a quantum computer, can still maintain an analogue or hybrid ontology; and informational ontologists like Sayre and Floridi embrace neither a digital ontology nor a pancomputationalist position.[25]

    Computational foundations


    Turing machines


    Theoretical computer science is founded on the Turing machine, an imaginary computing machine first described by Alan Turing in 1936. While mechanically simple, the Church-Turing thesis implies that a Turing machine can solve any "reasonable" problem. (In theoretical computer science, a problem is considered "solvable" if it can be solved in principle, namely in finite time, which is not necessarily a finite time that is of any value to humans.) A Turing machine therefore sets the practical "upper bound" on computational power, apart from the possibilities afforded by hypothetical hypercomputers.

    Wolfram's principle of computational equivalence powerfully motivates the digital approach. This principle, if correct, means that everything can be computed by one essentially simple machine, the realization of a cellular automaton. This is one way of fulfilling a traditional goal of physics: finding simple laws and mechanisms for all of nature.

    Digital physics is falsifiable in that a less powerful class of computers cannot simulate a more powerful class. Therefore, if our universe is a gigantic simulation, that simulation is being run on a computer at least as powerful as a Turing machine. If humans succeed in building a hypercomputer, then a Turing machine cannot have the power required to simulate the universe.

    The Church–Turing (Deutsch) thesis


    The classic Church–Turing thesis claims that any computer as powerful as a Turing machine can, in principle, calculate anything that a human can calculate, given enough time. A stronger version, not attributable to Church or Turing,[26] claims that a universal Turing machine can compute anything any other Turing machine can compute - that it is a generalizable Turing machine. But the limits of practical computation are set by physics, not by theoretical computer science:

    "Turing did not show that his machines can solve any problem that can be solved 'by instructions, explicitly stated rules, or procedures', nor did he prove that the universal Turing machine 'can compute any function that any computer, with any architecture, can compute'. He proved that his universal machine can compute any function that any Turing machine can compute; and he put forward, and advanced philosophical arguments in support of, the thesis here called Turing's thesis. But a thesis concerning the extent of effective methods—which is to say, concerning the extent of procedures of a certain sort that a human being unaided by machinery is capable of carrying out—carries no implication concerning the extent of the procedures that machines are capable of carrying out, even machines acting in accordance with 'explicitly stated rules.' For among a machine's repertoire of atomic operations there may be those that no human being unaided by machinery can perform." [27]

    On the other hand, if two further conjectures are made, along the lines that:

    • hypercomputation always involves actual infinities;
    • there are no actual infinities in physics,

    the resulting compound principle does bring practical computation within Turing's limits.
    As David Deutsch puts it:

    "I can now state the physical version of the Church-Turing principle: 'Every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means.' This formulation is both better defined and more physical than Turing's own way of expressing it."[28] (Emphasis added)

    This compound conjecture is sometimes called the "strong Church-Turing thesis" or the Church–Turing–Deutsch principle.



    The critics of digital physics—including physicists[citation needed] who work in quantum mechanics—object to it on several grounds.

    Physical symmetries are continuous


    One objection is that extant models of digital physics are incompatible[citation needed] with the existence of several continuous characters of physical symmetries, e.g., rotational symmetry, translational symmetry, Lorentz symmetry, and electroweak symmetry, all central to current physical theory.

    Proponents of digital physics claim that such continuous symmetries are only convenient (and very good) approximations of a discrete reality. For example, the reasoning leading to systems of natural units and the conclusion that the Planck length is a minimum meaningful unit of distance suggests that at some level space itself is quantized.[29]



    Some argue[citation needed] that extant models of digital physics violate various postulates of quantum physics. For example, if these models are not grounded in Hilbert spaces and probabilities, they belong to the class of theories with local hidden variables that some deem ruled out experimentally using Bell's theorem. This criticism has two possible answers. First, any notion of locality in the digital model does not necessarily have to correspond to locality formulated in the usual way in the emergent spacetime. A concrete example of this case was recently given by Lee Smolin.[30] Another possibility is a well-known loophole in Bell's theorem known as superdeterminism (sometimes referred to as predeterminism).[31] In a completely deterministic model, the experimenter's decision to measure certain components of the spins is predetermined. Thus, the assumption that the experimenter could have decided to measure different components of the spins than he actually did is, strictly speaking, not true.

    Physical theory requires the continuum


    It has been argued[weasel words] that digital physics, grounded in the theory of finite state machines and hence discrete mathematics, cannot do justice to a physical theory whose mathematics requires the real numbers, which is the case for all physical theories having any credibility.

    But computers can manipulate and solve formulas describing real numbers using symbolic computation, thus avoiding the need to approximate real numbers by using an infinite number of digits.

    Before symbolic computation, a number—in particular a real number, one with an infinite number of digits—was said to be computable if a Turing machine will continue to spit out digits endlessly. In other words, there is no "last digit". But this sits uncomfortably with any proposal that the universe is the output of a virtual-reality exercise carried out in real time (or any plausible kind of time). Known physical laws (including quantum mechanics and its continuous spectra) are very much infused with real numbers and the mathematics of the continuum.

    "So ordinary computational descriptions do not have a cardinality of states and state space trajectories that is sufficient for them to map onto ordinary mathematical descriptions of natural systems. Thus, from the point of view of strict mathematical description, the thesis that everything is a computing system in this second sense cannot be supported".[32]

    For his part, David Deutsch generally takes a "multiverse" view to the question of continuous vs. discrete. In short, he thinks that “within each universe all observable quantities are discrete, but the multiverse as a whole is a continuum. When the equations of quantum theory describe a continuous but not-directly-observable transition between two values of a discrete quantity, what they are telling us is that the transition does not take place entirely within one universe. So perhaps the price of continuous motion is not an infinity of consecutive actions, but an infinity of concurrent actions taking place across the multiverse.” January, 2001 The Discrete and the Continuous, an abridged version of which appeared in The Times Higher Education Supplement.

    See also





    1. ^ Jaynes, E. T., 1957, "Information Theory and Statistical Mechanics," Phys. Rev 106: 620.
      Jaynes, E. T., 1957, "Information Theory and Statistical Mechanics II," Phys. Rev. 108: 171.
    2. ^ Jaynes, E. T., 1990, "Probability Theory as Logic," in Fougere, P.F., ed., Maximum-Entropy and Bayesian Methods. Boston: Kluwer.
    3. ^ See Fredkin's Digital Philosophy web site.
    4. ^ A New Kind of Science website. Reviews of ANKS.
    5. ^ Schmidhuber, J., "Computer Universes and an Algorithmic Theory of Everything."
    6. ^ G. 't Hooft, 1999, "Quantum Gravity as a Dissipative Deterministic System," Class. Quant. Grav. 16: 3263-79.
    7. ^ Lloyd, S., "The Computational Universe: Quantum gravity from quantum computation."
    8. ^ Zizzi, Paola, "Spacetime at the Planck Scale: The Quantum Computer View."
    9. ^ Zuse, Konrad, 1967, Elektronische Datenverarbeitung vol 8., pages 336-344
    10. ^ Zizzi, Paola, "A Minimal Model for Quantum Gravity."
    11. ^ Zizzi, Paola, "Computability at the Planck Scale."
    12. ^ Marzuoli, A. and Rasetti, M., 2002, "Spin Network Quantum Simulator," Phys. Lett. A306, 79-87.
    13. ^ Marzuoli, A. and Rasetti, M., 2005, "Computing Spin Networks," Annals of Physics 318: 345-407.
    14. ^ Girelli, F.; Livine, E. R., 2005, "[1]" Class. Quant. Grav. 22: 3295-3314.
    15. ^ von Weizsäcker, Carl Friedrich (1980). The Unity of Nature. New York: Farrar, Straus, and Giroux.
    16. ^ von Weizsäcker, Carl Friedrich (1985) (in German). Aufbau der Physik [The Structure of Physics]. Munich. ISBN 3-446-14142-1.
    17. ^ von Weizsäcker, Carl Friedrich (1992) (in German). Zeit und Wissen.
    18. ^ Papers on pancompuationalism
    19. ^ Chalmers, David. J., 1995, "Facing up to the Hard Problem of Consciousness," Journal of Consciousness Studies 2(3): 200-19. This paper cites John A. Wheeler, 1990, "Information, physics, quantum: The search for links" in W. Zurek (ed.) Complexity, Entropy, and the Physics of Information. Redwood City, CA: Addison-Wesley. Also see Chalmers, D., 1996. The Conscious Mind. Oxford Univ. Press.
    20. ^ Langan, Christopher M., 2002, "The Cognitive-Theoretic Model of the Universe: A New Kind of Reality Theory, pg. 7" Progress in Complexity, Information and Design
    21. ^ Wheeler, John Archibald, 1986, "Hermann Weyl and the Unity of Knowledge"
    22. ^ Eldred, Michael, 2009, 'Postscript 2: On quantum physics' assault on time'
    23. ^ Eldred, Michael, 2009, The Digital Cast of Being: Metaphysics, Mathematics, Cartesianism, Cybernetics, Capitalism, Communication ontos, Frankfurt 2009 137 pp. ISBN 978-3-86838-045-3
    24. ^ Floridi, L., 2004, "Informational Realism," in Weckert, J., and Al-Saggaf, Y, eds., Computing and Philosophy Conference, vol. 37."
    25. ^ See Floridi talk on Informational Nature of Reality, abstract at the E-CAP conference 2006.
    26. ^ B. Jack Copeland, Computation in Luciano Floridi (ed.), The Blackwell guide to the philosophy of computing and information, Wiley-Blackwell, 2004, ISBN 0-631-22919-1, pp. 10-15
    27. ^ Stanford Encyclopedia of Philosophy: "The Church-Turing thesis" -- by B. Jack Copeland.
    28. ^ David Deutsch, "Quantum Theory, the Church-Turing Principle and the Universal Quantum Computer."
    29. ^ John A. Wheeler, 1990, "Information, physics, quantum: The search for links" in W. Zurek (ed.) Complexity, Entropy, and the Physics of Information. Redwood City, CA: Addison-Wesley.
    30. ^ L. Smolin, "Matrix models as non-local hidden variables theories."
    31. ^ J. S. Bell, 1981, "Bertlmann's socks and the nature of reality," Journal de Physique 42 C2: 41-61.
    32. ^ Piccinini, Gualtiero, 2007, "Computational Modelling vs. Computational Explanation: Is Everything a Turing Machine, and Does It Matter to the Philosophy of Mind?" Australasian Journal of Philosophy 85(1): 93-115.


    Further reading



    External links