The Mathematical Structure of the Loop Logic – Is that How God Thinks?
By Dr. Yeshayahu Eisenberg
I want to know God's thoughts; the rest are details.
Einstein
To "understand the superorder of the nonlinear dynamic that connects everything to everything else," to "know the lawfulness of recreating specific instances by targeting specific meaningful constellations from the meaningless everythingwitheverything connection," to "know how to build the lawfulness and the mechanism that can make choices, and then you will have real artificial intelligence," as was conveyed in "SHET"'s Millennium session, is a big leap with respect to our present scientific understanding of reality. Such a leap requires a new mathematics to describe this nonlinear logic, in order to define the terms within which it is realized. Conventional logic is the fundament of computer science and the mechanism by which we describe the world of cyberspace and virtual reality, but the new logic is the fundament of all processes in nature, including the process of perceiving nature itself. If creation is God's thoughts, then this logic is how God thinks.
The result of the research (a new mathematical logic) that went hand in hand with the development of the general philosophical and logical frameworks was a big leap toward achieving these ambitious targets. This new mathematical logic emerges as the language that could unify all aspects of human endeavor, yielding a new approach to science and technology. However, another important result is the demystification of the act of heralding and the "ADAM KADMON" principle, and thus, both became plausible derivations of the theory.
1. The Uncertainty of Certainty
As far as the laws of mathematics refer to reality, they are not certain; as far as they are certain, they do not refer to reality.
Einstein
Unlike conventional logic that strives to achieve "Consistency" to the absolute exclusion of any indefiniteness, one of the basic notions in the new logic is the emphasis on the aspiration toward consistency, which includes the "Indefinite", rather than assuming/demanding consistency. Unlike conventional mathematics that strives to exact values, the new mathematics is the mathematics of almost values. So, on the one hand, there is always some indefiniteness in what that value is, but on the other, this mathematics includes the process that defines that value. These are very important points to note. When consistency includes the process which defines that consistency, and when a mathematical value includes the process which defines this value, then the most we can hope for is the aspiration to achieve consistency, the aspiration to achieve an exact value. Consistency, exact values, exact periodicity, etc., are meaningful notions only if posited as external to our process of querying them, which is a truncated view, whereas if we want a system that can attribute meaning to itself, and through that, to anything external to it, we cannot adopt a truncated view.
"Perception" is a "Stabilization" of the process of definition. "Experience" is the process of definition on its way towards stabilization. When we say "stabilization of a process," this means, to create repetitive patterns (for instance, a periodic process is an example of such pattern, the repetition of the same numerical value throughout a certain periodicity). However, in the world of almost values, there can only be almost stabilized processes, almost repetition of the same numerical value. The discrepancy between the almost values of a quasistabilized process is infinitesimal (or at least very small with respect to the main course of the process. Infinitesimal means going towards the limits of the continuum, and since  as will be shown  the processes we are dealing with are discrete, we have coined the name infinitesimal tail for these small discrepancies). Nevertheless, if the resolution of our tools of measurement that can detect the infinitesimal tails is lower than the infinitesimal tails, then practically speaking, we can regard the process as having stabilized. However, zoomingin to the process, namely, taking the viewpoint of the parameter that is the difference between two almost equal values (the viewpoint of the infinitesimal tails), we see chaotic or even random behavior. From that point of view, the process didn't stabilize yet, but rather is still on its way toward stabilization. Even if we succeed in stabilizing (or rather, almost stabilizing) the above mentioned parameter, we will still find erratic behavior on the next zoomin level, etc. This is like eating one's cake and having it too. We have both the viewpoint from which the process became stabilized and also the one from which the process is still on its way toward stabilization. This hybrid behavior is an indication of the relationship between perception and experience, which is the loop that can attribute meaning. Since meaning is also consistency, consistency should be connected to stabilization. This connection will be elucidated in the following pages.
One might look at a surface and see a smooth aspect. However, looking at the same surface with a magnifying glass might reveal a rough aspect. So, is the surface plain or rough? It depends on our point of view; it depends on how we measure the surface.
The small discrepancy between equivalent values is the realization of the notion of "Uncertainty of sameness". This in turn encodes the probable evolutions of the processes. Indeed, as we will see, interacting with the parameters of the difference between two almost same values (zoomingin) can destabilize the system, and then stabilize it in a new pattern, a new perception. Zoomingin reveals more and more parameters that become control parameters of the system when dynamically activated. The dynamic zoomingin process reveals a fractal kind of evolution. One should notice though that, when activated, those parameters destabilize the system and restabilize it in a new process. This in turn can cause the infinitesimal chaotic parameters to grow and become stabilized patterns (well, almost), to become one of the main processes in the system, which then creates new infinitesimal chaotic parameters when one zooms into this newborn process.
The question that arises is: What are the generators of such processes that unfold an extremely rich mold, which can be looked upon from different points of view, and consequently, different aspects come to the foreground? Such metamorphoses of processes and patterns that take place when a change of viewpoint occurs are a reflection of the tremendous richness of that mold and its ability to resonate with any phenomenon.
The generating principles that create these processes together with their control "parameter"s (which, by the way, are part of the generating principles themselves) are the heart of this work in its many aspects. I will refrain from boring you with the tedious formal mathematical details of the construction of these generating principles. Rather, I will share the general idea of how the generating principles were achieved, and I'll use their attributes to represent their implications (which were partially presented previously).
Another important point to note: not all processes in nature are periodic. However, perception is always a recurrent pattern (a periodic process). Perception means recreating the same event time after time, the same segment of a phenomenological process, etc. This implies that the perception of a nonperiodic process is periodic  a recurrent pattern. Therefore, the generators I'm speaking about are always generating processes that  when stabilized  become almost periodic patterns, even if these stabilizations are repetitions of almost the same segments of nonperiodic phenomena.
If the perception of an event is the event, if the perception of a process is the process, if for something to gain meaning, the logical structure  which gives it "Meaning"  should be brought forth, then we could say that the processes unfold and enfold the logical structure that gives them meaning in such a framework.
2. Creativity  the Hallmark of Creation
Once upon a time, there was a brilliant student doing his PhD in physics who was given a problem for his thesis by his supervisor. The student looked at the problem for a while, considering it very deeply and from different angles. Eventually, he mumbled under his breath, "Dear professor, how do you want me to solve this problem if I don't know the answer to it?" How indeed?
There are those problems we can solve by retrieving the answers from our memory. There are problems that can be solved by deduction, by utilizing known methods, like solving simple mathematical equations or to use a map to find our way over unfamiliar ground. There are problems for which some trial and error might work. And then, there are problems that offer not the slightest clue as to the method that could be used to obtain a solution, problems that no one has ever solved previously  nor even anything similar to it. Our poor student got an extremely difficult problem of the latter kind, a problem at the frontiers of scientific research, a problem to which the most brilliant scientists would like to find the answer, a problem no one has the slightest idea how to approach. So indeed, facing such a problem you either know the solution or you are in deep shit.
Yes, I know, you would say now, the solution to his problem is creativity. But if it was so simple to be creative to such an extent, if creativity could be achieved in three easy steps, then everyone would become an Einstein. Yet Einsteinlike geniuses do not pass frequently even through the corridors of the leading scientific institutes. Since creative solutions are not the result of deductive processes, but something that descends upon us within a certain framework, it seems that an essential part in the art of creativity is how to define that framework.
This framework is a closed loop, and the core of that loop is the cognitive processes taking place in our brain. Input parameters might trigger the creation of the framework of our cognitive processes to eventually collapse into the desired solution of the problem, in which case we could say that the problem fixed the boundary conditions for the cognitive process.
What are these input parameters? Initially, that is the problem under consideration. The problem triggers the creation of the framework of the cognitive processes to produce (collapse into) output parameters, like associations, wonderings, ponderings, partial understanding, further questions, etc. The loop is closed when the output parameters (all or part) become the input parameters. The new input parameters retrigger the cognitive processes to create new output parameters, which become the new input parameters, etc. The aim of this operation is to find a creative solution to a given problem, and the operation either succeeds or fails to accomplish that goal. Succeeding means that the output parameters become the solution to the problem. Until the output parameters become the solution, they keep changing. However, once the solution to the problem is accomplished, the overall loop process is recreating the same output parameters time after time. In other words, the process becomes stabilized. As we shall see, such processes in which the system regulates itself until it stabilizes in whatever form to which its boundary conditions constrained it is the famous process of definition, the fundament of the Holophanic loop logic. Furthermore, in such systems stabilization will become synonymous with solution, definition or even perception.
The cognitive processes are not collapsing into the desired solution accidentally, but rather, they are doing so according to the underlying generative principles and lawfulness. This lawfulness of creativity is universal in the sense that it underlies the entire phenomenological world and can  in principle  be used to solve most any problem. This lawfulness is a realization of the loop logic.
Generating principles are those rules that allow a process to be that process. The lawfulness is the control mechanism of the generating principles, activated by the parameters which are extracted from the processes generated by those generating principles (i.e., the difference of two almost equal values). Yes, the loop again. Although the distinction between the generating of the process and the control mechanism is somewhat artificial, these designations will serve us well, both as a means to think with and also as a pedagogical approach.
Thus, we could divide the problem of understanding the lawfulness into two parts: First, we can define the generating mechanism, which can create a process that has the potential to become any process. (It can be demonstrated that any point of the mapping can acquire any numerical value we wish, depending on the boundary conditions imposed on it. Since any process is a collection of consecutive points created by this mapping, the mapping can generate processes of any kind or shape. (See Figure A5) And second, we can formulate the mechanism through which we can control the process, to make of it a specific process that describes certain phenomena or that is the solution to a specific problem.
This lawfulness is a nonlinear mapping, mapping parameters into parameters; "Significance" into significance. How the mapping takes place is the lawfulness ("Structure"); what is mapped are parameters (significances). The "structure" is not a phenomenological lawfulness, but rather, a logical structure, the famous "Isomorphism". Nonetheless, structure and significance are interwoven: structure creates significance, while "parameter"s are the means by which the logical structure is being realized. The loop between structure and significance is actually the Loop of Creation. Or as SHET put it, "The structure is the dynamic between structure and significance (parameters) that preserves the structure." What is preserved is the isomorphism and not the specific realization. Nevertheless, the loop logic implies that any phenomenology or problem can be modeled in terms of the different realizations of the logical structure.
Read more about The Mapping...
The flexibility of the structure of the generating principles of the mapping means that the dynamic generated by it allows the system to synchronize with any problem. The art of solving problems becomes the art of finding the boundary conditions that project the problem onto the mapping. There is a truism to the effect that the most important thing is to ask the right questions, which probably means that reformulating a problem in an appropriate way will straightforwardly imply the answer. The tendency of most of us to adopt this idea is based precisely on our fundamental experience of being such a structure ourselves. The appropriate reformulation is to become the boundary conditions of our cognitive processes, which can collapse into the desired solution. This is not unlike creating our perception. What we detect through our senses sets the boundary conditions for our internal processes to stabilize accordingly. The stabilization on a recurrent pattern then is perception. Sensing something differently changes the boundary conditions and destabilizes the internal processes, which consequently stabilize in a new recurrent pattern, which is the new perception. This entire procedure generates processes perpetuated by the specific boundary conditions, which define a specific viewpoint.
Any questions, problems or data collected from phenomenological experiments point in the direction of the unknown. We don't know the answer when the question is posed. We don't know the phenomenological model that will give a unified explanation to the collected data just from having that data. The unknown is our target space. The problem under consideration sets the boundary conditions, which are in turn part of the definition of the specific process aimed at stabilizing into the solution of the problem. Will it necessarily stabilize into the desired solution? It depends. Not every equation has a solution and not every problem can be solved. This is quite obvious once it is clear that the process of definition will never stabilize into a solution that does not exist. The general hypothesis is, that only those phenomena for which there are boundary conditions that are consistent with the process that define the phenomena (that can stabilize into these phenomena) can exist in nature.
Read more about the universality of the mapping, aspiration towards consistency, order and stabilization  The Logical Conservation Laws...
3. The Black Box as a Logical Gate
Since the mapping can generate any number of parameters (zooming in), and since any segment of the mapping can create the input parameters for the generating of the next segment (which in turn generate as an output the parameters that become the input parameters for the generating of the next segment), we can consider the mapping to be the realization of the logical structure as a black box. It gets input parameters and creates output parameters. The term black box is quite suitable, since what matters is not so much what is going on inside the black box, but rather, the structure of its interactions with itself and anything external to it, or in other words, its function. "The control mechanism is the structure. You could look upon it as a black box: whatever goes in and whatever comes out may change, but the black box remains  the black box is the structure. Or to be more precise, the structure is the dynamic between significance (parameters) and structure that preserves the structure. Lawfulness is lawfulness of structure."
The black box consists of processes that are themselves black boxes (selfsimilarity). Using the zoomin technique, any segment can be divided to subsegments and subsubsegments, etc. Once so divided, the dynamic of a segment is described in terms of the nonlinear dynamics of its subsegments. How a given segment can be constructed from its constituent subsegments is not unique. However, the same pattern can be the result of different generating principles. Differently constituted segments, that nevertheless embody the same pattern, will exhibit different evolutionary possibilities (which means that the patterns are identical, but the structure of their infinitesimal tails is different).
To summarize: There are endless ways to generate a given segment (pattern). First, by defining a generating principle that creates the pattern as a whole, without assuming it consists of substructures. Second, by generating it through its constituent elements, which are not unique. The multitude of generating principles is unified within one worldview when we formulate the segment as consisting of subsegments together with the inherent dynamic transformations of the system from within the system, between the different possibilities of how the overall segment is constructed (meaning, transforming the system from being generated by a certain generating principle into being generated by another generating principle). This in turn introduces a system wherein the zoomin and the fractal means of evolution are not merely the attributes of the process (significance) but also of the lawfulness itself. This enables the system to change itself from within the system. And since this same consideration is true for any subsegment or subsubsegment, it also explains why even the simplest object is a "Complexity" that can be divided infinitely into its constituents, while any such part of the segment is the whole looking from the point of view of its function (its lawfulness).
Another SHET session unfolds this very same idea in slightly different terms. It is of interest to mention, however, that this session was given long before we had any realization of the mapping. It might seem that the session merely enhances and summarizes what we already know. In reality, however, it was the other way around. Like so many other sessions, this one created the framework for thinking differently and trying to interpret the written words, which eventually collapsed the formal description of these words through the mathematical formulation of the mapping. My experience with SHET taught me, however, that in these sessions there are always additional layers to be found.
"Indeed, different generating principles can create the same pattern with different evolutionary possibilities. The transitions between the different generating principles can be achieved by infinitesimal gauge transformations. That is, you can travel between different frameworks of evolution patterns in close circuits: periodic processes, strange attractorlike processes, chaotic or infinite evolution processes, which describe different loops or stabilizations. Consider this a very ordered chaotic labyrinth in which you can comfortably control your direction by controlling which generating principle stabilized a given pattern by utilizing the right infinitesimal gauge transformation.
"The right infinitesimal gauge transformation is like a rotation around an axis. This axis is usually considered to be a given, unchanging magnitude that conserves the invariance of the theory. In this case, however, this unchanging magnitude is indefinite. How can a magnitude be indefinite? By not being a real physical magnitude, but a logical principle, which is isomorphous to all the generating principles. This logical principle is the ‘if' so and so, ‘then' something that is almost its opposite, like a seconddegree paradox, a dynamic harmony.
"This preserves the conformal invariance of the whole, whereas each generating principle is different. That is, the viewpoint that defines, which collapses something to be a physical magnitude, is that looking that establishes different physical magnitudes. As the isomorphous logical structure is invariant, and the whole is the generalized isomorphous inner logic that remains invariant and indefinite, it does not matter how various the collapsed expressions are. Thus, each viewpoint in itself is the whole. However, while interacting with other viewpoints, it is different.
"The infinitesimal gauge transformations can be performed through and by the infinitesimal tails, which render the pattern that generated them meaningful. They always act backward so there should be something forward.
"When you assert that the invariant magnitude is a logical principle (which is an indefinite physical magnitude), then you don't confine yourself to creating a class that includes all classes, you don't create an absolute magnitude that your theory has to obey. Instead, any physical magnitude becomes like any other, which you can control and manipulate. Your question should be then: Why are certain physical magnitudes that have been measured phenomenologically more meaningful than others? Add to this question another one: Could there be other consistent worlds with other such physical magnitudes? Have fun."
The black box can interact with itself and with systems external to it. It interacts with itself when some of its output parameters are reinserted as input parameters, or to be more precise, it interacts with itself when the input parameters are functions of its output parameters. When an input parameter becomes a function of one or more output parameters, a loop is closed. There might be any number of closed loops. The way the loops are closed defines the boundary conditions of the dynamic (again significance) generated by the black box. In principle, any kind of problem or phenomenon can be translated into such boundary conditions. Once such translation was preformed, the dynamics generated by the black box stabilizes some of the output parameters as the solution of a given problem or as the fields describing the dynamics of certain phenomena. In more prosaic terms, all that is needed in such a system is to rephrase the question (problem) in such a way that it can be "understood" by the system, which in turn prompts the system to go through a process that eventually leads to the desired answer (solution).
The main universal attribute of the black box is its aspiration to achieve stabilizing whatever it interacts with, whatever boundary conditions were imposed on it. As we shall see, when the "tendency to achieve stabilization" is replaced by the "tendency to reestablish consistency" (in our framework, the first assertion is a realization of the second), it will become even more apparent why we are speaking about logical conservation laws.
The black box is the universal lawfulness, THE logical conservation law. Different profiles or cuts are the different boundary conditions, different realizations of THE logical conservation law expressed as a specific logical conservation law.
"There you can solve the problem or fulfill the desire by using the cut (boundary conditions) that precisely controls the system in such fashion that satisfies the requirements. That's why the different profiles are different worlds, wherein each world is governed by a different conservation law (meaning, different realizations of THE conservation law). Think: What is conserved in relativity? What is conserved in quantum theory? (Once you consider several conserved elements, try to generalize them into one idea to get the picture.) The power of this worldview is tremendous: that's why different profiles might be inconsistent when you try to conserve all parameters of the cuts (which is trying to conserve significance). Nevertheless, physical world action is a choice, and the choice actually is the profile: which conservation law you choose to abide by. Different cuts might be incompatible, one with the other; nevertheless, all of them together are that meaningless wholeness, which is the potential of any such cut "retromorphous"ly."
Read more about The Mapping in Action...
Read more about how the structure replicates itself and its applications  Blessed Noise...
4. Creation and the ADAM KADMON Principle
The second law of thermodynamics, which is also referred to as the law of entropy, states that a system will almost always be found either in the state of maximum disorder or moving towards it. Entropy is a measure of disorder, and it can be said that according to the second law in any macroscopic process the total amount of entropy is always increasing. Since existence is order (or rather, the aspiration to achieve order), this tendency in nature to macroscopic uniformity and microscopic disorder should have a counter tendency to order, if one expects existence and life, which gives creation meaning, to not be a momentary fluke. At present, there is no known universal counterpart to entropy. Scientists, in the field of complexity feel the need for such a universal counterpart; however, their wish has not been fulfilled by any discoveries in their research.
"'I'm of the school of thought that life and organization are inexorable,' he (Los Alamos physicist, Doyne Farmer) says, ‘just as inexorable as the increase in entropy. They just seem more fluky because they proceed in fits and starts, and they build on themselves. Life is a reflection of a much more general phenomenon that I'd like to believe is described by some counterpart of the second law of thermodynamics  some law that would describe the tendency of matter to organize itself, and that would predict the general properties of organization we'd expect to see in the universe.'
"Farmer has no clear idea of what this new second law would look like. ‘If we knew that,' he says, ‘we'd have a big clue how to get there. At this point it's purely speculative, something that intuition suggests when you stand back and stroke your beard and contemplate.' In fact, he has no idea whether it would be one law, or several. What he does know, however, is that people have recently been finding so many hints about things like emergence, adaptation, and the edge of chaos that they can begin to sketch at least a broad outline of what this hypothetical new second law might be like."[1]
The law, which is the core of the black box, the aspiration to achieve consistency, stabilization, organization and order in our work, is a serious candidate to be the universal counterpart of the law of entropy.
Any existence is the result of the competition between these two tendencies toward order and disorder (not unlike the basic aspirations in Freud's worldview, Eros and Thanatos: love and death). Total disorder is randomness while total order is exact symmetry. Both extremes are nonexistence. Existence is in between. One can find exact periodicity, exact symmetry, etc. only in mathematical structures. In nature, we will merely find almost symmetry, almost periodicity, etc. This uncertainty in what certain phenomenological values are is not merely an unavoidable occurrence that we have to live with, an inconvenience, but the very means whereby we can control and manipulate any system or object. The probable evolutions of a system are encoded in this uncertainty, and the output parameters of the black box can be extracted from it. Remember that anything would be a black box, if we only knew how to look at it, and any simple process could be the result of the stabilization of a complex mapping. Such a structure wherein the part, the object, includes the whole structure and the internal space that enables us to activate the lawfulness of recreating specific instances by targeting specific meaningful constellations from the meaningless everythingwitheverything connection, is indeed a new paradigm.
That the stabilization of a certain profile from the overall amorphous structure is due to boundary conditions, which are the result of the stabilization of another profile of that same structure, etc., is the ADAM KADMON principle in action. All creation is the result of such processes. The act of "Heralding" is but the triggering of this indefinite process by means of the boundary condition (questions) that stabilize one of its profiles and thus yield the desired answer. That explains how the act of heralding and its result, the logical structure, are one and the same structure. Hence, ADAM KADMON, the infrastructure that teaches itself, gains extraordinary meaning.
[1] Complexity, by M. Mitchell Waldrop, p.288
