Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-

Langan CTMU

Name: Anonymous 2018-07-12 10:16

http://www.megafoundation.org/CTMU/Articles/Langan_CTMU_092902.pdf
Abstract: Inasmuch as science is observational or perceptual in nature, the goal of providing a scientific
model and mechanism for the evolution of complex systems ultimately requires a supporting theory of reality
of which perception itself is the model (or theory-to-universe mapping). Where information is the abstract
currency of perception, such a theory must incorporate the theory of information while extending the
information concept to incorporate reflexive self-processing in order to achieve an intrinsic (self-contained)
description of reality. This extension is associated with a limiting formulation of model theory identifying
mental and physical reality, resulting in a reflexively self-generating, self-modeling theory of reality identical
to its universe on the syntactic level. By the nature of its derivation, this theory, the Cognitive Theoretic
Model of the Universe or CTMU, can be regarded as a supertautological reality-theoretic extension of logic.
Uniting the theory of reality with an advanced form of computational language theory, the CTMU describes
reality as a Self-Configuring Self-Processing Language or SCSPL, a reflexive intrinsic language
characterized not only by self-reference and recursive self-definition, but full self-configuration and self-
execution (reflexive read-write functionality). SCSPL reality embodies a dual-aspect monism consisting of
infocognition, self-transducing information residing in self-recognizing SCSPL elements called syntactic
operators. The CTMU identifies itself with the structure of these operators and thus with the distributive
syntax of its self-modeling SCSPL universe, including the reflexive grammar by which the universe refines
itself from unbound telesis or UBT, a primordial realm of infocognitive potential free of informational
constraint. Under the guidance of a limiting (intrinsic) form of anthropic principle called the Telic Principle,
SCSPL evolves by telic recursion, jointly configuring syntax and state while maximizing a generalized self-
selection parameter and adjusting on the fly to freely-changing internal conditions. SCSPL relates space,
time and object by means of conspansive duality and conspansion, an SCSPL-grammatical process
featuring an alternation between dual phases of existence associated with design and actualization and
related to the familiar wave-particle duality of quantum mechanics. By distributing the design phase of
reality over the actualization phase, conspansive spacetime also provides a distributed mechanism for
Intelligent Design, adjoining to the restrictive principle of natural selection a basic means of generating
information and complexity. Addressing physical evolution on not only the biological but cosmic level, the
CTMU addresses the most evident deficiencies and paradoxes associated with conventional discrete and
continuum models of reality, including temporal directionality and accelerating cosmic expansion, while
preserving virtually all of the major benefits of current scientific and mathematical paradigms.

Name: Anonymous 2018-07-12 10:20

Introduction
Among the most exciting recent developments in science are Complexity Theory, the theory of
self-organizing systems, and the modern incarnation of Intelligent Design Theory, which
investigates the deep relationship between self-organization and evolutionary biology in a
scientific context not preemptively closed to teleological causation. Bucking the traditional
physical reductionism of the hard sciences, complexity theory has given rise to a new trend,
informational reductionism, which holds that the basis of reality is not matter and energy, but
information. Unfortunately, this new form of reductionism is as problematic as the old one. As
mathematician David Berlinski writes regarding the material and informational aspects of DNA:
“We quite know what DNA is: it is a macromolecule and so a material object. We quite know
what it achieves: apparently everything. Are the two sides of this equation in balance?” More
generally, Berlinski observes that since the information embodied in a string of DNA or protein
cannot affect the material dynamic of reality without being read by a material transducer,
information is meaningless without matter. 1
The relationship between physical and informational reductionism is a telling one, for it directly
mirrors Cartesian mind-matter dualism, the source of several centuries of philosophical and
scientific controversy regarding the nature of deep reality. 2 As long as matter and information
remain separate, with specialists treating one as primary while tacitly relegating the other to
secondary status, dualism remains in effect. To this extent, history is merely repeating itself;
where mind and matter once vied with each other for primary status, concrete matter now vies
with abstract information abstractly representing matter and its extended relationships. But while
the formal abstractness and concrete descriptiveness of information seem to make it a worthy
compromise between mind and matter, Berlinski’s comment demonstrates its inadequacy as a
conceptual substitute. What is now required is thus what has been required all along: a
conceptual framework in which the relationship between mind and matter, cognition and
information, is made explicit. This framework must not only permit the completion of the gradual
ongoing dissolution of the Cartesian mind-matter divider, but the construction of a footworthy
logical bridge across the resulting explanatory gap.
Mathematically, the theoretical framework of Intelligent Design consists of certain definitive
principles governing the application of complexity and probability to the analysis of two key
attributes of evolutionary phenomena, irreducible complexity 3 and specified complexity. 4 On one
hand, because the mathematics of probability must be causally interpreted to be scientifically
meaningful, and because probabilities are therefore expressly relativized to specific causal
scenarios, it is difficult to assign definite probabilities to evolutionary states in any model not
supporting the detailed reconstruction and analysis of specific causal pathways. On the other
hand, positing the “absolute improbability” of an evolutionary state ultimately entails the
specification of an absolute (intrinsic global) model with respect to which absolute probabilistic
deviations can be determined. A little reflection suffices to inform us of some of its properties: it
must be rationally derivable from a priori principles and essentially tautological in nature, it must
on some level identify matter and information, and it must eliminate the explanatory gap between
the mental and physical aspects of reality. Furthermore, in keeping with the name of that to be
modeled, it must meaningfully incorporate the intelligence and design concepts, describing the
universe as an intelligently self-designed, self-organizing system.
How is this to be done? In a word, with language. This does not mean merely that language
should be used as a tool to analyze reality, for this has already been done countless times with
varying degrees of success. Nor does it mean that reality should be regarded as a machine
language running in some kind of vast computer. It means using language as a mathematical
paradigm unto itself. Of all mathematical structures, language is the most general, powerful and
necessary. Not only is every formal or working theory of science and mathematics by definition
a language, but science and mathematics in whole and in sum are languages. Everything that
can be described or conceived, including every structure or process or law, is isomorphic to a
description or definition and therefore qualifies as a language, and every sentient creature
2constantly affirms the linguistic structure of nature by exploiting syntactic isomorphism to
perceive, conceptualize and refer to it. Even cognition and perception are languages based on
what Kant might have called “phenomenal syntax”. With logic and mathematics counted among
its most fundamental syntactic ingredients, language defines the very structure of information.
This is more than an empirical truth; it is a rational and scientific necessity.
Of particular interest to natural scientists is the fact that the laws of nature are a language. To
some extent, nature is regular; the basic patterns or general aspects of structure in terms of
which it is apprehended, whether or not they have been categorically identified, are its “laws”.
The existence of these laws is given by the stability of perception. Because these repetitive
patterns or universal laws simultaneously describe multiple instances or states of nature, they can
be regarded as distributed “instructions” from which self-instantiations of nature cannot deviate;
thus, they form a “control language” through which nature regulates its self-instantiations. This
control language is not of the usual kind, for it is somehow built into the very fabric of reality and
seems to override the known limitations of formal systems. Moreover, it is profoundly reflexive
and self-contained with respect to configuration, execution and read-write operations. Only the
few and the daring have been willing to consider how this might work...to ask where in reality the
laws might reside, how they might be expressed and implemented, why and how they came to
be, and how their consistency and universality are maintained. Although these questions are
clearly of great scientific interest, science alone is logically inadequate to answer them; a new
explanatory framework is required. This paper describes what the author considers to be the
most promising framework in the simplest and most direct terms possible.
On a note of forbearance, there has always been comfort in the belief that the standard hybrid
empirical-mathematical methods of physics and cosmology will ultimately suffice to reveal the
true heart of nature. However, there have been numerous signals that it may be time to try a new
approach. With true believers undaunted by the (mathematically factual) explanatory limitations
of the old methods, we must of course empathize; it is hard to question one’s prior investments
when one has already invested all the faith that one has. But science and philosophy do not
progress by regarding their past investments as ends in themselves; the object is always to
preserve that which is valuable in the old methods while adjoining new methods that refine their
meaning and extend their horizons. The new approach that we will be exploring in this paper,
which might be colorfully rendered as “reality theory is wedded to language theory and they beget
a synthesis”, has the advantage that it leaves the current picture of reality virtually intact. It
merely creates a logical mirror image of the current picture (its conspansive dual), merges the
symmetric halves of the resulting picture, and attempts to extract meaningful implications.
Science as we now know it is thereby changed but little in return for what may, if fate smiles upon
us, turn out to be vast gains in depth, significance and explanatory power.
And on that note, I thank you for your kind attention and wish you a fruitful journey.
On Theories, Models and False Dichotomies
It has almost become embarrassing to point out that science is in a state of crisis...not because it
is untrue, but because it has become a cliché too often accompanied by little or no remedial
insight. For all of the magnificent achievements of science, its grander ambitions long ago
succeeded in taxing its traditional models and organizational principles beyond their explanatory
limits. In the search for ever deeper and broader explanations, science has reached the point at
which it can no longer deny the existence of intractable conceptual difficulties devolving to the
explanatory inadequacies of its fundamental conceptual models of reality. This has spawned a
new discipline known as reality theory, the study of the nature of reality in its broadest sense.
The overall goal of reality theory is to provide new models and new paradigms in terms of which
reality can be understood, and the consistency of science restored as it deepens and expands in
scope.
3Mainstream reality theory counts among its hotter foci the interpretation of quantum theory and its
reconciliation with classical physics, the study of subjective consciousness and its relationship to
objective material reality, the reconciliation of science and mathematics, complexity theory,
cosmology, and related branches of science, mathematics, philosophy and theology. But in an
integrated sense, it is currently in an exploratory mode, being occupied with the search for a
general conceptual framework in which to develop a more specific theory and model of reality
capable of resolving the paradoxes and conceptual inconsistencies plaguing its various fields of
interest (where a model is technically defined as a valid interpretation of a theory in its universe of
reference). Because of the universal scope of reality theory, it is subject to unique if seldom-
recognized demands; for example, since it is by definition a universal theory of everything that is
real, it must by definition contain its rules of real-world interpretation. That is, reality theory must
contain its own model and effect its own self-interpretative mapping thereto, and it must conform
to the implications of this requirement. This “self-modeling” capacity is a primary criterion of the
required framework.
The ranks of reality theorists include researchers from almost every scientific discipline. As the
physical sciences have become more invested in a quantum mechanical view of reality, and as
science in general has become more enamored of and dependent on computer simulation as an
experimental tool, the traditional continuum model of classical physics has gradually lost ground
to a new class of models to which the concepts of information and computation are essential.
Called “discrete models”, they depict reality in terms of bits, quanta, quantum events,
computational operations and other discrete, recursively-related units. Whereas continuum
models are based on the notion of a continuum, a unified extensible whole with one or more
distance parameters that can be infinitely subdivided in such a way that any two distinct points
are separated by an infinite number of intermediate points, discrete models are distinguished by
realistic acknowledgement of the fact that it is impossible to describe or define a change or
separation in any way that does not involve a sudden finite jump in some parameter.
Unfortunately, the advantages of discrete models, which are receiving increasingly serious
consideration from the scientific and philosophical communities, are outweighed by certain basic
deficiencies. Not only do they exhibit scaling and nonlocality problems associated with their
“display hardware”, but they are inadequate by themselves to generate the conceptual
infrastructure required to explain the medium, device or array in which they evolve, or their initial
states and state-transition programming. Moreover, they remain anchored in materialism,
objectivism and Cartesian dualism, each of which has proven obstructive to the development of a
comprehensive explanation of reality. Materialism arbitrarily excludes the possibility that reality
has a meaningful nonmaterial aspect, objectivism arbitrarily excludes the possibility that reality
has a meaningful subjective aspect, and although Cartesian dualism technically excludes neither,
it arbitrarily denies that the mental and material, or subjective and objective, sides of reality share
common substance. 5
One might almost get the impression that the only two available choices are the classical model,
to which quantum theory has been fastened with approximately the same degree of cogency as
antlers on a jackrabbit, and the newer discrete models, which purport to be more in line with
quantum theory but fall by the wayside en route to the new kind of quantum cosmology they
portentously seem to promise. For such claims exhibit an unmistakable irony: classical reality is
precisely that on which information and computation are defined! Like classical reality itself, a
well-defined entity unable to account for its own genesis, information and computation are well-
defined and non-self-generative aspects of reality as it is observationally presented to us at an
advanced stage of its existence. So they invite the same questions as does classical reality: how,
and by what, were they originally defined and generated? Without an answer to this question,
little can be gained by replacing one kind of reality with the other.
Some may have felt, as they watched the history of Big Theories and New Paradigms unfold over
the last few years, as though they were being forced to watch the same show, or read the same
novel, a thousand times in tedious succession with no more than an occasional minor revision of

Name: Anonymous 2018-07-12 10:24

this is drugs

Name: Anonymous 2018-07-12 10:29

>>3
Its like seeing a highly effective neural network fed with a bullshit dataset and producing hallucinations.

Name: Anonymous 2018-07-12 10:33

>>4
or maybe just a markov chain trained on 50% deconstructionist lit-crit, 30% linguistics, 10% physics and 10% religion

Name: Anonymous 2018-07-12 11:17

What if we're just sub-150 IQ brainlets unable to comprehend his 200IQ genius?
What is there is something there..?

Name: Anonymous 2018-07-12 11:53

>>6
and what if his's is just an anus?

Don't change these.
Name: Email:
Entire Thread Thread List