[time 404] On the Problem of Information Flow between LSs


Matti Pitkanen (matpitka@pcu.helsinki.fi)
Sun, 13 Jun 1999 11:36:11 +0300 (EET DST)


Dear Stephen,

this discussion is very inspiring and I have been busily working with
quantum TGD:eish information theory all the time. The email has grown
very long. I think that we must cut it into pieces in the sequel.

Dear Matti,

Matti Pitkanen wrote:
>
> On Wed, 9 Jun 1999, Stephen P. King wrote:
>
> > Dear Matti and Friends,
> >
> > In [time 395] Constructing spacetimes, Matti wrote:
> >
> > "There is also problem about information flow between different
> > LS:s. How can one define information current between LS:s if
> > these systems correspond to 'different spacetimes'?"
> >
> > There is much to be discussed here!
> >
> > If I am correct, "current" is defined as some quantity of change
> > occurring through a boundary of some sort.
> > (http://www.whatis.com/current.htm) It is usually assumed that some
> > particle or fluid is being transferred from one location to another
and
> > a term "density" is associate with "Current per unit cross-sectional
> > area". So we are thinking of the concepts: "flow", "boundary",
> > "information", "different space-times", and "particle".
>
> Yes. This definition works also in infinite dimensional case.

        Ok, we should try to use this definition, but we should consider
the tacit assumptions that it brings.
 
> > We need definitions that are mutually consistent, I am proposing
to
> > using graph theoretic concepts since we can easily generalize them to
> > continua:
> >
> > http://hissa.nist.gov/~black/CRCDict/termsArea.html#search
> >
> > Flow: "A measure of the maximum weight along paths in a weighted,
> > directed graph" We could consider the "weight" as the degree to which
a
> > given edge connects a pair of vertices, e.g. if a pair of vertices are
> > identical relative to their possible labelings the weight is 1, the
> > weight is 0 if their respective sets of labels are disjoint. (When
> > considering spinors as labels of the vertices we use alternative
> > notions.)
> >
> > http://hissa.nist.gov/~black/CRCDict/HTML/flow.html
> >
> > Boundary: I can not find a concise definition so I will propose a
> > tentative one: the boundary of a graph B{G} is the minimum set of
> > vertices |V_G| that have as incident edges that connect a pair of
> > points, one of which is an element of ~{G} and the other which is an
> > element of {G}; where {G} and ~{G} are a graph and its complement.
> > I am not sure that this notion is appropriate. :( I am thinking of
the
> > way which traditional set theory defines a boundary of a set: "a point
> > is in the boundary of a set iff every neighborhood of the point
> > intersects both the set and its complement". So the boundary of a set
of
> > these points. It looks like the only element involved would be the
empty
> > set {0} in the usual way of thinking of sets in the binary logical
> > sense, this relates to my discussion of the Hausdorff property...
> >
>
> Does this approach generalize simplicial cohomology? Simplicial
complex
> defines homology groups. It has simplices up to dimension D if
simplicial
> complex is D-dimensional. One can consider functions in the set
> simplices of given dimension. One can define co-exactness and
> co-closedness and cohomology groups. Info current would be a function
in
> the set of D-1 dimensional simplices. Info current would in general
> correspond to an element of cohomology which is not coclosed. This makes
> sense only in ordinary topology defined by norm but you are talking
about
> non-Hausdorff property.

        I think of it as a crude method to think about simplical
cohomology! :)
I do believe that cohomology theory is the best place to look for the
tools we need to build a model of interactions! :)
        The problem is that simplexes (or more generally, complexes) are
"static" objects,e.g. "relational structures", so for the modeling of
dynamics or any "updating" of the local structures I think that there
are several options. I have been looking into "periodic gossiping" as a
basic notion to build upon, but I need assistance with the mathematics
involved. :)
        One aspect that I particularly like about what you are saying is
that
the fundamental dualities that were needed to construct Chu transforms
among posets of observations ("observers") are already built into
simplical cohomology, as manifested by the "co-exactness" and
"co-completeness" properties. I highly recommend that you read over the
papers that Pratt has on his site,starting with the ones linked from the
heading! :)

        We need to start putting some of the zig-saw puzzle pieces
together
before we loose track of the big picture. This will help illustrate how
the non-Hausdorffness of posets of observations occurs.

[MP]
I have some familiarity with Pratt's paper about quantum mechanics
and Chu spaces. At least I can tell what one should do for
computationalism to generalize it to TGD context: conscious computer
should make quantum jumps between entire computations and contents of
conscious should be give information about some part of computation.

[SPK]
        The basic idea there is that given n >/= 2 observers having an
observable that they can communicate effectively about (the idea of
infromation flow), there will be at least one singleton \subset of the
posets of the n observers that is not disjoint. Thus the definition of
Hausdorffness is weakend... We do need to discuss this further as I am
not sure that my words are proper. :)
        
[SPK]
> > Information: Now here is the key problem: How to define "information"!
> > What is Information? Is is "meaning" as in "the semantic content of a
> > pattern of matter/energy"? Is it the bits that are recovered when a
> > string of bits is encoded or compressed by some scheme and then
decoded
> > or decompressed by the scheme's inverse? Is it the value of a quantity
> > present at some arbitrary point?

[MP]
> Very stimulating questions! While visiting at your homepage I
> realized for the first time how many times 'information' appeared
> there. For some mysterious reason I have managed to circumvent the
> challenge of defining this concept until now. Perhaps my
> strong opinions about computationalism explain this(;-).

        I understand how that can happen. :) I would like to better
understand what your ideas about computationalism are so that I can
explain my thinking better. :)
 
[MP]
AI presents human brain as mere machine and I have never been
(and will(;-)) able to take this seriously. The concept of information
is easy if one speaks of binary sequences and very AI:eish concept.
In my own framework it seemed almost hopeless to associate information
measure to conscious experience for obvious reasons. How to associate
bit sequence to color red experience?! Therefore I was also sceptic
about concept of information and regarded it as too AI:sh concept.
In fact I am still sceptic about assigning information content
to say color red experience but I will not go to this here.

As I already have explained below I was too pessimistic. 'Clever trick'
is possible. Definition of information content of cs experience
as differences of informations contents of initial and final quantum
histories reduces to the definition of info content for quantum histories
which are completely classical things. This means that
the huge arsenal of existing wisdom (I wish I had it(;-))
about classical information becomes available.

I have of course used 'clever trick' earlier but have not realized
that what I am doing can be generalized. I defined
entanglemnt negentropy gain as difference between
entanglement entropies between initial and final states of quantum jump
as information content of cs experience. Later I became sceptic: is
information gain indeed in question or does entanglement entropy describe
only catchiness of conscious experience.

I am now working through this thing and have now rough formulation
of quantum information theory a la TGD. There is not only one
information measure but infinite number of them: each measuring
particular kind information. Also entanglement negentropy gain is
particular kind of information gain in conscious
experience. This is wonderful result: although one cannot write
formula for the contents of conscious experience, one can calculate
the amount of particular kind of information in conscious experience.
This is how far one get: what remains is the sacred mystery of
free will.

This is basic point at which computationalistic approach differ from TGD.
In computationalism one must assume that contents of conscious experience
is expressible using formula. In TGD only the initial and final
states of quantum jump are expressible using formula but
not contents of consciousness itself itself: this is enough!
TGD replaces computer with Universe as computer making quantum jumps
between computational histories.

[SPK]
> a) I think that its is meaningful to talk about 'meaning'
> only if one talk about *conscious* information. OK?

        Yes. :) But, I see consciousness from a generic point of view and
not
restricted to people.

[MP] I agree here compeletely. I talk always about subsystems(;-), rarely
about brain.

[SPK]
This sounds a bit like panpsychism, I know, but if
there is to be a scietific instead of mystical explanation of
measurement (see discussions of "Wigner's Friend" and Schroedinger's Cat
http://www.npl.washington.edu/tiqm/TI_40.html#4.3), we need to think of
measurement as "objective". I find Penrose's work to be very inspiring
toward this end! :)

[MP]
Here is basic difference in our deepest beliefs. I do not believe
that state function collapse is modellable. I regard it as ultimate
mystery. But notice how much one can say: one can even calculate
information contents of this mystery for all kinds of informations!
Computer replaced with computations and quantum jumps between them
makes it possible to achieve this(;-)!

 
> b) It is certainly impossible to characterize conscious experience
> by a bit sequence. This was one of reasons I have been very sceptic
> about 'information content of cs experience'.

[SPK]
        Yes! This impossibility is part of Penrose's program and is
wonderfully fixed by Peter Wegner and Vaughan Pratt's work! Thus I urge
all to study them.

 
> One could however circumvent this problem! Conscious information could
>be
> defined as *difference* of informations associated with initial and
> final states of quantum jump. This would reduce problem to that of
> associating information measure to quantum states! Since quantum states
> correspond to a well defined geometric objects there are hopes of
> associating information measures with them! This looks a clever
> trick to me at least!(;-)

        Yes! "Conscious information could be defined as *difference* of
informations associated with initial and final states of quantum jump."
But notive that involve some very subtle situations. There is the matter
of the compressability of the information, such that there is a
relationship between the "reducibility" of the number of bits needed
(assuming a binary message for simplicity) to communicate a given
message and ability to predict what the message says before it is
"read". This implies a "strong duality" (sic) between data compresion
and "gambling"!

[MP]
These are problems but problems at clasical level. The task of defining
the information measures for configuration spinor fields reduces to
purely classical problem. Modulo effects relating to p-adic numbers of
course and these effects seem to be over-all important!

The approach is based on the reals to p-adics map performed
by the phase preserving canonical identification with minimal pinary
cutoff. The necessary presence of *uniquely determined* pinary cutoff
makes it possible to assign *unique* information content to the
configuration space spinor field. The pinary cutoff is not anymore an ugly
but necessary feature of fundamental theory but makes possible quantum
information theory. What is nice is that pinary cutoff provides
fundamental model for course graining characteristic of conscious
experience and this course graining makes it possible to assign p-adically
finite information measure to quantum state.

The lengths of pinary strings must be
defined as p-adic numbers and this leads to finite results.
For instance, the real countepart of p-adic information is
always bounded by p*log(p): this has direct interpretation.
p-Adic system with typical size of order p-adic length scale
is not able to have consciously experiences with too large information
contents! This kind of restrictions are not easy to formulate
in ordinary information theory!

I am now working with details. Good starting point is
strong NMP: entanglement entropy is particular example
of informaation and I have worked for years to polish the definitions.
I can directly generalize the details to more general case!
For instance, conscious experience decomposes into
separate sub-experiences. Also information gain should
natural reduce to sum of information associated with
sub-experiences. This indeed happens.

        Please read "Elements of Information Theory" by Thomas M. Cover
and Joy A. Thomas, Wiley-Interscience Pub. 1991, pg.136-143. The entire
book is very good!
 
        I also recommend the paper by Brody and Hughston "Geometric models
for Quantum Statistical Inference" pg. 265-276, in The Geometric Universe,
edited by Huggett, Mason, Tod, Tsou and Woodhouse. It arrives at
conclusions very similar to those of Frieden and, appearently, is
independent! The author's derive geometry instead of Langrangians... But
it is easy to see that they are dual!
        These should help us realize this "clever trick"! :)

[MP]
The problem is that it is probably difficult to find these books (I
have promised me to no talk about my miserable economical situation!).
I believe that they might be useful. On the other hand
p-adic approach provides rather unique and physically intuitive
manner to approach the problem and is something new.
        
> c) One can probably define several types of informations associated
> with configuration space spinor field and assign
> information measures to them. Perhaps one must give up the idea
> about single information measure. Perhaps the essential question
> is 'About what the information is about' and each question gives
> different measure of information.

[SPK]
        Silly question: Do you mean "spinor field configuration space"
when you say "configuration space spinor field"?

[MP]
No. Configuration space spinor field is spinor field in configuration
space. Physical states are classical spinor fields in configuration space
of 3-surfaces. Configuration space spinors are essentially many fermion
states.

[What is beautiful is that second quantization for ordinary
spinor fields at the level of spacetime has purely geometric
interpretation: the anticommuting gamma matrices for infinite-dimensional
configuration space are linearly related to the fermionic oscillator
operators. I would emphasize and underline this connection: it is really
something highly nontrivial and deep. Second quantization <-->
construction of infinite-dimensional spinor structure.]

One assigns to each 3-surface a Fock state. In quantum field theory
there would be only single surface since geometry of 3-space
would not be dynamical. Now I have configurations space spinor
field defined as map from the space of 3-surfaces to
many fermion states.
 

[SPK]
There is a difference...The
former is the space of configurations of a spinor field and the latter
is something I do not understand. I know that you are using complex
projective (hyper)planes as part of the geometry of p-adic TDG, so maybe
the latter involved mapping or identifying the configurations of
particles to a spinor field?????
 
> d) I realized that the information associated with configuration space
> spinor field, about which I talked in previous postings,
> is essentially *information about position in configuration space*
> plus information about spin degrees of freedom relative to the
> ground state which corresponds to Fock vacuum and contains no
information.

[MP] Position in configuration space means position of classical universe
(3-surface) in configuration space and is generalization of
position of electron in 3-space. Huge conceptual leap!
The position of universe relates directly to the 'classicality'
of the quantum state. The larger the widht of distribution of parallel
universes the less classical the quantum state is. The larger the
potential positional information, the less classical universe (hence
biosystems, which are very information rich, are macroscopic
quantum systems!). Each quantum jump
localizes configuration space spinor field in some sector D_p of
configuration space and increases the classicality of the quantum state
dramatically.

[SPK] No "information in it-self", yes. :) "There can be no observations
of
self without mirrors." But, this "Fock vacuum", what is it?

[MP]Fock vacuum: no fermions.

> Information is defined as the information gain involved in total
> localization of the configurations space
> spinor field to single point in Fock vacuum state. Single 3-surface
> in configuration space into Fock vacuum state is selected and Shannon
> formula defines the information gain. Same works for Schrodinger
> amplutitude in nonrelativistic situation.

[SPK]
        This is a maximization of negentrophy or minimization of entropy,
but
what kind of entropy? Mutual or "cross" entropy? Entropy, most
basically, is a measure of the equilibrium or "equivalence" between the
"parts" of a system, so there are subtleties ... If all subsets of a
systems configuration (or phase space?) are identical, the system is at
equilibrium. I think that Hitoshi's bound states are quantum mechanical
versions of this. One idea that I have is that the notion of space-time,
in the sense of distance or duration, is undefinable in such conditions!
Thus the Totality level Universe U ^T has no associated "space" or
"time"!
        "Inproper" or finite (and thus distinguishable) subsets of U^T can
have
space-time assosiated because they are in a state of "disequilibrium"
with at least one other improper subset of U^T.
 
[MP] There is no division into parts in case of positional
entropy (unlike in the case of entranglement entropy, which is
also information measure as I finally learned). You have propability
distribution and Shannon entropy for it is given by
I = SUM_n p_n log(p_n), now p_n gives probability for position of universe
in configuration space and n is continuous 'index'
parametrizing 3-surfaces and sum becomes integral. The large the width
of the distribution defined by p_n the larger than information gain.

As I mentioned, information is about position of
the universe in configuration space. One can invent infinite numbers
of different types of *local* informations and construct information
measures for them and calculate the corresponding
information content of conscious experience. For a week ago I still
talked about single information measure: now I can distinguishes between
different types of informations, information about geometry
of spacetime surface, information about induced gauge fields in it, etc...

> Critical Question: Does the information of the configuration space
> spinor field provided about the position in configuration space
> of 3-surfaces provide information about configuration space
> and spacetime geometry? There are hopes since spin degrees of freedom
> (which correspond to fermionic degrees of freedom in infinite
dimensional
> context) are involved and entanglement is associated with these degrees
of
> freedom. Recall that fermions describe 'reflective level of cs' in TGD
> approach to cs (Fock state basis has interpretation as Boolean algebra).

[SPK] Look at the role that Boolean algebras play in Chu spaces! :)
 
[MP] I noticed the comments about Boolean algebras in paper relating Chu
spaces to QM. In fact, I had in mind to ask about following.
The Boolean algebra formed by Fock state basis can be extended
to a complex linear algebra of fermionic quantum states. One has
quantum superpositions of Boolean statements represented as many fermion
states. Fermion number conservation restricts the allowed superpositions.
 Could it be that Chu spaces could accomondate this structure
which is not quantum logic but kind of complexification of
Boolean algebra.

> e) You mentioned bit counting as a possible manner to define
information.
> Interesting possibility is that real to p-adic correspondence
> could provide measure for the information content of the configuration
> space spinor field based on counting of bits, or actually pinary digits.

        Very interesting! :) Think about this: :

"Let p(x) ... be a probability density function on the real number line
[R^1], thus satisfying 0 </= p(x) </= 1 and \SUM p(x)dx = 1. If we take
the square-root density \Eta(x) \equivalent sqrt(p(x)), then \SUM \Eta^2
dx = 1 and we can regard \Eta as a point on the unit sphere S in a real
Hilbert space \H. If \rho(x) is another such square-root density
function, then we can define a 'distance' function D(\Eta, \rho) in \H
for the two distributions corresponding to \Eta(x) and \rho(x) by
writting:

        D^2(\Eta, \rho) = 1/2 \SUM [\Eta(x) - \rho(x)]^2dx . (2.1)

In this case the function D(\Eta, \rho), known as the 'Hellinger
distance', is evidently just the sine of the 'angle' made between two
Hilbert space vectors \Eta and \rho." pg. 266 ibid. in Geometric Models
for Quantum Statistical Inference ...
 
[SPK] What would be the p-adic version of this?! :)

[MP]
Square root might cause difficulties in p-adic context. The point is
that p-adic square root does not always exist as p-adic number.
One could use algebraic extension: in this case square root could
be however imaginary. One should replaces square with modulus squared
in the formula to get its p-adic counterpart.

Of course, also the integral is nontrivial p-adically. There are
several approaches to p-adic integral and the simplest one
generalizes ordinary definite integral using canonical identification.
It is however essential that p-adic integration region is image
of real integration region: this makes it possible to define
boundaries of integration regions as images of real boundaries. In
purely p-adic world there are no boundaries (compact-open topology).

> i) Pinary cutoffs of configuration space spinor field provide a
> sequence of more and more accurate discretization of configuration
> space spinor field.

[SPK] Umm, so this would be like saying that the 'mesh' size (graining)
of
the resolution of observations is given by the pinary cutoffs. Look at
how Hitoshi defines the uncertainty principle in his papers!
 
[MP]
Precisely. Resolution characterizes the resolution of conscious
experience.

> ii) The mapping of real configuration space spinor field to its
> p-adic counterpart involves *minimal* pinary cutoff for which
> continuation to smooth p-adic configuration space spinor
> field is possible. Minimal pinary cutoff comes from
> the requirement that the canonical image of the pinary cutoff allows
> continuation to a *smooth* p-adic configuration space spinor field.
> If pinary cutoff of the canonical image is too
> detailed, completion is not possible.
>
> iii) There would be thus some number N of pinary digits and
>
> I(X^3) =N(X^3)
>
> would serve as a measure for the information contained by
> the value of the configuration space spinor field at given point of
> configuration space.

[SPK] So are you saying that the measure of information (entropy) would
depend on the value of the pinary cutoff?
 
[MP]
The point is that the pinary cutoff *is unique*! It is the minimal
pinary cutoff allowed by the requirement that pinary cutoff
of the canonical image of spacetime surface can be continued
to a smooth surface satisfying p-adic counterparts of field
equations. The smaller the pinary cutoff, the better
the accuracy of real-p-adic correspondence and the more
intelligent the geometric object!!

> iv) One could define the total information contained by configuration
> space spinor field as sum of informations associated with
> discretized configuration space.
>
> N= SUM_i N(X^3_i).
>
> This number is infinite as real integer but *finite as p-adic number*!
> Real information is obtained as the canonical image of I
> and would be finite. Higher pinary digits would
> not be be given such importance as for low pinary digits in this
> information measure. This is indeed very reasonable: lowest pinary
> digits contain the essential and the rest is just details.

        The Hausdorff dimension used to define the non-integer
dimensionality
of a fractal looks like this! :)

[MP]
The real counterpart of integer N is indeed non integer when N is larger
than p.

 
> Note: the value of p-adic prime associated with entire universe is
> very probably infinite so that N is probably infinite as
> ordinary integer still. Note that infinities can cancel
> in info content of cs experience defined as difference.

{SPK] The value of the p-adic prime associated with U^T is infinite with
a
probability of 1! But, the cardinality of this infinity is, I believe,
related to Chiatin's \Omega!
http://www.cs.auckland.ac.nz/research/CDMTCS/docs/about.html
http://algebra.rotol.ramk.fi/~keranen/IMS97/abstracts/GREG.html

[MP]
Cannot say much about Omega. In any case following gives
idea about how decomposition of infinite primes to finite primes
relates to the decomposition of conscious experience to sub-experiences.

a) Entanglement entropy is indeed a genuine information
 measure and gives also information about spacetime geometry (contrary
to my earlier beliefs!!) I took the definition of
total entanglement entropy as a model of information measure in general.

b) Althought the P of the universe is very probably infinite, quantum
jump decomposes into sub-quantum jumps inside
unentangled sub-Universes containing no further unentalged sub-Universes.
These sub-universes are characterized by finite p-adic primes.
Similar decomposition occurs for conscious experience.
And must also occur for the information gains of conscious experiences.

c) Infinite P has well defined decomposition into finite primes
and this decomposition corresponds to the decomposition of
spacetime surface to p-adic regions characterized by finite
p-adic primes p_i. The definition of infinite prime P defise
also decomposition of spacetime surface to cognitive and material
spacetime sheets and quantum jumps reduce the entanglement
between cognitive and material spacetime sheets. This generalizes
von Neumann's intuition about brain as fundamental state function
reducer.

d) One must require that also *information gain decomposes into
sub-information gains* associated with these experiences labelled
by primes p_i. Hence one must define the information gain
of p_-adic sub-experience as p_i-adic number and map it to reals
by canonical identification. The resulting sub-information gain
is finite and smaller than p_i*log(p_i)/log(2) bits and reflects the
finite
mental abilities of finite subsystem! By the way this gives
limit to the maximum information contents of our concious experience if
it is determined by brain size. Huge number of bits: 167*2^167) bits
for single neuron already (p =about 2^167 for neuron)!

e) The decomposition to sub-information gains is possible
only if information measure is *local* at the level of configuration
space. This suggests that all allowed information measures are local.
Locality means that information is expressible as integral of
information density R*X, where R is the Fock space norm squared
of configuration space spinor field and X characterized the type
of information in question. Locality also allows to define
information current.

f) Interesting possibility is that the real information measures
obtained as sum of real counterparts of p_i- adic information measures
and directly as real counterpart of P-adic information measure are
identical.
This is difficulti to decide since both are infinite (already
log(P)/log(2) conversion factor is infinite).

 

> v) This information is obviously information about the construction of
> the p-adic counterpart of configuration space spinor field from
> its real counterpart by canonical identification mapping. Is this
> information given by conscious experience? Perhaps! Conscious
> experience always involves coarse roughening: higher pinary digits
> do not have same importance as lower pinary digits. Conscious experience
> forms abstractions. So, perhaps the contents of conscious experience
> involve essentially the coarse roughening involved with reals to p-adics
> map?

[SPK]
        I do think so! The finiteness of conscious experiences seems to
indicate this! :)
 
> f) All geometric structures of real quantum TGD
> are mapped to their p-adic counterparts using phase preserving canonical
> identification map with minimal pinary cutoff.

[SPK]
        But is this "phase" restricted to being on a S^2 disk? Could it be
the "phase" on a higher dimensional sphere S^n? We might get a situation
in which there are more than one unitary transformation of phases, much
like there are more than one S^2 slice of S^n?
 
[MP]
The phase is related to complex plane. Configuration space allows complex
coordinates and configuration space spinor fields are complex.
Time and one spatial coordinate form hypercomplex coordinate and one
can define hyperbolic phase in this case.

> i) This approach might work also at spacetime level
> for spinor fields defined on spacetime surface. To each spacetime
> time=constant section of spacetime surface one could associated
> information I in similar manner and pinary cutoff
> would provide the discretion of 3-surface making it possible
> to define total information as sum over informations associated
> with the points of X^3. CAnonical image would define real
> information which would be finite.

[SPK] This idea makes sense. We normally associate a space-like
hypersurface
with the set of entangled states of a quantum mechanical system [I think
:) ] and can think of it as a moment of consciousness...
 
[MP] The finiteness of information measure could have interpretation
as finiteness coming from the restrictions of conscious experiencer.
p-adic subsystem would never be able to have experience with information
content larger than p*log(p).

The information measure based on number of pinary cutoff relies critically
on p-adic nondeterminism. And p-adic nondeterminism in turn is what
realizes 'conduction' philosophy. For instance, one cannot deduce future
or past time development of spacetime surface uniquely from initial
values at t=constant snapshot but from a discrete net of
values given for the entire spacetime surfaces. The information
contained by spacetime surface is defined by the minimum number of these
initial values making unique 'conduction' possible. This looks rather
reasonable since it generalizes the idea about information
as 3-dimensional initial data to information as 4-dimensional
pinary cutoff.

> ii) The mapping of real spacetime surface to its p-adic
> counterpart involves this map and one can assign the real counterpart
of
> p-adic integer N of pinary digits to each point of real spacetime
> surface as its information content. Again also total
> information content could be defined as sum for the
> minimal pinary cutoff of spacetime surface.
>
> I think I must stop here.
>
> Best,
>
> Matti Pitkanen

[SPK]
        Do you have comments on the rest below?
 
> > Different space-times: This statement implies a plurality, a multitude
> > of configurations of distinguishable particles such that a basis of
> > three orthogonal directions is definable in conjunction with a dynamic
> > that alters the configurations in a uniform way.
> >
> > Particle: An entity that in a given reference frame or framing is
> > indivisible. It should not be assumed that an entity that is
indivisible
> > in one framing need be so in another framing. I am thinking of a
framing
> > as a finite context or environment that acts as a "contrast" for the
> > entity in question.
> >

[MP]
I do not know what the precise meaning of indivisibility is...

> > The problem I see right away is that information is not a
> > substance in
> > the normal sense, since it has the properties of compressibility and,
> > according to Bart Kosko, irrotability, which are in contrast with
those
> > properties of matter which is, usually incompressible and
rotateble....
> >

[MP]
I can comment only from my restricted point of view.
It seems that all kinds of information measures have
information flow in configuration space of 3-surfaces
associated with with them in TGD: the 'time development' operator
U_a allows to define this flow: one could say that this kind
of flow occurs in each quantum jump. Information current is obtained by
simply multiplying conserved probability current with the measure for
particular kind of information. Hence analogy with hydrodynamical flow.
I do not know whether one could define information at spacetime level as
an approximate concept: perhaps in nonrelativistic approximation. It
might be that configuration space and
quantum superpositions of parallel classical universes is necessary
conceptual tool for understanding conscious information.

By the way, I just realized that there are 3 time developments.
Classical time development of spacetime surface (Kaehler action),
time development by quantum jumps and 'informational' time development
by the time development operator U_a , a--> infty acting on the space
of quantum histories. Holy trinity of matter (as geometry), consciousness
and information!(;)

> > But, I think that Peter's notions are the most relevant to this
> > conversation of "information flows" between LS, so we need a way of
> > bridging between the formalism of graph theory and the formalisms used
> > in Peter's papers.
> >
> > We'll take that up after some discussion. :)
> >
> > Later,
> >
> > Stephen

Later,

Stephen

Best,

MP



This archive was generated by hypermail 2.0b3 on Sat Oct 16 1999 - 00:36:05 JST