Peerless Reviewings and Lesser Diatribes


It seems to me that the only way they could cull seed stocks of genetically-modified organisms (GMOs) is by learning a great deal about the radiating coherent quantum wave properties of DNA.

You want my reaction to Ian Blair's study suggesting that Vitamin C may help cause cancer (Science, 06/15/01)? From me? That me who has been saying ever since the late-70's, when the superconductant DNA paper was written, that the terms “antioxidant” (i.e., electron donor) and “free radical” (i.e., electron acceptor) are politically-motivated black-propaganda terms similar to “catastrophe” (i.e., singularity in an equilibrium surface) and “chaos” (i.e., the logically m-valued, as opposed to the logically binary)? Well… only when chemists, immunologists, geneticists, biologists, and so on stop cultivating psychological need for war and mayhem will terminology in their respective fields cease being derived from military science and thereby start mirroring the identity-transparent superintegrative properties of organic function undermined in pathogenesis of degenerative disease. And only then will we be actually able to evaluate experiments like that performed by Ian Blair. For those simpering scientific reactionaries who kneejerk that no such terminological trend persists in the “life” sciences, I would recommend the article by Natalie Angier: “AIDS: Inability to Absorb Excess Oxygen Blamed for Decay of Immune System” (NY Times, 06/26/94) -- which, of course, argues a theme suggesting that the electron acceptor, Vitamin C, is helping to sustain immune competency, thus in some degree mitigating against onset of AIDS, at the same time, per Blair, it is helping to cause cancer: take your choice, the big A or the big C. This journalist, whom one would prefer to have been angrier, naively quotes numerous “life” science researchers applying the following terms to processes studied in their fields: “turmoil”, “battle plan”, “defense network”, “gladiators”, mass “suicide program”. This is the short list, as anyone working in the “life” sciences knows. The degree to which this terminological transgression upon life, as supposedly studied in the “life” sciences, represents collective psychological projection of normotically-ill scientists cannot be overstated. The whole institutional discourse on oxidation-reduction has gone into ever increasing levels of mass hysteria since the late-50s when Albert Szent-Györgyi and Linus Pauling began mounting their (here is a word from military science) “attack” on foundations of “stereochemistry” (an old-fashioned term I have even had a biologist assure me is not a term and never has been a term). It is in order to maintain pretense that “charge transfer complexes”, “electron transport chains”, “p-electron parcels”, “bands of free electrons”, and other such non-“stereochemistry”-like submolecular quantum properties of organisms have no significant biological role to play that this post-modernist terminological black-propaganda polyglot has been so assiduously cultivated over the past 40 years. Because, if such quantum properties did have significant biological roles, then the collection of psychological identifications characteristic of the normotic illness to which the institutional scientist is so committed would not have binary-logic consistency and a tremendous intrapsychic and inter-psychological crisis would be full upon the halls of academy (not to mention institutes engaged in [classified military] “service funded” research on “life” science).

As early as 1974, in letters to Wolfgang Luthe, M.D., the then leading figure in Autogenic Therapy, and at a medical congress in 1977 (described in MOON) -- before AIDS was identified as a syndrome, that is -- Derek argued that the frequency response characteristics of DNA could be altered by low levels of non-ionizing ambient electromagnetic radiation. This was argued simultaneously in relation to several points: (1) the frequency response characteristics of cellular DNA are histologically specific and constitute an immunological tag, alteration of which, therefore, can lead to formation of autoantibodies and, ultimately, anti-DNA antibodies; (2) these processes in neurons and perineural tissue constitute the primary link between the immune system and the nervous system; (3) the fundamental role of spontaneous clinical and subclinical autogenic brain discharges -- which are part of the electromagnetic processes transpiring inside the body -- is to return altered frequency response characteristics of cellular DNA to normative values, thus mitigating autoimmune reactivity; (4) Albert Szent-Györgyi's thesis regarding integrity of intra- and inter-molecular electron-transport processes and structured intracellular water is essential to how normative values of cellular DNA frequency response are maintained; (5) these processes occur not only with regard to human DNA, but to nucleic acids of all organisms, including bacteria and viruses; (6) the pathogenicity of a virus can be enhanced sufficiently to get through human anti-viral immunity by having its nucleic acid (DNA or RNA) frequency response characteristics altered by low levels of non-ionizing ambient electromagnetic radiation -- because the virus with altered electromagnetic properties would be relatively immunologically invisible.

If these propositions are correct, then Jacqueline Barton's low-beta-value experiments on electron-transport rates between metallointercalators of DNA -- wherein she likely created tuned circuits with the embedding DNA frequency cones (See: “Collapse of Spatial Dimensions in DNA Helix-Coil Transition”, given below) -- represent something of a re-enactment of the processes by which HIV (an RNA-type virus) got through human anti-viral immunity. Of no small consequence is the circumstance that nucleic acid frequency response windows apparently fall in the valley of minimum sky noise where our common-carrier frequencies are located, i.e., in the microwave spectrum. For my part, the NIH-CDC account of the origins and transmission of AIDS would become more credible if someone would write a detailed account of why mosquitoes cannot disable Nile virus, but can disable HIV -- an account which further explains why there is not a huge HIV vaccine research effort afoot based on how mosquitoes disable HIV.

The race is on! Before we know it, the CloNes will have their SynThroids (completely artificial HuMans rechargeable by microwave link). Having to have an immune system is really a bummer, you know. SCIENCE ought to be able to overcome such systemic restrictions in organic processes God engineered with such simple-mindedness -- which unnecessarily set limits, for instance, on ambient radio frequency, microwaves, infrared, and magnetic fields, on how widely genetic recombination can be employed, on how much the ozone shield can be disturbed by natural gas flaring, hot auroras, and high energy lasers, on uses to which EMP devices can be put. The electromagnetic properties of human blood are, for instance… well, if not, like cultures, an obstruction to trade, certainly an obstruction to evolution of technology (see, say, H. Cook, “Dielectric Behavior of Human Blood at Microwave Frequencies,” Nature, 168:4267, pp. 247-8, 1951 or H. P. Schwan, “Electrical Properties of Blood at Ultrahigh Frequencies,” American Journal of Physical Medicine, 33:3, pp. 144-152, 1953). With SynThroids, however, properties of blood and other artificial body tissues (blood is a tissue, at least that is what they used to teach!) won't have such a stultifying effect on technological progress. If GMO-diaspora makes genetically-engineered food stocks a fait accompli and recombinant-induced food allergies make human clones a fait accompli, then the changing radiative environment of the planet (microwave cooking and communications, ABM-HAARP, irradiation at entry and exit to the library, drugstore, grocery store, video rental, airport) will make SynThroids a fait accompli.

How utterly embarrassing! is this children's garden discourse on the nature of time by leading scientists and philosophers (“Physics' Big Puzzle Has Big Question: What Is Time?” by James Glanz, NY Times, 06/20/01, an account of the Seven Pines Symposium meeting which we have no reason to doubt accuracy of). The “what” and the “why” of the “big puzzle”, so intimately involved with the reason the relativity- and quantum-theory perspectives on the nature of time profoundly differ, is quite simply that quantum theory was falsified early in the 20th century when Max Borne's probability amplitude interpretation of m-valuation in Schrödinger's wave equation became holy writ. By the end of World War II, virtually every discipline in the natural sciences was infected by consequences this falsification. By now, three-quarters of a century after Borne's 1926 suggestion, even the social sciences, music composition, literary criticism, and architectural design may have irretrievably succumbed to the disease. Indeed, immunology itself is not even immune! By mandating that submolecular properties of organisms, described by a falsified quantum theory, have no biological role, cybernetically-oriented biologists have arrived at a theory of organic self-production and self-organization (autopoiesis) which only grudgingly acknowledges subcellular molecular stereochemistry -- thus, altogether protecting the fields of biology and medicine (most critically, genetics) from any violation of the classical limit by encroachment from below. Biological clocks, therefore, cannot be highly correlated with (immunological) signifiers of self-identity via TEMPORAL OPERATORS incorporated into quantum wave properties of DNA. Since this cannot be the case, a biotechnology industry based on hacking the genomes of myriad organisms, cannot have mistakenly established its foundational premises on the quicksand assertion that genetic processes have no significant quantum chemistry dimensions.

Yes, I will have to find that paper by Nima Arkani-Hamed, Andrew G. Cohen, and Howard Georgi on the origin of spatial dimensions as viewed through the lens of super string theory. Thank you for drawing the NY Times article concerning it to my attention (George Johnson, “Back to Basics: How Did Space Get its Dimensions”, NY Times, June 26, 2001). You are correct in noting similarities with our ideas. In fact, the newspaper article uses some of the exact words and images written into the “Brownian Wave Statement” appendix composed in 1980. The basic idea of spatial-dimension-generating particles they are hypothesizing is far from new. Thirty years ago, “Chew's Monadology/bootstrap” interpretation of the elementary particle zoo treated particle interactions as part of “the process by which spacetime is” -- not as occurring in a passive spacetime reference framework. Particle interactions were considered part of the ontological steps by which ponderable space and time come into being. The presently proposed super-string-theory notion of the origin of spatial dimensions is, essentially, a resurrection of Chew's idea, which was roundly rejected three decades ago by the physics establishment and the Science News arbiters of scientific worth who classified such Asian ideas a being “Off the Beat”. Yes, Asian. A quick glance at Kumar Kisore Mandal's A Comparative Study of the Concepts of Space and Time in Indian Thought (Varanasi, Chowkhamba Sanskrit Studies, Vol. 65, 1968) is enough to realize that space as perceived in ordinary experience is treated as derivative and coming into and out of being under certain conditions -- one of these conditions involving what is presently called energy. Of course, this energy has to do with properties of time which are not presently conceived in consensus physics. Moreover, the idea of “planar worlds” connected to form three-dimensional space, as discussed in George Johnson's account of the paper, is not only a fundament of Sahkarov's notion of the elasticity of space in a multi-sheet universe, it is the basic idea underlying the traditional Shinto concept of sacred space or Ma (see: Ma, Space-Time in Japan, exhibition catalog, NY: Cooper-Hewitt Museum, 1979) which is the root notion informing the complete spectrum of Japanese aesthetic sensibilities, as applied across the full realm of traditional arts.

Why has the NY Times decided to focus attention on this particular incarnation of a basic idea which has appeared periodically in the scientific literature over the last several decades? Because this time it is presented mathematically and thus may be experimentally testable? No. I do not think so. Chew's ideas were mathematical. Sahkarov's ideas were mathematical. Both may have been experimentally testable. And the ideas written into the “Brownian Wave Statement” appendix were mathematical and eminently experimentally testable. This appendix, which describes space coming into and out of being under operator-time, was developed in course of describing the quantum wave properties of the DNA molecule. The involved mathematical model clearly implied several specific experimental tests which were outlined in the published paper. The description of these quantum wave properties involved a likelihood of “dimensionality collapse” in “resonant soliton coupling”, which collapse now appears testable through the use of “fractal drums”. So, I do not think it was mathematical presentation and experimental testability which led the NY Times to select this material as the basis for an article.

I have now had a chance to study the article on origins of spatial dimensions [Nima Arkani-Hamed, Andrew G. Cohen, and Howard Georgi, “(De)Constructing Dimensions”, Physical Review Letters, 86:21, May 21, 2001]. There certainly is a lot of revealing material here. I would draw your attention first to the following quotation: “The fifth dimension has appeared in the condensed moose [a 'mnemonic for the particle content of a four-dimensional gauge theory'] because the nonlinear sigma model fields allow the gauge field to 'hop' from one site to the next.” I think this hopping of elementary particle fields is related to hopping in theory of DNA conductivity by more than mere choice of words (see Paul T. Henderson, Denise Jones, Gregory Hampikina, Yongzhi Kan, and Gary B. Schuster, “Long-distance charge transport in duplex DNA: The phonon-assisted polaron-like hopping mechanism”, Proceedings of the National Academy of Sciences, USA, 96, pp. 8353-58, July 1999).

COLLAPSE OF SPATIAL DIMENSIONS
IN
DNA HELIX-COIL TRANSITION

William L. Pensinger
(circa September, 1999)

a range of beta values

From the perspective of the 1979 mathematical model of DNA in its superconductant state [see: Paine and Pensinger, 1979], it seems obvious why researchers are getting a range of beta values (decay of electronic coupling with distance) with different DNA electron transfer experiments. They think they have a substance with an invariant index of conductivity at a given temperature, which index can be established via measurement. This is not likely to be the case in vitro, any more than in vivo. In the living cell, DNA molecules in different physical locations (nucleus, mitochondria), and therefore in different functional relationships, will have different levels of conductivity. These levels of conductivity will differ with histological type: muscle cells, collagen cells, et cetera. Moreover, each of these reference levels of conductivity will vary according to the changing radiative environment of the given DNA molecule. Why? Because the level of conductivity is a function of: [1] the EXACT mass properties of the sample (meaning experimental duplication requires the same base sequences, the same sugar-phosphate backbone “string length”, the same intercalated molecular attachments, and so on); [2] the frequency cone (frequency, wavelength, waveform, intensity) the sample is subjected to (meaning that the experiments need to be done in a Faraday cage using identical laser devices to deliver the photon); [3] establishment or non-establishment of resonance between the mass properties of the sample (a determinant of well-stacking versus limited-stacking) and the properties of the frequency cone to which the sample is subjected. Already, by 1979, Shteyer, et. al. [see: Shteyer, Norton, Pilla, and Rodan, 1979], had demonstrated in the laboratory that DNA could be switched on by only the right pulsed frequency cone (very low intensity EMP, that is).

It is likely that in Jacqueline Barton’s low beta value experiments [See: Barton, Kumar, and Turro, 1986, and other papers cited below related to Barton’s research], she created tuned circuits with the embedding frequency cones. Basically, what has been demonstrated by the corpus of experiments on electron-transport rates between metallointercalators of DNA is that the undisturbed molecule has the lowest beta value (decay of electronic coupling with distance). The more well-stacked the probes, the more the excited states are well-populated in the natural DNA bases -- the less, that is, the p-stacks of the molecule’s nucleotide staircase are disturbed -- then, the lower the beta value. Now, given that knowledge, stack resonance effects on top of well-p-stacking, natural orientation, and donor energetics! For any given sample, how could the properties of the frequency cone required to create a tuned circuit be determined? We are not just talking only donor energies here, but the whole embedding cone: which means in vitro is not in vivo. Distance dependencies are sensitive to donor energies and to the other properties of the embedding frequency cone. Consider the superheterodyne (and screen grid in a radio tube) analogy: “We can set up our own transmitter very near the guy and blast his receiver with our own transmission. We keep changing the frequency until we happen to hit the one he is receiving on. At the very instant we hit it, the local oscillator circuit in his receiver will howl. It’ll put out a sound wave.” In the present case, the VERY RAPID quench of a glow is the analogue of the local oscillator circuit howl. There will be a very narrow range of high beta values with variations in the properties of the laser light used to blast the given DNA sample -- this is Anthony Harriman’s 40,000 measurements [see: Wu, 1999]; but alas, one can hardly be surprised that sexual bias should subliminally vector preferences relative to intercalated versus pendant donors -- and a sharp low-value down spike (the “howl”) if a given blast happens to hit the resonant “frequency” of the given sample.

Is there any way to predict ahead of time where the low-value down spike will come? According to the 1979 superconductant DNA model, when the tuned circuit oscillator resonance “hits”, the molecule starts to replicate. That is, the hit will correspond to the transition temperature of the given sample. My guess is that the negentropic resonance input maximizes phonon production, and that this maximization is necessary for the molecule to transit to its critical state. The superconductant DNA model establishes a mathematical relation between the critical temperature and the minimum time for spontaneous localization. In order to predict, given a known critical temperature, where the low-value down spike will come, it would be necessary to: [1] establish a mathematical relation between the minimum time and the beta value; [2] establish a mathematical relation between the minimum time and the frequency-wavelength of the incoming photon.

photoacoustic spectroscopy and DNA’s fractal drum

Explaining why DNA is not a substance with “an invariant index of conductivity at a given temperature” is a much more fundamental task involving issues behind Heisenberg’s indeterminacy relations and how virtual phonon exchange is permitted (as BCS describe) by indeterminacy. This would involve further insight into Dirac spin and the above mentioned minimum time. Indeterminacy is the single-valued-logic Hilbert space shadow of pencils of skew-parallels in m-valued-logic Hilbert space. In the mid-30s, neither Turing or Gödel would have blanched at this idea. But maybe the notion of skew-parallels requires too large an intuitive leap. Think, then, of skew-perpendicularity in relation to space-charge gradients.

A miniaturized microbarograph would be required for the immediate experimental task, and this device would have to mimic how the photoacoustic spectroscope is an analogue of the superconductant DNA model. Essentially, in listening to photosynthesis [see: Cahen, Malkin, and Lerner, 1978], a chloroplast (which contains DNA, of course) is put in a sealed plastic bag, bombarded with light, causing the chloroplast to change temperature, expand and contract, and send out a pressure wave that causes the plastic bag to vibrate, which the instrument translates into audible sound. This is exactly analogous to the quantum properties, not only of superconductant DNA molecules, but also of p-electron parcels in the p-electron gas environment of the molecule’s p-stacks. Both the parcel and the molecule itself are pulse-code receivers and transmitters. One structure to receive the coherent waves transmitted by the DNA molecule’s p-electron parcel ensemble is certainly the cell membrane; another is very likely the cytoskeleton. If you want to hear the music of the nucleotide pairs, you put Jacqueline Barton's DNA sample inside a fractal drum (experimental analogue of a cell) inside a Faraday cage, bombard it with a photon, use Catherine Even's instrumentation [see: Even, et al., 1999; and Weiss, 1999] to read the vibrations on the drum's liquid crystal tympanum, and translate that into audible sound using the apparatus from the photoacoustic spectroscope. Do this first with known sequences of nucleotide pairs to map the wave dynamics and you have a fast track way to read the genome. No viral clipping and pasting, and so on. I am sure creating such a device would be no mere afternoon's work, but it seems to me likely doable. You put this together with Lipton's idea about the cell membrane being a liquid crystal semiconductor for reading frequencies [see: Lipton, 1986] and you have a good beginning on a wave-effect computer processor as a light/sound interaction device. Optical logic. Superconducting bio-junctions. All the buzz words. Moreover, in context of discussion of Isaacs' ideas on molecular indeterminacy in the 1997 homeopathy paper [see: Isaacs and Lamb, 1969; and Pensinger, Paine, and Jus, 1997], there is a direct route seen on how to transform this liquid crystal semiconductor into a quantum processor utilizing Post’s m-valued logics. That is where microtubules of cytoskeleton come in, and ideas on Musculpt holography. Quantum tunneling (at synaptic and ephaptic junctions) is modeled by a multiplexed branching of a fiber optic microtubule into a pencil of skew-parallels defined in Hilbert space with Post's m-valued logics. When the optical fiber branches into a pencil of skew-parallels, it, by definition, leaves 3-space (tunnels), enters m-logically-valued n-dimensional Hilbert space, wave-effect processes, then re-enters 3-space at the other side of the neural or perineural junction in parallel with vesicle diffusion: this being an example of partial redundancy of mechanisms responsible for functional specificity and functional integration. Viewed in this fashion, the brain is a device in 3-space for receiving messages from m-logically-valued n-dimensional Hilbert space. The most cost effective laboratory for studying all this is a flotation tank, with attached Musculpt laser projection dome as an experimental model of the dolphin's sonic-visioning system. Dolphins and whales don’t live on this planet -- although they do occasionally visit. The whole Carl Sagan Hollywood popular science orientation to extraterrestrial life is laughably simple-minded. Space travel. Ponderable space even! Time travel. Passing-time even!

dimensionality collapse, fractal entrapment, solitons, and molecular diffusion

Resonant soliton coupling between the ambient frequency cone and the natural frequency of the mass properties of the DNA molecule collapse the space surrounding the double helix into a fractal-dimension array which corresponds to the prevailing p-frame sequence of the nucleotide staircase. This spatial, qua “biogravitational”, collapse is an integral part of binding recognition which constrains protein diffusion into bands via processes of fractal entrapment. But how does resonant soliton coupling achieve this collapse in dimensionality of space in the neighborhood of DNA? In the 1979 model of DNA in its superconductant state, the term relating to this collapse was b, the counterforce to the pressure gradient between levels of the p-electron gas environment of the molecule’s p-stacks. What is this b? Generally, it is gravitational acceleration. Could this be the case here also? Could some analogue of gravitational collapse transpire in a biological system? If so, the reduced protein diffusion dimensionality would be induced by acoustically-modified gravity wave modes propagating in the p-electron gas environment of the molecule in just the fashion hail is organized into bands by acoustically-modified gravity wave modes in the Earth’s atmosphere. Basically, fractal entrapment would be a direct expression of General Relativity and initiated by a complex angular momentum signal communicated during resonant soliton coupling. The importance of this is that, just as it has been insupportable since the Aspect experiments to imagine electron transport as transpiring in simply-connected three-dimensional Newtonian space (i.e., alternative pathways), so now it is becoming increasingly insupportable to imagine molecular motions as transpiring in three-dimensional Newtonian space. Indeed, as early as 1969, Isaacs and Lamb argued that the variables describing molecular motion in living systems at normal pressures and temperatures are subject to Heisenberg’s indeterminacy relation.

Molecular docking viewed through the lens of resonant soliton coupling suggests that the “memory” associated with a Poincaré recurrence has a lot to do with what the notion of distance is in electronic coupling. Self-induced transparency is a function of correlation length, which is essentially what the inverse of beta (decay of electronic coupling with distance) measures. Fermi-Ulam-Pasta vibrational modes are precisely the coherent waves generated by DNA in its superconductant state. Is this an on-or-off process, or are there graded levels of activity in reduction of dimensionality of space in the neighborhood of DNA helices? And what does it mean to measure a distance when the dimensionality of the space the distance exists in is not constant? Similar considerations, of course, apply to time reference. All of this is Relativity stuff! That’s why the wave-equation matrices in the appendix to the 1979 superconductant DNA model involve hypernumber arithmetics beyond Hamilton’s quaterions. That’s why operator-time is invoked. You can’t measure a distance when there is no dimensionality of the reference space at the time of measurement. At any time of measurement. But, you see, this thirty-year-old idea inevitably would synchronistically yield yet one more Copernican epicycle: not only do proteins have street addresses, they have American-style street addresses -- certainly not Japanese style! Of course, proteins still find their way to the prescribed destination by random, thermodynamic, heat-death-type forcing: postmen are not required to keep the classical limit inviolate. What time does to space at measurement: That’s all, folks! Operator-time. You see, when Einstein removed the notion of force as a fundamental in physics, he meant it. Which means everywhere, everytime. An address tells you you are at the right place when you get there; it doesn’t get you there. And random motion, even in the aggregate, won’t get you there at the right time most of the time -- only rarely.

skew-perpendicularity and zero-point donor energy

So, how do you get from force to field to operator-time modifying the properties of space? When the fractal-dimension nest is deep: Heisenberg indeterminacy. Simply stated: Because the work is independent of the path followed between two points, the force and field are conservative. The potential then has a definite value, is single-valued, that is. But when the dimensionality of the space collapses, the two points are not the two points, the path followed is not the path followed, the force and field are not conservative, the potential is not single-valued. In other words, the potential is multi-valued, what is conserved are invariants of classes of forces and fields, the path followed is a pencil of skew parallels, the two points are two sets of points, the space is laminated. High school honors physics: If change of potential equals zero for a certain small path, then the field cannot have a component along this path and must be perpendicular to the direction of such a path. Surfaces over which change of potential equals zero and potential is a constant are called equipotential surfaces. Thus, equipotential surfaces and lines of force are mutually perpendicular. But this is not the case in space subject to topological transformation and fractal collapse under operator-time. Change of potential becomes multivalued, and fiber bundles are required to describe the relationship between equipotential surfaces and lines of force. Skew-perpendicularity! The iceberg Heisenberg and Gödel saw the tip of. The virtual phonon exchange, responsible for the positive screening charge binding Cooper pairs in superconductivity, is permitted by the non-conservation of energy allowed by the uncertainty principle. But uncertainty is just a shadow of skew-perpendicularity! Skew-perpendicularity is charge creation, as Riemann’s “lines of force trapped in the topology of space”: zero-point donor energy. This is a route from spacetime physics back to pregeometry as a “bucket of dust” -- m-valued-logic propositional dust in quantum biochemical processes, that is.

SUGGESTED READING

In “(De)Constructing Dimensions”, the moose diagram considered “is the N-sided polygon representing a field theory with a gauge group and fermions transforming bilinearly under 'nearest-neighbor' pairs” -- nearest-neighbor pairing being the preferred account of intermolecular electron transfer processes and intra-DNA electron transport. Regardless of the content of the note given above, written in 1999 from the perspective of the 1979 superconductant DNA model, why are nearest-neighbor models the preferred models, in spite of the Aspect experiments on nonlocality? To quote Arkani-Hamed, et. al.: “Locality is a consequence of the nearest-neighbor coupling structure of our moose, enforced by our choice [emphasis added] of fermion content, gauge invariance, and renormalizability.” And later:

Since we could have obtained this [the gauge field hopping mechanism] directly as a latticization of the five-dimensional gauge theory, we might ask why we need the original moose model at all. The reason is that latticization in the fifth dimension does not cut off divergences from large four momenta: the four-dimensional nonlinear sigma model of [equation] (5) is nonrenormalizable, becoming strongly coupled at a scale
~ 4 p f s.

As an aside, I would note that “divergences from large four momenta” are treated in our energy-momentum cascade model as indication of topological operation on space by operator-time, the sort of temporal operation on space that may have an effect on the dimensional structure of space. Renormalization is, therefore, not employed, because it conceals the information required for initialization of computer model nesting modules related to complex angular momentum cascade.

Why is strong coupling an anathema? By way of an answer, let me give you an example of just how objective objective science can be. Quoting from Arkani-Hamed, et. al.:

We know that at some point as L ® Ls [ L and Ls being “dimensionful” parameters complexly related to the simple notion of distance, as is the “beta value” in theory of DNA conductivity], an ecological disaster will occur, dramatically changing the nature of the long distance physics. But it is reasonable to suppose that the cataclysm will happen abruptly at some point L » Ls, where both gauge couplings are strong. The only signal at long distances of impending doom is that as
L ® Ls, gets large compared to a. This signals the imminent breakdown of the effective theory because dimensional coupling in an effective theory must not be large compared to the appropriate power of the cutoff. Even though the tree level interactions are still weak at long distances, the theory is losing control of its quantum corrections, a warning that anarchy is about to be loosed upon the world.

This use of the words “disaster”, “cataclysm”, “doom”, “breakdown”, “losing control”, “anarchy” in context of an account of the origins of spatial dimensions seems to me quite significant -- even more significant, perhaps, than the meteorologist's use of the term “pathological phenomenon” in reference to a tornado. The implications are not so much in regards to individual psychology, but relative to collective processes of projective identification and resultant regression of the group mind. A sense of underlying psychological dread and political fear envelops the reader, casting considerable doubt on the disinterestedness with which critical choices were made in elaborating the theory. Was the need for “cutoff”, for instance, a psychological need? Is the insistence on renormalization, so as to avoid divergence, more than mere mathematical aversion to the unterminated infinitude? Is fear of “strong coupling” deeper than the obvious Freudian interpretation and related to the animistic associations of identity transparency? Indeed, in the distinction between “weak coupling” and “strong coupling” (i.e., “strong coupling” being coupling to the point of utter “identity transparency”: experiencing the experience of the “other”, that is) is to be found the essence of the account of the origins of warfare presented in MOON. Strangely, very strangely, “strong coupling” is here made tantamount to anarchy -- when “strong coupling”, another term for “long-range phase correlation”, is the responsible agent of “spontaneous order” in critical and cooperative quantum behaviors like superconductivity. One would think that, if a value-imputation was to be made regarding “strong coupling”, it would be that Nazi goose-stepping is “about to be loosed upon the world” -- not anarchy. What does this “calling the thing by the name of its opposite” -- a characteristic verbal ploy of deconstructionist texts -- tell us about the collective psychological processes involved?

The no-place/space to look for insight is in sensory isolation and sensory overload, for clearly the scientist's commitment to spatial isotropy, simple-connection, and localizability is rooted in his sensory experience. Commitment to sequential time reference, self-identical instants, asymmetry, and homogenous time-rate has the same experiential origin. Because of obvious military concerns, the most elaborate body of research on sensory overload is in relation to “pilot fixation syndrome”, where altered perceptions of space and time at overload can be responsible for fighter plane crashes. In the critical point of syndrome onset, the fundamental properties of space and time, given to awareness under normal sensory load, begin to collapse at overload -- possibly including collapse of the dimensionality of space. This breakdown in fixation syndrome is regarded as merely a matter of perception, as having nothing to do, that is, with actual distance functions or dimensionality of space itself. Origins of actual dimensions of space itself must be studied back at the beginning of time, because any apparent change in dimensional properties of space subsequent to the beginning of time can only be a matter of altered perceptions of the dimensions of space, as these dimensions have always been invariant since the beginning. Setting aside the fact that the properties of time the scientist is psychologically committed to have their origins in his sensory experience -- to include the act of reading the dial of an instrument used in an experiment (a problematic issue in itself, as the huge body of literature on the quantum measurement problem testifies) -- it is a plain fact that the whole of modern cosmology is predicated upon the notion that there was an objective world before there was any possibility of there being observers to observe it. In the beginning, there was not yet enough linear-time within which observers could have evolved (according to any possible account of planetary and biological evolution -- Darwinian, neo-Darwinian, or any form of catastrophism). Inherent in the very notion of linear-time evolution of the universe is the idea that there necessarily is an “objective world” independent of the observation of it. This “independence” is an enormous piece of scientific mysticism, mitigated only in so far as the “doctrine of final causes” -- long ago rejected by science as mere medieval “mysticism” -- can be evoked and elaborated by having observers far removed, temporally and spatially, from origins of the universe peer back into time over great distances to observe the universe that had no observers to observe it. This scientific credo has sense similar to Kazantzakis' proclamation: “Man must save God!”

But the only man who can completely save the scientist's God, the “objective world”, that is, is the last man, because only that man poised at the end of time can see all the way back to the absolute beginning -- for only he is far enough away for light from the beginning still to be traveling to him. And if dimensions of space had an origin, then there was a time before space was. To observe this time after the beginning but before spatial dimensions, the observer would have to see in time, not in space -- or in a time-not-in-space. This necessity implies something about the relation of time, light, and consciousness which cannot easily be denigrated.

So, you believe reference to the teleology-eschatology fallacy of medieval Scholasticism is misplaced, irrelevant to contemporary physics. Unfortunately, I cannot agree. Just as mathematicians, post-Gödel, cannot do mathematics, only make a pretense of doing mathematics, so physicists, post-, say, Maxwell, cannot do calculus, only make a pretense of doing calculus. The crisis in physics did not begin with Planck's paper in 1900 on the quantum of action, but in the debate between Newton and Leibnitz on Fluxions and Monads. But this is a subject impossible to talk about, for knowledgeable people think they know all there is to know about the notion of a limit and the roles of initial conditions and linear-time in the differential and integral calculus. “Heisenberg's indeterminacy inequalities were inevitable once Fluxions were employed.” Make this statement, and nothing but snickers will be elicited. Rather than go down that path, let me juxtapose two quotations.

…Hindu philosophy seems to be closer to Plato's philosophy than Buddhism, because for Plato the “tree in itself” exists in a suprasensory world, and all the trees that exist in the world of the senses, the phenomenal world, are, so to speak, copies of that tree-in-itself, “perceptible” copies of that “intelligible” tree. None of the copies, therefore, can fully reflect the perfection of the tree-in-itself. [Philosopher, Jean-François Revel, The Monk and the Philosopher, HarperCollins, 1998, pp. 104-105.]

The total solution of the N-body problem, for all possible initial conditions, can be comprehended rigorously in a rather Godlike or omniscient overview, by formulation as the flow of an incompressible fluid in a (6N - 10)-dimensional “phase space” or state space. Here each stationary streamline represents the entire motion of an N-body planetary configuration. (Each planet, idealized as a Newtonian point-particle, requires 3 coordinates of position and 3 coordinates of velocity to specify its current state, so N particles require 6N coordinates to specify their complete state; but the 10 integrals of Conservation of Energy and of Conservation of Angular Momentum reduce the 6N dimensional state-space by 10 dimensions.) For N = 10, (6N - 10) = 50. Hence we must consider a stationary incompressible flow in a 50-dimensional state-space. [Astronomer, Robert Bass, in Kronos, 1:3, 1975, arguing as to why Velikovsky's thesis on planetary volcanism must be taken seriously: http://www.kronia.com/library/planetary.html ]

I draw your attention to the statement of Bass: “Here each stationary streamline represents the entire motion of an N-body planetary configuration”. (Emphasis added.) This statement captures much more accurately the sense of a Platonic Idea, an archetype, than does the notion of a “perfect tree” offered by Revel. The “perfect tree” interpretation is the academic philosophy version taught in every Intro Course since Abélard took Héloise over his knee -- taught, that is, by academics who have not seriously studied neo-platonic thought, the medieval alchemy and Masonry it gave rise to, or the corpus of relevant esoteric theosophical literature derivative of the Hindu take on “the general case of general cases”, which is an early version of the “set of all sets, including itself”. (A good place to look concerning this is Sri Aurobindo's Plato and Vedic Idealism, written early in the 20th century by this Raj-born Cambridge-trained scholar of classical Greek. The larger issue is whether or not the texts attributed to Plato could have been translated properly by someone still staring at shadows on the wall of the cave, i.e., particular entities in the “objective world”.) The Bass quotation starts off noting “all possible initial conditions”, which does not refer to any particular case, but to the general case. The “entire motion” is the complete 4-dimensional spacetime path (past, present, and future) of the N-body configuration in question. An “N-body configuration” is not a particular entity; it is not an individual body. It is an N-body; it is n-identity-transparent-bodies in configuration space (depicted in Timæus: this “mothering space” which is hardly real, Hylê, purely indeterminate matter, the recipient matrix of nature); it is a “general entity”, an archetype. An N-body does not exist at a particular occasion; but at a collective occasion: it is a collective occasion of experience. This “entire motion” in ponderable spacetime of a collective occasion of experience (“the music of the spheres”) is a “stationary streamline” in the Godlike 50-dimensional state-space. The stationary streamline is one of the lines/edges constituting Plato's regular polyhedra, which metaphorically represent “Divine Ideas” (see Proclus, Introduction to Books II and III of Plato’s Republic, as well as the great works of Cambridge Platonist Thomas Taylor). A particular “maze” of interfused regular polyhedra metaphorically represents a cosmos of collective occasions corresponding to a given set of the set all sets of possible initial conditions, including itself. Thus, is motion of the music of the spheres frozen in a stationary architecture -- to paraphrase Goethe (oops! Frederich Schlegel, “Architecture is frozen music,” as quoted by Hartley Burr Alexander, Nature and Human Nature) and reveal the Masonic core of metareference designed into medieval cathedrals, leaded glass windows, and plainsong chant. The angels sitting on the head of a pin were not particular entities; they were general entities: therefore, angels.

The obvious question to ask is: Is an N-body boson configuration a collection of discrete entities or is it a general entity? Is there such a thing actually possible as an N-body boson configuration not in Bose-Einstein condensation? The authors of “(De)Constructing Dimensions” describe a “…fifth dimension, dynamically generated by a different set of interactions and with a different set of gauge bosons.” They ask the question:

What is a fifth dimension? Mathematically, any set of ordered points can be called a “dimension”, but physically we need more. Particles should be able to move in the extra dimension; that is, they should carry labels, their coordinates in the fifth dimension, that change as they move in the fifth dimension. Furthermore, there should be a physical notion of locality in the extra dimension. This translates into the requirement of locality for the interactions in the theory. Particles with the same labels have the largest interaction, while particles with very different labels should interact only weakly.

How, why, and wherefore is it so that the activities of the elementary particles that create the dimension of the space transpire within the dimension thus created by those very activities? Is this element of self-reference an attribute of space itself; does it reflect on the subjective experience/needs of the physicist; does it tell us something fundamental about the nature of time? And where does the “Furthermore…should…” come from? No reason is given for the furthermore/should properties of the extra dimension: they are like self-evident truths of what “physically we need” . Judgement as to what “physically we need” is based upon the sensory experience of the scientist, his sensory experience under normal sensory load -- not at sensory overload or in sensory isolation. Are the very properties of space itself to be judged exclusively on the basis of normal-load percepts? In fact, there appears to be no “physical notion of locality” under sensory overload, as the Los Alamos studies of pilot fixation syndrome clearly suggest. There also appears to be no “physical notion of locality” in sensory isolation, as the research of John C. Lilly, M.D., has clearly suggested. Deconstuctionist physics supposedly studies origins of spatial dimensions back at a supposed beginning of the universe in conformance to a post-modernist experimentalist orientation designed to keep the classical limit intact and quantum-relativity theory as far away from everyday life as possible. What would happen to what psychologically we need if Bose-Einstein condensation was allowed to affect the cognitive sciences, for instance? In order not to find out, physicists long ago learned to carry a mantra: “There is no such thing as a collective occasion of experience. There is no such thing as a collective occasion of experience. There is no such thing as a collective occasion of experience…” Please note: “Particles with the same labels have the largest interaction…” Am I crazy to think this bizarre? Is it not bizarre to call identity transparency “largest interaction”? so as to maintain pretension of distinction even in the unary.

Here it is necessary to tunnel through a potential barrier. Columbia University optical physicist, Rudolf K. Luneburg, summarized his experimental findings in the 1950 paper entitled, “The Metric of Binocular Visual Space” (Journal of the Optical Society of America, 40:10, p. 631) with the statement that “… there is no absolute localization even in binocular vision”. This paper has been largely ignored for 50 years because of what psychologically we need. In 1980, we made the following statement in “Some Preliminary Considerations toward Development of a Mathematical Model of the Autogenic Brain Discharge as Spontaneous Localization in Quantum Measurement”:

Luneburg's experimental findings concerning localization in visual space as learned behavior very strongly support this notion of spontaneous localization in quantum measurement as autogenic discharge. If ability to localize (in visual space) the needle's position on the gauge of a measuring instrument depends on consensuated (a collective occasion of experience) psychological factors, as Luneburg's findings clearly imply, then this (cooperative) localization of the needle's position on the gauge must take place in the brains of the participants or in association with some other substates of conscious awareness.

By presenting, in the same 1980 paper, a theory of the “Relative-State Activation of the Brain” through “DNA-Mediated Quantal Localization and Fusion in Cortical Function”, we explicitly began developing an account of the origins of spatial dimensions in the act of quantum measurement (as detailed in the appendix to the paper). In more contemporary parlance: a fractal boundary exists by virtue of the attempt to measure it; but this does not mean that the fractal dimensions involved are not spread out over the whole universe, the whole of space. By 1999, as stated in the above-given note entitled, “Collapse of Spatial Dimensions in DNA Helix-Coil Transition”, we could speculate that:

Quantum tunneling (at synaptic and ephaptic junctions) is modeled by a multiplexed branching of a fiber optic microtubule into a pencil of skew-parallels defined in Hilbert space with Post's m-valued logics. When the optical fiber branches into a pencil of skew-parallels, it, by definition, leaves 3-space (tunnels), enters m-logically-valued n-dimensional Hilbert space, wave-effect processes, then re-enters 3-space at the other side of the neural or perineural junction in parallel with vesicle diffusion: this being an example of partial redundancy of mechanisms responsible for functional specificity and functional integration. Viewed in this fashion, the brain is a device in 3-space for receiving messages from m-logically-valued n-dimensional Hilbert space. The most cost effective laboratory for studying all this is a flotation tank, with attached Musculpt laser projection dome as an experimental model of the dolphin's sonic-visioning system.

Thirty-five years ago, John C. Lilly, M.D., was scientifically crucified for exploring the origins of spatial dimensions in sensory isolation. This exploratory objective was an explicit aspect of the involved research program. The attack by institutional scientists on one of their own who had decisively moved in a new direction was an act of collective hysteria with many psychological, social, and political dimensions -- and with no scientific justification. One of those involved, a scientific “personality” and popular popular science writer, even attempted to create a “science court” to try scientists who endorsed pseudoscientific ideas, ideas like those of Emanuel Velikovsky. By the time this personality died, however, one could hardly find a reputable scientist who believed in “uniformitarian”evolution or solar system dynamics. This same personality played an important role in blocking application of quantum and relativity perspectives within the atmospheric sciences, thus facilitating chaos theory applications and setting American meteorology back perhaps by decades. Admittedly, the idea that spatial dimensions have origins, and that quantum properties of the act of measurement and cognition thereof are involved in those origins, can be disorienting to those psychologically committed to the usual Newtonian perspective. Traditional notions of boundaries, separateness, simple-identity, locality, logical-accommodation schemata, and linear-time sequencing are not easily outgrown. Indeed, when cell biologist, Gilbert Ling, developed the polarized multilayer theory of cell water, brilliantly dissolving the traditional notion of a cell “wall” into little more than a phase boundary between normal water and structured water, he retained the notion of “nearest neighbor coupling” in the involved electron-transport processes -- just like in the “moose” described in “(De)Constructing Dimensions”. And, astoundingly, he used as his chief explanatory simile for this nearest-neighbor coupling process how, traditionally, on the Great Wall of China, messages were passed from one watchtower to the next. Notions, once banished, continue to cling to the mind! Ling's work, clearly, was rejected by his field in an act of collective hysteria on a par with that directed against John Lilly. More recently, it appears that biochemist, geneticist, and radiation biologist Mae-wan Ho is being subjected to a similar act for similar reasons: her explorations of entanglement, coherency, and the properties of liquid crystals in relation to genes can only be disorienting to those psychologically committed to the usual Newtonian perspective. That the issues involved in giving rise to each of these instances of collective hysteria are all interrelated is most graphically illustrated by the fact of fractal image generation in the sensory isolation/flotation tank.

During the early 70s, Joseph Bridger pioneered practice of Autogenic Training in the isolation/flotation tank. Using the terminology of Autogenic Therapy (AT) to describe the work of both Lilly and Bridger, one could say the following. In the condition of deep relaxation which the body eventually obtains in the gravity-free environment of an isolation/flotation tank, the brain will shift into the trophotropic mode of function. Under normal sensory load, the brain remains in the ergotropic mode. Autogenic shift between these modalities of brain function can be facilitated not only by isolation/flotation tank use, but also by Autogenic Training standard exercises, by other adequate meditative practice, or, as Lilly came to discover and which has become well known, by use of psychotropic drugs. In the ergotropic state of active innervation and normal sensory load, the brain is unable to engage in adequate clinically manifest levels of spontaneous electrochemical discharge of “records of stress”. “Records of stress” is AT terminology for what we now regard as “altered quantum frequency parameters of neuronal and perineural DNA” accumulated in course of everyday life in the modern world -- a world filled with stressors, car accidents, microwave transmitters, biological clock shifters, chemical pollutants, and so on. Discharge of these records of stress, called autogenic discharge, involves electrochemical activity on the cellular, molecular, and submolecular levels within the central nervous system, which, in turn, affects other bodily systems. The brain can engage in clinical-intensity levels of autogenic discharge only if it is allowed to transit deeply into the trophotropic state. This transit is greatly facilitated by practice in the isolation/flotation tank environment. It is also greatly facilitated by practice at the opposite pole of sensory load: sensory overload in a flight simulator.

As a personal note, I would observe that John Lilly's earliest scientific work was conducted at Wright Field during WWII, where he developed electrophysiological measurement instrumentation for studying the physiological effects of then high-altitude flight without pressurization. At this same time, my father was flying B-17s over Germany -- mission after mission. Later, following the war, my father suffered two collapsed lungs because of so much of this high-altitude flying, and was taken off flight status. This put him on a revised career path which led to his becoming one of the honcho's on the flight simulator project at Wright-Patterson AFB through which autopilot systems were developed and tested. I, therefore, as a high school student, had Sunday dinner with the scientists who created the flight simulators and was given many opportunities to crawl in and out of them, thus, early in life, receiving a taste of sensory overload and its effects on perception of space and time.

Electrochemical autogenic discharge processes, which can be recorded by the various forms of electrophysiological measurement, are subjectively experienced, because these discharges involve neurons firing in the brain. The neurons that fire are otherwise employed in controlling (according to prevailing models) varying parts and functions of the body and associated cognitive processes, so that when these neurons are involved in abreactive autogenic discharge processes there are effects subjectively felt in the involved parts and functions. Any part of the body, any bodily function, any psychological function can be affected. Visual processes are at greatest issue in the present discussion. The best introduction to this subject is Wolfgang Luthe's article entitled “The Clinical Significance of Visual Phenomena During the Autogenic State” (Autogenic Training: Correlationes Psychosomaticae, N.Y.: Grune and Stratton, 1965). Many variables are involved in determining the nature of the visual phenomena encountered, as subjective content of the discharges is thematically organized by a “centrencephalic safety discharge mechanism” associated with Wilder Penfield's centrencephalic system and the RAS, the reticular activating system. There are stages of elaboration as the practitioner experiences more and more complex abreactive discharge processes. What is not described in detail in this article -- which was based on an enormous body of clinical case histories brought together over decades of neuropsychiatric practice, going back to the 1920s -- are what Luthe called “bright light states” and the afterimages of photisms experienced by very accomplished practitioners, once the bulk of abreactive electrochemical unloading (“neutralization” in the parlance of AT) has transpired (involving, according to our present understanding, the shift of altered intra- and interneuronal DNA frequency windows back to their normative values). A long personal discussion of this topic was had with Dr. Luthe at the Kyoto Psychosomatic Medicine Congress in 1977, followed by an hour-long symposium multilogue also involving Yuji Sasaki (then Japan's foremost authority on the electrophysiology of Zazen) and several prominent long-practiced elderly monks from Daitokuji Temple (mediated by a very accomplished simultaneous interpreter provided by the Congress Secretariat). Photismic afterimage is fractal image generation in autogenic discharge, the experience of which was surely the origin of Goethe's theory of color. After the photism (generally golden light specks in one of myriad array patterns) comes the instantaneous flash of a fractal image in raging color. In time-slow-down states (increased baud rate of consciousness) the fractal image is held in awareness longer (densification of the time; expansion of the “present moment”.

The photism and the fractal afterimage is the subjective blowback of autogenic discharge. As “neutralization” proceeds further and further, the discharge activity becomes thematically less and less involved with personal factors and more and more related to general properties of organisms -- later yet, more and more related to general properties of existence, to properties of “general entities”. A stage comes when one abreacts on a photism-pattern display. This is in deep time-slow-down state. The fractal image cascades through itself: ump! ump! ump! ump! ump!… Later yet, one gets trapped deep in the origins of the cascade nesting dynamic and is faced with the problem of finding a way out. Will my “I” ever be again? The Brownian wave-statement appendix is a general essay on “ways out” and “ways back”. All ways out involve creation of dimensions of space, a logical walk back from spontaneous quantal fusion through stages of autogenic brain discharge as spontaneous localization in quantum measurement. Each ump! is a “divergence in 4-momenta”, an entrance upon another scale level in the fractal nesting matrix. One quickly learns that localization of modes (i.e., non-holographic) in fractal entrapment (remember one is trying to find a way to get back from fusion, a way to get fractally entrapped!) is a function of single-valued measurement (i.e., the spacetime scale of measurement is not being changed relative to the scales of that being measured; that is, the time-step and the grid-length to which the measuring instrument is calibrated remain fixed). Fractal entrapment is an m-logically-valued process being measured as if it were single-logically-valued. In order to get back from fusion, one must discover how to thus get fractally entrapped. One learns to do this by becoming a “time giver”, by discovering that consciousness gives the time. Any “time giver” sets the space -- is a “space giver” by virtue of being a temporal operator. One learns to “turn the knob” on the temporal operator rheostat.

There can be no incontrovertible experimental verification, reproducibility, and consensual validation in the artifice of a short-term clinical setting. The physical changes associated with autogenic shift of the base state of consciousness transpire almost imperceptibly over a period of years of sustained effort -- even if there are rapid phase transitions at thresholds. Moreover, shifts made are not necessarily persistent or permanent. This is why realization of the isolation/flotation tank proposal is required to provide an environment for the practice, which is at the same time a device that can record and store complex electrophysiologic measurements on an ongoing basis over a period of years. The same is, of course, true of the “smart dress” in its dancewear and athletic-garb incarnations.

It has been known since the early 70s that the bottom line on mechanisms of action of psychotropic drugs involves the same sorts of electron-transport processes which underlie autogenic brain discharges -- through which spontaneous localization at quantum measurement is clearly involved in origins of spatial dimensions. So, what did the collective hysteria which scientifically crucified John Lilly achieve? It prevented experimental study of origins of spatial dimensions -- and thereby blocked insight into the nature of operator-time, the relation of operator-time to logical operators, the manner in which m-valued logics are to be crossed into Hilbert space in construction of multi-sheeted laminated spacetime (without which superstring theory has no real basis). It pushed sensory isolation research to the fringe area and left studies of sensory overload primarily to military application -- thus making it impossible to scientifically study whether or not what “physically we need” in the way of spatial dimensions is actually a psychological need specific to the sensory load characteristics of enculturated states of awareness at the baud rates of consciousness peculiar to recent human life on planet Earth. It insured that the ontology-genesis-simulation properties of substances like ketamine (through effects on electron-transport processes transpiring at synaptic and ephaptic junctions) could not be systematically studied in the proper experimental settings by highly trained scientists, thus handing it instead to the haphazardness of the virtual reality development community and the Hollywood public fantasy molders who make films like “Lawnmower Man” and “The Cell”. There can be little doubt that there must be an enormous difference between psychotropic drug-induced experience of the creation of spatial dimensions and non-drug-induced, autogenic-discharge-associated experience of the creation of spatial dimensions. Why? Because, in all likelihood, psychotropic drugs shift intra- and interneuronal quantum wave properties away from the normative, while the primary role of autogenic discharges is to shift those same properties back to the normative (similarity in the immediate subjective experience resulting from deautomatization of the spatiotemporal-dimensional construct maintained on a submolecular level by the central nervous system). This likely enormous difference is a situation readymade for controlled experimental studies, studies which have been prevented by the collective hysteria that scientifically crucified John C. Lilly. This hysteria is much older than John Lilly, however. Hysterical processes related to these exact same issues transpiring in collective attention cathexes and their unconscious substrates predated both world wars. They have a long history; they have major sociopolitical consequences; and they surely have a bright future.

The speed of light changed? I love it! More evidence supporting our 30-year-old notion that the speed of light is m-valued. Wait until they discover m-valued limiting accelerations and time rates of change of acceleration!

Joseph Chilton Pearce's book, The Biology of Transcendence (Rochester: Park Street Press, 2002), is most interesting to me in that he has somewhat returned to the themes of Crack in the Cosmic Egg: essentially, the attack on efficacy of culture qua culture and its enculturation process (this attack being an important departure from Freud's false notion of “non-repressive sublimation”, an apologia for the nation-state, which Jung embraced in his writings on politics, and which was the central assumption of Brown's Love's Body and Marcuse's Eros and Civilization). Pearce has, however, softened the attack on enculturation due to his focus on early childhood development in the years since writing Crack. I fully agree with him that our higher functions are derailed by psychological projection and that these projections become cultural counterfeits of the unavailable higher functions -- and that our myths, religions, and cultures, thereby, are sustained in the violence engendered by the projections they are an expression of. But he says higher “potentialities”, which puts the matter in a linear-time-bound framework, an evolutionary context. This, I have trouble with, as it conceals the how's and why's of the collective projection (i.e., projective identification on the level of group mind) which he does not actually explicate. Let me see if I can say something thought provoking on this without writing a long essay.

Pearce says: “…a negative judgment in any form ruptures relationship”. This, to me, contradicts his first imperative of nature, i.e., “no model, no development”. There can be no model without the possibility of negative judgment, because negative judgment is necessary to the logical law of non-contradiction (i.e., no A is not-A), without which there can be no logical structures defined on the basis of relationships (i.e., there could be no distinguishing between a given relation-structure and another relation-structure, that is, there could be no model, and no development absent a model). Negative judgment is required to register an instance of the fallacy of contradiction. Pearce assumes linear-time and hence evolutionary development. This assumption necessitates the further assumption of the inviolability of the logical law of non-contradiction, which demands the validity of negative judgments, without which there can be no models as relation-structures, no linear-time line with instants absolutely distinguishable one from another, and hence no development. If there is to be evolutionary development, a negative judgment must not necessarily rupture relationship, must not necessarily disturb relation-structure.

This is true only under binary logic (not under m-valued logics of order greater than 2), which is tacitly assumed in linear-time-bound modes of comprehension. The notion of evolution necessarily involves linear-time. Linear-time is not definable absent the logical law of non-contradiction. I am a sufficient believer in the “non-repressed unconscious” as transfinite sets organized under m-valued logics that I disparage (which is a negative judgment required for evolutionary development) all efficacy of enculturation and do not attribute a positive role to models, imprinting, or development . With Plato, I believe that all learning is remembering (anamnesis) from the multivalued reference space (frequency domain interpreted under m-valued logics, not probability amplitudes). I, therefore, am skeptical of the whole evolutionary treatment of brain structures and functions, which in large measure regards quantum-level brain properties as non-essential, ante-emergent, pre-secondary, prior-derivative, un-decoherent or epiphenomenal in one sense or another.

It is unfortunate that Pearce gives so few technical references to his commentary on “neural pruning” and the myelination process “at about age 15” -- as his interpretation is couched solely in relation to the developmental evolutionary model of brain structure and function. From my first studies of Piaget (at urging of an old buddy from SRA at MACV-HQ, who later became a specialist in early childhood development and “master teachers” at the elementary school level) I have always felt that his stages of cognitive development roughly correspond to an inversion of what is required to enter and sustain states of identity transparency (which are logically m-valued). The neural prunings that Joe describes may be the organism self-immolating its capacity for eidetic m-valued cognition under impress of enculturation, beginning in the womb and ending in myelination. M-valued cognition would require an extreme plasticity in the reading capacities of neural populations and networks (neurons reading the m-logically-valued frequency domain). I believe that a similar effect of enculturation underlies much idiopathic epilepsy, with origins in suppression of non-linear higher-order time awareness. But I have also seen the perineural tissue treated as the seat of an analog data system in the brain, which would contradict both this and Pearce's interpretation. In which case there may be room to argue that the p-electron parcels associated with intraneuronal superconductant DNA are the ultimate interface with the frequency domain, not the perineural tissue. Presumably there are many technical references in the two Allan Schore citations made by Pearce which I could use to begin evaluating this area. (Affect Regulation and the Origin of the Self: The Neurobiology of Emotional Development. Hillsdale: Lawrence Erlbaum, 1994; and “The Experience-Dependent Maturation of a Regulatory System in the Orbital Prefrontal Cortex and the Origin of Developmental Psychopathology,” Development and Psychopathology, 8, pp. 55-87.) Sub-clinical autogenic brain discharge phenomena may have as one of their functions to undermine the effects of enculturation. Massive electrochemical unloading in autogenic abreaction may be a near-convulsive attempt of the organism to demyelinate so as to re-establish capacity for m-logically-valued cognition. Multiple sclerosis may result in highly empathic individuals (strong tendency to identity transparency) impelled to recapture m-logically-valued states who, nonetheless, are unable to voluntarily throw away identification with the single-valued egoic self.

I believe that the collective projection leading to cultural counterfeits of higher human functions has its origins not in repressed potentialities, but in imposing cognitive “development” of binary-logic-defined models through emotional and mental imprinting. This problem became acute with rise of the nuclear family system and monogamy, because the involved monotonic identity construct (lack of a multiplicity of role models) militates strongly against the inherent (quantal brain mechanisms) capacity for identity transparency and that state's associated specific m-logically-valued modes of apprehension, feeling, and cogitation -- which are hypertemporal in nature. The quantal properties of the brain, which do all the information processing, are denied through consensual impositions of the group and exercise of force or threat thereof in order, largely, to institute and maintain social role stratification. This sociopsychobiological denial generates the projection. Identification with the quantal contents thus projected establishes “objectivity” as cultural counterfeit maintained through collective attention cathexes (i.e., the social structure of attention), serviced these days primarily by the media and modulated relative to processes of group hysteria which culminate in warfare (thereby giving access to the most intense forms of REGRESSED identity transparency).

Absolutely I am biased. Absolutely I cannot tell the difference between science and politics. Anyone who believes that science is more “objective” than the news media has got a real problem with his perceptual functions. This gives me a hint as to how to understand the import of your accusation, for to me the deaths of tens of thousands of people do not fall under the rubric of “bias”. But then, so many scientists have adopted the conventions of Chinese logic, without acknowledging that fact, anything is possible, I suppose.

For instance, consider A. I. Miller giving an account of Schrödinger’s wave equation (in “Erotica, Aesthetics and Schrödinger’s Wave Equation”, in It Must be Beautiful, Graham Farmelo, N.Y.: Granta, 2002, p. 98 and note 23, p. 265):

But what if the marble is an electron? According to wave mechanics, the falling electron can be anywhere because its wave function is spread out over all of space. The marble, on the other hand, is localized right from the start.

Roughly, this is because an object’s wave properties become less important the heavier it is. Consequently, and luckily, we are localized.

If you want to know what a scientist really thinks, read his popular science expostulations, as these are not so well worked over as to remove all the Freudian slips. These coupled statements of Miller exemplify the rotational logic of the tai chi symbol as well as any other example I can think of. The words “from the start” are a dead giveaway, and “luckily” surely reveals how unbiased science is. But, to show this is more important than just science writer stuff, let me quote from Hugh Everett before going into Miller’s logic: “…since physical objects always appear to us to have definite positions” (Everett, Hugh, III. “Relative State Formulation of Quantum Mechanics,” Reviews of Modern Physics, 29:3, July 1957). This was a most pivotal statement in one of the most important theoretical papers in postwar quantum theory (see: For the Mirror is Not the Glass) -- and it fully agrees with Miller’s perspective on the marble.

Both Miller and Everett make no attempt to justify their statements about “localized” and “definite positions”; the statements are made with the same matter-of-factness as statements once were made about the Earth being flat and at the center of the solar system. But Max Born’s interpretation of the wave-function of Schrödinger’s equation as a probability wave, which is assumed valid throughout Miller’s account, was arrived at by Born because, if the wave-function was not treated as indicating probabilities, then it could not be so that “objects always appear to us to have definite positions”. The tacit assumption, “from the start”, after all the convoluted arguments, “luckily”, is the reason given for the conclusion arrived at -- and so the “tai chi” wheeler turns.

Moreover, if you point out contrary evidence, the physicist’s eyes glaze over. Luneburg, for instance, has been ignored since before Everett wrote his paper: “…there is no absolute localization even in binocular vision.” (Luneburg, Rudolf K. “The Metric of Binocular Visual Space,“ Journal of the Optical Society of America, 40:10, October 1950.) Optical physicist Luneburg demonstrated in his laboratory at Columbia University that visual space is a non-Euclidian metrical space, with a limiting velocity. He demonstrated that visual processes subscribe to the Lorentz-Fitzgerald contraction, and that the “psychometric” distance function of visual space varies with learned psychological factors. These experimental findings directly contradict Miller and Everett, and Born before them -- and have been ignored for 52 years, presumably because the Earth is still flat.

In discussing The General Theory of Classes explicated in Russell and Whitehead’s Principia Mathematica, Ignacio Matte Blanco says, “Reflection, therefore, shows in an indisputable manner that what in ordinary life is, for everybody, an individual, is, in logical terms, only a zone or point of intersection of propositional functions” (The Unconscious as Infinite Sets, London: Duckworth, 1975, p. 30). This position clearly represents a “bias”. The bias is that of Advaita-Vedanta: the chief function of Kala (a logical operator) “is to establish intimate contact between Prakrti and Purusa at the time of dissolution” (Kumar Kishore Mandal, A Comparative Study of the Concepts of Space and Time in Indian Thought, Varanasi: Chowkhamba Sanskrit Series, 1968, p. 21; see also Sir John Woodroffe, The World as Power, Madras: Ganesh, 1974, for detailed explication of Kala as a logical operator: i.e., operator-time). In other words, on a psychological level, then, my ‘I’ is nothing but the intersection of Prakrti and Purusa, two propositional functions. The logic we choose, apparently, has something to do with whether or not objects have “definite positions” or “localized” individualities (simple identity as opposed to n-bodied non-simple identity) -- for type of logic (binary or m-valued) is most certainly part of the learned behavior determining the “psychometric distance function” of the visual space we choose to apprehend objects in. This is part of the reason why I say that Born’s choice of “probability amplitude” over “m-valued logics” (both on the scene in July of 1926 when the choice was made) for interpretation of Schrödinger’s wave-function was a political decision, a political decision with disastrous consequences: World War Two (see: Echo of the Mockingbird). The real problem is that physicists, with few exceptions, suppress the feeling function, thereby have access only to a regressed form thereof, and thus have very little in the way of subjective experience: they experience only enculturated percepts and as a psychological defense mechanism deny the existence or validity of any other. This collective behavior, this denial, in my judgment, is a political act which has been directly responsible for the deaths of hundreds of millions of people.

Everett lived about a mile from a nursery I dug trees in off and on for a decade. He had landscaping about his house like every other Northern Virginia suburbanite. At one point, I approached him concerning my perspectives on mathematically modeling autogenic brain discharge phenomena. I proposed that the so-called “collapse of the wave-function” and the autogenic brain discharge were actually identical, that spontaneous localization and fusion take place relative to the quantum chemistry of brain function. Everett was not a happy man. We both were military brats and I had immediate insight into why he had “dropped out” into the Pentagon. The father figure is a great problem for the intellectually adept military son: Everett had three of them, including his Ph.D. preceptor and Niels Bohr. Public evidence of this is provided in the end notes to Everett’s 1957 paper. The full brunt of Bohr’s obsessive authority was clearly brought to bear on young Everett to get him to toe the line -- as his dissertation, had Everett been allowed to follow his full intuition unmolested, could have been disastrous to the consensus view imposed by Bohr. I do not believe that “…since physical objects always appear to us to have definite positions” was a statement Everett wrote free of duress -- if he wrote it at all. This was the transitional statement between the section of the paper explicating Everett’s original intuitive breakthrough and that in which John von Neumann’s methodology was applied to demonstrate that the “multi-worlds interpretation” was equivalent to the interpretation embraced by Born and Bohr.

By contrast, consider extracted comments by someone who assuredly did not suppress the feeling function, the painter Rice Pereira (The Nature of Space, Washington, D.C.: Corcoran Gallery, 1956, one year before publication of Everett’s paper; also greatly recommended is her characterization, on the basis of direct perception, of the multi-sheeted structure of hyperspace, which she called the “layered transparent”, available in her “The Transcendental Formal Logic of the Infinite”):

As the picture of this world view of nature, analyzed and reduced to mathematical laws, moves on in perspective, man got closer to the horizon with each succeeding century… Through the centuries of progress, as man grew more familiar with his surroundings, a contradiction becomes imminent between the optical diminishing picture of the world view, and man’s expanded knowledge… The eye, or the way the eye “sees” the world, has a vanishing point similar to perspective. The logic of perspective cannot be reconciled with the substance of experience. The discrepancy between the optical impression and the inner knowledge about the thing perceived detracts energy from the center of the mind… When the center of gravity is disturbed or shifted in the mind, man loses his balance… If the thing perceived cannot be united with its substance, it remains external, or exterior to reality and to inner contemplation… The world of sensible things exists only in an infinitesimal “present”. The real picture is formulated in the mind through the image which cognition makes possible. Unless the two worlds -- inner and outer -- can be superimposed and sustained in the mind, thereby giving depth and solidity to the substance of experience, man will lose his balance when confronted with the irrational, incomprehensible space of the infinitely large. The development of a society depends on its ability to participate in space… If a society possesses the seeds of a new orientation in space, but is not prepared to participate by establishing frames of reference and a center of gravity to hold the balance in tension, that society will be caught up in a dynamic of uncontrollable forces. The portion of space which goes off balance will act on the society unconsciously, and man will become victim of its action.

With the political decision of Max Born and his cohorts to interpret Schrödinger’s wave-function in terms of probability amplitudes, rather than m-valued logics, the human species became just such a victim.

I’ve now read a lot of Wolfram’s notes and have gotten well into his text, A New Kind of Science (Champaign: Wolfram Media, 2002), but I don’t like to read a book like that without contextualizing it, so, beyond reading about John Walker Lindh, I am at the same time reading It Must Be Beautiful: Great Equations of Modern Science (Graham Farmelo, editor, London: Granta, 2002) -- so far, the essay on Schrödinger’s wave equation, and the one on Dirac’s equation -- and also Giorgio Agamben’s book, Remnants of Auschwitz (Daniel Heller-Roazen, translator, N.Y.: Zone Books, 1999), a postmodernist essay on archives, testimony, shame and witness, and the walking dead, those in the last stages of dying of starvation whom other prisoners at Auschwitz called “Muselmann”: Muslims. There was apparently a certain randomness about who became a Muselmann and who did not. There was no simple rule identified determining who would fall and who would remain standing, whose block on the SS jailer’s chart would remain white and whose become black. Maybe there was such a rule, a simple generative rule, but it was never discovered. Reading Wolfram, I feel like a potential Muselmann myself, sure to starve for the lack of something essential. I find it quite without nourishment how he equates randomness with complexity and pattern with relative simplicity. To me, any given random is random only relative to some order of logical value, and any given pattern is patterned only with respect to its correspondent order of logical value. To talk about an absolute random or a completely obvious pattern seems to me to arrogate the prerogatives of one or another one-godhead, or at least to retain the absolutist notions of Newtonian space and time, even if these notions, implicitly carried in Wolfram’s interpretation of the import of cellular automata, are later in the book applied to General Relativity. He argues that what he identifies as random is really random, and not simply subjectively random, because the most sophisticated statistical tools and number generators concur with his randomness judgment -- as if those statistical tools and involved notions of number were not epiphenomenal of the order of logical value employed by his subjectivity to reach the judgment of randomness, and that, therefore, they are non-derivative corroborative “testimony”. Of course, he dismisses as increasingly meaningless orders of logical value greater than two; thus, at the outset, are contending superordinate perspectives eliminated, even though the principle of nesting is central to the complexity his cellular automata generate, to chaos theory, to fractals, to Cantorian spacetime, and so on. Schrödinger and Dirac never concurred with the interpretations given their equations by lesser minds, Schrödinger rejecting the probability interpretation of his wave-function, Dirac rejecting application of procedures of renormalization to remove infinities from solution sets to his equation. It is my contention that the m-valued function in Schrödinger’s equation is correctly interpreted relative to m-valued logics -- understood as logics of holographic Cantorian identity transparency, not of truth-value, not of modal possibilities, not of fuzzy probable states -- and that the appearance of an infinite quantity in solutions to the Dirac equation, which equation was an attempt to reconcile Special Relativity with Schrödinger’s equation, indicates a shift from one order of logical value to another order of logical value at absolute limiting values (i.e., limiting velocities, limiting accelerations, and limiting time rates of change of acceleration) of dynamical variables in the physical process under consideration, indicates, that is, the presence of nesting structure in orders of logical value as manifest in the ground state or reference space of the given process. QED and QCD (quantum electro- and quantum chromo- dynamics: the former relative to the surface structure of the atom; the latter, to the deep structure, the interior) are said to be experimentally verified more accurately than any other in the history of science, and that the bulk of contemporary technology of globalization is derivative there from. I find it amusing to observe grown men prancing about preening themselves with the notion that experimental substantiation and the appearance of derivative technologies demonstrates the truth or validity of a theory. Just as some aspects of human functioning clearly can be explained by operant conditioning, and just as clearly it is untruthful and invalid to reduce all of human functioning to jets of operant-conditioning cascades, so, just as clearly, do some aspects of nature conform to calculable rules defined within the realm of binary logic -- be those rules rules of cellular automata, rules of QED, rules of QCD, rules of General Relativity (but not rules of Special Relativity, as the logic implicit in the notion of an absolute limiting velocity is m-valued, not two-valued). I do not argue that binary logic does not exist, only that it is merely one order of some type of infinite number of orders of logical value (each order of which has an associated universe of discourse at least as complex as that associated with two-valued logic). When Schrödinger’s m-valued function was interpreted as invoking probability amplitudes, which is essentially wrong, that did not mean that judicious separation of data from noise would, nonetheless, not be associated with accurate experimental verification and development of derivative technologies. It meant only that an aspect of reality can be accurately characterized and represented to a mind employing binary logic and the notion of probability amplitudes, that the hole-and-peg (positron-electron) structure of the quantum vacuum, the quantum field framework, the furious burgeoning zero-point energy configuration of the vacuum, functions as a superconductant mirror reflecting with verisimilitude anything any given order of logical value systematically projects upon it, that the technologies developed on the basis of two-valued-logic-type probability amplitudes would not be integrative, would not be of a piece with nature, that they would be fragmenting, fragmenting to nature, to societies, to cultures, to psychological experience, fragmenting in the same way as Wolfram’s equivalency between randomness and complexity is and will continue to be fragmenting. He holds out to us in face of this great good, complexity, what he clearly regards a greater good, the power to control the random, the complex, and all of complex random nature with knowledge of its simple binary rules of creation, its covenant of genesis through automatism. Fine. No one said nature or humans do not have automatisms. But is all else in nature, in the human, simply an emergent property of automatism? Maybe so… Certainly so, if the genetic recombiners have their way and pool their resources with postmodernist devotees of chaos theory, autopoiesis, non-linear Newtonian dynamics, complexity theory, cellular automata, nanotechnology, and probabilistic quantum computing. Which, of course, they are and will continue to do -- and we will, all of us, increasingly, as a consequence, be Muselmann, Muselmann this time who understand the simple rule that made us the walking dead.

The earlier comment comparing the “American Taleban” John Walker Lindh to Stephen Wolfram came about as a result of reading the Harper’s article on Lindh (Edwin Dobb, “Should John Walker Lindh Go Free?”, Harper’s, May, 2002) at the same time as reading the notes to Wolfram’s A New Kind of Science.

Beginning in the late 19th century and culminating in the death of the Unified Science Movement at outset of World War Two in hands of Linguistic Philosophy, there was a very large body of literature analyzing form in natural processes -- a body of literature I spent a great deal of time with at Cornell as a result of being introduced to it there via the book inspired by R. Steiner entitled Sensitive Chaos. One of the leading late figures in this body of literature was L. L. Whyte, who produced a brilliant extrapolation from the involved research in 1948 entitled The Next Development in Man, a book that had a tremendous influence on my thought.

Wolfram barely mentions this body of literature in text and in his notes rarely cites the involved authors -- even Hermann Weyl and D’Arcy Thompson, Weyl only in relation to gauge invariance, Thompson rating some four sentences -- though the forms his cellular automata computer programs were created to reproduce were extensively analyzed and lavishly depicted in this literature. Wolfram justifies this lack of discussion of precursors by saying he is writing a popular exposition. I do not believe, however, that this is the real reason. His book contains one sentence and one paragraph on multivalued logics, a sentence and paragraph which do no more that recapitulate the thesis of Bertrand Russell in his book of 1900 (if I recall the date correctly), On Logical Atomism: the more logical values added the less meaning the involved proposition can encompass. (Fifteen of the more philosophical books in this body of literature are listed on page 615, Volume II, of MOON, along with speculation about the fate of Unified Science; Wolfram mentions some of these in passing in his notes). The orientation taken to natural form in the Unified Science Movement (of which Rudolf Steiner was an early figure) is regarded -- by postmodernists and others who disparage the identity transparency associated with unvarnished quantum perspectives -- as mystical and cabalistic in nature. So, it is very easy, in turn, to regard Wolfram’s whole program of research as a rabbinical attack on cabalistic modes of thought, an attack akin to those mounted in the recent films “Pi” and “Cube”. The requirements of a rabbinical attack is the real reason, I believe, for why the early works in the field are rarely mentioned in text or notes.

Wolfram describes his fundamental insight coming at age twelve: i.e., that repetitive application of very simple rules can give rise to great complexity. He states that this realization was a fundamental breakaway in the history of all of human thought -- even though this recognition on his part is essentially a restatement of Church’s Hypothesis (“calculable if and only if recursive”) and is the foundational idea of Chomsky’s generative linguistics, the first of which was long on its way and the latter well into the birth canal before Wolfram was born. Wolfram makes passing mention of these, in his notes primarily, but this is in no way connected to or modifies the claim he makes to a breakaway notion.

It is also not a big stretch to regard generative linguistics, itself, as a rabbinical attack on cabalistic modes of thought -- especially so, given that cabala is an area of study focused on the meaning, means, and structures of Genesis as given in Hebrew. Generative linguistic procedures are “recursive”, i.e., repeating a simple operation (grammatical rule) over and over to generate complex syntactic structures. This is an explicit application of Church’s Hypothesis, one of the basic assumptions of modern logic -- embraced even by a Roger Penrose while attacking conclusions held dear in the field of Artificial Intelligence. (Church’s Hypothesis is ridiculed at the very end of MOON relative to m-valued-logic perspectives on Gödel’s theorem.) Unfortunately for Church and Chomsky, generative linguistics, very successful in the area of grammar, foundered in application to semantics, the study of meaning: while grammatical structures in some languages can be pretty well accounted for on the basis of recursive algorithms, meaning cannot. The Unified Science Movement was strongly focused on studying the implications of form in natural process relative to the “Meaning of Meaning”, to state the title of one of the major books (Ogden and Richards, 1923) associated with that movement before it died in being transmuted into Linguistic Philosophy.

We can tell by extrapolation from the Harper’s article that John Walker Lindh was also having fundamental insights by age twelve -- if by age sixteen he had already disparaged the emptiness of American materialism, withdrew from his peers, deeply studied the world’s religions, converted to Islam, and set upon a self-designed “Outward Bound” program of inner spiritual development leading him to travel alone to Central Asia. By any reasonable standard of judgment, this is a genius-level accomplishment for someone so very young -- and, had the outcome been different, certainly worthy of a MacArthur Foundation award. Having gone through all the processes of psychological induction in U.S. Army Special Forces training, I have no trouble imagining what happened to him in Central Asia at such a young age, nor in having great respect for his survival skills, given what he went through. I find it hard to credit the assessment that he significantly compromised the American national interest. It is clear, moreover, that he, by virtue of his experiences, has gathered the possibility of generating more insight into the dynamics of contemporary global conflict than that available to most of the rest of us -- a greater accomplishment, in my estimation, than that of Wolfram, which (from the comments on page 615, volume II, of MOON) I regard as thematically related to the deeper causes of the world wars.

Yes, that’s what I said: “The fact that Born’s probability interpretation of Schrödinger’s wave function came in 1926, the founding year of the Showa Era, is a synchronisity.” The notion of probability amplitude removes from quantum mechanics what Everett later called “relative-state” -- thus preserving the Western notion of separate self-identity. Relative-state has a meaning essentially the same as jijimuge, the concept of the interfused state of part and whole especially espoused by the Japanese Kegon sect of Buddhism. Jijimuge is a refinement of the general Buddhist notion of “dependent origination”. That the Taisho Era democracy movement ended the very year quantum theory was dissimulated, suggests to me that, at least on the level of unconscious forcing functions, origins of the Great Pacific War were the same as those of World War Two in Europe.

This is all the more to be suspected when one considers that the traditional Japanese notion of sacred space --ma -- was conceived as a vertical stack of two-dimensional sheets whereby “time” was understood only as the connection between such sheets. (One might compare this with Julian Barbour's notion of timeless “Platonia”.) This is the origin of the vertical sense of form in Noh drama -- the dan mosaic -- and in other traditional arts and music, the sense of form suffusing Japanese aesthetic sensibility. The sheets composing ma were considered essentially a dusting of kami, or dimensionless mathematical points, decomposed into a stack from an all-embracing maternal void or vacuum. “Time” was not experienced in the linear horizontal fashion of the West, but as ontological precedence stacked upon the present moment within a mathematical involute. Three-dimensional space was regarded an illusion constructed by mind in a state of attachment.

Jijimuge is in the structure of ma. Mind-in-attachment decomposes ma into separate sheets out of which three-dimensional maya, illusion, is constructed. In the primal state, the maternal void, there is uniform distribution of kami dust, an interfused state of the whole. Motion of attached mind decomposes this interfused state into an involute of nested sheets of kami dust. At this ontological stage of involutory decomposition, the nested part-sheets of dimensionless mathematical points remain interfused with the whole. Loss of part-whole interfusion transpires as mind-in-attachment constructs three-dimensional space and horizontal linear time out of the stack of two-dimensional sheets.

The primal state of ma was considered the essence of beauty, order, simplicity, meaning, because in its repose were held all things. This is the absolute opposite of the Western concept. At onset of the Meiji Era, Ludwig Boltzmann began investigations leading to the definition of entropy as the most probable state. The uniform distribution of kami in the primal state of ma, space of the sacred, is, according to Boltzmann, the most probable state, the state of highest disorder, a state of furious motion, not repose, of heat death in the final state, not Nirvana at return to the state of origin. Of course, Boltzmann’s final state is that at the end of horizontal linear time, not the traditional Japanese concept of precedence as stacked on the present moment in a mathematical involute. Aesthetically speaking, Boltzmann could not have come up with an idea of order-disorder more repugnant to the traditional Japanese sense of beauty.

But the new physics of Planck and Einstein brought a breath of fresh air, as it was intuitively recognizable, if only by affective osmosis, as a step toward a reconstitution of ma. By 1912, the beginning of the Taisho Era, there was a gathering sense of optimism in Japan: General Relativity was on the very threshold. Young Japanese intellectuals were consuming huge quantities of text concerning contemporary avant-garde European thought. The democracy movement was less political than simply a willingness to uncritically embrace the new. I think all this optimism, this openness, died with the probability amplitude interpretation of the Schrödinger wave function. Such is the nature of synchronisities mediated by the collective unconsciousness.

Okay, I will try to give a concise statement about what I think regarding the “probability issue”. Statistical mechanics and stochastic thermodynamics emerged based upon the knowledge that in face of the large number of atoms in motion in natural systems it is not humanly possible to determine the state of such a system utilizing Newton’s laws of motion, given known initial and boundary conditions. Newton worked out these laws of motion relative to planets of the solar system, not atoms in a gas. Calculating probabilities was found to be the only practical means to learn something about the state of a system involving a large number of atoms. This did not mean that such a system had no instantaneous determinate state, only that this state could not be determined by application of deterministic laws of motion. With Heisenberg’s indeterminacy inequalities, however, it became apparent that there is an inherent indeterminacy in Nature: subatomic and atomic systems are disturbed by the act of measurement such that some property of motion of such a system will always be relatively indeterminate at measurement. If such a system is not measured, there is no issue of inherent indeterminacy; the universe is in a determinate state until measured. Indeterminacy in all three cases -- statistical mechanics, stochastic thermodynamics, quantum mechanics -- is an issue related to knowledge and the activities involved in efforts to advance knowledge.

If the universe had a beginning and an evolution, then there must have been a period before there appeared entities capable of making measurements, a period when inherent indeterminacy at measurement was not a component in evolution of natural processes. Let’s start again. According, therefore, to stochastic thermodynamics, the time at which inherent indeterminacy at measurement became a state of the universe was a time of a highly improbable state, a state of low entropy. Uh, let’s start again. Since the arising of entities capable of making measurements is a low-entropy event, pockets of low entropy had to have formed in a non-equilibrium statistical thermodynamic fashion. Or, uh, let’s start again. Intelligent entities have been in the universe from the beginning, which beginning was a low-entropy state when intelligent entities were most probable, as the universe has been running down ever since into higher and higher entropy states.

Or we must accept a hard version of the anthropic principle: the universe evolved the way it did, the initial and boundary conditions and values of universal constants were the way they were, because intelligent entities were to evolve in low-entropy pockets so as to introduce an element of inherent indeterminacy. This post-medieval teleology would not be hard to accept if the future is always the present in the past and time-reversal never proceeds from the present to the past, but always from the future to the present. What I actually believe is most like this last statement.

The categories of time-ordering are a function of the properties of the logic employed to distinguish order from disorder, probable from improbable. The notion of probability cannot legitimately be used to define linear time, as the notion of linear time is implied by and contained in the notion of the probable. There must be some lineally-sequential future if there is to be a probable state. In the case of an m-logically-valued superposition in the macroscopic universal wave function, however, the notion of probable state is only associated with one order of the m-logically-valued superposition. Each such order, I believe, is associated with different initial and boundary conditions, and with scaled values of universal constants, such that all such conditions and values are simultaneously superposed. Since Planck’s constant and the other fundamental constants have scaled values, a relative inherent indeterminacy at measurement exists at every scale level of the universe, and this is why causality and free will co-exist. My I is free to be determined and determined to be free, as Derek Dillon likes to say. Each order of the m-logically-valued superposition is its own time-ordering function, its own time-logic. This is equivalent to temporal nonlocality, to there being no essential time, no preferred time-ordering construct: all time-ordering aspects of reference frames are relative, such that the case can always be found wherein the future is always the present in the past and time-reversal never proceeds from the present to the past, but always from the future to the present. This hard version of the anthropic principle, therefore, maintains that teleology is indistinguishable from causality because the future is the present in the past.

I've recently spent much effort upon studying Julian Barbour's THE END OF TIME: The Next Revolution in Physics (N.Y.: Oxford U. Press, 2000). This has in some ways been depressing. I agree with about ninety-five percent of what he has to say, and it is truly a brilliant body of work he has produced, but the disturbing thing is how he, like Hugh Everett (see: For the Mirror is Not the Glass), has leveraged his insights under the assumption that perceptual experience in the “normal state of consciousness” for an absolutely-in-so-far-as-distinct post-Renaissance Westerner is the only perceptual experience to be given credibility as in some sense authentic “objective” perception, all other perception, including that by those experiencing one or another degree of identity transparency, being merely subjective. He skirts investigation of the “measurement problem” on a level of creative consideration commensurate to his other investigations by stating that he does not have the requisite knowledge of neuropsychology. At every critical juncture in his argument he makes recourse to this assumption about perception in determining the course he will follow in proceeding to the next level of consideration. This is a recurrent drone throughout the book and by the end of it I found myself depressed contemplating the probability mists he has floating over his very elegant model, the presence of the mists being necessitated only by his repetitive recourse to this false assumption.

The most brilliant exposition in the book is his Tristan-Isolde diagram on page 179. He says: “We can see how the difference that keeps Tristan apart from Isolde is actually part of the body of Romeo (and Juliet).” Not only does he here exhibit some insight into identity transparency -- thus belying the false assumption given above -- but earlier, on page 151, he displays some fleeting insight into how this “difference” between two identities can be “part of the body” of a third identity: “Almost everything mysterious and exciting about special relativity arises from the enigmatic minus sign in the space-time 'distance'. It causes the 'skewing' of both axes of the starred frame of the starred twins…” Here, he almost gets how “pencils of skew-parallels” underlie identity transparency. He is referring to his diagram on page 132, which can be viewed as a presentation of one account of the origins of Bennett's “skew-parallelism”. Each subset (a structuring of the whole transfinite set) of the universal set of points (objects) in the configuration space is a pencil of skew-parallels. The origin of charge is in this skewing, which is an expression of the relativistic lack of simultaneity. Identity transparency is not clearly seen because transfinte sets in configuration space are not explicitly considered: identity transparency is resident in Cantor's very definition of a denumerable transfinite set, i.e., any set that can be put into one-to-one correspondence with one of its proper subsets.

In focusing on the notion of “intrinsic time” Barbour truly gets right near the core issue (page 245): “…that 'time' was somehow carried within space.” But not as one of the degrees of freedom of a 3-space! as those around John A. Wheeler believed. (Below, I will describe my interactions with Wheeler in 1975 and '77 on this specific issue). “Intrinsic time” is not one of the degrees of freedom of space, but the logical operations implicit in the relative-state of that space, i.e., operator-time or time as a topological operator on space. Barbour illustrates this with his Tristan-Isolde diagram of the criss-cross best-matching construction technique -- but does not fully grasp that large-scale best matching is determined by quantum relative-state (i.e., all incarnations are simultaneous AND TRANSPARENT in the same superspace as relative-states each of one another: an absolutely crowded antithesis to solipsism, which Rice Pereira calls on the basis of direct perception “the layered transparent”). This not fully grasping is because he mistakenly accepts the DeWitt designation of Everett's “relative-state” interpretation as being the “many-worlds” interpretation. Everett used the term “relative-state” because he meant relative-state, which is an evocation of identity transparency -- whereas “many-worlds” is what you get in quantum theory by denying the possibility of identity transparency. Everett's original insight was falsified in being subjected to Bohr's reservations and the von Neumann methodology. Everett never recovered from the trauma involved.

The next step in the path of insight is to realize how m-valued logics underlie the involved identity transparency and replace the notion of probability amplitude in interpreting Schrödinger's wave function. Barbour skips over this step, never makes it, and goes on to the next: on page 137 he says, “What we call time is -- in classical physics at least -- simply a complex of rules that govern the change.” Those rules are what I call “time logics” -- and they are m-valued. The time logics of operator-time. Wheeler once looked to a “pregeometry as the calculus of propositions” but ultimately set it aside. This setting aside was because Wheeler never connected the propositional calculus (in his view only of a 2-valued logic) to operator-time and also because he never considered an m-valued calculus of propositions as being a codification of the quantal relative-state of a 3-geometry in superspace. He could not see that the Machian history summing to zero (e.g., fixed SINGLE VALUE of the energy of the universe set to zero in the Wheeler-Dewitt equation) is equivalent to an m-logically-valued point on the MULTIVALUED reference space, each m-value stacked on that point being “relative” (as in relative-state) to an involuted subsystem, subspace, or subgeometry. (Barbour approaches but does not quite reach this notion with his diagram on page 305, entitled “The division of Platonia”, where his points in the vertical are equivalent to my m-logical-values stacked on a point in Hilbert space.) That the DIMENSIONALLY zero-point is LOGICALLY m-valued was not seen because 3-geometries were not built on a Cantorian-fractal base with its intrinsic identity transparent relative-state, meaning that though reference frames were seen as generally covariant, subsystems, subspaces, subgeometries were not seen as holographically coextensive with superspace, because transfinite point sets were not considered or related to the structure of Hilbert space.

The reason why they never took this leap, and why the Wheeler-Dewitt equation remains enigmatic, is because they kept looking for well-behaved solutions only -- on analogy with Schrödinger's treatment of the hydrogen atom. The hydrogen atom exists on one scale level. Therefore, the required well-behaved solutions of its wave-function are to be found. But the universe, being multiscaled, is not equivalent to a hugely complex molecule, as Barbour suggests. Infinite solutions and discontinuities in timeless LOGICAL MARCH (not time-series evolution) of the universal wave-function relate to the nested Cantorian fractality of 3-geometrical domain structures in superspace and orders of topological operations thereupon, as an expression of the relative-states codified as given pencils of skew parallels (those operations being what we denote as time). They could not see what the infinite solutions and discontinuities in timeless LOGICAL MARCH (not time-series evolution) of the universal wave-function relate to, because they disallow orders of operator-time by insisting that the universal macroscopic wave-function remain linear like the microscopic wave-function of the hydrogen atom. I believe that the origin of charge, magnetic monopole, and graviton is to be found in macroscopic properties of the skewing which is an expression of the relativistic lack of simultaneity under three orders of temporal operation reflected in quantum relative-state as nonlinearity of the universal wave-function (one linear order and two non-linear orders expressive of topological operations occurring at limiting velocities, limiting accelerations, and limiting time rates of change of acceleration). This is described in our 1980 paper on autogenic brain discharge in quantum measurement. In due course, these orders of “time differentiation” will be explicated as classes of atemporal m-valued LOGICAL march and associated topological invariants (this will probably be done through the use of Gödel numbers).

When dynamic meteorologist, Doug Paine, and I met at the last Velikovsky symposium in 1974, there was immediate resonance between us in large measure because he had already arrived at the notion of “temporal curl” and I had already arrived at the notion that “time is topological operation on space”. I moved up to Cornell to live because we found we just had to explore the implications of the fact that we had each independently arrived at the same basic concept of time from two so very different directions: he from a Maxwellian view of acoustically-modified gravity waves in tornado genesis; I from quantum perspectives on autogenic brain discharge phenomena. But my perspectives on brain discharges did not, in the first instance, come from study of neurology; they came from perceptual experience associated with what I call “the man-woman level of the transference”.

Barbour's choice to label his diagram on page 179 with the names Tristan-Isolde, Romeo-Juliet, and Heloise-Abelard, and to title it “Space-time as a tapestry of interwoven lovers” is itself a breakaway in physics more pregnant, perhaps, than Einstein's Special Relativity -- were it to be taken literally. The fundamental reason why physicists made the wrong choices and missed the right associations, as discussed above, is that they have never penetrated the man-woman level of the psychological transference and, therefore, have been denied the direct perceptual experience of identity transparent relative-state. They cannot, for instance, understand Proust's statement in Remembrance of Things Past: “…perhaps the immobility of things that surround us is forced upon them by our conviction that they are themselves, and not something else.” Or the implied inverse, as would be given in Tang-style Chinese poetry: Certainly, the mobility of subjective emergent properties and the appearance that they are internal to us is a mere serendipitous characteristic due to our tacit assumption that they are somehow not ourselves and yet truly ourselves. Identity, non-identity, and non-simple identity transparency. The protagonist in THE MOON OF HOA BINH, upon being presented with the decomposed nonlinear Brownian (i.e., G. Spencer Brown) form (see Table 1) of the universal wave-function (page 340, Vol. 2) says: “The physicist's 'self-duality' slash 'anti-self-duality' is, I suppose, a mere shadow of this mirrored hyper-complex counter-imaging… what I call the cosmological level of the transference. Jung's diagrams of the alchemical projection-identification exchange illustrates exactly this four-fold mirror, y'know.” The Japanese atmospheric scientist replies: “And the physicists see this phenomenon only in relation to linear time; that's the subliminal origin of their term 'instantons'.”

Transference between the intrapsychic contrasexual (anima-animus) in Tristan's unconsious-conscious itself is a 4-fold mirror. Crossed into Isolde, we have minimally the 8-Fold Way of the dyadic monad. The permutations are enormous, taking into consideration “best matching” under psychological, temperamental, and physical types. And if you want to get beyond “Cooper pairs” into critical collective behaviors of identity transparent monadic subsets of other cardinalities -- crossing in, say, RomeoJuliet and HeloiseAbelard -- well, the long-range correlations between the discharging neurons will certainly be coherent -- if there is, of course, a process of “making conscious”. And associated with the consequent autogenic abreactions will be -- persisting into the midst of everyday life -- major alterations in the normative perceptions of time and space. Change the tacitly given concept of monadic self-identity and all else changes with it. Focus attention upon the attendant time-slow-down, cataloging associated changes in feeling-space, depth perception, afterimage, olfactory phenomena, taste sensation, color perception, aural tone-color dynamics, and so on as described in MOON, and one may eventually find oneself postulating that “time is topological operation on space.”

The way this notion of time specifically came into my awareness was -- progressively, culminating in 1972 -- by noting many many instances of altered spatial perception associated with the experience of time-slow-down in brief fugue-like “derealization” states. I started studying linear perspective as a way to focus my attention, so as to increase my ability to identify commonalities from one such experience to another. As described in MOON, as I ambled down the street while practicing walking meditation, I visualized the ambient space through an imagined screen-grid and practiced seeing the space as a line drawing. Doing this in time-slow-down states, I saw the space as sets of superimposed interstices which changed according to the objects in the visual field. The critical commonality I picked up upon in this manner was that the superimposed interstices composed and decomposed differently at different degrees of time-slow-down -- and that my FELT relation to the object changed in concert. I postulated a feeling-space superposed upon the ponderable physical space. I experimented with many hypotheses in this manner. A description of some results of this was written into a letter mailed to J. A. Wheeler in 1975. It included a drawing of the decomposable nested triangular meshwork I came to call the multivalued reference space. It also described the fact that I saw this space as a model which explained how the following two statements, though appearing contradictory, are equivalent: “Nothing ever changes” and “Everything is nothing but changes”. Time as a topological operator on space, I said, decomposes and recomposes the reference space. Wheeler answered politely, briefly, indulgently, and mildly dismissively, but did suggest that I look at the work of Roger Penrose.

Instead, we worked on Paine's Maxwellian view of acoustically-modified gravity waves in tornado genesis. In 1977, we sent Wheeler a copy of the General Process paper and asked him what he thought about it. This time he was very dismissive and did not think that any of these relativity theory ideas could have anything to do with tornadoes, though he admitted to not knowing the equations of atmospheric science. We tried Hawking and had a worse time. It took me a long period to get some rudimentary understanding of Penrose's twistor theory, as only highly technical treatments were then available. There was no way to proceed with this, as their commitment to the notion of a “classical limit” (and so very much more) ruled out the possibility that severe local storms could manifest any of the involved phenomena, and we certainly were not in a position to undertake studies in their fields.

Yes, I was very intrigued by George Musser’s account (“A Philospher’s Stone”, Scientific American, June 2002) of Raymond Y. Chiao’s childhood engagement with a crystal radio, because the origin of my ideas about DNA superconductivity was drawn on a radio analogy, as described in MOON.

I’ve looked at Chiao’s paper, “Superconductors as quantum transducers and antennas for gravitational and electromagnetic radiation” (posted at http://arxiv.org/abs/gr-qc/0204012 on 29 July 2002) and notice a lot of parallels with certain aspects of the paper I co-authored with Douglas Paine in 1979 (“A Dynamical Theory Describing Superconductant DNA”, International Journal of Quantum Chemistry, XV, pp. 333-341). This paper considers the effect of radiation impinging on DNA, treated as a superconductor, and argues that the molecule produces coherent radiation in response to impinging electromagnetic radiation.

Two factors in particular parallel the perspectives Chiao argues. [1] When writing the DNA paper, there was much discussion as to what sort of coherent radiation the molecule produced in response to impinging electromagnetic radiation. In the end, we did not make any specification. We were confused about this issue because of one of the variables, (b), in the canonical equation (eq. 20 for those who look over the paper). We designated (b) as the “counterforce to the pressure gradient term”. We were aware that this was actually the gravitational acceleration of the parcel, but were unable to fully evaluate implications of its appearance in the canonical equation. We entertained the possibility that a coherent gravity wave phenomenon was being suggested, but were certain that if we drew attention to this notion the paper would not be published. [2] Time, as invoked in the DNA wave equations, is defined in an unusual way in relation to a limiting velocity and a limiting acceleration. In this paper, we did not elaborate on this relativistic notion. We were working with the idea that the speed of light is m-valued, not single-valued, but did not go into this as, again, we felt that if we did the paper would not be published. Chiao has been involved in demonstrating that light travels faster than the speed of light. I personally feel that Chiao’s thesis on quantum transduction has already been demonstrated relative to DNA.

Particularly fascinating is Chiao’s description of “Maxwell-Like Equations for Gravity Waves” (Section V). The derivation of our canonical equation for superconductant DNA radiation exchange was initially made relative to double-helical feeder bands in tornado genesis. (Such a juxtaposition of ideas was to most people utterly bizarre in the mid-70s; today, in context of complexity theory, it is not so bizarre to people familiar with that area of study.) The involved numerical computer forecast model of tornado genesis (see: "The Discovery of a Superconductant Exchange of Hydrothermodynamic Properties Associated with a Limited Domain of the Atmosphere") utilized an atmospheric analog to Maxwell’s equations to predict emission of acoustically-modified gravity-wave modes from these intense electrical storms. Chiao’s mode of thought is very similar to that which gave rise to this model, which, using historical data sets, regularly predicted tornado outbreaks 12 hours in advance on a 1 km grid. Why this model was never put online with real-time data is itself a fascinating story. I personally feel that Chiao’s thesis on quantum transduction has already been demonstrated relative to acoustic gravity waves in severe local storm genesis.

Thanks for your last mailing of materials. The paper by Lynn Surgalla, “Electromagnetic Field Interactions with Biological Macromolecules”, needless to say, was quite something, but to me somewhat depressing, as there is not a principle discussed in this paper that is not explicit or directly implied in our 1979 DNA paper AND he makes no mention of biological superconductivity. In spite of all he says about the quadripolar acoustic waves generated by the molecule (which is the exact thesis our mathematical model of the quantum wave properties of DNA describes) in response to ambient electromagnetic radiation, somehow he did not run across our article and the issue of superconductivity is not raised. The absence of superconductivity in his discussion is what is depressing. Surgalla was writing a survey article in 1986 for the Journal of the Psychotronics Association (complete bibliographic information is not printed on this copy) after the issue of biological superconductivity had dropped out of the literature. One can hardly imagine the Psychotronics Association rejecting the paper had it contained a discussion of biological superconductivity, which was the subject of quite a few related papers published during the time frame of those papers prominently cited in his bibliography. Many of the people he cites and had personal communication with were in one way or another associated with nonlinear studies at Los Alamos or University of Arizona, which exchange personnel. Surgalla cites a 1984 Journal of Biopolymers paper from University of Arizona that apparently described some level of experimental evidence of quadipolar acoustic waves radiating from DNA of calf thymus. There is little doubt in my mind that the status of superconductivity is an essential aspect of the processes involved, and that the acoustic “Cherenkov radiation” (phonons generated by the soliton wave induced in the molecule by pulsed electromagnetic radiation at specific frequencies and in the requisite waveforms) is in some sense gravitational in nature. One can now have greater confidence in this idea in view of the recent paper by Raymond Chiao, entitled “Superconductors as quantum transducers and antennas for gravitational and electromagnetic radiation”, which I earlier sent you a reference to.

Absence of mention of superconductivity is doubly depressing in light of the recent (August 2002) press accounts of structure of electromagnetic pulse (EMP) bombs projected to be the first munitions dropped on Baghdad in a future strike on Iraq -- munitions delivered by UAVs that will generate electromagnetic fields so intense as to destroy computer and communications hardware. If they have developed a version involving superconductors, the generated “Cherenkov radiation” is likely to be gravitational in nature and therefore will penetrate deep underground structures in a way electromagnetic radiation will not. If superconductivity of DNA has been held hostage for such purposes, this is, indeed, depressing in light of the path the field of biotechnology has gone down in the interim, and the implications of processes involved with such DNA superconductivity for explicating pathogenesis of various degenerative and autoimmune diseases. Not to mention the health undermining effects of the EMP generated by EMP bombs (see: Illustrated Terminology Glossary and "…and Kissinger begat…" and "Quantum Biophysics of the Homeopathic Ozone Effect") on the population of Baghdad.

Deeply related to all this is the ineptitude with which Newsweek produces its dark-gray propaganda and agitprop output. Having taught methods of white and black propaganda at Psychological Operations Group, JFK Special Warfare Center, using OSS Morale Operations and Office of War Information teaching aids, I certainly understand effectiveness of factually-based propaganda created by selective inclusion. The closer the output is to incorporation of outright falsehoods, however, the darker the shade of gray. Writing a piece (“Serenity Rocks”, Newsweek, 10/14/02) on the concept of space in the Ryoanji Temple Zen garden, Hideko Takayama, Tokyo Correspondent, uses a quotation that gives a particularly pregnant false translation of the Japanese word ma: “…analyzing the structure of ma, empty space, which plays a big part in Japanese culture…” The space signified by the word ma is anything but empty! As any informed Shinto priestess will tell you, it is filled with kami dust. Ma is normally translated “sacred space” not “empty space” (see, for instance, Ma, Space-Time in Japan, Exhibition Catalog, N.Y.: Cooper-Hewitt Museum, 1979); moreover, this space is laminated, a multi-sheet model of the cosmos, where each 2-D sheet is connected to the others by time-like properties. Ma is very much like the configuration space conception proposed by Barbour in THE END OF TIME: The Next Revolution in Physics (N.Y.: Oxford U. Press, 2000) or Sakharov’s multi-sheet model of the universe based on strain-tensors and electrically-charged Novikov dust. It is the 2-D sheet property of ma that forms the basis of the whole framework of aesthetics in Japan, as the traditional Japanese house illustrates in so many ways. It is the dust on the 2-D sheets that informs the whole artistic production -- from her earliest teenage works to those of the present day -- of Yayoi Kusama, the artist who has perceptually entered the realms of quantum-gravity more thoroughly than any other since Rice Pereira. In this false translation by Newsweek of a very important word, a word that may have been the very first word verbally articulated by any member of the human species (for an account of ma as the onomatopoeic root homonym from which all other words in every language derive, see “Mother’s tongue and slang: Why and how thought, language, and culture began” Journal of Unconventional History, Fall of 1990 and Winter and Spring of 1991, by MacArthur Foundation grantee Huynh Sanh Thong), we see very well illustrated America’s conception of global monoculture: every unAmerican conception, no matter how at variance with the contemporary American worldview construct, is to be assimilated to the American view by whatever false interpretation is required.

I wonder what John Lilly would have thought about the larger meaning -- relative to metaprogramming the human biocomputer -- of Russia using a gaseous form of ketamine to take down Chechen hostage takers and hostages? If you doubt this, check out the literature on co-use of ketamine and fentanyl and which one tastes like the “bitter almonds” reported by the hostages. Also look up what the lethal dose of ketamine is usually regarded as. Would the suicidal metaprograms Lilly had so much to say about have been involved in his reflections? I think there can be little doubt -- as we are seeing the global consequences of John Lilly being “scientifically crucified for exploring the origins of spatial dimensions in sensory isolation” (as stated above). There is a parallel, on the collective unconscious level, between the Cambodian holocaust and the Moscow theater gassing. In Cambodia, the Khmer Rouge policy of forced de-urbanization was in collective unconscious compensatory relation (employing a concept of Jungian analysis) to the American policy of “forced-draft de-ruralization”, the verbal frame for that phase of American attack on peasant animism (please see: Who Caused the Cambodian Holocaust, Anyway?). The collective unconscious always engages in thematic play -- as content of clinically recorded abreactions in Autogenic Therapy so lavishly illustrate -- a play which Jung and Pauli used the term “synchronisity” to designate (and which Barbour reframes somewhat reductively as “records”). Legitimizing a new era of chemical warfare by use of ketamine, which John Lilly used to systematically explore the origins of ma, is an extraordinary illustration of projective identification operating on the level of collective mind. There is no way the human species will deal sanely with the issue of identity transparency. Everywhere, on a daily basis, one can see illustrations of this. Every little lie, such as the Newsweek false translation of ma, sums to massive metaprogramming of the human biocomputer for collective suicide. As Derek forecast in MOON: in due course, there will be a Cambodian holocaust of the Whole Earth.


Return to:
•Top
•Home page
1