THE SAIGON PAPERS

DEREK DILLON'S UNPUBLISHED ARTICLES



M-valuation in a Generalized Currency Basket©
Transcending Fitness-Landscapes in Quantum Economies
by Replacing the Wave-Function with Operator-Time

currency basket mode-locks time-shape
The global economy is a multi-scale dynamic system composed of individual and corporate actors, a multitude of market micro-environments, local economies, urban economies, mega-urban-region economies, national economies, regional economies, and the global macroeconomy. Economic information, in the form of price differentials, moves laterally, vertically, and by leaps throughout this dynamic system in process of establishing fluctuating event gradients, or basins of attraction, which are allocative patterns. Currency exchange mechanisms interface one economic partition, with its unique behavioral configuration, to other economic partitions, with their own unique behavioral configurations. Relative values of currencies parameterize classes of activities peculiar to the unique behavioral configurations of given economic partitions. A currency basket, within this multi-scale dynamic, attempts to mode-lock the time-shapes of value fluctuations on given scale-levels by transforming inherent risk into uncertainty for deposition on other scale-levels of the global economy. To state this notion of risk transformation is to postulate a law of conservation of economic risk. Risk cannot be created or destroyed; it can only be transferred from one actor to another, moved from one scale-level to another. Risk, accordingly, has certain properties in common with non-zero-point physical energy. That being the case, risk must be systemically processed or metabolized, if disjunctive behaviors on the macroscale are to be avoided. An additional function, then, of a currency basket must be mediation of systemic processing of risk, so as to minimize formation of macroscale imbalances.

conservation of risk a necessary bridging concept
Von Hayek introduced the notion of time-shape into economics in discussing how to assess total capital stocks. Time-shape is not a classical concept; it is not classical in economics or in physics. Perhaps Von Hayek himself did not have a full appreciation of the implications of the notion he invoked. What is time-shape, really? In recent decades the term “chronotopology” has been loosely used, without adequate contextualization. If time has shape, then something must give it shape or time itself is active shaper. Time as active shaper has been called operator-time. This notion has a history: it is the concept of time implicit in the Chinese BOOK OF CHANGES; Von Clausewitz struggled with the idea in ON WAR; Pauli formulated the notion early in the history of quantum mechanics, but rejected it; Prigogine considered it in the 1970s, and again rejected it. But it appears -- as Von Hayek clearly sensed -- we shall not have much further insight into total capital stocks, macroeconomic risk management, systemic effects of currency baskets in absence of further understanding of time-shape. We need, therefore, to look again at operator-time. The notion that economic risk is a conserved quantity with some properties similar to energy is a necessary bridging concept in econophysics because operator-time is inevitably associated with the total energy of a system.

time-logics of the quantum reference space
Were operator-time to be brought into Schrödinger form, the first inclination of the physicist would be to conjugate the temporal operator to the Hamiltonian, H, the total energy operator. (In classical particle mechanics, H is a function of generalized coordinates and momenta minus a Lagrangian function, L. If time is not explicit in L, then H represents the total energy of the system. In quantum mechanics, H is an operator which provides the equation of motion for the wave function.) Indeed, conjugating operator-time to the Hamiltonian was considered and rejected as early as the late 1920s by Pauli. More recently Prigogine has reconsidered this proposition and reached the conclusion that operator-time is incompatible with the standard interpretation of quantum mechanics. He observes that the generator of the time evolution group is also the operator representing the energy observable and hence is required to be bounded from below. Which is to conclude that operator-time would generate negative values of the energy, and this is regarded as meaningless. But conjugating operator-time to the Hamiltonian is to treat it simply as a measurable, not as an active topological operator! Authentic operator-time would have to enter the Schrödinger format in a much more complex manner involving a related reinterpretation of the wave function.

When Dirac revised Schrödinger’s original time-dependent wave equation (which was unbalanced in regards to time reference), the resultant quadratic invoked a spin coordinate used to explain the observed split in spectral lines. No one quite knows, even yet, what this spin coordinate is whose value, according to Dirac’s formulation, must be either plus or minus ½. Spin ½ is agreed not to be simple physical spin upon an axis, because the spin value cannot vary over a range as h (Planck’s constant) tends to zero. Such continuous variation would be required to yield a classical rotational analogy. It is the property of remaining either plus or minus ½ that identifies Dirac’s spin coordinate as a shadow of active operator-time. In order to see this, however, reinterpretation of the wave function, which is connected to the Hamiltonian operator, is first required.

The wave function, essentially, replaced Newton’s laws of motion; but in so doing, those laws were transformed. All the variables in the laws of motion are single-valued, whereas those associated with the wave function are multivalued. This transformation implied the discovery of a fundamentally different sort of reference space than that used by Descartes and Newton. A multi-dimensional function space, called Hilbert space, came to be used, which may not have been quite sufficient for full insight into the transformation implied by conversion from laws of motion to wave function. A point in Hilbert space, even though it has many degrees of freedom representing the intersection of many spatial dimensions, is still logically single-valued. Hilbert space was constructed with a calculus of propositions wherein each proposition must be either true or false. However, four years before Schrödinger wrote his wave equation, Emil Post at Cornell University produced an account of m-valued truth systems. In standard binary logic, a proposition is defined as any statement that can be ascertained to be either true or false: the statement must have one of two possible truth-values. Post demonstrated that logics could be constructed, without contradiction, such that a proposition could have one of an infinite number, m, of possible truth-values. This was not applied to the multivalued wave function of quantum physics because, even though Post demonstrated there is an infinite number of possible truth-values, any given proposition can still be ascertained to be single-valued. Since that time, though, fuzzy logic has shown there can be a “fuzziness” between true and false. Moreover, G. Spencer Brown, while elaborating proofs of Sheffer’s postulates for Boolean algebras, demonstrated that the notion of “distinction” is more fundamental than that of “truth-value”. If we consider that the notion of “identity” is more fundamental, yet, than that of “distinction”, we may arrive at an expanded understanding of Post’s orders of logical-value and postulate full-blown m-valued logics, wherein a proposition may simultaneously embrace many logical-values.

A reinterpretation of the significance and meaning of orders of logical value is encouraged by recent developments in the field of mesoscopic physics, where it has clearly been demonstrated that some atomic entities, not only elementary particles, are multivalued and can exist at more than one place at a given time. The identity of such entities cannot be “simple” cannot be absolutely self-identical. Such entities possess “complex” identity; that is, distinction between the given entity and other entities cannot be absolute, an opaque wall, but must be a matter of degree, must exhibit one or another order of identity transparency. This means that an order of logical value (the multiple values of which a proposition can simultaneously and validly express) represents an order of identity transparency: one of the transfinite set of states between “no A is not-A” and “A is absolutely not-A”. The complete logical domain, implied by multiple orders of logical value, contextualizes the multivaluedness of the quantum wave function such that bringing operator-time into Schrödinger form necessarily involves the hypothesis that operator-time is the logical operator which determines the order of logical-value configuring the involved system dynamic.

Hilbert space, then, can repeatedly be reconstructed with each order of logical-value permitted by an m-valued calculus of propositions interpreted in relation to the notion of identity transparency. A given point in the resultant composite logic space would not only have many degrees of freedom, but, also being logically m-valued, would have many shadow selves holographically peppered throughout the space. Simple-locality and simple-identity would not be inherent properties of entities mapped into this densely stacked function space, which may be called a multivalued reference space (MVRS). Gödel’s discoveries concerning the limitations of mathematical systems constructed with a single-valued logic can be viewed as opening a window upon m-valued logics, identity transparency, nonlocality.

time-shape as quantal forcing of basins of attraction
What would a MVRS “look” like? It would look like an infinitely dense multi-sheeted Riemann surface, where each sheet is a cloud of dust, a Borel set of dimensionless logical-value points which activate and deactivate according to the Christmas-tree effect. We have here a pregeometry logic board where the light bulbs have m-states, not just on-and-off. Each bulb represents a q-bit, not a binary bit. Sakharov’s “multi-sheet model of the universe” was constructed of Novikov dust and geons, self-organizing gravitational fields. There is also the Cantor dust of complex, multi-scale, fractal, self-organizing, chaotic systems. Is it too much to think that all these different dusts are just the same dust? I think not. Any march of a function -- which is the time element! -- is a map on this MVRS. That which transforms a map on this MVRS into the march of a function in ponderable space is operator-time. Such mathematical evolutes are cascade phenomena, cascades in imaginary number spaces. Each sheet is 90 degrees imaginary to the next. Active operator-time, as authentic topological operator (not merely a measurable), would decompose and recompose functional maps on this MVRS. Much as Kozyrev imagined, such plus and minus cascade marches through the stack of sheets involve repeated 90 degree twists into and out of one imaginary dimension after the next: i.e., complex angular momentum cascades, switching on and off logical-value points as they go, thus determining the emergent time-shape. We have, then, an answer to the question, What is time-shape, really? Time-shape is time-logic, a given time-shape being the order of logical-value prevailing in a given dynamic regime. The logic maps on the MVRS are the quantal latencies forcing time tendencies associated with basins of attraction, basins of attraction being an expression of the underlying identity transparency at the root (i.e., more fundamental than “distinction” and “truth-value”) of m-valued logics. Taking a logically m-valued map and decomposing it into marches of its constituent single-valued functions is to generate a basin of attraction. Recomposing the map by fusing the marches into their composite m-valued map is to reconstitute the quantal latency -- which looks like just so much turbulence from a Boolean perspective. Dirac spin, plus and minus ½, is a shadow of these “downward” and “upward” evolutive logical and numerical cascades, a shadow viewed through the standard interpretation of the wave function as pertaining to probability amplitudes. A better way to look at it would be with Penrose’s twistors, where the twistor is treated as quantized operator-time -- which is basically what the wave function really is, according to this non-standard m-valued Postian interpretation.

total capital stocks and relentless globalization
Von Hayek’s notion of total capital stocks is an exceedingly complex idea because the total time tendencies of myriad interactive factors must be incorporated into the equation set used to assess such stocks: all possible patterns of interaction are implied. The given array of factors, when mutually interfaced according to pattern 1, have a productive potential of x; the same array, when interfaced according to pattern 2, x +1; the same array when interfaced according to pattern 3, x +2; and so on to the nth case. There is a similar regress in regards to the magnitude of the array of factors chosen, which will vary according to how fine grained the economic analysis made, and to what degree externalities are or are not excluded. The analytical grid mesh issue itself becomes complex when the attempt is made to factor in scale-levels -- market micro-environments, local economies, urban economies, mega-urban-region economies, and so on. An additional regress enters when the assumption of a fixed number of factors in array is excluded and the factor array is allowed to quantitatively change over time, increasing and decreasing -- thus raising the issue of basins of attraction and, indeed, the full gamut of chaos theory problematics. All of this and more is implied by the notion of time-shape of total capital stocks.

This is not a matter of much concern in a global economy where information about faraway places is scarce, communications are infrequent, computing power is small, and money is slow to move around the planet. If the economic playing field is not a level one, people do not notice that fact very much. When new technologies increase information densities, communication frequencies, computing power, volume and speed of money movement, the importance of total capital stocks and their assessment increases as well. Relative values of currencies should reflect, in one way or another, regardless of the system employed, relative values of total capital stocks. Optimally, this is what allowing the market to determine relative currency values is all about: the value of a currency should reflect the market’s assessment of the time-shape of total capital stocks of the involved economy. Hence, the level playing field. But how does the market arrive at its judgement? How does it set its price on the value of a currency relative to its time-shape of total capital stocks? Since currency traders do not have information sufficient, or the mathematical skills required, to ascertain the time-shape of total capital stocks associated with a given economy’s currency, they ignore the issue of actual economic-value of the given currency and focus upon the speculative utility of the currency, which sets the currency’s market price. A certain manageable volatility of the total currency market is necessary if there is to be speculative utility of any particular currency, and so it is that the currency markets remain volatile. How horrible it would be, if this were not the case!

If the changing relative price of currencies truly reflected actual economic-value, there would be an enormous pressure to agglomerate economies, because the more elaborate the time-shape of total capital stocks, the greater the actual economic-value. Economies of scale would become the only economies. Small could not in anyway be considered beautiful. But as information densities, communication frequencies, computing power, volume and speed of money movement increase, the divergence between speculative utility and over-time actual economic-value becomes more and more difficult to conceal. The general perception emerges from reading events that bigger is better, and the pressure to agglomerate economies builds accordingly: globalization. This implosive process relentlessly leads to one world, one economy, one currency, one news organization, one company, one boss, one store, one landlord, one language, one culture, and so on. But there is little in the natural sciences or economic history to suggest that the removal of lateral partitions and the collapse to one scale-level would make the resultant globalized company-town economy a vibrant one. Is there a real alternative in face of the new technologies? What if currency traders actually had the information and mathematical skills required to assess time-shapes of total capital stocks, assess, that is, in full light of complexity and chaos modeling techniques?

currencies on Koch curves
Ironically, attachment to the notion of national borders is leading to global uniformity, leading inexorably to a uniform borderless world. The idea that a border is a continuous line closed on itself, and that inside the line is an economy with a currency, imposes enormous limits on what an exchange unit can be and what sort of information it can carry with it on its travels. There is a difference between information about an economy and information in an economy. Economic data is information about an economy. Price differentials in market activities are information in an economy, which can be historically tabulated as information about an economy. Currency units in movement chasing the best price are information in an economy. The time-shape of total capital stocks can never be adequately assessed by economic data, by information about an economy. As the above discussion has explicated, time-shape is logically prior to basins of attraction, to all possible marches of events. Time-shape of total capital stocks is fused composite of the regresses in its constituent arrays of factors. Analysis of historical economic data can allow one to make inferences as to what the attractors might have been, and then one can make projections from that, and test the projections with the data that comes in, and revise the projections, and revise the inferences, and on and on. But none of this will allow one to arrive at time-shape of total capital stocks. Hence, one can never know the economic-value of a currency, only its speculative utility.

What about changing the character of the exchange unit so it can carry more information? This cannot be done so long as the notion of a currency remains tied to the notion of a border as a continuous line closed on itself. Change the very idea of a border and suddenly changed also is the very idea of a currency. The western and eastern borders of the United States of America -- which, in part, bound the national economy where greenbacks are legal tender -- are coastlines. These coastlines are Koch curves -- a fact that has been overlooked by economists. Koch curves have no measurable length. The more you measure them the longer they get. This is because they embody the properties of the Cantor set, which can be arrayed in a line that gets longer and longer because it gets more and more holes in it the closer you look at it. The more you measure the jigs and jags of these things, the more they simply aren’t there. This is because the partition drawn by a Koch curve is not a lateral severance; the severance drawn contains all possible vertical levels of that which is severed: nests nested inside of nests. The pattern of Cantor-set holes in any given Koch curve corresponds to a particular frequency pattern and hence functions as a barrier, almost like two waves are interfering and canceling out, while allowing other frequency spectra through. Could it be used like a stack of band-pass filters? The border is a map of the whole space, not just a line therein or thereon: the border is a mathematical function of what transpires within in. With currencies on Koch curves there is the possibility of more subtle control. Presently, with existing exchange units, a country can only nationalize a foreign company, forbid repatriation of funds for a specified period, restrict investment for less then a specified period. Not subtle controls; not fractal entrapment. If a currency unit was defined on a Koch curve, the possible subtly of self-organized band-pass would be enormous. Moreover, in order to assess the time-shape of total capital stocks, we need currencies referenced to Koch curves, not continuous lines closed on themselves. We need the information in economies such currencies would carry. What kind of currencies would these be? They would be m-valued exchange units.

onto and into the multivalued reference space
Multivalued exchange units (MEUs) fully implemented within the global economy on borders that are Koch curves, would completely obliterate the distinction between a domestic economy and the international monetary system, while yet maintaining lateral partitions and vertical scale-levels. This is the beauty of q-bitic information systems constructed with m-valued logics: tiling patterns and hierarchies have a virtual presence in midst of deeply holographic ordering (complete transparency between part and whole). In such a system, the inherent conflict between domestic Keynesian fiscal policy injunctions and international monetary needs would simply disappear. Presently, of course, things are going in the opposite direction with the to-be-expected consequences. The fewer economic partitions (laterally, and subsystem-system-supersystem) there are in the global monetary framework, and thus the smaller the number of entities seeking to promote high employment and real income, the greater the threat to international monetary stability via enhanced conflict between domestic economic requirements and international monetary needs. This conflict promotes collusive intervention, thus undermining market self-organizational competency.

In a global monetary system based on MEUs, the money supply in any given economic partition can be governed by an algorithm of issuing, the values of the variables of which are continuously modulated by a global (for the given partition) wave equation, which, in turn, is a function of the relative-state of the nested currencies composing the system. If our equations were classical Newtonian econometric equations of behavioral motion with single-valued variables, the notion of an “algorithm of issuing” would constitute a reversion to centralized, planned, command economies, which would have no hope of adequately modulating the money supply because of multiple-bank credit expansion, and so on. However, a q-bitic quantum wave equation is absolutely a different sort of thing, with its m-valued variables characterizing the myriad interactions of a complex multi-scale system. It is this extraordinary degree of quantal self-organizing capacity which is responsible for the phenomenon of superconductivity, the best model we have of a truly sustainable system. Nor would such nested algorithms remove decision making on the part of commercial bankers and other micro-actors: the algorithms are interested only in the aggregate money supply per partition, not in the details of its origins. Individual entities are free in m-valued systems to a degree never permitted elsewhere.

Banking decisions would, however, become much more subtly cued by incentives and sanctions. Such algorithms of issuing, when functioning in an environment of MEUs, would modulate aggregate money supply per partition, not by brittle human-judgment-determined discrete macro-interventions (i.e., regulation by fraction of reserves required, by variations of central bank discount rates, by governments purchasing and selling commercial bank claims), but by wave-equation determined making of categories of credit expansion in given partitions into various weighted values on the MEU's value-array -- thus modulating the functions of credit categories in market exchange dynamics. That is, credit categories would become branches of the m-valued function defining the given MEU. Innovations within the financial industry occur because they fulfill systemic needs which have arisen due to changing economic circumstances. The variable properties of an MEU allow for incorporation of factors related to such innovations. This would, essentially, mode-lock forms of credit and financial derivatives into feedback loops stacked onto and into the MVRS (multivalued reference space, as discussed above) of the MEUs which would be the reserve currencies for clearing payments. Not only would the chosen values be different from partition to partition and change over time, but the weights on the chosen values would also change -- both variable categories being coupled to fluctuating externalities and measures of intersystemic currency flows. Employers and employees, buyers and sellers, creditors and debtors would have to electronically match the branched exchange-unit value-arrays in order to complete transactions being fed back, on a radically decentralized per-partition basis, into the determining conditions for fluctuation of the nested wave equations setting terms of the nested algorithms of issuing.

Of course, such levels of complexity could be undertaken only by staged entry over a relatively long period, beginning locally at multiple points (with LETS, Local Exchange Trading Systems) and expanding outward geographically, and upward through the scale-level nest. Any attempt to impose quantum processes from the top down (as current technology infusions are doing by globalizing from macro to micro) would in time be disastrous. In order to make the complexity of relationships user friendly, color coding would initially be required, and, as the system complexified, more sophisticated means of representation, involving autopoietic Musculpt holography and VirFut Q-Pro (written about elsewhere), would be needed to make rapid recognition and precise communication possible.

an allelotropic theory of value in autopoionomies
Algorithms of issuing primarily focus on modulations of quantities of in-circulation monies and virtual monies associated with various economic partitions. These modulations are market mediated because: (1) the monies float in relative value according to supply and demand; and (2) their prevailing relative-state is a factor fed back into the algorithms of issuing. But an m-valued quantal market has a higher level of self-organizational competency than does a single-valued Smithian market, because a quantum as a fundamental unit is a multivalued function, while an atomistic fundamental unit is only a single-valued function. This multivalued quantum property is incorporated into the quantal exchange unit itself, thus allowing each individual unit to directly carry major components of explicit macroeconomic information into every single market transaction in a fashion impossible to atomistic exchange units. The macroeconomic state in a system incorporating MEUs, therefore, much more directly arbitrates incentives and sanctions in the market micro-structure than it does in a single-valued Smithian market. The quality aspect of money -- its “standard of value” and “unit of account” aspects -- is part of the burden of value carried by the multivalued information stacked onto the function space of an MEU. The fact that the state of a given MEU at time, t, is partially a function of the relative-state of all MEUs in the global monetary system at time, t - f(x), insures that the allelotropic attributes of value are being adhered to. Allelotropic? Yes. Price is the information unit, the values of which set up action directives according to conventions of the market supply-and-demand dynamic, which in turn propel micro-actors via incentives and sanctions. Free market signals (properly functioning, accurate prices) require removal of price distortions. But, in truth, we have learned a great deal about the nature of information since Adam Smith printed his book in 1776. We know, for instance, that information has more properties than simply accurate or inaccurate. It can be accurate and have, or not have, one or more of these other properties. Of major importance is whether or not the information carried by an accurate price is single-valued or multivalued. If the “order of value” (the degree of branching) of an accurate price is factored into the system establishing a monetary standard of value, then value itself becomes the determining variable. The best arbiter of value is the free market, but by what fundamental criterion would it arbitrate in presence of accurate prices carrying m-valued information? Here enters an “allelotropic” theory of value. Allelon: of each other. Trope: a turning. Value as a measure of relative-state, of superintegration, of overdetermination. Value is the “quantum potential” in an “autopoionomy”, an economic process self-organizing on the basis of quantum principles. Value as a measure of the capacity to integrate the subsystem-system-supersystem composite. A wave function is required to represent it. Value is the index of a “turning to each other”; it is a metaphorical embodiment, an allegorization, of the “other awareness”, the non-simple identity amongst parts of a quantum system -- economic value being only a special case of the general principle which manifests in physics, chemistry, sociology, sexology, ethics, metapsychology, and so on. Behaviors that exhibit value are allegiant: loyal.

f(x), given above, is a time-lag period that is a function of processing variables operative in the global monetary system at given states of self-organizational competency. No adequate monetary system can be a mere mechanism; it must be an unfolding process, or the changing technology-driven economy will evolve beyond it. A self-regulating global monetary process is needed. This is not a homeostatic master control device, but a dynamic wave-guide on economic activity: a model analogically based upon superconductivity, not clockworks. Implementing MEUs is not a project in institution building; it is a project in deinstitutionalization. Nor will it promote empowerment; it will demote the very notion of empowerment. Since systemic electronic processing can never transpire absolutely in real-time, f(x) is one of the many measures of the level of self-organizational competency displayed by the system at time, t. This can all be described algebraically and geometrically. Measures of the level of self-organizational competency jump out of both representations.

forest fires, leaded-glass windows, ideal forms, and attractors
The quality link between money and physical wealth is fundamental to an MEU floating in the cyberspace of a quantal marketplace, but it is not at the forefront. The relative-state property of an MEU is quite similar to pegging to a basket, except that there are no predetermined central peg values or band margins associated with the branched value-array stacked on the currency base, which can be used as paper or metal alone for a limited range of transaction types. The currency base floats, like any unconstrained single-valued currency, while the branches of the stacked value-array are wave-equation modulated according to: (1) the relative-state of the currency bases in the given basket or scale-level nest; (2) measurements taken on changing macro-variables and externalities tagged to the m-values of the stack; (3) fluctuating total capital stocks as defined by Von Hayek in terms of their changing time-shapes, made increasingly mensurable as the m-valued framework evolves in complexity.

As chaos and complexity theory have, like quantum mechanics, abandoned the notion of causal connection between temporally adjacent events in a linear-time sequence, Stuart Kauffman’s idea of an “nk fitness landscape” can profitably be used to approach the problem of writing a wave equation for m-valuation in a generalized currency basket. The fitness landscape is an application of percolation theory, wherein an “influence” is allowed to percolate over time through a set of objects, and, via the monitoring of critical variables, the prediction of certain states such as the shift to chaotic behavior, the attainment of self-organized criticality, and so on becomes possible. Kauffman’s work has primarily been in the area of DNA codons. A better illustrative example of application of percolation theory is that of modeling forest fires. Given various values of critical variables (number of trees, average spacing, geographic topography, typical canopy patterns, number of fire sources, wind conditions, moisture content of the soil, relative humidity) what kind of forest fire is likely to transpire? Will there be a controlled homogenous burn? Will the fire make jumps leaving spotty burn clusters? Will the fire move through the whole forest or stop at some invisible barrier. Kauffman’s idea simplifies the problem by reducing the number of variables to two: n and k. n is the number of elements in the system, and k is the number of correlations between those elements. Each of the elements can be in one of two states: on or off. So Kauffman’s model is actually that of a 2nk fitness landscape. A random number generator is used to turn the elements on and off, providing patternless input to the computer model. When the model is initialized, time starts running. Kauffman has found that, simply by varying the number of elements and/or the number of correlation factors, it is possible to predict evolution of certain properties of the resultant system: whether or not it will enter a regime of chaotic behavior, whether or not it will stabilize into a state of self-organized criticality, how long a period is likely to be required before transitions of certain types are likely to occur, and so on. But the ability to make these predictions in no way assumes that the turning on/off of a given element causes the change of state of a spatially adjacent element (unless this, specifically, is designated as one of the k correlation factors defining the parameters of the given system), nor does it assume that the instantaneous totality of on/off states of the full set of systemic elements at a given instant of time causes the instantaneous totality of on/off states of the full set of systemic elements at the “adjacent” (i.e., subsequent) instant of time. Causality between system states (of the ensemble of constituent elements) of the evolutionary (march through time) sequence of states is not assumed -- either by spatial reference or temporal reference (adjacent in space, adjacent in time).

The focus is on the emergent pattern of events (as the random number generator insures patternless input), not on causes thereof. What causes a given tree in a forest aflame to burst into incandescence? The fire jumping from the adjacent tree? The general temperature rise in vicinity of the burning section? The tornadic winds carrying incandescent gases in leaps from place to place? All these, and perhaps none of them, because burning behaviors change at transition thresholds seeming to have little to do with physical processes (to include critical temperatures) of forest fire burning: thresholds near entrance to the regime of chaotic behavior, near entrance to the regime of criticality. Forest fire events at critical states seem to defy physical explanation. Why does a roaring fire suddenly stop along a line where there appears no barrier? Why does a stand of trees like any other positioned in the middle of a burn, not burn? And the like. These regimes are holistic, omni-transforming, overall configurational states, and they seem to “determine” the patterns of behavior of the individual elements of the ensemble under “influence”of their regime. So the percolation model manipulates variables characteristic of these regimes only, not of the physical processes of forest fire burning -- and is able to make more accurate predictions than possible by consideration of the physical processes. Thus, the percolation model has no model of the cause or causes of the events it models. On this model, then, what causes the on/off or burning/not-burning state of the elements? Nothing in the model! “Cause” as a notion, is ignored, and the events are simply “treated” as randomly generated but teleologically attracted to certain emergent patterns. But something turns Kauffman’s light bulbs on and off at the input instructions of the random number generator. The electric current involved in doing this, of course, is not part of the model: it is transcendental to the model; it is in another sphere altogether.

The fractal-attractor pattern on Kauffman’s fitness landscape, which he so assiduously studies as an emergent causing-from-the-future determining tendency or influence, is like the pattern on a leaded-glass window in a medieval cathedral, through which the Divine Light poured to illuminate the events of the incarnate world below. This transcendental influence (the Divine Light) was viewed by the neo-Platonic scholar of the Middle Ages as the first and only real cause of events. This electrifying influence from higher spheres (like the electric current transcendental to Kauffman’s light bulb model) was patterned in its incarnate play by the lead in the glass representing the strange attractor as Platonic Ideal Form. But these “perfect forms” came in multiples, hence many “windows” in the “church” for the “light” radiation to penetrate, thus casting a composite light-play music-sculpture upon the floor of the cathedral where the medieval mason focused his numerological imagination. Music-sculpture, that is, when the light-play was accompanied by polyphonic plainsong Gregorian chant, wherein the harmonic structures of the multiple lines were analogous to the leaded-glass patterns, and the projecting cantus firmus basso ostinato -- the Word of God -- was the aural equivalent of the penetrating Divine Light. In a very real sense, what Newton did -- and Libnitz tried to prevent him from doing with argumentation over monads and fluxions -- was disentangle the projected light-play composite and regard elements of the patterns cast through the individual viewing windows as, not only connected in time, but as antecedent and consequent, one element causing the next. Newton even came to regard the windows as causing one another! Quantum mechanics rediscovered the cathedral.

2nk becomes Mnk
The problem of an emergent property teleologically back-causing from the future before it exists in that future is a non-problem. The basin of attraction is a logic product of the quantal action of operator-time on the m-valued light bulbs in the multivalued reference space -- which is the “floor” of the cosmic cathedral. But there are several difficulties that must be resolved if Kauffman’s landscape model is to be used to enter this cathedral. First, the elements of any functional march, linear-time sequence, scenario (be they trees in a forest fire, units of exchange in a money meltdown crisis, or codons in a genetic information system) operate by graded levels of activity, not just in on/off states: trees burn faster or slower, to various stages of destruction; currency exchange rates vary over a range, not simply jump between no value and maximum value; because of the quantum wave properties of DNA codons, there are highly complex graded states of gene on/off-ness in response to ambient factors like radiation. This does not imply continuous smooth variation in violation of the basic idea of a quantum of action; variables step discretely from one stepping-stone value to the next. So, the implied 2 in Kauffman’s nk model must be able to take on any numerical value, thus becoming an M which represents the ability of an element to assume the full range of values over a graded scale. We thus move onto an Mnk landscape for each of our curencies on Koch curves.

Secondly, say, for instance, a given m-valued currency incorporates sustainable development indicators as variables of measurement. Where would they be represented on the landscape? The k factor, of course. These indicators are correlation factors, holistic properties of the system: the indicators are measures of correlations between physical processes driving evolution of the system’s states. k does not concern itself with what these indicators are in the concrete, only with the number of them operating in the system. Vary the number, and the system behavior moves toward or away from self-organized criticality, toward or away from the chaotic regime. Knowledge of these behaviors in the purely abstract chaos theory, and also knowledge of these behaviors in the concrete case of the physical system under consideration is necessary in order to choose the appropriate number of indicators for the given currency (what the chosen measures specifically measure in a physical sense is a secondary, logically derivative issue).

But the magnitude of the k factor makes a statement about the identity of the concrete elements correlated by the m-valued currency. The smaller the number of behavioral correlations, the more distinct the elements; the greater the number of behavioral correlations, the less the possibility of clearly distinguishing between those elements. This is one reason why total system behavior is critically regulated by the number of correlation factors present: the magnitude of k determines the relative-state (degree of identity transparency) of the elements composing the system. So, in a very real sense, the value of k modifies the “meaning” of the value of n, the number of elements in the system. High values of k can make n “fuzzy”. Consideration of this would lead us into “fuzzy sets”, a branch of contemporary number theory.

A fuzzy n, in turn, reflects upon the state of the element-fuzzed-to-elements in question. What graded level of activity -- the magnitude of the M factor, that is -- is the state of the fuzzy element-elements in, under high values of k, at time, t, say? No exact determination can be made. So M, too, becomes fuzzy. This would lead us into fuzzy logic, if, that is, we did not already know that fuzzy logic is a shadow of full-bodied m-valued logic in the same way that Dirac spin (and, indeed, Dirac’s “crossover time”) is a shadow of operator-time -- both shadows being viewed through Boolean lenses and the assumption that truth-value is the bedrock of logical semantics. No single-valued logic -- be it Boolean or Post’s m-valued truth systems, where a given proposition must represent one of 2 or m possible values -- is adequate to “resolve” the patterns visible in this domain where values of variables inherently cannot be exactly ascertained. Kauffman finds that in his studies, orderly behavior can be maintained only with low k values. As the value of k increases (i.e., more correlations between elements), the system transits ever more decisively toward chaotic, patternless behaviors. This is counter-intuitive, as a system whose elements are more highly correlated (e.g., a coherent system in a superconductant state) should be more self-organized than one whose elements are not well correlated. Because of his reliance upon binary on/off relations, Kauffman has not factored the fuzzy, relative-state, identity-transparent aspects of system behavior into his model. Nor has he realized that what he sees as random or disordered from a binary-logic perspective may, from the perspective of the appropriate m-valued logic, be highly patterned.

This takes us on to the third issue. From early childhood (especially in the West) our minds are enculturated to abhor fuzziness. To get rid of the fuzziness, we find it necessary to split a composite into multiples. The fused state of the system ensemble at high values of k is separated into identity components as if spatially located in different domains, or as if temporally located in different linear-time segments. So, let's make a model that takes this habit-of-mind into consideration, yet does not forget that the fused base state is an always-there reference space which at any of its points operator-time can decompose. Let S represent a given currency on a Koch curve. Then S cross Mnk, that is S x Mnk, is required to characterize the m-valuedness of the given currency’s landscape. Cross multiplies each element in S across every other element in S via matrix algebra. To carry this through would be to algebraically construct a composite of the multiple time-streams (each drawn to basins of attraction) characterizing the time-shape of total capital stocks, of the economy whose currency it is, under differing degrees of integration: i.e., various branch functions or fuzzy value-regimes of the variables M, n, and k. Call the global economy U and let it contain a number of currencies S1, S2, S3…Sm. Then Sm x Mnk = U characterizes the universal covering surface or landscape of the multiscale dynamic system under consideration, the global economy. And it is on U, not on the landscape of any S, that self-organized criticality is seen at high values of k -- with all that implies about globalization.

Arguments can be made that M is operator-time, and that it replaces the wave function in a quantum wave equation as construed under the standard interpretation of quantum mechanics. A primitive equation set can be constructed from this canonical wave equation, but such mathematicizing is far less important than taking a LETS like, say, Ithaca dollars, and beginning to see what happens when externalities or quality of life indicators are attached to it as m-values. The mathematics should not be prescriptive relative to a global monetary process which ought to evolve through use like any natural language.

(This article was written in fragments
during the first six months of 1998. The
author at that time was working as a copy
editor for The Saigon Times, a publication
of the Ho Chi Minh City Party Committee.
Though the man does not theoretically
understand the involved concepts, the author
would like to credit Tran Bach Dang for setting
him on the path leading to their formulation.
Tran Bach Dang was the Saigon Party Committee
Chairman throughout the American period of the
Viet Nam War. In 1968, the author was at Strategic
Research and Analysis, MACV Headquarters, and
charged with being the resident expert on Tran Bach
Dang and his organization -- which functioned like
no bureaucracy seen before or since. Many of the
above described concepts emerged in their initial
formulation in 1968 as attempts to comprehend
the dynamics of Dang’s underground political
infrastructure. Within 18 hours of the author’s
first arrival back in Viet Nam in 1993, he found
himself sharing tea on Dang’s terrace, and
graciously accepting the man’s offer of visa
sponsorship.)


Return to:
•Top
•The Saigon Papers -- Abstracts
•Home page
1