Part 1 -- Complexity

(The fog: Problem and Opportunity)

Christopher Spottiswoode
cms@metaset.co.za
 
15 October 1998 (minor updates 27 October)

(This document is part of the "Ride The Mainstream!" paper.)

The Divine Programmer Syndrome, and the consequent mess

As usual in The Mainstream, someone else identified it long ago. It was the gist of Charlie Bachman's 1973 ACM Turing Award address, in which the thereby-recognized "father of DBMS" pointed out that programming must undergo a Copernican Revolution: the CPU, like the Earth, is not the centre of the universe. The programmer must navigate in a universe of data.

We share … we even exist … through the wisdom of our accumulated common knowledge, not our own momentary thoughts.

The change in perspective is fundamental, but it has still not been carried through to its logical conclusion. As we begin to see in this section, conventional DBMS and OO architectures are still largely stuck in a "naïve realtime" outlook, analogous to classic notions of an absolute and all-embracing space with a naïve concept of simultaneity (even though some of it may be "slightly behind"). Just as physics has thoroughly integrated the fact that the speed of light is finite, so also must programming architectures fully recognize that no programmer can assimilate or even grasp the entire data-universe out there, and no practical user even considers the notion.

Much though programming has yet to catch up, in The Mainstream the recognition of complexity has always been implicit. We all accept and face it practically. It shapes every facet of our lives. As individuals or in groups we live asynchronously, more simply, in our own "relativistic" data-perspectives or data-worlds. Each one has its own conceptual space and relevant time. Communication involves transforming one view into the perspectives of another. Hence MACK's "relativistic realtime" concept, as we shall see. (It would also be a key addition to XML as it is currently aimed or at least presented.)

Even the whole of biological evolution can be viewed as the story of how species have found their own simplified yet viable niches, each one ignorant of what its senses cannot perceive, as senses and brains and response-patterns have evolved in partnership. (Is the frog ignorant or blind when it does not see an immobile though live grasshopper as food? Take your pick, but its view is very rational, considering the various constraints of the species.)

The human mind has taken the natural evolutionary process an amazing step further: what took a whole species evolution to achieve, for us merely requires the creation, testing, refinement and practice of a new perspective or mental model, or the taking to heart of an existing one. (And what a marvellous relief to a nexus of problems that can be! Or how exhilarating with possibilities! But how tempting and deluding such reductionism can be. We need to be more awarely rational than that frog…) We can flip from one perspective to another in a blink of an eye or a twinkle of a thought. That's one reason why we like multi-window applications. It's also a key aspect of that flickering mosaic of groups in which we live. Each view has its own conceptual sensory apparatus, ignoring what is deemed irrelevant to it, and -- ideally -- focusing efficiently on the effective goals and activities of the moment.

But we can only see it, and see how we already manage it, by squarely facing the immensity of our universe and the complexity of its possibilities. Then we might manage it better. The opportunity is immense too.

Those observations underlie MACK's "relativity". It closely maps the evolution of species and of our conceptual artifacts. But it is also the natural resolution of many of the big practical problems in information systems, from relevant and coherent view composition and input management with access-security and privacy, through version-management, schema migration and efficient processing, to the resilient integrity of the multi-user distributed database.

Meanwhile, programming architectures are still stuck in the pre-Copernican Dark Ages.

The programmer is still assumed to occupy some divinely-ordained central position of omniscience and omnipotence, and he (or she) tries happily to oblige (though somehow the masculine instances do also seem more typical here...).

He comes to feel in full control. Majestically orchestrating the CPU, he commands services (functions, procedures, methods, ...) to do his bidding, exactly as he imagines them to be. And lo!, computing does indeed often seem to have the track record to have him confidently expect his whim to become law. System software providers work hard to pamper him, surrounding him with a dazzling display of more sophisticated instruments (APIs, Objects, Interfaces, Components, ...), from which he may pick and choose to his heart's content. Especially with the Internet, he has tremendously extensive access to an entire universe of data and other resources. Analogies with hard engineering componentry flatter him into the attitude that his impulsive compilations are sure to meet his customers' needs. He is at the centre of his universe. How exciting! Yet how deluding.

The resulting mess -- for are we programmers not all too human? -- has all too familiar symptoms. Lack of reuse in his own orchestrations results in widespread duplication of code between programs and applications. The program client bloats. Unmanageability looms. (And the "thin client" and "network computer" tempt architects to throw the baby out with the bathwater, for "off-line" work is not a downgrade but an opportunity for natural simplification, the challenge being to feed it in an appropriate way and relevantly to the user's purposes.)

Flexibility evaporates, and now the world even has a Y2K problem, excellently spotlighting a severe and profound architectural malaise, despite the underlying stark simplicity of Y2K's cause and real implications. The problem is not the missing digits but the silly complications of making such a minor change as adding them, when it has to be done in so many places, each often with its own unobvious side-effects. All due to lack of reuse of essentially simple functionality. And conventional OO, even had it been in effect for decades, would not have eliminated the problem. Many side-effects of what is usually cast as a storage issue extend beyond the Classical Object Model's "island classes", and stir up the other tiers of the 3-tier model, impacting on both presentation and workflow (Since we don't always want to see or type those extra digits, consider the many application-dependent ways of handling default input and output centuries).

The Divine Programmer succumbs into self-centredness and lives in a world so different from his neighbour's that applications interoperate clumsily at best. (Are you happy with how your e-mail messages and web browsings are integrated with your own database?)

His divinity crumbles as it becomes evident that he cannot master the purported wealth of service offerings. (Have you tried to absorb the OMG's proposed full CORBAservices and CORBAfacilities? Or Microsoft Windows' many thousands of API calls? Generally not, of course, so Procrustean packages from higher in the heavenly hierarchy then treat us fallen angels with contempt. (Happily, the aptly ancient image of Procrustes, the legendary bandit who cut or stretched his victims to fit his bed, has come to personify the figurative and sometimes literal mutilation of other people so that they might fit one's own preconceptions.))

Insidiously, in the "confusion of algorithm time with real time" that my 1996 paper pointed out and its faq question 1 reply elaborated on, it is too easy for the programmer to imagine that users' thought processes and activities can be simply mirrored in his own procedural code with its special perspectives and narrow horizons. What fragile oversimplifications result, pretentious "wizardry" notwithstanding!

In the most immediate user predicament, when the all-too-human programmer has misunderstood a service, or there are undocumented or otherwise unforeseen interactions between services, the user is landed with an error message which seems to assume that the Divine Programmer's angelic and similarly omniscient emissary is ever watching, and can omnipotently solve the problem there and then. Alternatively, his lordship or one of his minions might be prevailed upon to provide some expensive or at least status-enhancing support services, thus closing the vicious circle that maintains the priestly caste.

It is such ransoms which support the coteries of technical experts living symbiotically under the wings of the large software package providers. Thus the circumstances of the Divine Programmer's very profession are further cause of the polarization of the market's supply-side into ever fewer large package providers with inversely-proportionate size and concomitant neglect of locked-in customers.

Resignation spreads as ever larger numbers of users regard the occasional new release's meagre improvements as something to be grateful for. Imagination and initiative are stunted. The well-known "hidden backlog" of unvoiced yet reasonable demands grows inexorably.

One could continue in that vein, but the litany of related woes already pervades both the specialist and the popular media. It will become clearer as we proceed how they amount to a syndrome largely caused by the implicit myth of the Divine Programmer.

Fortunately, of course, something can be done about it.

However, the cure for the syndrome is not mere incremental improvement of program, programmer, or programming or scripting language quality. (Even less is it a little dose of anti-trust lawsuit.) First we need a good diagnosis:

The problem is more fundamental: it lies in the well known analysis and design technique that goes by such names as encapsulation and complexity-hiding.

The technique pervades the mainstream of programming evolution. From Fortran's functions, through Modular and Structured Programming, to Object Orientation and component-based design, so-called complexity-hiding reigns widely supreme.

Another aspect of the problem is apparent when we review the syndrome, not as caused by the programmer's mere humanity, but by that of the system architect.

Since the programmer alone can evidently not handle the big problem, system architects have followed the same principle at a higher level. Thus complexity-hiding has resulted in the three-tier and other layering architectures which seek supposed cleavage planes that might cleanly split reality's rough diamond into smaller orthogonal pieces which the programmer might handle with greater success.

But even after the three-tier split, the Divine Programmer is still not divine enough, so the next step has been UI, Middleware and DBMS packages. But they have had a bad job to begin with, as there are no such clear orthogonalities in reality (as we saw above even in the simple Y2K case), so those inevitably Procrustean products have merely further spoilt the gem of rich complexity.

Alternatively, 4GL designers have maintained a more global approach. But that requires the support of builtin frameworks or skeletons which, not being divine products either, merely add their own inflexibilities, and with mixed results. Each one's niche is limited. That is bad for reuse and interoperation.

Finally, therefore, the ultimate Procrustes of the big application package has run rampant, bulldozer-like, trampling and mutilating the complexity of the user's needs. The user would prefer not to take it lying down but pays up nonetheless. For many there is evidently no alternative.

Clearly, the Divine Programmer caste, so well treated by it, is quite naturally a bit confused about complexity. More specifically, as we shall see, the foggy conception of complexity-hiding needs a major clarification and repositioning, and it is a very central feature of MACK that it better situates that key mainstream technique. (It improves the programmer's job too, but we come to that later...)

The Syndrome is even more serious than may at first appear, for in mistaking complexity, programming has misapprehended arguably the most basic and far-reaching matter that we can possibly talk about objectively, as we shall now see.

One consequence of a more correct software-architectural positioning of complexity will be history's judgment that present so-called complexity-hiding is almost tantamount to a gigantic fraud. It takes inordinate amounts of common intellectual property which should be openly recognized as everyone's heritage from The Mainstream, bundles it in with small measures of true invention and more validly private property, and presents the package as hidden complexity ... expensive of course ... when it would be more accurate to call it bad and cheap artificial complication. Users are being had for enormous suckers.

Architecturally better recognizing true product advances, and better supporting them infrastructurally, together make fine yet powerful supply-side stimulation. Coming on top of a market running on qualitatively better groupware, that can only have a major depolarizing effect on present software supply.

Even without looking very deeply, though, is it not utterly plausible that a better appreciation of complexity -- and our various poor human responses to it -- should help us towards better shared knowledge and more successful joint action? It is, after all, deemed a characteristic of our times that we are bumping almost everywhere into a complexity that is beyond our means to comprehend.

Thus the Divine Programmer's so-called complexity-hiding is merely a mistaken though historically-explicable way of simplifying real complexity that has rather naturally led to the systemic complications of his profession that we have noted.

It will be easier to see that in the context of the alternative. (Medawar's Dictum: Theories are not displaced by facts, they are replaced by better theories.)

Epistemology is about knowledge despite complexity, and applies to itself

I once again apologize for the autobiographical aspect of this part of my presentation, but it is difficult to talk simply about complexity! (That is especially so when it makes one so critical of many others.)

However, I am in good company with such difficulty, as "the complexity of reality" is a basket-phrase trying to hold much of the most difficult matters which philosophy has tried to deal with. That makes it a personal adventure as well, a pilgrim's progress, an Odyssey, invariably with its own lessons. Those of my own story are quite reasonably relevant to this project, throwing light on its present strengths and weaknesses so that you may build on or correct them.

Since philosophy seeks wisdom despite the often-overwhelming complexity of reality, complexity may be construed as the "raw material" of philosophy. One of the major branches of philosophy, epistemology, the "science of knowledge", has the core task of trying to show how, despite that complexity, human knowledge can still exist, and hence, more particularly, how we may better trust and improve our knowledge.

Most desirably, epistemology should be relevant to every branch of knowledge, from theology through ethics, politics, sociology, psychology, the arts, the natural and physical sciences and mathematics, to all computerized knowledge or coded information.

The nub of my metastory is then that I had put together a quite extensive epistemology addressing all the areas on that list except the last, before I had even encountered any computerized information system. Next -- to anticipate the story below -- that epistemology has turned out to be amazingly applicable to computerized knowledge models too, even down to some quite fine detail, some of which has only recently transpired to be the case. Finally, it has even had a considerable degree of more obviously predictive value.

That is the classic formula for the "proof" of a scientific theory or applied abstract model, so -- I am hoping -- will help with my credibility problem.

The most significant proof for me has been the way the epistemology has on the whole (that is, despite some regrettable consequences of overexuberance when trying to put the message across or refine its implications...) kept me on track for many years, notwithstanding the allegedly "postmodern" times in which we live.

I shall further show how it has enabled me to predict, well enough ahead of time, and in significant detail, how one of the currents of software-architectural standards efforts has flowed during the past year or two.

So by now I am pronouncing in a rather bold way on epistemology and how we might better deal with complexity in the future. And I am worried primarily about how to get others to help get that project into a more accessible shape and take it further, with a minimum of bother and loss of time, as you and they are essential characters in the further unfolding of the story. (To begin with, as I have already asked, you can be part of addressing that very challenge of presentation of what I believe is, after all, our common heritage in The Mainstream.)

But let us leave the metastory and story, and get back to the basic matter.

The epistemological problem has generally boiled down to trying to bridge the gap between knower and known, between the subject and object of knowledge. (And although that formulation is criticized for being unduly dualist (the monist position being that reality is one), we shall provisionally ignore that objection as it has little immediate bearing on an Architecture for Common Knowledge that must fully recognize individual subjects.)

But it is likewise difficult to get down to any practical detail beneath that bald statement of the problem. The history -- and the present state -- of philosophy can only offer us images.

For example, "reality as a blackbox" is a modern attempt at representing the object side of the dichotomy and implies how we the subjects are expected to deal with it. "Science" is then the answer to the detailed practical questions. But the resulting world is still not as it seems it should or could be.

Plato's image of the cave, representing our knowledge as shadows on the wall of a cave while we have our backs turned permanently to the true reality of the light of day, likewise aims to portray the ultimate unknowability of reality. Kant's world of "noumena", of the unknowable elemental ding-an-sich or thing-in-itself that we can maybe just conceive, similarly distinguishes a supposed reality from the "phenomena" that we can experience. But neither image seems to get us anywhere practical.

In reaction to such difficulties, Empiricism concludes that we should simply ignore any conception of any "reality" supposedly "behind" or "beneath" our experiences. Especially, they imply, we should stop seeking any "transcendent reality" or "transcendence": there is nothing to discover, they assert, we can only invent models on whose basis we may predict. And Instrumentalism further concludes that those models only have meaning that is evident from their "method of verification". Pragmatism is an appealingly innocent philosophical position that is in line with such epistemological views.

But while such "mainstream science" attitudes are very useful and admirably modest, they are unduly narrow, and do not adequately take account of or integrate the many practical limits of conventional scientific method. That is particularly true of positions which would purport to have "reduced" all human phenomena to the laws of physics but are not practically applicable in many domains very close to our human concerns, and probably never will be. Such reductionisms are in that way the most radical of blinkered oversimplifications, and the smugness of their authors when others disagree is one of the saddest illustrations of the aberrations that complexity typically brings about in us mere humans. (There is a hint for you to examine my own simplifications.)

To jump to the end of my own overall epistemological representations, I conclude, based on all this history, our present experience, and all the ambient talk and theorizing about complexity (and chaos, uncertainty, etc), that we may indeed posit a "complex unknown and ultimately unknowable reality that underlies our knowledge" (or some such wording) and then characterize it as "infinitely complex" (It would be infinitely more complex than any such simple thing as the most intricate computer program or quantum field equation or DNA configuration or econometric model that one might imagine). It is the logical counterpart of the accepted fact of the various inescapable imperfections and incompletenesses of our knowledge. I don't think that is controversial any more. It may safely be asserted as a philosophical, epistemological, and knowledge-modelling axiom. It is also inherent in the two epistemological images below.

As we shall see, that is where the complexity lies that "complexity-hiding" should be hiding. And there certainly does seem to be an infinite wealth that we can discover.

That "infinitely complex underlying reality" appears -- amazingly -- to be "indefinitely further discoverable" in terms of human simplifications. That observation is a projection from the entire "phenomenon of knowledge" by mere humans despite that complexity. Expressed theologically ("Theology is the finite trying to talk about the infinite."), that might be put in words such as these: "God created the universe in such a way that we can always discover more about it." That, I think, would be one way of expressing part of what is meant (by people who are far more perspicacious and wise than I) by the "humanity" and "goodness" or "grace" of God. We may observe that there is no "Problem of Evil" in that rendering of those two imputed qualities of a Creator, if and only if we accept our knowledge-making responsibilities and opportunities as we know them and might get to know them better. (And there are all the delights of the creativity that enables such further discovery, whether of human or physical or any other truths, especially when its products have meaning sought by others, and when in a wider market all that is accessible to everyone.)

Thus such theological fragments may offer some encouragement in better Riding The Mainstream. That very assumption of "indefinite further discoverability" is also, as far as I can make out, implicit in every theology and optimistic view of our human situation. We do not need it as a knowledge-modelling axiom, but it -- or something like it -- forms a neat conceptual closure of many presently apparent efforts in The Mainstream or outside it. And it certainly does imply a belief in progress despite all the turbulences in that massive river.

Moreover, it is as common, objective, or sharable -- though abstracted -- a view of such matters as one is likely to find. Even the physical reductionists just alluded to, while they would presumably have problems with the above wording, could assent to what it means in practice, and it is the latter that ultimately counts for them.

In these fine-printed paragraphs I have presumed to spell out part of what I would guess was the intended --though unstated -- more general message in an article (whose reference I have lost) by Eugene Wigner, entitled, if I recall correctly, The unreasonable effectiveness of mathematics. In that spirit, the above generalization might be entitled: The compelling message of the phenomenon of merely human knowledge.

Now, our horizons may well be infinite, but how -- specifically and in practice -- might we better extend our scope within them? Some more helpful models of our knowledge-making predicament and opportunity seem to be indicated...

We shall see below how The Mainstream of epistemology has led me to two different images for representing the general problem. And how, thanks to the later notion of "conceptual system" or "abstract system" from more reflectively scientific contexts (a development to which Kant so famously -- or so notoriously! -- contributed with his Critique of Pure Reason), the second image is far more refined and detailed in its encapsulation of the indirect relationships between knowledge and known, and hence is more directly applicable to knowledge modelling.

The more explicit epistemology behind The Mainstream dates back to September 1966, when the image of "Scylla and Charybdis" struck me as uniquely well portraying the human aberrations occasioned by complexity, and the recommended generic strategies for avoiding them.

Summarizing for now, Scylla is the many-headed monster of oversimplification by whom we are always partially caught as we veer away from Charybdis, the whirlpool of complexity. Later, with our simplifications shown to be vain, we find temporary refuge on the figtree growing over the whirlpool, representing the intertwined and encrusted systems of established practice, until we may make our escape as Odysseus did, on his mast and keel. But there is much more to it.

To see how the image and its details can be interpreted in many further interesting and relevant ways, see at these points in my 1996 paper (with its own further link, in which also find "Scylla" or "Charybdis"), and in my 1997 paper and later "reflections" (in which find "complexity" for where the classifying in Scyllan and Charybdian terms starts).

The image was created by Homer -- I allege -- with that specific intent (in bold above). Why was the message was not spelt out more explicitly or directly? Any "epistemological" or "philosophical" issues, to the degree to which concepts could at the time identify any such matters, could only be cast and experienced in metaphors that we like to call mythological. (We may recall that "Homer", whether a man or a lineage, represents the story-telling medium for an age-old wisdom-passing tradition. That largely explains how "his" two full works have survived, as the bible of their times, while only the merest fragments from the much later "pre-Socratic" Greek philosophers are known.)

Demythologized, however, the interpretability of the allegory is uncannily detailed. The image and its terms remain as modern -- even postmodern! -- as one can find (That is why the monsters, the whirlpool and the figtree are depicted on the cover of my book introduced below).

But the Homeric image portrays mainly the human implications of dealing with complexity. We still need one which more usefully represents the more formal or symbolic aspects. Hence, also from September 1966, a twin image, that of the agate. (It seems difficult to credit, and you can see why I did not dare mention it in any of my previous papers, considering the well-known alchemist phase in the history of what people have at various times liked to call philosophy, but the model for the MACK knowledge model is actually a "philosopher's stone"!)

The agate image portrays the structure and growth of symbolic knowledge to an amazing extent. It has remained with me as a usefully-elaborated portrayal of "abstraction as applied to the real world", and has much direct relevance to any computerized knowledge models. It has subsequently even become a detailed model for MACK. (It is described in detail below, and then applied in Part 2.)

But where did those images come from? Under what circumstances would a 25-year-old even have any such need?! Are such pictures not merely arbitrary or chance metaphors? More importantly, are they not themselves restrictive or misleading, further Scyllan dangers lurking uncomfortably close to the unwary philosopher or knowledge architect or system designer?

To answer the first three of those sceptical questions, I can safely assert that the images most definitely grew in the most relevant environment. In 1963, aged 22, I arrived in Cambridge, England, to do research in pure mathematics. But as an English-speaking South African (We typically saw ourselves on the centre/left of the political spectrum) who had just spent over 4 years at an Afrikaans-medium university once the home of apartheid's creators and at the time attended by their children, I also brought with me many close-up experiences of a very different political outlook.

So in November of that year a number of individually-commonplace matters came together in my own little mind: (1) the lesson from pure and applied mathematics that pure abstract systems have a life of their own, independent of reality, and that "real things", once we admit them in symbolic terms when "applying" mathematics, go the same inexorable but often useful though thereby-simplified way; (2) my own observations of how apartheid's so well-intentioned theorists, like any social engineers, put people-reality through the same kind of conceptual mill; (3) the frustration of my own persistent attempts at "explaining" the latter behaviour in the naïve hope of thereby changing it; and finally, (4) the sudden realization that such "explanations" are themselves error-prone for exactly the same potentially oversimplifying reason! (Those "explanations" were in evolutionary terms, as intimated above, and clearly beg their own questions.)

"Very obvious!" you might rightly say, but that very "meta-" realization, as the prototype of "seeing Scylla", and with its rather intense history, was of course my own small "Damascus experience" with the usual personally-significant effects. It immediately led to my dropping all thoughts of research in pure mathematics and concentrating on a book to be entitled The Phenomenon of Knowledge (in attempted emulation of Teilhard de Chardin's The Phenomenon of Man).

But if you see Scylla, can Charybdis be far away? So that plan grew, via the 1966 images and the course reported under 1975 in my own more detailed story, until it finally appeared under the title Beyond Apartheid in 1986 (See also 1984 and 1985).

There is a further epistemologically and autobiographically relevant aspect of the simplification process as embodied in the Homeric injunction "Hug Scylla's rock!" and related recommendations.

We may build on the extraordinary degree to which that episode -- like the others too -- may be read as an epistemological lesson. "Hug Scylla's rock!" means of course that we have to keep on the simplification side if we are to avoid the fearful Charybdian whirlpool of perplexity by complexity, despite the tragic fates -- "I have never had to witness a more pitiable sight than that!" -- of the inevitable oversimplifiers, as Scylla with her six heads snatched and munched six of Odysseus' crew, and "the ablest hands" amongst them too. Cliché once more (Mere reductionism, of one kind or another). But further, "Call on Cratais, Scylla's mother, who brought her into the world to prey on men. She will prevent her from making a second sally." That is, if we can try to understand the real nature of simplification and oversimplification, we are better protected against its further excesses. Ancient yet simple and wise advice for perennial problems! (And there is far more in the Scylla episode, in that very connection, that I shall not even mention here, as it is not quite as immediately relevant.)

The most basic kind of simplification, and one that is incorporated, in detail, in its simple axiomatic terms, into the MACK reality-model, and is presently taking programmatic shape in Metaset, is the "seeing-as" or "interpreting-in-the-context-of" process whereby any experience or thing may be seen or interpreted differently by applying a different conceptual schema to it (See the vignette in the box). That represents a creative and conceptually-synergetic step, and is the very basis of how MACK works in practice. It is the operation in a semantic net, given precise meaning by its relativistic context.

Creativity is a further application of the normal simplifying processes of perception or scientific explanation.

"Seeing as" is the key complexity-filtering process throughout. Thus an otherwise fleeting sense of colour and shape we "see as" an apple in a tree. A response by a person we may "see as" a sign of love. The frog sees a grasshopper as food, though only when it moves.

Thus while we simplify by filtering out "inessential noise", we also add meaning thanks to the context in which the observation process places the experience, or through which we "interpret" the input.

Our preconceptions may be supplemented by yet further ones: Newton "saw" gravity when the apple fell. Love may obscure what we also come to "see as" an undue element of self-interest.

New conceptual interpretations may supersede old ones, as in falling out of love, or as when, with a wholly new kind of perspective, Einstein with his General Theory of Relativity "saw" the phenomena of gravity rather in terms of some larger conceptions of time and space and energy. Or as did the Athenians when the citizenry rather than the ruler was regarded as the proper source of political authority. Their vague inner sense or intuition of justice (which the classical Greeks are well known to have felt about very strongly and very extensively) came to be partly "seen" in democratic terms.

Any experience or any thing may be "seen as" one or more of a potentially-infinite number of further things: we need only liberate our vision and escape from our habitual contexts into new mental planes. Such is creativity, whether scientific, artistic or humorous, as presented by Arthur Koestler in his magnificent book, The Act of Creation (Hutchinson, 1964). Koestler's picture was for me a signal lesson in that year, as I acknowledged in my 1996 paper and 1997 reflections (Find "Koestler" there).

When new conceptual schemas, for all their abstraction, thus add value to our real experience, it is in a synergy that further enables our very lives and can indefinitely enrich.

That is also the very basis of the "profound congeniality" (as distinct from mere "user-friendliness") that MACK will help us cultivate in our natively-ignorant computers, as already much mentioned in my earlier papers.

But let us return to the other sceptical question about those images. As metaphors they may be most highly relevant, but have they not been taken too far? Does their use not oversimplify in its turn? Certainly, those far-back dates do situate me squarely in the "silly Sixties", that climactic period of many grand dreams which have since turned to dust! So, what considerations might exempt me from dismissal as a simplistic fool of one of the many kinds that have come to characterize that decade, from the misled would-be mystic through the ineffectually idealistic to the dangerously technocratic?

Maybe, on the other hand, the answer lies precisely in those 1966 events, which have left me with all these admittedly simplifying images? How plausible, after all, that the Homeric image of Scylla and Charybdis, so graphically and fully capturing the Sixties' temptations, might have helped me survive the rest of the Sixties syndrome, which is basically Scyllan or oversimplifying?

In another simplifying contrast, if the subsequent decades of world history might be characterized by the resulting confusion and chaos on which the denizens of static since overcomplicating Charybdian figtree feed, would it not make sense that such a widely-applicable image from the objective "phenomenon of knowledge" might have proven a good mental stabilizer and star to navigate by? It would not be surprising if "hugging Scylla's rock" and "calling on Cratais" have helped here too.

And so you may see how actual events have unfolded as a story of an ever-growing convergence with what I am asserting has transpired to be The Mainstream of history ... which history of course includes that of software architectures.

As merely a few examples, classical OO with its Divine Programmer and naïve realtime roots are Scyllan, UML and Big Methodologies are on the Charybdian figtree (with "process" still outstanding), while MACK follows Odysseus' "mast and keel" formula for eventual escape from both (with "process" implicit).

The continuity and the coherence, as the story and the metastory converge, are massive. The story relates actual events which -- as the metastory -- significantly extend the compelling objective evidence in favour of the proposed MACK standard with its implementation project. (The story also builds up to the same process, pursued in the following section, in respect of the already-published MACK papers.)

So in the separate document linked to below there is a rather blow-by-blow account of Metaset/MACK's further history. But why do I risk wasting your time with such subjective trivia?

I try there to show that many of the lessons learnt have been highly relevant to the design and implementation of something like Metaset with its underlying abstracted architecture.

I also hope to learn from you, whether by direct communication on the gaps you spot, or in due course by yourself moving into the MACK-compliant market, collaboratively or competitively.

And in the short term, it also functions as a resume or CV in support of my application for some role in any MACK-promoting or -implementing project you might put together (As we shall see in Part 2).

The story also shows how the development has been guided by the original epistemological lessons and images, with each aspect buttressing the other.

In addition, the MACK background is of course not really just about South Africa, as will hopefully be evident from the detail I recount. But the South African aspect is relevant too.

Firstly, MACK is about human knowledge generically and certainly aims to have global applicability. Together, with awareness, we can better abstract the universal aspects into it.

Secondly, when it later comes to the full marketing of compliant products, especially for the needs that evidently drove the design at first, South Africa is an excellent microcosm in which to test them. In its groupings, histories, disparities and shared resources, it is a globally-relevant and almost universally-eager laboratory for change, and as a proving-ground could quickly give results that might be scaled-up or translated elsewhere. We are in fact already used, and have long been used, as guinea-pigs by a variety of global companies test-marketing high-tech products.

Therefore it might help for you to see from the separate document how few of the IT-related lessons of MACK's history are after all specifically South African, while yet not ignoring its South African taints and qualities.

But why is it necessary to bring South Africa or a South African into the picture at all? Can't MACK be argued and pursued objectively, on its own merits? Unfortunately not yet very well, and only partly because, as this paper surely shows quite amply, it is such a big and still poorly-described scene. Metaset in its present state cannot help out either in illustrating MACK's qualities and potential in a widely-accessible way, and certainly not without giving away many of its still trade-secret details. So it might help for you to know more about where MACK comes from.

I am not just being embarrassingly narcissistic: there are so many opportunities and roles for you too. The Mainstream would make no sense unless it can be widely ridden, and the sooner other people can help make it so, the better for us all. That is my premise.

On the contrary, therefore, I have tried to draw objective rather than subjective lessons: what I have described is a course that has long been converging on The Mainstream as it has become increasingly apparent in far more recent years.

The story might also help you spot more clearly those areas where I am still lacking, as I am also asking you to help me. Do please therefore read this seemingly irrelevant story with some care, matching it against your own experience and abilities to comment on knowledge architectures. (In that connection, do also take to heart the opening limerick of my 1996 paper...)

Surely also, if indeed there is any merit in MACK, it is thanks to the evidently rather unusual course, by an otherwise-usual person in unusual circumstances, which has made it possible.

A significant degree of confirmation from published MACK papers

To start with, I may point out that this paper, my other ones linked to, and their further links to my pages on this site are very consistent with one another. (Reading the earlier papers might also help consolidate in your mind the various Metaset/MACK backgrounds and concepts. (Just one wording change to remember: the "typology" of earlier papers is here simply "model".))

More importantly, in the light of subsequent events, they also constitute some significant confirmation of my thesis, with only minor refutations.

The most evident major confirmation is the way that software-architecture directions as I have conceived them have remained the same. Even the most sceptical can regard that fact as significant in view of the rapidity of change as it is generally perceived in our industry. So at the global level there is at least a good case to be made for my assertion that it shows that I do have my own meagre finger on the pulse of The Mainstream, in our industry at least. My papers all tell the same consistent story, while the only things that have been changing are the current events to be commented on and the new ways I have been trying to get the whole big message across.

That comment applies even more to MACK as it has long been evolving, but those autobiographical facts are not so easy for me to prove to you, though the metastory in this paper does try to convince you of it through force of overall consistency and coherence.

However, there are some specific aspects of my accuracy that are easily pointed out.

For example, when my 1996 paper was published I was unaware of XML, and I first learnt of it from another paper for Jeff Sutherland's 1997 BO Workshop, Mark Baker's Revisiting Sims (See also here). My second introduction is built around the remarkable convergence between longstanding MACK and that particular current in The Mainstream. (Thank you Mark! Thank you, Oliver Sims too!) The current of user-definable semantics has in fact run for quite long already. Tracing it backwards one might mention the lineage of (at least) repository, data-dictionary, self-describing file, db schema and cobol copylib. Neither XML documents nor Sims' semantic packets, however, come close to MACK's pervasive approach and framework.

In general, there has also been a bluntening of industry expectations -- even a deepening of disillusionment -- in respect of all conventional OO, such as CORBA and Java, and indeed, most of my documents have consistently poured scorn on that whole scene (Hence my MACK "BOO!" stance, "MACK is Beyond OO", which I started adopting in 1992).

After OO, what else is left? "Components!" is the new life-jacket. However, loudly though marketeers have been trying to push it, strongly though market demand has been trying to pull it, and much though it works not badly in view design, it cannot escape the abysmal present OO and its poor inheritance that is meant to give flexible reusability. That is perhaps the most fundamental of all the good things that are presently missing from conventional technologies and infrastructures, and is consistent with my various criticisms of conventional OO (Most particularly in my 1996 paper and its faq q 1 reply, and my True Love play).

Unfortunately, of course, at this stage it is difficult for me to convince you of the better alternative, as MACK's inheritance-equivalent is part of the trade-secret core of MACK that I am not yet disclosing (On that aspect see further in Part 2). In due course, though, you will not be able to stop me from enthusing about how just right MACK's abstraction/refinement feature is, how it pervades "core MACK", and how responsible it is for the impending explosion of reuse in the MACK market! It is even so simple. But the fog of complexity-hiding is at present obscuring its possibility.

In all my papers, the easiest potential refutation of which I am aware concerns my quantified time predictions for Metaset/MACK's arrival and penetration into wide use in the market.

Thus in my June 1997 Background document I had predicted "that it will be between one and two years from now that MACK will convincingly replace the OMA (and DCOM, HTML, Java Beans, etc) as the architecture of choice for Internet-leveraging applications." I believe there is still a fair chance that that one will come off within the longer period (and not the shorter one, seeing that my "sooner and better tactic" of that document has not worked).

But in my 1997 paper (of August), it was as if I had expected that "sooner and better tactic" to materialize: "During 1998, d.v., Metaset will start quickly bootstrapping the MACK-compliant market and will begin hosting massive compliant application-development by ISVs." Well, maybe all I can say there, is "d.n.v." in respect of that tactic. "You and I" still need a bit more time.

You might well say that the time prediction is the only kind that counts. To that I can only beg that you put yourself in the shoes of the manager who knows that the programming project is behind but still has many good marketing and technical reasons for not being unduly concerned by such a situation, rather normal as it is, especially for pioneering applications...

On the other hand, there is confirmation in the way the Metaset programming has been turning out. While I am of course disappointed that my poor concentration on it has not allowed it to progress faster, the shape it has been taking is the most beautifully tight one that I have long been predicting. For example, in the question 12 reply in my 1996 faq I had expressed a certain confidence: "The basic MACK model is really so simple and so general-purpose that there isn't all that much programming to the Boot Product's kernel." After this year's rewrite that aspect is stronger now that I had thought it would be. But I am keeping those details under wraps still, so in this paper I shalln't insist or count on that further.

However, there have certainly been no refutations -- and only confirmations -- of my far more deep-reaching and detailed predictions in the OMG finds True Love play, published June 1997, concerning the OMG's BOF quest. (You may also go there via Jeff Sutherland's kind introductory warning on his OOPSLA'97 Business Object Workshop III index page.)

I must preface my review with my own apology for seeming, rather small-mindedly, to say "I told you so!" to some individuals for whom I have the highest respect and admiration. But this type of argument is far more accessible to the average reader than much of the rest of my writings, so I beg your indulgence, and particularly that of the said individuals, for this different kind of attempt at conveying the vast coherence yet correctness of all my wide-ranging argument. I had also, in anticipation, prepared for this very situation with these words in that play, under the "signifying nothing" subheading (and referring to the widespread misconception of encapsulation and complexity-hiding, here characterized as the "fog"):

Though -- I hasten to add! -- the idiot here is not any of our characters, it is the situation with its constraints. Such, of course, is the best dramatic tragedy: marvellous people, ineffective, caught in a naturally-misconceived situation.

In that play, under the "Some key details" heading I referred to the OMG's "current inertial and predictably tragic course". Then under the subheading "Thus all five BOF submissions must be dismissed" I predicted that, even if only for the secrecy reasons given, the OMG would not choose the MACK alternative, with the result, clearly referring to the outcome of the BOF RFP, that:

the OMG will have a tragically-unnecessary and messy short-term way ahead.

In the event, that RFP was withdrawn at the OMG meeting in Helsinki in July 1998, and quite possibly at least for the very reasons which the "True Love" play also indicated, as we now see.

In the play I had criticized all the BOF submissions for their lack of "synergetic and exciting semantic substance inside their hollow shells", and how could it be otherwise, as they were all "cast into forms of insipid and indigestible IDL"? Nonetheless, I had praised the JBOF submission for failing less badly than the others, so I had "bet on JBOF". In the event JBOF did become the front-runner, evolving partly into the "BOCA" mentioned below that was so very nearly adopted in Helsinki.

At the time that play was first e-mailed, early May 1997, the OMG's Analysis and Design Task Force's RFP had recently been answered by the initial Submission of Rational's UML. There was clearly so much overlap with the very concept of a BOF that I could not resist this paragraph in the play (where, just in case you had wondered, "ArchiBoard" is the character representing the Architecture Board of the OMG, while "the castle", where the "Grand BOF Banquet" is being planned, represents its dogmatically-imposed OMA):

In addition there is both injury and insult: our embattled ArchiBoard is apprehensively eyeing his brilliant and friendly neighbour, Maestro Rational. Frictions are looming over the Maestro's venerable UML tree. Its deep state-roots have long been opening cracks in the castle walls, spoiling its ideal beauty, and now its spreading branches are dropping leaves into the BOF dishes.

Later, in August 1997, the final UML Submission appeared, UML 1.1, with an entirely new addition, OCL (Object Constraint Language). It not only significantly added to semantics external to IDL, but, in its extensive reliance on class-invariants it is state-based rather than behaviour- or interface-based (The interface basis of the OMA and hence of the BOF were of course the play's big bête noire).

Even later, at the OMG's Salt Lake City meeting in February 1998, the BODTF Minutes report this: "Jim Rumbaugh said that BOCA is at the same level as UML, when you look at it. You can map between them, but you don't gain anything, because you're not shifting levels. I thought the idea of this BOF was to raise the semantic level [...]".

And indeed, the August 1998 replacements for the original BOF RFP (the first being this) all concern a requested "UML for Business Objects".

So it is appropriate to end this episode of the story and metastory with this quote (italics now added) from the True Love play, which immediately followed my already-mentioned "bet on JBOF" rather than on MACK for the BOF:

That said, as time passes I am expecting the perceived risk to fade in any minds that can persevere and piece together all my argument, image and confirmed prediction. That picture's very high degree of coherence and growing cogency is, I guess, not so obvious at this stage. (After MACK's launch -- if necessary without the [...] teammates hereby sought -- the whole ballgame changes anyway, as the completed picture's clarity will indeed shine out as the targeted "complexity simplified".)

Hopefully more convinced now that MACK is on the right kind of track, despite the horrendous complexity and the associated Scylla and Charybdis Syndrome of which the Divine Programmer Syndrome is part, and despite my own expository fog, let us -- at last! -- get on with the actual design of MACK and Metaset.

The point of MACK, and a high-level lesson from the epistemological images

At least the impetus and shaft to the point are very much in line with the familiar mainstream. Consider the delightfully apt very first paragraph of Chapter 1, Complexity, of Grady Booch's Object-Oriented Analysis and Design with Applications (2nd ed, 1994, Benjamin/Cummings):

A physician, a civil engineer, and a computer scientist were arguing about what was the oldest profession in the world. The physician remarked, "Well, in the Bible, it says that God created Eve from a rib taken out of Adam. This clearly required surgery, and so I can rightly claim that mine is the oldest profession in the world." The civil engineer interrupted, and said, "But even earlier in the book of Genesis, it states that God created the order of the heavens and the earth from out of the chaos. This was the first and certainly the most spectacular application of civil engineering. Therefore, fair doctor, you are wrong: mine is the oldest profession in the world." The computer scientist leaned back in her chair, smiled, and then said confidently, "Ah, but who do you think created the chaos?"

Booch then reviews "The Inherent Complexity of Software", with its features and causes, so by page 15 he can stand back and reflect that there is a "factor that dominates: the fundamental limitations of the human capacity for dealing with complexity."

How true! But the final point to the thrust of The Mainstream is both ancient and refreshingly simple: we deal with complexity better by means of simple systems, not complex systems.

Mere naïve innocence and persistent folly? Not really. Thanks to the present state of IT, we may surprisingly easily follow The Mainstream of history. We may project and take it into the future, in a very consciously fine-tuned fashion, as the natural way of dealing with complexity by means of simplicity:

We need merely better pursue The Mainstream of the evolution of species and artifacts.

We need demand no great adaptation by our users. We shall cater to them as they are. The right seeds already exist and the good ground is in place. The Mainstream is not some Utopia. It is where we are already and always have been. We need merely see through the fog, complete a bit of non-divine programming, then everybody will soon ensure that the mess is cleared up and we shall really "Ride The Mainstream!" further.

There will be no appeal to any grand idealism or extraordinary capabilities or superhuman efforts.

(That is just as well, as history has also shown how such appeals -- if they are heard at all -- tend towards the slippery slope into dictatorship or totalitarianism. We shall see below how more modern conceptions of leadership protect us from that Procrustean danger.)

But is such a grand project feasible? On page 8, op cit., Booch laments:

Our failure to master the complexity of software results in projects that are late, over budget, and deficient in their stated requirements. We often call this condition the software crisis, but frankly, a malady that has carried on this long must be called normal.

Yes, there is that obvious crisis (the mess and its underlying Syndrome), but, pursuing "the point", I have to insist that "normal" may certainly be redefined somewhat, and yet remain natural! We can clarify The Mainstream and ride it better.

Booch then (still on p. 8) provides a handy point of departure for us: "the underlying problem springs from the inherent complexity of software". From there we now dig deeper than he set out to do in that book.

The Mainstream epistemology giving the MACK reality-model

It seems some mere wordplay by MACK as it shifts the perspective from that of Booch's cited book: It is not software, but reality, that is inherently complex.

(I suspect that Booch would agree with that, as the philosopher which, in that book, he did not claim to be. (However, I must add that his approach in that book, as quoted above, and though it is doubtless not his true self, does make him appear a most typical instance of the classic inhabitant of the Charybdian figtree, the professional who appears to have the edge in supposedly mastering complexity, when in reality it is mainly artificial complication that is being addressed, and no breakaway from that static position seems imminent.))

In the MACK perspective, software can at most be "intricate" or "elaborate", or "complicated", as in "artificially complicated" or "convoluted", as it is mere humans that make it so. It is indeed a "complex task" to sort out the mess, but that is solely because of the industry's humanly real component. People are involved. But the "complexity" is elsewhere, and certainly not "hidden" or even overt in any Divine Programmer's attempted orchestrations!

Within that perspective we may certainly still talk of "complex adaptive systems", a useful concept which Jeff Sutherland has most pertinently placed at centre-stage for this Workshop, and which will recur in this paper. However, its "complexity" aspect derives from how the real world, via its various and generally unpredictable inputs into the adaptive system, adds a reality component. The thus-"complexified" whole system may then be considered a truly interesting part of reality. But the model component not only remains artificial, it should even be made and kept as simple as possible ("but no simpler" being the added tail in what I called "Einstein's Imperative" in my 1996 paper).

Thus the shift in perspective is not mere wordplay. The difference is absolutely fundamental, far-reaching, and squarely part of The Mainstream of the evolution of species and artifacts as it has been represented in this paper. It is the axiom already posited. Restated for this context we have:

It is not the IT system that is complex, but the reality it models. The model itself should be simply structured, simply accessible and simply adaptable.

Here I merely present the end-result of the whole long story, where, as usual, "a picture is worth a thousand words." It will only be a mental picture ("One thousand graphic words is worth one million purely abstract words.") as any one actual picture brings its own distortions, and here I can surely rely on architects' good visual imaginations!

It is also worth recalling from the metastory that the picture is of both kinds of model: knowledge models in our minds or on our computers. We shall also even see how the model itself even portrays what is possibly the major difference between the two.

We now look at the detail of the agate, as already introduced.

Despite and yet also because of its rock-solid apparent stability -- like the flint that sparks, it is a form of chalcedony, a microcrystalline quartz -- the agate in many significant ways portrays "the phenomenon of knowledge" as we observe it, with its structure and evolutionary dynamics, the way we humans live it, within the ultimately unknowable complexity it models.

All images having limited applicability, I might preface the incongruity of the very idea with the reassurance that we shall also see aspects where the image breaks down. It is not followed slavishly.

The picture is of agates as they grow in their geological situations.

They typically form in cavities in a rock matrix, as left by condensing gases in igneous rock that has solidified under low pressures near the surface of the earth. (In South Africa the rock is basalt, from deep in the Earth's crust, but now forming the Drakensberg mountains, the highest in the sub-continent. My own agate specimens, then on my desk in Cambridge, I had found on Mont Aux Sources, where our dry country's three largest rivers arise. But that specific background is a merely poetic touch!) Such once-molten and quickly-cooled rock is characterized by its formlessness, having no layered or visible crystal structure. The rock matrix represents our given, vast, unstructured, complex reality from which all knowledge derives.

Think of us cognitive beings as those cavities, individual or group minds or databases to be filled with true and usable knowledge. The image is quite the opposite of that of the classical tabula rasa, a smooth surface on which our knowledge will be neatly inscribed. The insides of the basalt cavities, on the contrary, are formless and especially rough, even jagged and inhospitable, yet ultimately fertile like that of lava flows, destined for much smoothing, shaping, transformation and growth from appropriate seeds.

Now the rock matrix, the womb, over the ages secretes aqueous solutions, rich in silica and other mineral components leached from the surrounding reality. They percolate through networks of channels in the matrix and in due course start crystallizing out in the cavities.

The first deposits are from colloidal solutions and produce the microcrystalline chalcedony. They line the rough boundary of the cavity, so have no apparent form but that which is given by that unstructured reality. Thus the outer surfaces of the agates represent our formless sensory and intuitive world-contacts or experiences, and in the MACK model the "RE" or "Realworld Equivalent" outer layer, as they are the portions of our representations or models that most closely match the given irregular shape of reality, less influenced by any subsequent structures that we might later gather and shape in our minds, though of the same basic and partnering material. (For MACK's RE concept, see here in my 1997 paper and follow the links from there. The practical and technical implications are taken up Part 2 of this paper.)

Note that sensory experiences as we can talk about them, and the REs similarly, are not reality itself, but only the closest we can get to it in our formulations. As our own concepts, they are still our own representations, part of our abstract models, just as the rough outer surface of the agate is part of the agate, not part of the containing, limiting and shaping basalt of reality.

Now, as the agate's nutritional liquids change in time and source, so do their mineral concentrations vary. That produces layers of different colours, depending on the other mineral components in the solvent. Layered chalcedony is agate. The layers are moulded to the now-smoothed form of the cavity, each one being parallel to the previous one, often in attractive patterns, and tending progressively to smooth yet further the native forms of the cavities, often becoming largely regular and even circular, as displayed when the agate is broken or cut through.

The agate layers represent our abstract systems or models, each one related to the adjacent layers, and with increasing abstraction becoming ever more attractive to our neatness-seeking minds' eyes, though ever narrower in variety and reality-hugging refinement. But they can lead our eyes quickly from point to point of the outer reality, representing chains of fluent logical deduction. That represents Russell's "thinking without thought".

Here the model falls rather short: our abstract systems and computer models are all far more richly many-dimensional that the agate layers as we may see them in a three-dimensional solid or a two-dimensional cross-section. But mathematicians easily generalize from two to three to many dimensions, so we can remind ourselves of the immense wealth of content, shapes and textures of systems in this zone.

The inner layers are of the basetype, CBO (Common Business Object) or foundation-framework kind, while the outer layers are the derived types, models and ultimately reality-hugging applications that are more closely moulded by the given cavity-walls of complexity.

The general image of a cross-section is however not one of concentric circles, but of the very irregular and rough polygon of the native cavity, from whose flatter segments the layers seem to grow like slices of a cake, each one starting with its own concentric layering, but then gradually merging in patterns which are often the more attractive and at the same time unique features of each agate. That represents the eventually interacting nature of otherwise orthogonal applications or activities, and is thus the counterpart of multiple inheritance as well as the relativistic realtime contexts or perspectives as found in kaleidoscopic variety in different applications or user settings (Part 2 will distinguish between those two manifestations).

The most interesting of all those orthogonalities, however, are those which permit reflectivity, and most particularly of the kinds which enable the self-organization of our own mental models and the automatic self-management of the MACK database too. Part 2 takes that "applied epistemology" down to further detail.

Finally, the inside of the agate is typically of fully-crystalline and clear quartz, representing the pure logical core of our abstract systems and models, that which is the very furthest from the messy details of reality, but representing the clear essence of the structuring and deductive properties of the intervening layers. Depending on the ambient mineral "impurities", the quartz crystals may be in beautiful colours, rose quartz or amethyst being examples. The various colours represent different styles of possible logical structure, but they are all members of the quartz family, with its minimum standard commonality.

In knowledge modelling the large quartz crystals represent the more "pure" or "meta-" kinds of knowledge, such as axiomatic principles and inference mechanisms, the MOF's meta-metamodel concept, UML's OCL (if the latter two's proponents will excuse my thus bundling or generalizing them together in some apparent confusion…), and of course MACK's still trade-secret core metadata.

In MACK, however, there is no distinct "meta-meta-" concept. All is just "model", including both form and content, metafacts and facts, types and instances. All entities, facts and systems are structured according to identical rules (That is, after all, the way we think reflectively). So, what do the crystals represent in MACK that the layered zone does not? (This one I just love!) The crystals represent "bound-and-code-generated-and-compiled model", the most highly-efficient form of logical deduction. The more a particular agate, or portion of one, is set up for rigid and routine work, depending on the formal commitments its user-community is prepared to make, the more it has of efficient logical paths.

In effect, more of application logic is bound with our more common and "pure" logic. The agate has fewer less transparent or translucent MACK-characteristic interpreted model layers, and more clear conventional object-code.

The quartz crystals are deposited last during the formation of an agate, no longer from colloidal solution but from pure solution, after all the other application details have been settled.

There is never just one crystal in an agate, but an interlocking set of them, in some stones more regular than in others. That of course represents the dynamic linking without which applications would not be able to change.

Other agates may have no such clear crystals, consisting only of chalcedony, while some of those may even have no layers at all, being more pearl-like, though they may also show more smoothly continuous changes of colour or texture. That great variety of formal structuredness represents people's or groups' varying degrees of formality or logical structure and rigour, from the more intuitively to the more strictly organized. Thus the unlayered varieties or zones model only our minds or intuitions, and not our computer models. Thus the epistemological model even models where it does not apply to our IT task!

Meanwhile, of course, the matrix of reality has many cavities, in communities or networks of individuals or federated or distributed databases, with common characteristics dependent on the ambient geology, though each one is totally unique.

There is even a common medium interconnecting the individuals in their networks of channels, representing the communications between them, and carrying compatible material that further builds the material of the mental or computer models. Those are the XML-like semantic packets (more interesting in MACK!) which will mediate the intelligent connections between participants in our future more mediated exchanges.

So there we have a most amazingly detailed and evidently quite highly applicable model! It models knowledge as such, as for me it has done since 1966, while now it models MACK too.

Interestingly, MACK was not consciously modelled on that image during its early years. As indicated under 1987 in my more detailed story, it grew from far more conventional roots, unusual though one of them may have been. Then, around 1991/1992 I started to use an "onion" image, and only subsequently realized that the agate fitted so much better. I mention that bit of history in order to allay fears that MACK might have been unduly moulded by the image. However, it does also afford yet another example of how, in my own mind, the various currents of The Mainstream, in this case "pure epistemology" and computerized knowledge-modelling, have converged with a great synergy of coherence.

Even that latter observation has a further parallel. In biological evolution, many species show a phenomenon of "ontogenesis following phylogenesis", meaning that the development of the individual embryo goes through stages exhibiting characteristics of various ancestor species. The human embryo, for example, is said to pass through an ape-like hairy stage. So the way The Mainstream has converged in my mind shows similarities to the way it has been converging in history.

But does such alleged "convergence" not imply an impoverishing reduction in the wealth and diversity of knowledge or structure? Quite the contrary. It is an eclectic convergence towards an applicable abstraction, a distillation of an abstract essence, the very stuff of which OO and knowledge modelling and standard architectures are made! And which can enable a further synergetic flourishing.

Let us test the model a bit. In the agate model everything is made of silica, silicon dioxide. Does that not twist the vision along the lines of an unduly simplistic grid? I have, after all, already said that the MACK model uses binary entity-relationships, also a good candidate for scepticism as a universal yet flexible basis.

As already said in my earlier papers, that is not necessarily any more limiting than the way we use simple number systems for all of arithmetic, or simple alphabets for all our language. Ternary or higher-order relationships can easily be modelled in binary relationship terms and re-presented in n-ary terms if one wishes. But that still doesn't answer the question as to why that specific primitive concept is suitable for building structures of the kind we seek in knowledge models.

The obvious practical answer is of course that ER and specifically binary ER have proven themselves in practitioners' minds as highly intuitive and in their models as highly useful. But we are now trying to think in a more high-level epistemological, architectural and even universal way, as we should do too, considering the key importance of what knowledge-modelling and architectures are all about.

Well, until a fully-working Metaset can finally clinch the matter, even here we have a kind of answer, or indication, if you wish, in the terms of the agate model. The binary entity-relationship molecule, consisting of a relationship with two entities (or association with two association-ends, to use analogous UML terms), looks remarkably like the silicon dioxide molecule of the silica of agate and quartz. It is even the case that in MACK the relationship, like the silicon, is of the two elements the one which contributes most to the interest of the resulting structures, at least as far as MACK's uniquenesses are concerned. (That is also what you would expect from an architecture claiming to follow a Generalized rather than the Classical Object Model.)

Now of course that "molecular parallel" does look irrelevant and even far-fetched, but there is a very fundamental parallel here too: both literal and figurative molecules have the clear simplicity and easily-extensive combinability that enables large crystal structures which are both strong and transparent, where the latter two mineral properties parallel the logical properties of stable structuring and clear logical deducibility that we seek for our automatically-reflective knowledge models. The parallel is absolutely central to what we are all seeking.

One might even pursue the parallel yet further. What is the compiled-code counterpart of the quartz but the basic op-code with two operands that characterizes RISC instruction-sets?

Having thus insisted on the binary form, there is however no reason why another MACK-conformant agate, still receptive to the binary form of the standard communications, should not also provide internally for more direct representation of higher-order forms -- much like the amethyst variety of quartz in the agate core. (And RISC does not preclude some CISC facets.)

Perhaps needless to say, that molecular parallel was an ex post facto reading. As I write it is just a few days old, in fact. But maybe the literary analyst's attempted elucidations of a poem do sometimes discover interpretations that were indeed most plausibly and relevantly merely in the poet's subconscious mind? For example, I have long and hard repeated (publicly in my 1975 course, my 1986 book and my 1996 and later papers, always quoting Russell's "thinking without thought") that the easy and automatic manipulability of symbolic calculi (the "little stones" of the proto-abacus ("Chalcedony" has the same root!)) is the key to stable and useful logical construction, and the agate parallel did arise in that theoretical or epistemological context. Now I am just applying that theory to the practice of computerized models, and it all seems to tie together very tightly. That is of course further support for the MACK thesis, though only of the coherence kind rather than the predictive-value kind.

In Part 2 -- Simplification (Product and Project), we put the agate image into practice. We shall also see why it enables our escape from the Divine Programmer Syndrome.

Here we must conclude the complexity aspect with some most relevant and important meta-generalities.

Despite all our own efforts to remain undistorted by our models and images, we have to take care not to impose the lessons of any always-fallible images onto anyone else. That is an extremely far-reaching aspect, as we shall now see.

Leadership has "changed, changed utterly"

As this entire "Complexity" part of the paper has shown, either by implication or at length, MACK has been greatly inspired by epistemological images. In historical order for me, they were (1) the concept of the abstract mathematical system as a model for the generic conceptual system, (2) biological evolution as a model for conceptual system evolution, (3) reality as a blackbox and reality-abstraction as "modelling", (4) Koestler's "bisociation" between two contexts as an image of mental creativity, and finally (5) Scylla and Charybdis and (6) the agate.

Later IT images were (7) Bachman's astronomical image and (8) the physical relativity that is its natural extension. But applying DBMS-inspired images does not prompt scepticism or require re-examination to quite the same degree!

Like all images or models, they have their strengths and weaknesses. I have tried to show their strengths in my own story, as guides and even bases for successful prediction. But they must be fallible too.

(We may even note that Plato's image of the cave was in his Republic, in which he even insisted that mathematics should be a compulsory part of the education of the "philosopher king". However, Plato's main experiment in that line was an abject failure! On the other hand, in the light of the issues of this section, we may also note that Plato was no proponent of democracy.)

But what about the image of "The Mainstream"? It easily looks ominous. Is it not the ultimate Procrustes? (Or, worse, even hint that I might be tempted to be one?!) Whereas the epistemological images were all part of the innocence of youth (my 22 to 25 years of age, in the early to mid Sixties too), the more sociological "mainstream" image of 1990 came after many more political lessons. Great care imposes!

Is the very idea of "The Mainstream" at all desirable anyway? Shouldn't everybody be alarmed when someone attempts to characterize and build on it with a computer-based architecture?

Fortunately, of course, one does not have to swing all the way from complexity and perplexity to the Procrustean extreme, as we may observe through the eminently authoritative words of Pericles the classic statesman, in his Address to the Athenians (My rendering of the French rendering by Cornelius Castoriades in Le Monde Diplomatique of August 1998):

We are the only ones who don't let reflection inhibit our actions. The others either don't reflect and become foolhardy and do stupid things, or else upon reflection they don't manage to do anything, as they say there is this view and there is the opposite view.

(One may be forgiven for observing that the message of "Scylla and Charybdis" seems to have been very much alive and well and living in Periclean Athens!)

But Leadership is not a simple matter either. We would do well to expect no Pericles to recur in this millenium or the next. (If he tried to, he would be thrown straight out!)

We may recall the absence as described by the more familiarly Twentieth-century Yeats as quoted in my 1997 paper: "The best lack all conviction, while the worst are full of passionate intensity."

He had opened that poem with an image of leaderlessness that still resounds (and rings a bell here as I write, in South Africa, evidently like Yeats' revolutionary Ireland to a disturbing degree):

Turning and turning in the widening gyre
The falcon cannot hear the falconer;
Things fall apart; the centre cannot hold;
Mere anarchy is loosed upon the world,
The blood-dimmed tide is loosed, and everywhere
The ceremony of innocence is drowned;
The best lack all conviction, while the worst
Are full of passionate intensity.

And he had concluded with a gloomy foreboding of altogether the wrong style of Leadership:

And what rough beast, its hour come round at last,
Slouches towards Bethlehem to be born?

Procrustes haunts, as ever, and was not unduly exaggerated in that poem, The Second Coming, so clearly reflecting that little can be as heartless and pitiful than the tragic misconceptions of good teachings. And who shall rightly claim to interpret or reinterpret them so that we may comprehend?

Fortunately, we do not need to. The answers are already all around us.

We shall merely further trace and reinforce this very mainstream set of clichés:

Leadership in our more complex modern world is both more creatively and more safely diffused throughout the supply-side of the market. IT strongly supports and enhances that process. If it could do so better, we might more clearly see and manage "the hidden hand of the market" as the collectively participative effort it has always been. Together, more responsively and responsibly since better informed, people might ever more efficiently and penetratingly meet infinitely-complex human demand, thanks to humanly-simpler and hence more manageable systems.

The market as a whole effectively constitutes one great big real "Complex Adaptive System" of people and systems. That was already the essence of Adam Smith's "hidden hand", over two centuries ago. (Or shall we say the "hidden" essence"?) He had also justified it in terms of the inability of any possible "Sovereign" to manage the complexity of the modern economy that was then emerging. He had even emphasized the incongruity of the very notion of such old-style Leadership.

The very nature of Leadership has long been in the process of transformation.

All changed, changed utterly:
A terrible beauty is born.

Where we may borrow from Yeats once more, as he had earlier contemplated Ireland's own messy and irreversible people-driven process (That poem, Easter 1916, reviews the ordinary individuals he had known, as they had risen to the occasion and transformed it).

Leadership in the market manifests itself on the supply-side. (To the extent to which an individual consumer becomes an activist apparently on the demand-side, that person has shifted to the supply-side, whether in some friendly collaboration with the supplier or in an opposition of some kind, catering also to the needs of other consumers in the same boat.)

And the lasting basis of such leadership lies of course in the one word, "marketing". Leaders must have the ability to simplify demand appropriately and deliverably, and sell such simplifications to their followers or "customers". We should ignore the charismatic aspects of leadership, as the salesman passes away, leaving what is basically a testimony to good or bad marketing.

It should be easy, as we all like simplicity, but it is ourselves and our perceived needs that are thus being simplified. Other people too easily get that wrong. So we demand better respect: the would-be leader should "look again" (that being what "respect" means).

For example, the conventional leader of the conventional political party finds himself or herself in the most unenviable situation, for he or she does not have the time for such respect, so cannot avoid being ultimately responsible for the most tragically supply-driven basket of oversimplifications that one can find in our modern world.

And the MackWeb will complicate their lives yet further, as the medium itself will better elicit complexity by making it easier for the finer details of problems and needs to be recorded in relevant places. Then what fertile ground that will be for a responsive supply-side! Thus representative democracy will be progressively displaced (to a degree) by a more direct democracy.

Simplicity is successfully encapsulated in any product that survives in its market niche, whether the product is a physical good, a service, a system, or any other human creation that other people want badly enough for their demand to drive the requisite market mechanisms.

The total market is -- or should be -- a dynamic synergy of virtually countless such simplifications. The introduction on groupware elaborated enough on that.

The market is thus clearly visualizable as encompassing all social activity. "The Mainstream" again!

That slightly ominous view is further mitigated by observing that the commonality that we seek in Common Knowledge is, like Leadership, a sympathetic quality that relates interacting partners, and is not necessarily universal or grandly unifying. Indeed, it very seldom is, and rightly so. The agates come in a most extraordinary variety yet still interrelate in their respective kinds of matrix. As Shakespeare put it, four centuries ago already, of an only slightly different sympathetic quality:

The quality of mercy is not strain'd,
It droppeth as the gentle rain from Heaven
Upon the place beneath...

Those words were from the mouth of Portia, the great lawyer in The Merchant of Venice, and a woman, as we may note with some interest in respect of the epoch, and with warm recognition of her humanity. We may also note how Shakespeare, well known elsewhere as no liker of lawyers, further saw fit to emphasize through her that the law alone is not justice. As he continued a few lines later:

But mercy is above this sceptred sway,
It is enthroned in the heart of kings,
It is an attribute to God himself,
And earthly power doth then show likest God's
When mercy seasons justice.

In that spirit, there have been many indications in this and my earlier papers of how MACK has a particularly fundamental and consistent way of providing for common ground yet with ample room both for nuance and for creatively-different perspectives.

But is it deliverable? Is all that fine and high-flown intention practical?

Shakespeare said it all for us supposed moderns:

The quality of mercy is not strain'd,
It droppeth as the gentle rain from Heaven
Upon the place beneath: it is twice bless'd;
It blesseth him that gives and him that takes:
'Tis mightiest in the mightiest: it becomes
The throned monarch better than his crown...

A free transaction is not a zero-sum game. It permits a powerful synergy. It would happen if the market mechanisms really allowed and promoted it. If they did, then it would be practical too. And even the existing Internet-leveraging market infrastructure is certainly quite adequate for the degree of collaboration required for the launch and self-bootstrapping of the MackWeb.

We have thus encountered several characteristics of good marketing and lasting leadership: respect, compassion, nuance, creativity yet practicality. Real motherhood clichés once more, of course, but it is worth leveraging the extent to which they are also the natural extensions of the whole of biological and human history as a knowledge-accumulation phenomenon. They are therefore safely promotable, particularly now that we can all, like Newton, "stand on the shoulders of giants", see through the fog, and as we shall see in Part 2, really leverage the Internet.

The Mainstream is thus a carefully self-correcting though fluid organization.

Thus the antidotes for any fears about "The Mainstream" are in The Mainstream itself.

Hopefully more comfortable about the principle, therefore, we proceed to Part 2 -- Simplification (Product and Project).