Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  ·...

47
Hypertext Publishing and the Evolution of Knowledge K. Eric Drexler Abstract The evolution of knowledge Media and knowledge Functions and consequences Needed inabilities Architectural sketch Existing work Advantages and problems Evaluation Some general objections What it might be like Quantifying its value Economizing intellectual effort Extending the discussable Getting there Conclusion Acknowledgments References Media affect the evolution of knowledge in society. A suitable hypertext publishing medium can speed the evolution of knowledge by aiding the expression, transmission, and evaluation of

Transcript of Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  ·...

Page 1: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Hypertext Publishing and the Evolution of Knowledge

K. Eric Drexler

Abstract

The evolution of knowledgeMedia and knowledgeFunctions and consequencesNeeded inabilitiesArchitectural sketchExisting workAdvantages and problemsEvaluation  Some general objectionsWhat it might be likeQuantifying its valueEconomizing intellectual effortExtending the discussableGetting thereConclusionAcknowledgmentsReferences

Media affect the evolution of knowledge in society. A suitable hypertext publishing medium can speed the evolution of knowledge by aiding the expression, transmission, and evaluation of ideas. If one aims, not to compete with the popular press, but to supplement journals and conferences, then the problems of hypertext publishing seem soluble in the near term. The direct benefits of using a hypertext publishing medium should bring emergent benefits, helping to form intellectual communities, to build consensus, and to extend the range and efficiency of intellectual effort. These benefits seem numerous, deep, and substantial, but are hard to quantify. Nonetheless, rough estimates of benefits suggest that development of an adequate hypertext publishing medium should be regarded as a goal of first-rank importance.

From Social Intelligence, Vol. 1, No. 2, pp.87-120; an edited version of a paper originally submitted to Hypertext 87 conference.

Page 2: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Knowledge is valuable and grows by an evolutionary process. To gain valuable knowledge more rapidly, we must help it evolve more rapidly.

Evolution proceeds by the variation and selection of replicators. In the evolution of life, the replicators are genes; they vary through mutation and sexual recombination and are selected through differential reproductive success. In the evolution of knowledge, the replicators are ideas; they vary through human imagination and confusion and are likewise selected through differential reproductive success - that is, success in being adopted by new minds. (These ideas are memes, in Richard Dawkins' terminology [1].)

Evolutionary epistemology [2] maintains that knowledge grows through evolution. Animals - and even plants - can be said to know of certain regularities in their environments; this knowledge, embodied genetically, certainly evolved. Like genes, folk traditions are passed on from generation to generation; surviving traditions tend to embody knowledge that aids survival. Karl Popper describes science in evolutionary terms, as a process of conjecture and refutation, that is, of variation and selection [3].

The scientific community evolves knowledge with unusual effectiveness because it has evolved traditions and institutions that foster the effective replication, variation, and selection of ideas. Teaching, conferences, and journals replicate ideas; the lure of recognition helps bring forth new ideas; peer review, refereeing, calculation, and direct experiment all help select ideas for acceptance or rejection. Every community evolves ideas, but science is distinguished by unusually rigorous and reality-based mechanisms for selection - by the nature of its critical discussion.

To improve critical discussion and the evolution of knowledge, we can seek to improve the variation, replication, and selection of ideas. To aid variation, we can seek to increase the ease and expressiveness of communication. To aid replication, we can seek to speed distribution, to improve indexing, and to ensure that information, once distributed, endures. To aid selection, we can seek to increase the ease, speed, and effectiveness of evaluation and filtering. The nature of media affects each of these processes, for better or worse.

Media and knowledge

The nature of a medium can clearly affect critical discussion and hence the evolution of knowledge. Consider how the lack of modern print media would hinder the process: Imagine research and public debate in a world where all publications took ten years to appear, or had to contain at least a million words apiece. Or imagine a world that never developed the research library, the subject index, or the citation. These differences would hinder the evolution of knowledge by hindering the expression, transmission, and evaluation of new ideas. If these changes would be for the worse, then a medium permitting faster publication of shorter works in accessible archives with better indexing and citation mechanisms should bring a change for the better. The naive idea

Page 3: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

that media are unimportant in evolving knowledge - that only minds matter - seems untenable.

The effects of media on variation, replication and selection can be described in more familiar terms as effects on expression, on transmission, and on evaluation. These categories provide an analytical framework for examining how media affect critical discussion and the evolution of knowledge.

Broadcasting

The newest of major media is television, but it seems poorly suited for critical discussion. Its cost limits access, limiting the range of ideas expressed; political regulation worsens the problem. Its nature - a stream of ephemeral sounds and images pouring past on multiple channels - does not lend itself to the expression of complex, interconnected bodies of information. Transmission of new information is often very fast, but in a form awkward to file, index, and retrieve. Viewers cannot easily or effectively correct televised misinformation. It is hard to imagine researching anything by watching television, save television itself. Similar remarks apply to radio.

Paper publishing

The medium of paper publishing does better. It is relatively open, inexpensive, and expressive. Paper books and journals have been the medium of choice for expressing humanity's most complex ideas. Published items endure, and they can be copied, filed, and quoted. Paper books and journals, however, suffer from sluggish distribution and awkward access.

Paper publishing's greatest weakness lies in evaluation. Here, refereed journals are best - but consider the delay between having a bad idea and receiving public criticism, that is, the cycle-time for public critical discussion:

An author's (bad) idea leads to a write-up, then to submission, review, rewriting, resubmission, publication, and distribution: only then (after months' delay) does it become public. This then leads to reading by a critic, an idea for a refutation, write-up, submission, publication, and distribution: only then (after further months) has the idea received public criticism. This cycle can easily take a year or more, though the total thinking-time required may be only a matter of days. And even then the original publication exists in a thousand libraries, unchanged and unmarked, waiting to mislead future readers.

Group communication

The sluggishness of paper publishing forces heavy reliance on communication in small groups. There, cycles of expression, transmission, and evaluation are fast and flexible, but operate within the narrow bounds of the community. This limits both the criticism of bad ideas and the spread of good ones.

Page 4: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Computer conferencing

Computer conferencing systems aim to combine the speed of electronic media with the text-handling abilities of paper media. They can combine some of the virtues of small-group interactions with those of wide distribution. The better computer conferencing systems have much in common with hypertext publishing systems, though all presently lack one or more essential characteristics. Since they are diverse and rapidly evolving, it seems better to describe what they might become than to try to take a snapshot of their present state.

Hypertext publishing

A hypertext publishing medium is a system in which readers can follow hypertext links across a broad and growing body of published works. Hypertext publishing therefore involves more than the publication of isolated hypertexts, such as HyperCard stacks. This paper follows Jeff Conklin [4] in taking 'a facility for machine support of arbitrary cross-linking between items' as the primary criterion of hypertext.

Hypertext publishing systems can provide an open, relatively inexpensive medium having the expressiveness of print augmented by links. Electronic publication of reference-links, indexes, and works will speed the transmission of ideas; criticism-links and filtering mechanisms will speed their evaluation. The nature and value of such systems is the topic of the balance of this paper.

Hypertext publishing

Randy Trigg has stated an ambitious long-term goal for computer media and publishing:

In our view, the logical and inevitable result will be the transfer of all such activities to the computer, transforming communication within the scientific community. All paper writing, critiquing, and refereeing will be performed online. Rather than having to track down little-known proceedings, users will find them stored in one large distributed computerized national paper network. New papers will be written using the network, often collaborated on by multiple authors, and submitted to online electronic journals. [5]

In spirit, this embraces a broader goal: transforming communication within the community of serious thinkers, including those outside the scientific community. It also embraces a narrower goal: transforming communication within smaller communities which still must use paper media to publicize their results. None of these goals entail competing with local newspapers, glossy magazines, or popular books; they aim only at providing better tools for communities of knowledge workers.

Kinds of hypertext

Page 5: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

With these goals in mind, it may help to distinguish among several sorts of hypertext.

Full vs. semi-hypertext: Full hypertext supports links, which can be followed in both directions; semi-hypertext supports only pointers or references, which can be followed in only one direction. As we shall see, true links are of great value to critical discussion, and hence to the evolution of knowledge.

Fine-grained vs. coarse-grained hypertext: This embraces two issues. First, can one efficiently publish short works, such as brief comments on other works? Second, can a critic link to paragraphs, sentences, words, and links - or only to author-defined chunks of text? Fine-grained linking has value chiefly in a critical context: given fine-grained publishing, authors can structure their work to match their ideas, but critics will often want to pick nits or blast small, vital holes in parts of an author's structure - parts that may not be separate objects. To do so neatly requires fine-grained linking.

Public vs. private hypertext: A public hypertext system will be a hypertext publishing system - if it is any good. A public system must be open to an indefinitely large community, scalable to large sizes, and distributed both geographically and organizationally; no central organization can control access or content. Closed or centrally-controlled systems are effectively private. Public systems will aid public discussion.

Filtered vs. bare hypertext: A system that shows users all local links (no matter how numerous or irrelevant) is bare hypertext. A system that enables users to automatically display some links and hide others (based on user-selected criteria) is filtered hypertext. This implies support for what may be termed social software, including voting and evaluation schemes that provide criteria for later filtering.

'Hypertext publishing': This paper will use the terms hypertext publishing and hypertext medium as shorthand for filtered, fine-grained, full-hypertext publishing systems. The lack of any of these characteristics would cripple the case made here for the value of hypertext in evolving knowledge. Lack of fine-grained linking would do injury; lack of any other characteristic would be grievous or fatal. Most important is that the system be public: the difference between using a small, private system and using a large, public system will be like the difference between using a typewriter and filing cabinet and using a publisher and a major library.

Functions and consequences

To support the evolution of knowledge effectively, a hypertext publishing medium must meet a variety of conditions. Nelson [6,7] and Hanson [8] have specified some of them; the following describes an overlapping set and relates it to the evolution of knowledge. Several conditions are included because they conflict with common

Page 6: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

practice in computer systems administration, yet seem necessary for a functioning publishing system.

Must support effective criticism

Hypertext publishing must support links across a distributed network of machines, and these links must be visible regardless of the wishes of the linked-to author. The resulting medium can greatly enhance the effectiveness of critical discussion. Since this conclusion is pivotal to the argument of this paper, it deserves detailed consideration. Consider how the critical process works in paper text, the current medium of choice, and how it may be expected to work in hypertext:

In each case, we start with a published paper making a plausible statement on an important issue - but a statement that happens to be wrong. Imagine the results in the medium of paper text and in hypertext. In both, some readers see that the statement is wrong. In both, a few know how to say why, clearly and persuasively. Then the cases diverge.

Faced with a paper publication, these critics may (1) fume, (2) complain to an officemate or spouse, (3) scribble a cryptic note in the margin, or (4) write a critical letter that may (5) eventually be published in a subsequent issue. Steps (1-3) contribute little to critical discussion in society: they fail to reach a typical reader of the offending paper and leave no public record. Step (4) is an expensive gamble in time and effort: it demands not only the effort of handling paper and addressing an envelope, but that of describing the context, specifying the objectionable points, and stating what may (to the critic) seem a stale truism that everyone should know already. Depending on editorial whim, step (5) then may or may not result. Even at best, readers won't see the critical letter until weeks or months after they have read and absorbed sthe offending paper.

In a hypertext publishing medium, critics can be more effective for less effort. Those who wish to can write a critical note and publish it immediately. They can avoid handling papers and envelopes because the tools for writing will be electronic (and at hand). They can avoid describing the context and the objectionable points because they can link directly to both. They can quote a favorite statement of the truism by linking to it, rather than restating it; if its relevance is clear enough, they needn't even write an explanatory note. And not only is all this easier than in paper text, but the reward is greater: publication is assured and prompt, and links will show the criticism to readers while they are reading the erroneous document, rather than months later.

In short, criticism will be easier, faster, and far more effective; as a consequence, it will also be more abundant. Abundant, effective criticism will decrease the amount of misinformation in circulation (thereby decreasing the generation of further misinformation). Abundant, effective criticism of criticism will improve its quality as well. Reflection on the ramifying consequences of this suggests that the improvement in the overall quality of critical discussion could be dramatic.

Page 7: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Must serve as a free press

To maximize the effectiveness of criticism, a hypertext publishing system must serve as a genuine free press. In addition to being scalable, open, and having diverse ownership, it should allow anonymous reading (and perhaps authoring under partially-protected pseudonyms). These conditions all facilitate broad participation with a minimum of constraints, aiding expression and criticism.

As Ithiel de Sola Pool notes, in the U.S., restrictions on free speech in new media have typically stemmed from their identification as tools of commerce, rather than as forms of speech or publication [9]. To reduce the chance of bad legal decisions regarding First Amendment rights in hypertext publishing, we should recognize that the participants are authors, publishers, libraries, and readers; we should avoid commercial terms such as information providers, vendors, and buyers.

Must handle machine-use charges

To have a free press, it seems that one must charge for machine use. Computer time and storage space have become cheap and abundant, but not free and unlimited. Even cheap and abundant resources must be rationed - imagine a hacker deciding to store the integers from one to infinity on a 'free' system. The choice is not whether to ration, but how. One can ration machine resources by reserving them for free use by a small, subsidized elite that is implicitly subject to strong social controls: this is a solution used by institutions on the ARPANET. One can ration storage space by having a privileged editor delete authors' material: this is a solution used by many computer conferences and bulletin boards. One can ration by imposing wasteful costs on people, making them wait in lines long enough to cut demand to match supply. Or, one can charge what the service costs, so that additional users will pay for additional machines, allowing indefinite expansion and access without editing or discrimination. Charging is the solution that has made on-line services available to high-school kids and retired farmers.

It is worth noticing just how low those charges can be. The cost of long-term storage of data on a spinning disk drive is now in the range of cents per kilobyte - this makes text it cheaper to store than to write, even if one types at full speed without thinking and values one's time at minimum wage. The cost of an hour's rental of a processor and a megabyte of RAM is again a fraction of minimum wage. (Telecommunications is a greater expense, but its charges are harder to fudge.) In short, the main cost of using computers (telecommunications aside) is already the value of the time one spends. On the whole, charging will increase openness and convenience, as it does in the free-press system of conventional publishing.

Must handle royalties

To have the familiar incentives of a free press, hypertext publishing must handle royalties. Royalties can eventually enable people to make a living as writers, and will encourage the production of boring but valuable works, such as indexes. The

Page 8: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

experience of conventional publishing suggests that royalties will be inexpensive for readers: if a hardcover book costs twenty dollars and takes six hours to read, typical author's royalties amount to roughly fifty cents per reading-hour. Paperback royalties and magazine-writer's earnings are less.

Must support flexible filtering

An open publishing medium with links presents a major problem: garbage. If anyone can comment on anything, important works will become targets for hundreds or thousands of links, most bearing comments that readers will regard as worthless or redundant. A bare hypertext system would become useless precisely where its content is most interesting.

To deal with this problem, authors must have exclusive rights to unique names, so readers can use those names as indicators of quality. Readers must be able to rate what they read, so that their judgments can aid later readers' choices. Readers must be able to use automatic filters (configured to match their preferences) to sift sets of links and choose which are worth displaying. Making it easy for readers to send each other pointers to documents would aid personal recommendation. Further, readers should be able to attach triggers to items - for example, a trigger that sends a message whenever a (highly-rated) item appears in a place of special interest. This could dramatically reduce the effort of scanning and re-scanning the key writings in a field to find links to relevant advances.

Without such mechanisms, critical discussion would choke on masses of low-quality material. With them, as we shall see, effective processes seem possible.

Needed inabilities

As important as functions are inabilities - in some ways, they are more important, because they are harder to add as afterthoughts. The above goals imply that no one should be able to:

retract or alter publications, save by annotation hide published comments on a piece of work read works from libraries without paying royalties monitor who is reading published documents trace pseudonymous authors without a warrant publish under another's unique name or pseudonym

Architectural sketch

Page 9: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

These functions can be mapped onto a set of levels, sketching a system architecture. A more detailed exploration of constraints and design approaches may be found in Hanson [8].

Database level

At the core of a hypertext publishing system will be a network of library machines holding overlapping portions of the hypertext literature. These machines will have a database level (designed to store hypertext data, not traditional database data). This level will support distributed, fine-grained, full-hypertext service, together with triggers to notify users when specific changes occur. It seems desirable to seek general standards for representing links, text, graphics (at least for simple graphs and diagrams), and access and accounting information.

In developing these standards, one should avoid trying to standardize too much, lest the result be unimplementable, inefficient, or excessively restrictive. Conversely, one should avoid standardizing too little, lest the result be a set of incompatible publishing systems. Careful definition of a database interface and a few basic representation schemes seems a good compromise, leaving decisions about representational idioms, user interfaces, and much else free to evolve.

Access and accounting level

Closely related to the database level is the access and accounting level. This level ensures that authors are who they say they are, that royalties are paid when documents are read, and that readers can only see public documents, or documents for which they have been given access, and so forth. These constraints will reflect access and accounting information stored at the database level.

Agent level

The agent level consists of a computational environment near the database (in a cost-of-communication sense); this environment can contain agents to which users delegate rights, resources, and tasks. In particular, agents can examine large numbers of links and items at low cost, apply filter functions, and send users only those most likely to be of interest. Agents can also implement social software functions - for example, applying voting-and-rating algorithms to sets of reader evaluations and publishing the results.

An agent level might use a secure, general-purpose language running under an accounting interpreter and accessing a set of secure, pre-compiled software tools. The latter could perform standard operations (such as reading, filtering, sorting, and merging) in a series of increments of bounded size. Secure in this context means able to operate only on data objects to which access has been given (the core of the Scheme language appears to have this property); for further discussion of language security, see [10,11]). To serve its essential function, accounting need only keep charges roughly proportional to incurred costs.

Page 10: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Telecommunications level

The telecommunications level consists of facilities for communications among library machines and with users. This involves interfaces to existing networks and protocols for identifying communicating parties, accessing remote data, and so forth.

User-machine level

The user-machine level should include a local database acting as a local cache and workspace for hypertext material, together with support for local filtering and display agents. Any of a variety of user interface engines might reside here. Different forms of hypertext might require different interfaces; a modular design in which these interfaces could be downloaded during a session could be very useful.

Existing work

Jeff Conklin's Survey of Hypertext [4] describes existing work at some length. Only Memex [12], NLS/Augment [13], Xanadu [6], and Textnet [5] are described as being (existing or proposed) 'macro literary systems', a category that includes hypertext publishing systems. None has yet been implemented as an open system with true links and filtering. A recent, evolving design aimed at meeting all the conditions described above is the Linktext proposal [8].

Much work has been done on issues such as versions and the semantics of hypertext links [7,14,15], mechanisms for support of argumentation [5,16,17], and user interfaces [16,18,19,20,21,22]. This work will shape both the nature of hypertext publications and the software used to manipulate and display them. A goal for design of a database level is to enable use of the full range of higher-level facilities and idioms that have evolved in the current generation of private hypertext systems. This will enable us to use knowledge we have already evolved.

Advantages and problems

How can one judge the advantages and problems of a nonexistent medium? Perhaps not very well, yet an attempt may be worth the effort. Several approaches seem reasonable. One is to reason by means of analogies to present paper media, seeking analogous problems and solutions. Another is to apply solid, elementary economic principles: lower cost draws greater use, higher cost reduces it; greater reward draws greater contribution, reduced reward reduces it. Another is to place advantages and problems within the analytical framework of expression, transmission, and evaluation, relating them to critical discussion and the evolution of knowledge.

Page 11: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Expression

Hypertext publishing will aid expression in several ways. It will lower per-word publication costs by orders of magnitude: the cost of publishing a book's worth of material will fall from tens of thousands of dollars to tens of dollars. It will lower per-idea writing costs, sometimes by orders of magnitude: by enabling writers to link pithy ideas to an existing context, rather than forcing them to use hundreds or thousands of words to establish a context, it will make single-sentence publications useful and practical. Further, it will allow writers to express networks of facts and relationships by building corresponding networks of statements and links [23], extending the range of what can readily be said. These advantages in cost and quality of expression seem great.

Some problems:

'Who will write for it?' Hypertext won't be a powerful medium of expression if no one writes for it. One might object that there will (at least initially) be too few authors, in part because the market will at first be too small to reward authors with either money or recognition. Can this start-up problem be overcome?

Given permission or lapsed copyright, hypertext can carry existing paper works. Scanners can input text with adequate accuracy, and readers can mark errors for correction. Royalties will encourage authors to give permissions. Hence the world of documents on the system need not be limited to those written for the system. Further, hypertext can be connected to the paper literature as well as that literature is to itself: Given unique names, each medium can reference works in the other. Thus, the hypertext literature need not suffer greatly from its initial small size.

In fact, one should expect to see much new writing from the start. Small, computer-oriented communities will be early users of hypertext publishing. New, interdisciplinary topics will be good candidates for the early elaboration of a hypertext literature. (One early topic will be hypertext itself.) Communities interested in such topics will generate their own incentives for publication and recognition. The amount of discussion that already occurs over computer networks suggests that a hypertext publishing medium need not starve for lack of material.

'Copyright won't work'Eventually, copyright is intended to provide incentives for writing - but can copyright work in a computer-based medium, given the ease of copying? For all but bestsellers [8], it seems that it might. A hypertext system would sell not information, but the service of providing information, complete with current evaluations, links to further information, and so forth. To duplicate this service would require copying, advertising,

Page 12: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

and selling large bodies of information; since this can't be kept secret, reasonably effective enforcement of copyright seems practical.

Experience with expression

Of the advantages cited above, perhaps the least quantifiable was that of extending expressiveness by enabling authors to represent more complex relationships. Is this advantage real and substantial? In 'Theory Reform Caused by an Argumentation Tool' [23], Kurt VanLehn reports his experience with expressing and developing theories in cognitive psychology using the NoteCards hypertext system [16]. This medium enabled him to play with organizations of facts and theories in ways that revealed (and helped correct) serious flaws. Reflecting on his experience with hypertext, he writes:

The NoteCards database is about as close as any written artifact can get to expressing a whole theory. . . .we can expect NoteCards to help theorists clarify their ideas and make them rigorous. . . .Nowadays, I view my work as building a NoteCards database qua theory. To theorize without building a NoteCards database seems like programming without building a program. One can do it, but it's harder. . . .Because NoteCards databases are accurate representations of theories, they have excellent potential as vehicles for collaboration. . . .Being halfway between lab notes and journal articles may also make NoteCards a unique aid to graduate-level teaching. A NoteCards database would allow young theorists to crawl around inside a classic theory, getting to understand it more deeply than they could from journal articles. Incidentally, this is one answer to what could happen to NoteCard databases after their active development ceases. They might rest in graduate schools, embalmed in computational display cases for students to dissect.

This illustrates the utility of hypertext for expressing (and hence evaluating) complex ideas. And with a hypertext publishing medium available, a theory expressed in hypertext need not be embalmed, but can become part of a living, evolving literature.

Transmission

Hypertext publishing will aid transmission in several ways. It will reduce delays in distribution (by orders of magnitude), placing published material in front of readers in under one day, instead of hundreds. It will eventually reduce the cost of placing the capabilities of a research library at a site (by orders of magnitude), from tens of millions of dollars to the cost of some user machines. It will increase the speed of accessing referenced material (again, by orders of magnitude), retrieving it in seconds rather than the minutes or days required in a paper library system. It will increase the ease of finding a reference (by an additional, hard-to-guess factor), because it will encourage a more reference-dense writing style and provide a market for free-lance indexers [6]. These advantages in transmission seem great.

Page 13: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Some problems:

'The public won't be interested' All these advantages would be of no value if no one used them, and most people won't use hypertext publishing any time soon. Most may never use it.

Likewise, most people don't read scientific journals, and most never will. Nonetheless, journals influence scientists, and their content spreads outward through books, magazines, newspapers, television, and conversation, ultimately having broad effects. Likewise, hypertext publishing might reach only a tiny minority directly, yet greatly affect the evolution of ideas and the course of events.

'Experts won't be interested' It might seem that leading experts in a field will have little use for hypertext publishing, since their colleagues keep them well informed. And this might be so, if all fields were narrow and well-established. But many fields are broad and interconnected, and of interest not only to their experts, but to other scientists, engineers, policy-makers, scholars, and students. And even experts can benefit from ideas and criticism from foreign fields.

'Readers will get lost'Transmission will fail if readers get lost; this has been a problem in experimental hypertext systems [4]. It might seem that the vast amounts of material in a publishing medium must worsen the problem.

But styles of use would evolve to suit readers, and success requires only that one scheme work well, even if most schemes have fatal problems. A conservative approach would emphasize hierarchical index structures, like those found in outline processors, allowing evolution of competing hierarchies suited to different fields and perspectives. Lowe's SYNVIEW work [17] suggests how a hierarchical structure can integrate indexing with critical discussion.

It may be that most users of hypertext publishing will chiefly read fairly conventional overview documents, dipping into tangled networks of representation and debate only in their own field, or in search of deeper understanding. Though these summaries might resemble conventional documents, their quality would reflect criticism based on knowledge evolved in the underlying hypertext debate.

'Reading will be too difficult'Hypertext publishing has obvious problems with equipment cost, and with the speed and cost of telecommunications. If this made reading too difficult or expensive, it would create disadvantages in transmission.

These problems clearly limit the value of hypertext publishing, but by how much? The price/performance ratio of personal computers is already impressive, and improving rapidly, so equipment cost is a modest and declining problem. Telecommunications speed and cost are tolerable for serious work, but they remain the major problem. This

Page 14: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

motivates a search for ways to minimize the telecommunications bottleneck.

One of the following approaches might eventually be worth pursuing: Monitor reading-frequency of works on the library machines, and observe which are read frequently over several months or more; then, (subject to authors' permissions) copy the most-read half-gigabyte or so of this material and sell it as a CD-ROM, paying royalties in proportion to previous on-line readerships. This would let users buy personal databases containing several hundred books'-worth of the most-read material on the system. In a complementary approach, one would assemble information in a similar way, but with selection biased by a user's interest profile and filter criteria; the result might be distributed on floppy disks and stored on writable media. Users could of course cache downloaded material on a local disk. In on-line use, a user's agent might pre-fetch material to a user's machine when a popular link came into view.

Together, strategies like these might greatly reduce the costs and delays of following a typical link, even without improved telecommunications services. And when compared to a conventional library system, even a fairly awkward and expensive system would shine.

'Who wants to read at a computer?' Who wants to read at any workstation set up for businesslike typing? Experience shows that a Macintosh set up for reading-chair (rather than secretarial) ergonomics is an acceptable reading device (lean back, swing in the Mac, put up your feet. . . aah!), and reasonable for writing as well.

Experience with transmission

In speed, hypertext publishing will resemble electronic mail, and this is its least exotic advantage in transmission. Common Lisp is widely considered to be an excellent design (at least for a committee-designed standard). In Common LISP: The Language, Guy Steele writes:

The development of Common Lisp would most probably not have been possible without the electronic message system provided by the ARPANET. Design decisions were made on several hundred distinct points, for the most part by consensus, and by simple majority vote when necessary. Except for two one-day face-to-face meetings, all of the language design and discussion was done through the ARPANET message system, which permitted effortless dissemination of messages to dozens of people, and several interchanges per day. . . .It would have been substantially more difficult to have conducted this discussion by any other means, and would have required much more time. [24]

Simply speeding transmission can make a difference in problem solving.

Evaluation

Page 15: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Hypertext publishing will aid evaluation in several ways. With a suitable interface, it will enable readers to transmit evaluations with a mouse-click rather than a letter, reducing labor-costs by orders of magnitude. It will enable critics to attach their comments directly to a target work, making them available almost immediately (instead of months later) and potentially visible to all future readers (instead of just those who later happen across them). It will improve filtering, enabling readers to benefit from others' judgment, read a richer mix of material, and save considerable skimming-and-rejecting. More indirectly, higher-quality criticism will foster higher-quality review articles, enabling readers to survey fields with greater ease and confidence, easily retreating to simpler explanations when needed. Finally, it will enable authors to attach a retracted annotation to obsolete views, making them disappear as seen through a typical filter. These advantages in evaluation seem great.

Some problems:

'Bulletin boards are boring'Computer bulletin boards and mailing lists might seem like models for hypertext publishing, but they are often full of trashy material. All, however, seem to lack one or more essential features, such as links (to enable effective criticism) or evaluation and filtering mechanisms (to make trashy material invisible). Further, they aren't archival, as journals are: writers know they are writing for the garbage can, and act accordingly. Nothing about a computer medium per se degrades the quality of writing.

'Most writing will be trash'Assume that 99.99% of published material will be trash. With charging for use, its storage needn't cost anyone but the author. With suitable database algorithms and organization, its presence needn't slow access to other material. With suitable filters (which display only favorably-rated material), its existence needn't be visible. Thus trash, however abundant, is irrelevant: what matters is what has value.

'Evaluation won't work'This argument places a heavy burden on evaluation and filtering. How are they to work? A simple majority vote of readers seems a poor basis for evaluation (and how would people vote?). Expert evaluators would be hard to choose, harder to agree on, and likely to refuse the job. Distributed evaluation and filtering systems deserve serious research; there are many issues to consider, ranging from the game-theoretical analysis of vote-weighting schemes through the details of effective user interfaces.

Regarding the latter, opinion-capture in a window-based system might be simplified by providing several go-away boxes, with meanings ranging from what a waste of time! to so-so to that was great!. A simple facility for tipping authors for pithy insights (and recording the amount as an evaluation) might be of value. In general, evaluations could be associated with their source (or with characteristics of their source) so that filters could weight different evaluators differently.

A good hypertext publishing medium need not begin with good filtering and evaluation mechanisms; it need only provide a medium for evolving good

Page 16: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

mechanisms. Are good mechanisms possible? We know that editors and reputable journals work, and their basic principle of reputation-and-recommendation seems extensible.

'Filtering will block novelty'If filters pass only material with positive evaluations, how will new material ever be seen or evaluated? Material from some established authors might have high ratings a priori, but what about material from new authors, and the occasional good ideas from bad authors?

If the system supports easy passing of references via electronic mail, this problem evaporates. A bad or unknown author can pass references to friends and colleagues; if the work is good, they can give it a high rating, and pass references to their friends and colleagues. In a half-dozen or so steps with a fan-out ratio of ten, this process would reach millions. Before then, a good work would accumulate enough favorable ratings to pass a typical filter without a personal recommendation. Public reading and evaluation then takes off.

Experience with evaluation

Today, electronic mail carries considerable electronic trash; the problems of automatic evaluation and filtering are broadly similar to those in hypertext publishing. In their work on the Information Lens system for filtering and disseminating electronic messages, Thomas Malone et al. [25] have made a substantial start toward developing the sorts of mechanisms that would be needed in a hypertext publishing system.

They note that:

many of the unsolved problems of natural language understanding can be avoided in intelligent information-sharing systems through the use of semistructured templates (or frames) for different types of messages. These templates can be used by senders to facilitate message composition. The same templates can then be used by recipients to facilitate construction of a set of rules for filtering and categorizing messages.

This approach (like many others they discuss) has application to hypertext. In particular, it suggests that fine-grained publication will have particular advantages when coupled with standard templates. (See also Lowe's work on structured, fine-grained argumentation systems. [17])

Some general objections

'It would be too hard to build' Designing and coding hypertext publishing systems will be a challenging task. Indeed,

Page 17: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

it would be easy to draw up specifications that would make this impossible. A sensible goal is to avoid really hard problems (such as ensuring tightly-coupled consistency across a loosely-coupled network, or designing a versioning mechanism with ideal semantics) while designing a database kernel that provides essential basic capabilities. Likewise, open-ended problems, such as evaluation, filtering, and user-interface design can be left to the open-ended process of evolution, so long as the design provides support for the basic mechanisms. There are suggestions for such designs that seem implementable [8].

'If it were good, we'd have it now' The idea of hypertext publishing is over two decades old, and several attempts at implementing it have been made - without the dramatic results anticipated here. One might argue that hypertext publishing is an already-tried, already-failed idea, that mere theory supports the idea of its great value, while solid experience shows its value to be quite limited. But in fact, though the idea may be old, it hasn't really been put to the test. Past software either hasn't been available in working form (Xanadu), or has been based on old technology and sold at a high price to a small community (NLS/Augment). No past systems have been full, filtered, and public. (And none, of course, has been based on next year's computer, disk, and telecommunications technology.) In short, the implementation and use of this sort of system has not yet been tried, hence experience hasn't yet had its chance to contradict theory. The theory might even be true.

'If successful, it will be abused' A successful hypertext publishing medium will surely be abused. A distributed system would be resistant to the 1984 problem of revised history, but lesser problems will remain. Some readers will use filtering to help them keep their minds closed, or to seek out their favorite brand of falsehood. Some will use the system for criminal purposes.

But everything since the rock has been abused by someone. And there is a presumption that the advantages of hypertext publishing for the expression, transmission, and evaluation of ideas will, on the whole, be a good thing - at least if one regards thought and communication as good things.

What it might be like

In an established hypertext publishing system, the operation of many minds on a shared, linked literature should aid several valuable emergent phenomena. Among these are the growth of intellectual communities and fields, the evolution and use of standard conceptual tools, the ease of seeing holes (and the lack of holes) in arguments, and the growth of clearly-summarized consensus. The following sketches these phenomena in a fictional context.

Page 18: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Forming intellectual communities

In a hypertext publishing medium, authors will typically sign their work and readers will often sign their evaluations and comments. Landmark writings will collect many evaluations. Those sharing an interest in a set of landmark writings form a potential intellectual community; any one of them can identify others by their signed publications, and can compile and distribute a mailing list to facilitate communication. Intellectual communities thus should form more easily.

For example, assume that researchers studying hypertext argumentation and those studying connectionist models (a.k.a. artificial neural systems, a.k.a. parallel distributed processing [26]) are publishing in a hypertext medium. Someone might notice an overlap between these communities. A mailing to this group could establish a landmark publication asking: 'In a fine-grained argumentation structure, the various 'supports-' and 'undermines-' link-types formally resemble exitatory and inhibitory connections in connectionist models. In both fields, we seek to derive coherent patterns from conflicting data. Might connectionist network-relaxation algorithms be useful for deriving a robust, largely self-consistent consensus position from a network of conflicting argumentation relationships?' This initial publication provides a natural coordination point for attaching criticism and elaboration if the basic idea proves fruitful. Thus the community is born possessing something like a journal.

Building a field

The growth of a field involves the growth of a community of researchers who share a literature and a set of problems. Speed of publication, ease of referencing, ease of finding who has referenced what to make what points - all of these facilitate building a new field.

In our fictional example, colleagues circulate abstracts with references to the new body of work, and some researchers are intrigued. Since links have the functionality of a citation index (but always current), new work becomes visible from the older work it builds on; this attracts attention from the authors of that older work, and again, some researchers are intrigued. The nascent field of connectionist hypertext grows rapidly.

In an early move, someone publishes a landmark item which says, in effect, 'Link statements of unsolved problems in connectionist hypertext here'. Evaluation, filtering, and display mechanisms then make it easy to sort these links to show the problems the community presently regards as most important. Other landmarks accumulate links to discussions of particular subproblems, algorithms, and so forth. Some publications consist of classifications and evaluations of other ideas and approaches.

The first statement of the connectionist hypertext idea is soon amended: Someone notes that people implicitly relate themselves to items by agree and disagree links, and can relate themselves to each other by respect and don't-respect links. These links can have varying weights, and again have a formal resemblance to excitatory and inhibitory connections in neural models. The revised proposal, then, is to take all these

Page 19: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

links among points and people, filter them in some way, and run a connectionist network-relaxation algorithm on the resulting system to identify coherent sets of thoughts and thinkers. Discussion soon centers on this new idea.

Using standard conceptual tools

Having existed for some time and accumulated some of the best material from the paper literature, the publishing system holds a wealth of crisp statements of useful points, distinctions, theorems, schemata, fallacies, logical principles, general system principles, economic principles, definitions of terms, and other conceptual tools. These are linked to discussions of their truth or falsity and to paradigmatic examples of their use and misuse. The availability of these standard conceptual tools economizes intellectual effort.

Soon after the connectionist hypertext idea surfaces, someone applies the evolution schema (with no need to restate it and trot out the standard examples). The network settling process has aspects that meet the criteria for variation and for selection, but nothing in it corresponds to replication. Therefore, network settling is a non-evolutionary form of spontaneous order: this observation sinks an idea floated the day before.

One specialized conceptual tool is a taxonomy of connectionist models. A member of the connectionist hypertext group applies this to the proposal. The idea involves use of relaxation algorithms, but not learning algorithms: connection weights are set by people, not by algorithms operating on the network itself. Placing connectionist hypertext in this taxonomy clarifies its nature without redundant explanation and indicates relevant parts of the connectionist literature (categorized by that same taxonomy).

Seeing holes

Several weeks after the proposal of connectionist hypertext, a member of the community notices a publication describing an unfamiliar network relaxation algorithm. Is it worth relating to connectionist hypertext, or is it old hat to the rest of the group? A quick check shows no links between the landmark publications on the algorithm and the landmark publication on algorithms for relaxing connectionist hypertext networks. This shows a hole in the literature and an opportunity to contribute (in the paper media, this would have required a tedious literature search, with results that might well be out of date).

The researcher who noted this absence wonders how fast the algorithm works on large networks - a point seemingly not covered in the literature. The researcher posts this question to the algorithm's authors, and to readers in general; this highlights another hole in the literature (or at least in its indexing) and hence another opportunity to contribute.

A skeptic about connectionist hypertext wonders about its strategic stability. If

Page 20: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

someone wanted to bias the results of the relaxation process, couldn't they play games with their statements so as to gain credibility and abuse it? The skeptic looks at the landmark compilation of problems - game playing isn't mentioned! A moment later it is, and the skeptic, having seen a hole in the list of problems, has enriched the field.

Each problem labels a hole and encourages work to fill it. Proposals to deal with the game-playing problem soon accumulate: they include using multiple algorithms for relaxing argumentation-networks and multiple algorithms for filtering and mapping the hypertext structures into a connectionist model, followed by a comparison of the different results. This is argued to make effective game-playing more difficult. A further proposal is to try to identify and screen out game-players' contributions as part of the filtering process. Finally, someone notes that this would be a problem only if the basic idea of connectionist hypertext has considerable merit - why else would game-playing be a problem?

Seeing a lack of holes

Weeks later, another skeptic examines the connectionist hypertext literature, and finds that the landmark compilation lists every major problem that the skeptic can think of. The skeptic's filter places these problems in roughly worst-problem-first order, but in going down the list, all the problems seem well in hand. The remaining objections to the basic idea aren't rated highly by the skeptic's filter; the answers to them are. Some of the wilder early proposals have been refuted, but the expected devastating criticism of the basic idea just isn't there. The remaining questions demand experimental test, and three groups report work underway.

The skeptic concludes that the idea should, at least provisionally, be regarded as sound. After several months of hypertext debate, the idea has been tested more thoroughly and visibly than it would have been in several years of papertext debate. The skeptic adds a bit of support to a call for increased research funding.

Summarizing consensus

One result of all this activity is what amounts to a review article, developed incrementally, thoroughly critiqued, and regularly updated. It takes the form of a hierarchy of topics bottoming out in a hierarchy of result-summaries; disagreements appear as argumentation structures [17]. When new results are accepted, their authors propose modifications to the summary-document; they become visible (to a typical reader) to the extent that they become accepted.

A free-lance writer on the system publishes a popularized account of the wonders of connectionist hypertext, but this account is more moderate in tone than one might expect. In a hypertext publication, readers expect links to the primary literature and to technical review articles. And knowing that readers will be able to see any criticism added by the actual researchers keeps the writer from speaking of scientists racing to develop a Giant Social Brain. In hypertext publishing, one must be careful of one's reputation, since so many reader's filters exclude work by unreliable authors.

Page 21: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

A typical reader mostly browses popular articles and technical reviews, seldom following links deep into the argumentation network. But more accurate and current summaries let them benefit nonetheless. People specialize in different domains, and everyone benefits from the resulting division of intellectual labor.

(Note: Although the process just described and the resulting consensus are imaginary, the connectionist hypertext idea is to be taken as a serious proposal for a line of inquiry. On mentioning it to members of the connectionist community at a recent conference, I was told that connectionist groups are interested in the related idea of social models inspired by neural nets.)

Quantifying its value

Implementation of a hypertext publishing system is one goal among many, all competing for our funds and attention. How important is it? The following attempts to examine its value in a way that lends itself to crude, quantitative estimates. This is a difficult and risky enterprise, but we are likely to have a better idea of its value if we try to estimate it than if we don't.

Economizing intellectual effort

We have some sense of the value of intellectual effort; trained people and innovative ideas are considered major assets. We also hear much about using resources efficiently and wisely. Though this is often applied to tangible resources (land, petroleum) it may be still more important to apply to the intangible resource of the human mind.

The human intellect is a limited resource

Human intellectual effort is, at any given time, a limited resource. There are a limited number of knowledgeable people in any field and a limited number of hours in a year. Increasing the number of people and the quality of their training is slow and difficult; increasing the number of hours is impossible.

This limited resource is wasted

Our limited intellectual resources are wasted in many ways. The history of the rise and fall of the (fictitious) square-wheel research program illustrates some familiar patterns.

Bad ideas adopted through ignorance of refutations. Transportation researchers, concerned with bumpy wheels, pursue work on the square wheel. They reason that it is

Page 22: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

superior to higher polygons, since it has fewer bumps; further, since its fewer corners probe the height of the ground at fewer points, it is less sensitive to typical bumps on a road. Bearing researchers are familiar with arguments that the decisive issue is bump magnitude rather than number, but the transportation research community remains ignorant of them. Work on the square wheel goes forward under a major defense contract, and major intellectual effort is misinvested.

Bad ideas maintained despite outsider's refutations. Later, when financially and intellectually committed square-wheel researchers hear of the bump-magnitude issue, they ignore it in their publications and research proposals. Lacking links, critics can't easily make their arguments visible. With sufficient effort they might make their point, but they have no real incentive to try. Investment of intellectual effort in the square-wheel program continues, and the knowledgeable say, That's life.

New thinking twisted by misinformation. Observing the major effort in square wheel development, others make plans for square-wheel vehicles. They focus formidable engineering skills on developing tough suspension systems and motors with extraordinarily high starting torque. An exploratory research effort begins on the more challenging triangular wheel, with its promise of eliminating a bump.

New ideas generated but not pursued. One researcher looks beyond polygons and considers the idea of a round wheel. But this doesn't fit with the researcher's other interests and seems like too small a point for a paper, and so is not published. The idea remains as a marginal note scrawled in a copy of the Journal of Earth/Vehicle Interfaces. Investment continues in what should have become an obsolete idea.

Good ideas neglected through ignorance. When the round wheel is finally proposed, few know whether to take it seriously. Most readers of the proposal have no way to know whether it makes sense, since it involves abstruse, interdisciplinary considerations of geometry, structures, and kinematics. Investment still continues in obsolete ideas.

Good ideas neglected because refutations are suspected. Mutterings are heard: "Round wheels - wouldn't they violate conservation of friction, or something? In any event, they sound too good to be true." Again, investment continues in obsolete ideas.

New thinking undermined by ignorance. The round wheel is at last accepted by a substantial community, and development is under way. The promise is clear, but many haven't heard of it. The failure of the square-wheel program to produce commercially viable results (despite its use for rough-terrain military vehicles) has left the transportation community wary of wheels. Considerable effort is invested in plans for sled-based systems for several years.

Old ideas redundantly pursued out of ignorance. In later years, this becomes proverbial, and is called reinventing the round wheel.

Effort consumed by research and publication. All of the above ways of squandering

Page 23: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

intellectual effort could be avoided, given thorough-enough searches of a complete-enough literature. But in reality, the costs of search (which may be fruitless) are high enough that it often makes more sense to risk wasting effort on bad or redundant work.

Hypertext can help economize it

Hypertext publishing won't eliminate wasted intellectual effort, eliminating bad ideas and spreading good ones instantly and effortlessly. But its many advantages in the expression, transmission, and evaluation of ideas - often reducing monetary and labor costs by orders of magnitude - can be expected to have a major positive effect.

The expected improvement in the efficiency of intellectual effort depends both on the degree of waste today and on the effectiveness of hypertext in reducing it. If one's standard of efficiency involves applying our best information to the most important problems, then one may well conclude that much of today's intellectual effort is wasted. If hypertext publishing can substantially reduce that waste (cutting it by tens of percent or more?) its benefits will be quantitatively huge. (And if its benefits will be huge, then the paucity of effort in the field today indicates that much effort in computer science is, relatively speaking, wasted; this, in turn, further increases one's estimate of the potential benefits of a hypertext publishing, which indicates. . . )

Extending the discussable

The last section outlined how intellectual resources are wasted today and noted that hypertext publishing systems promise to reduce that waste significantly, with quantitatively great benefits. This benefit involves doing what we already do, but more efficiently. Harder to quantify are the effects of extending the range and improving the quality of debate.

The result of arguing complex policy questions in modern media has typically been caricature, polarization, and paralysis. Recognition of this problem has led even thoughtful analysts to advocate simplistic policies: if ideas will be caricatured or ignored, it makes sense to cast them as slogans from the outset. In a sense, these questions are beyond the range of what can practically be discussed in present media. Why is this, and what effects might a hypertext medium have?

Breadth, complexity, and qualification

Many discussions seem to operate at or beyond practical limits related to breadth, complexity, and the need for qualifying remarks. Some are interdisciplinary, requiring backgrounds broader than real people have. Examples are arms control and missile defense systems, topics that combine issues such as weapons, sensors, space systems, orbital dynamics, software, strategy, diplomacy, and much else. Topics of this sort are up against the breadth limit.

In addition to sheer breadth, some topics are complex, in that any non-trivial statement

Page 24: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

about a single part may depend on its relationship to a complex whole. Again, arms control and missile defenses are examples, with the emphasis this time on the complex relationships among their parts. A statement on either subject may make sense only in a certain set of scenarios (involving assumed technologies, measures and countermeasures, relative costs, goals. . .) but each individual scenario may easily be too complex to allow brief description. Topics of this sort are up against the complexity limit.

Finally, some topics are highly controversial, requiring that statements be carefully qualified. Examples, again, include arms control and missile defenses. This time, the emphasis is on the difficulty of stating how a particular proposal might be improved or how it might fail (subject to the assumption that other technologies, costs, goals, etc., etc., are such-and-such) while not being seen as taking a stand either on those assumptions or on whether there is or isn't any sense in the general idea of controlling arms or of shooting down missiles. Topics of this sort are up against the qualification limit.

Big problems

We face many other big, messy problems where discussion is up against one or more of the above limits. Examples include acid rain, ozone depletion, nuclear winter, genetic engineering, nanotechnology, economic policy, and military strategy. Many of these issues are cross-disciplinary, involving chemistry, physics, biology, ecology, economics, political science, and so forth. All are complex, involving economic systems, ecosystems, multiple technologies, international politics, and so forth. All are subjects of contention. In all of them, an improved chance of avoiding major mistakes could be of enormous value.

These problems involve complex tangles of facts, values, and policies. With more accurate facts known and available, and with better means for discussing policy choices, we could hope to find policies that would better serve our values (such as survival). A better medium for representing and debating these problems could help substantially.

Extending debate

Today, discussion of big problems tends to be simplistic, polarized, repetitive, and ineffective. In part, this is because we lack effective ways to debate technical issues point by point and to make the results available as factual building blocks for further arguments. In part, this is because we lack effective ways to represent complex problems and overlapping scenarios they spawn. In part, this is because we lack ways of representing large contexts to which people can easily append small, incremental insights. Hypertext debate promises to be more detailed (hence less simplistic) and more cumulative (hence less repetitive). This, in turn, should make it more effective and perhaps less polarized. Thus, it should help us better discuss big problems [27,28].

Page 25: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Basically, hypertext publishing will let people express ideas, relationships, and criticisms more effectively. But at a higher level, this will help groups represent whole networks of beliefs, including broad summaries rooted in detailed evidence and argument. At a yet higher level, this will help groups do battle over their worldviews, enabling direct point-by-point comparison. Overall, this process of expression, transmission, and evaluation will aid the evolution of knowledge in society by knitting minds and ideas together more closely.

There are many open questions, ranging from how best to represent links in machines to how best to represent abstract argumentation structures on screens. But even now, we can see how to improve on paper publishing in crucial ways. Even now, we can foresee great benefits from systems that seem within our grasp.

Improving problem solving

A hypertext publishing medium will have abilities beyond supporting improved critical discussion. Since it is computer-based, it can naturally support software for collaborative development of modeling games and simulations [29] (and enable effective criticism of published model structures and parameters). Social software could facilitate group commitment and action: individuals could take unpublicized positions of the form I will publicly commit to X if Y other people do so at the same time. Once Y people take a compatible position, everyone's commitment (to making a statement, forming a group, making a contribution, etc.) could be automatically published. The possibilities for hypertext-based social software seem broad.

But let us focus on the narrower value of hypertext publishing in debating facts and policies. It will drop the costs of such operations as publishing an idea and following a reference by factors of ten to a thousand. It will greatly improve our ability to see what is and isn't in the literature, and to find the best available arguments for and against a point. By speeding work (and redirecting effort from square to round wheels), these characteristics of hypertext publishing will make intellectual effort more efficient.

By how much? A factor of ten? Given all its advantages, perhaps so, but this seems too much to count on. By a mere ten percent? This, too, may be an overestimate, but for a well-established system it seems more like a gross underestimate. Therefore let us consider what it would be worth to improve the problem-solving ability of various research and policy communities by this modest (and ill-defined) ten percent.

The value of solving problems

More efficient application of intellectual effort to a wider range of problems will mean more useful knowledge. More useful knowledge will mean many things.

Knowledge generates wealth. Take our ten percent figure to mean a ten percent faster rate of advance in technology (and in the understanding of how to use it wisely). The value of this would rapidly mount into the billions of dollars.

Page 26: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Better economic knowledge would shape better economic policy. Take our ten percent figure to mean a ten percent less ridiculous economic policy (by some measure), and again the benefits rapidly mount into the billions of dollars.

Better knowledge would improve health care. Take our ten percent figure to measure an improvement in the rate of biomedical advance, or in the effectiveness of applying existing knowledge to the problems of medicine. The number of lives saved would rapidly mount into the millions.

Better knowledge can aid survival. Take our ten percent figure to mean a ten percent lower chance of making a major mistake regarding arms control or missile defenses. What is the value of a significantly lower chance of nuclear war?

In short, an estimate of the long-term value of a hypertext publishing medium can be conservative, yet enormous. Knowledge is a primary asset of our civilization, crucial to all our goals. Hypertext publishing promises to speed its evolution, bringing broad and great benefits.

Why hasn't this been obvious?

Claims of wonderful benefits within reach are rightly suspect; the more wonderful, the more suspect they they should be. The heuristic here is that great opportunities are apt to be visible and appealing, and hence already exploited; those that seem to be sitting around within reach are usually illusory. It seems wise to consider how the benefits outlined here might be real and yet underappreciated - lest we suspect that hypertext publishing must violate some law of conservation of (social) friction, dismiss its value, and pursue wheels with corners instead.

The chief value of hypertext publishing lies in how it can aid the evolution of knowledge - but evolution, like spontaneous order in general, is a thoroughly misunderstood subject. People tend to think that order must result from orders and that progress must result from design. Schools teach evolution (if at all) as an allegedly controversial theory in biology. They do not speak of the evolution of economic systems (but who designed markets or corporations?) or of technologies (but who designed the modern automobile?) or of science (but who invented its practice and content?) or of language (but who designed English?). Each of these achievements emerged through many trials, many errors, and the slow accumulation of what works. This describes evolution (including the evolutionary process we call 'design'). Ignorance of evolution makes it hard for people to see the value of improving society's ability to express, transmit, and evaluate a myriad of small, interrelated ideas.

People share other cognitive blind spots [30], and the value of a hypertext publishing medium falls into several of them at once. People pay more attention to the tangible than to the intangible, and seek direct solutions to their problems more often than indirect solutions. Accordingly, they tend to undervalue basic research (with its unknown benefits) compared to research aimed at particular problems. Basic research is an indirect approach to the intangible product of useful knowledge. Hypertext

Page 27: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

publishing is likewise an indirect approach to this intangible product, but it isn't even aimed at a specific field, like physics or molecular biology. Its benefits, being more diffuse, seem less concrete.

People also tend to underrate the importance of media, thinking that thinking depends just on the quality of individual minds. Further, a hypertext publishing system (supporting an open-ended set of user interfaces and information structures) is better seen as a medium for the evolution of media than as a medium in its own right. Finally, people tend to think that only majorities matter, encouraging them to neglect the value of better intellectual tools in the hands of minorities. And as for developing a better medium for evolving media for a minority - what good could that be, in a world that has achieved color television?

Might it be that the goal of hypertext publishing is achievable, valuable, and yet radically undervalued? It seems so, and that presents us with a great opportunity.

Getting there

If hypertext publishing is promising enough, we should consider its implementation. Questions include how this might be done, when efforts might bear fruit, and what reasons there are for making a focused effort.

How?

Various paths could lead to a hypertext publishing medium. Incremental paths might start with existing hypertext systems and electronic mail; these paths run risks of setting standards that are incompatible with long-term needs. Another kind of incremental path would start with a design aimed at meeting long-term needs but would first implement functions having independent, short-term value. The resulting system will provide markets for many profitable activities: library services; telecommunications; publishing; input and output of paper documents; the development and sale of hardware, software, and integrated systems for businesses and laboratories.

How difficult?

Discussions with researchers reveal some common confusions and misconceptions regarding hypertext publishing. Some of these lead to underestimation of its difficulties, others to overestimation.

Underestimation of difficulties chiefly results from failures to understand the difference between a distributed publishing medium with two-way links and the distribution of hypertexts joined (at best) with one-way references. The issues raised by true hypertext publishing are explored at length in [8].

Page 28: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Overestimation of difficulties stems in part from the assumption that hypertext publishing, to succeed, must reach a large fraction of the population and contain a corpus of knowledge on the scale of a major library. These grand goals are inappropriate for a new medium (though one should seek system designs that do not preclude such achievements). This paper has argued that a hypertext publishing medium could reach the threshold for usefulness and growth with only a small community of knowledge workers, and that it could be of great value while used by only a minute fraction of the population. With this realization, the fear that hypertext publishing must be an enormous, long-term undertaking seems unmotivated. No positive arguments have been advanced to support this fear.

Overestimation of difficulties also stems from the notion that the challenges of hypertext publishing include all the challenges of lesser hypertext goals - that a publishing medium would be of no value unless isolated hypertexts had proven their competitiveness with books, magazines, movies, schools, or whatever. This seems mistaken. Isolated hypertexts compete with authored, organized documents; a hypertext publishing system would compete with the disorderly tangle of material found in journals and libraries. One can imagine that linear textbooks are always superior for organized presentations of established knowledge, while simultaneously believing that the linked, non-linear organization of a scientific literature would greatly benefit from computer support. This shows the difference of the goals, and the lesser challenge of certain aspects of hypertext publishing.

When?

It seems that the technology is in hand to develop a prototype hypertext publishing system, but how long will it take for such a medium to grow and mature to the point of practical value? If one's measure is number of users and one's standard of comparison is telephone, radio, television, or the (failed) goals of videotex, the answer is a long time, perhaps never. It would be hard to reach a readership as large as that of serious books, much less that of books in general, or of newspapers and magazines.

A better measure of value is the evolution of knowledge - and knowledge, once evolved, can have a broad impact through conventional channels. By this measure, payback can begin while the system is small. What would it mean to improve the effectiveness of 1,000 competent people by 10%? In the course of a few years, it would more than repay the person-years required to implement the system, giving a good return on the investment of intellectual resources. The value of hypertext publishing does not depend on its becoming a dominant medium, or even very large (though in time it may do both). Its value and richness will grow as software evolves and the literature expands, but its value may be substantial while the system is still small and unpolished.

Why try?

Page 29: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

Why bother trying to make this happen, rather than saying 'Let's wait and see'? True, it might happen anyway, sooner or later - but with billions of dollars and millions of lives seemingly at stake, a little sooner is far better than a little later. And a blind, incremental approach could lead to bad initial choices, setting inadequate standards and blocking progress for a long time. It seems that we can best serve our interests by focusing on the goal of an adequate system and trying to move toward it with all deliberate speed.

Knowledge evolves, and media are important to the evolution of knowledge. Hypertext publishing promises faster and less expensive means for expressing new ideas, transmitting them to other people, and evaluating them in a social context. Links, in particular, will enable critics to attach their remarks to their targets, making criticism more effective by letting readers see it.

Hypertext publishing should bring emergent benefits in forming intellectual communities, building consensus, and extending the range and efficiency of intellectual effort.

In A Survey of Hypertext [4], Jeff Conklin summarizes the advantages of hypertext as:

ease of tracing references: machine support for link tracing means that all references are equally easy to follow to their referent;

ease of creating new references: users can grow their own networks, or simply annotate someone else's document with a comment (without changing the referenced document);

information structuring: both hierarchical and non-hierarchical organizations can be imposed on unstructured information; even multiple hierarchies can organize the same material;

global views: browsers provide table-of-contents style views, supporting easier restructuring of large or complex documents; global and local (node or page) views can be mixed effectively;

customized documents: text segments may be threaded together in many ways, allowing the same document to serve multiple functions;

modularity of information: since the same text segment can be referenced from several places, ideas can be expressed more modularly, i.e., with less overlap and duplication;

task stacking: the reader and writer are both supported in having several paths of inquiry active and displayed on the screen at the same time (this is also a feature of window systems in general). In addition, each 'digression' occurs in a separate window, leaving the database and display in essentially the same state until the digression is ended;

collaboration: several authors may collaborate, with the document and comments about the document being tightly interwoven.

For full, fine-grained, filtered public hypertext systems - hypertext publishing, in the terminology of this paper - we can add the following, emergent advantage:

Page 30: Hypertext Publishing and the Evolution of Knowledgepaje.fe.usp.br/~mbarbosa/dpi/drexler.doc  · Web viewThe evolution of knowledge Media and knowledge Functions and consequences

speeding the evolution of knowledge in society.

Knowledge is a primary asset of our civilization, crucial to all our goals. By improving the quality of debate and speeding the evolution of knowledge, hypertext publishing will not only further our goals, but help us choose them more wisely.

Acknowledgments

I have benefited greatly from discussions on these matters (some spread over many years) with Jim Bennett, Roger Gregory, Robin Hanson, Kirk Kelley, Mark Miller, Theodor Nelson, Chris Peterson, Phil Salin, and Randy Trigg.