home   journals   books   e-products   contact us   FAQs   sitemap 

The Impact of Electronic Publishing on the Academic Community

[ Home ][ Contents ][ Copyright ][ Order Info ]

Session 7: Supplementary papers

Riding the knowledge waves of the centuries to come

Heinz-Dieter Böcker

GMD-IPSI, Dolivostrasse 15, 64293 Darmstadt, Germany
bocker@darmstadt.gmd.de

©Portland Press Ltd., 1997.
Copyright Information
Prediction is difficult, especially of the future (Niels Bohr)

The process of knowledge accumulation

The process of knowledge accumulation in society has undergone major changes, even revolutions, over centuries and millennia. Initially, knowledge was accumulated in the heads of the elders of a group or tribe and it was communicated by tales, taboos, commonly agreed-upon rules of behaviour, and myths. This tradition of collecting and communicating knowledge orally was the dominant method for hundreds of thousands of years of human existence. The knowledge accumulated was informally evaluated through the mechanisms of the survival of the fittest.

In most parts of the world the oral tradition has been replaced by a written tradition; the idea of a book as we know it today, or variants thereof, has been with us for some thousand years now, with the Gutenberg press a more recent invention. It was only yesterday that electronic information processing and storage technology entered the stage, and already now, the rules governing the process of accumulating and preserving knowledge are no longer the same as they were. The amount of available information and knowledge is exploding, and since information and knowledge consume attention we are all suffering from it. To make things worse, it is widely believed that the current ripples are only the precursors of the giant waves yet to come.

The following is an attempt to take a closer look at the process of knowledge production, aggregation and communication, and to identify some of the major factors influencing this process.

A very simple model

In the very abstract, knowledge is generated by somebody (an author), kept by society and eventually reused by somebody else. Solutions to real world problems are generated to serve the wants and needs of humans, and they are kept in knowledge repositories (e.g. libraries). When, later, similar problems are encountered one hopes to find these stored-away solutions, and not to always have to reinvent the wheel by laboriously working from first-order principles. Problem-solving knowledge that is saved after the problem at hand has been solved fits the standard producer--consumer model with an intermediate buffer to hold the goods of trade (see Figure 1).


Figure 1. The producer--consumer model

For society as a whole, it is crucial that only 'correct' problem-solving solutions are allowed to be added to the repository. 'Correct' is meant in a very broad sense; e.g. it may translate into 'consistent with already existing knowledge', 'socially accepted' or 'useful at a given historical time'. To guard against erroneously adding 'wrong' solutions, societies have always been willing to accept high overhead costs caused by ancient priests, the holy inquisition, medieval librarians, an elaborated patent system or modern reviewing procedures for scientific papers. Only knowledge that would pass these input filters would be allowed to enter the repositories of accumulated knowledge.

We all know that, similarly, output filters have been installed by society to control the reuse of the accumulated knowledge. Brotherhoods and guilds are just two types of institution that were appointed by society as gatekeepers who would hand out the knowledge to the general public in a piecemeal fashion. This filter is at the core of the German proverb, "Wissen ist Macht" (knowledge is power). Other output filters may exist, among them some implicit ones caused by language or cultural differences between the consumer and the initial producer of knowledge.

What is causing problems today?

Given the model above, we could try to trace down the now almost proverbial information overflow that everybody claims to be suffering from today, and pinpoint a few simple reasons.

(i) Far more potential authors

Compared to just a few decades ago far more people are able to read and write. Higher (at least high school) education has become the standard in the industrial nations. Moreover, ambitious, young, though still economically poor, nations are beginning to produce scientists by the thousands.

(ii) Far easier production processes

Some ten years ago, with the advent of user-friendly desktop publishing systems, the physical production of information no longer required much specialized knowledge and became more affordable. The whole process was taken out of the hands of typists and typesetters and given into the hands of the authors. We all know the consequences: the paperless office once foreseen was swamped by a flood of easily producible papers. Note that this event predates the rise of the Internet.

(iii) Malfunctioning filters

Desktop publishing, however, was only the first effect of information technology. The Internet is now gradually taking the distribution process out of the hands of the publishers. The input filters no longer work. They are bypassed by the authors, and the centralized repositories become distributed ones. This side-steps the very concept of knowledge in that the boundary between scholarly publishing and publishing for the more general public gets blurred. Of course, there are differences between disciplines: highly specialized fields of the natural sciences are less affected than the humanities.

Likewise, the traditional output filters mentioned above have been largely demolished (at least in some parts of the world) and the new ones --- search engines --- more democratic in nature, are not yet powerful enough. It is all too easy to pose a question in such a way that they either deliver too much or not enough.

One could even go so far as to argue that the whole model described above is about to break down. The traditional, well-established model of accumulating knowledge in the sciences follows what I call the bricklayer's model: putting bricks upon bricks to build the science building. The building is erected in planned, principled ways and the whole process is organized and supervised by academia, with its institutions, and the patents office, among others. Libraries are the cathedrals of this approach. It is becoming increasingly obvious that this brick-by-brick model is largely inadequate as a description of what has evolved over the past 20 years with the advent of information technologies. Variations among scientific disciplines notwithstanding, there is no longer anyone in control of a systematic building process; the building codes erected by academia, monitored and watched by learned societies, are violated en masse. Today, new ideas are invented and reinvented every day under largely unorganized circumstances; they fall through interdisciplinary cracks, get published in an increasing number of conference proceedings, books and journals, in languages sometimes understood by only a subset of the participating (scientific) community.

A more adequate metaphor to describe what is happening today can be borrowed from the kitchen context. The accumulated body of knowledge corresponds to a pot of boiling soup that many cooks try to add ingredients to. New ideas and variations of old ones are continuously thrown into it by an unlimited number of cooks. The ideas briefly float on the surface, lump together to form little islands showing a certain degree of local consistency, submerge, surface again already transformed in shape and gradually melt away into unrecognizable entities. The level of the soup is constantly rising, i.e. the amount of knowledge is increasing, but there is hardly any, whatsoever, recognizable structure to it. There is more soup than ever before - and it is believed to taste better.

What options do we have?

Using the simple model presented above as a guide we can try to point out possibilities of how to avoid or calm the knowledge waves threatening us, some of them involving computers, some effective on different grounds. We briefly sketch a few possibilities (in some cases admittedly tongue-in-cheek) related to the process of writing, to the input and output filters and the way knowledge repositories are run by society.

(i) Enforcing self-censoring

Traditionally, quality assessment and the choice of publishing media are coupled in obvious ways: what is considered to be more important is published using more durable material. If in the future scientific papers were acceptable for conferences only if they were submitted as scientific stones, i.e. the text had to be chiselled in, say, basalt by hand, many a publisher would soon close down and talking again would be preferred over writing. These initial successes in limiting the incoming information stream may be outweighed by the durability of the material, though. In the long run they may not save us from eventually being flooded with many more stones than we will ever be able to read --- remember, I am concerned about the millennia to come.

(ii) Tuning the input filter

Much tighter reviewing procedures could be adopted for journals, books and conferences. Any author would be allowed a maximum of say, one new stone per year. Alternatively, the readers could be given more control by various schemes, e.g. the authors paying for a publication and the readers refunding them.

Also, we could adopt an idea invented by Thales of Miletus in the first half of the 6th century B.C., which may be called an incremental reviewing procedure. First thoughts were to be written in sand, the sounder ones would be transcribed on to silver plates and the best of the silver plates would eventually be transcribed to a single gold plate.

(iii) Don't worry, rely on natural decay

Information technology has not only provided novel and more efficient ways of generating information. Fortunately enough, it has also invented new storage media at a pace that guarantees that what is written today can't be read tomorrow for the lack of surviving reading devices. Or, vice versa, if the device exists, the tapes and disks have lost magnetization. Maybe we will just have to acknowledge that knowledge is a perishable good. How do we know whether the flames that destroyed the library of Alexandria were a curse or a blessing?

We could, however, take a more active role in limiting our knowledge repositories to a manageable size. Instead of, for example, collecting each and every thing, we could impose an upper limit on the number of books a library would be allowed to hold. Like Science Academies we would limit the number of members: a new member could only be accepted if a previous one steps down or dies. Maybe this is what is already happening today, dictated by a limited space for setting up book shelves.

Another option would be to have more institutions whose sole task is to condense and weigh knowledge. Whenever a question is posed to the repository we would start procedures to generate answers from first (or second) order principles.

(iv) Homoeopathy

Computers! We could employ the very technology that caused the problem in the first place to solve it. If we store everything in machine-readable form, and if we dramatically improve our search engines, we may be able to cope with the problem.

Note that each of the aforementioned methods is able to solve the problem alone. We do not necessarily need a combination of them. We could do away with reviewing procedures if search engines were selective and intelligent; alternatively, if reviewing procedures were strict and systematic enough we could spare the search engines. The trade-off between input and output filters is another instance of the very basic, ever recurrent trade-off between information processing at read-time and information processing at question-time.

The problem with all these measures (which are just examples, there are many more one could conceive), except maybe the last one, is of course that currently nobody is able to enforce them. There is no central authority controlling the whole process.

On the other hand, the very fact that "there is so much knowledge around" may not be relevant for practical purposes. After all, in practice we are looking for satisfying solutions; we usually do not have the time or money to prove we have found the optimal ones.

So, what?

Having said all this one may well ask oneself, isn't this much ado about nothing? Do we need to care? Won't there always be a stable equilibrium by definition anyway?

I am not interested in short-term solutions. I am confident they will be found --- most probably the market, i.e. money, will decide. I am, however, interested to learn about long-term strategies to cope with the knowledge and information explosion; and by long term I mean 100 or more years. Is there anything we know today that we believe to be of some value for humans living 100 years from now? If so, what is it and what measures do we take to preserve it? For me, it's surprising to see the same humans that are proud of themselves as living in a cultural tradition spanning hundreds if not thousands of years being unable to imagine and plan for a mere 50 years to come. Obviously, it's just another instance of Simon's [1] principle of limited rationality.


Reference

1. Simon, H.A. (1981) The Sciences of the Artificial. 2nd edition, MIT Press, Cambridge, MA


©Portland Press Ltd., 1997.
Copyright Information
Sales enquiries
Portland Press
Charles Darwin House
12 Roger Street
London
WC1N 2JU
United Kingdom
Tel: +44 (0)20 7685 2425
Fax: +44 (0)20 7685 2468
E-mail: sales@portland-services.com
Map
Editorial enquiries
Portland Press Ltd.
Third floor
Charles Darwin House
12 Roger Street
London WC1N 2JU
Tel: +44(0) 20 7685 2410
Fax: +44(0) 20 7685 2469
E-mail: editorial@portlandpress.com

Quick search