Karl-Erik Tallmo,
lectures

This text may not be re-published, printed or copied without the author's permission. Copyright © Karl-Erik Tallmo

Speech given at "ABM Day" (= ALM, Archives, Libraries, Museums) about electronic publishing, at the Royal Library in Stockholm on December 15, 1997.



The Electronic Word: An Epistemological Dilemma

When the edition, the individual copy, and maybe even the author disappear
- what can one rely on?

It is said that we live in the Information Age, but many of us feel as though we are surprisingly uninformed. Information technology has long had the tendency, so to speak, of being more about technology than about information.

This paradox is somewhat akin to a physicist discovering that increased particle activity at the molecular level does not result in increased heat. Or rather, heat is generated but nothing gets warm.

When everything becomes information - advertisements, political propaganda, school subjects, monetary transactions, measurements and bits and bytes in computers - then a couple of important distinctions are lost. Namely, the difference between data, information and knowledge.

Data are basic measurements, instructions or observations of some sort - 32 km, three seconds, last year, to the right - that can be building blocks of both information and knowledge.

Information is acquired only when these pieces of data are interpreted in some way by being put into a context. Then, if a deeper understanding of that information is acquired, enabling it to be used again, then one has reached the level of knowledge. Having the correct direction pointed out to you is information; knowledge is being able to find your own way.

”Being well informed, as it is so often called, does not necessarily mean that one knows anything.”

In other words, being well informed, as it is so often called, does not necessarily mean that one knows anything. An agent who is forced to reveal secrets through torture may be well informed, bursting with strange information he does not himself understand at all.

Of course, the accuracy of the information is enormously important. Data, unlike facts, do not have to be true. That is why information must also be evaluated. We usually instinctively associate knowledge with some form of truth, or at least a correlation with reality and its demands. In other words, information can be false, but knowledge should have some element of truth in it.

This is where we find many of the shortcomings in this new information technology as it is applied today. For information to truly be informative, that is, to be possible to use as knowledge, it must be true, relevant, understandable and, most of all, accessible.

Access to information has, of course, both positive and negative aspects. Incredible amounts of previously unreachable information has been made globally accessible through the Internet, for example, and all of a sudden it seems as though the amount of information in the world has suddenly increased dramatically. To some extent this is an illusion. It is mostly the actual flow that has increased. But, if one has too much access, without the proper tools for selecting and evaluating the material, then one really doesn't have access to anything at all.

One often hears that in the Information Age we just need to know how to find information. The underlying presumption is that traditional learning is no longer as essential. But in order to judge the relevance and correctness of the information retrieved, a basic cultural and scientific knowledge is required. In fact, such knowledge might be even more important in this day and age where digital data are so easy to falsify - there are no splices, no eraser marks, no trace of changes.

In the Middle Ages, before the innovation of book printing, when things were still written by hand, the individual copy was the constant in the process. This does not imply that the copy always was true, but at least it was fairly invariable.

With the advent of book printing, the possibility to fix text extended from the copy to the edition - an incredible advancement. At the moment of printing, text was frozen and each and every copy came out the same as all the others.

In that respect, however, we have now taken a couple of steps backward, where electronic texts today do not actually have any fixed format at all.

When we deal with the electronic word, the problematic of authenticity and preservation becomes apparent as soon as we type a few characters on the keyboard. We have no guarantees that it will be saved or saved correctly onto the hard disk. And when texts are subsequently distributed via networks or on diskettes - or, for that matter, on CD-ROM disks which now exist in recordable form - then we can no longer be sure that texts will look the same as they did when they left us.

For a long time, people have talked about the problem of safely transferring account numbers over the Net in order to facilitate electronic commerce. And of course that is a very important question. But people are just now starting to realize that the transfer and storage of texts on different servers, in a way that they cannot be falsified, is also a very big problem. Why is that so important, and who would get the idea of falsifying something?

It is important because we will soon be living in a society where a large portion of education and decision making will be based on electronic documents.

Why, then, would anybody want to falsify information? The most obvious is probably for political reasons. Revisionist historians who want to rewrite history books and delete the atrocities of Nazism, for example. Of course, it doesn't have to involve such spectacular issues, but could also, for example, be about withholding information which one does not believe should be distributed at a given point in time because political opinion is leaning a certain way.

Researchers who want to advance their careers could alter electronic texts that they make reference to in such a way as to support their hypotheses. Supposedly, this already happened in the United States a couple of years ago. When decision making is automated, it is important that the computer programs and the legislation that such decisions are based upon, have not been manipulated by someone who could profit by a certain outcome.

The question then becomes, is there a way, in the electronic realm, to restore the confidence in and prestige of the source? With the aid of electronic signatures and other tricks, one can at least try. Every publisher could, perhaps, reserve a special server that he or she has control over, and which could be constantly monitored. Otherwise the problem is the endless amount of backup copies circulating and mirror sites containing the same information. Such a special server would be the officially authorized source for researchers and academics to turn to when they need reliable information.

This issue of dealing with several copies of the same file out on the Net is both a problem and an advantage. They say, one should not have all one's eggs in one basket, and the more copies of a document that there are, the safer it is, of course.

But at the same time, this creates a problem with authenticity. Some of the copies could be falsified. Or all of them could be false, creating, for example, a big problem for journalists who use the classic method of checking two independent sources before writing an article. How does one know that information found at two different locations on the Internet are independent? They could be direct copies of each other.

So, perhaps the most important thing is to foster a level of healthy skepticism. One must learn to consider information taken from the Net as being equal to having heard it from an acquaintance; and one must continually take into consideration the reliability of that acquaintance. Is this a person who without hesitation passes on legends or who usually embellishes his or her stories, or is this a reliable and truthful person who does not usually comment on things that he or she has no personal knowledge about? Is the person speaking in the capacity of a professional, a private individual or merely out of general interest?

Typical for the Internet is that so many non-professional publicists have a voice there. This is both the Net's strength and its weakness. All types of information may co-exist here, commercial and non-commercial, authorized and unauthorized. My understanding is, however, that these formats complement each other. Almost daily, I use information from the Encyclopedia Britannica, but I also frequently glean information about obscure subjects from enthusiastic amateurs; things which simply are not available from traditional sources.

At the same time, I am glad that I am somewhat well-rounded. For the same reasons that I would not dare trust a calculator without having some idea of multiplication tables, I would not trust the Britannica without having at least a cursory overview of history, geography and other basic facts. That is why I think it is incredibly important that our schools do not make it their primary goal to turn our children into full blown multimedia producers, but rather to teach basic subjects. Multimedia technology can easily be learned on the job, but few companies teach, for instance, the rivers of Africa or South American capitals.

The more we progress into an information society - where information is the basis for purchases and sales, where the content of the employees' brains is the most important resource in the company - the more clearly I think it will become that all knowledge (or at least basic knowledge) is in reality a meta-knowledge. This knowledge about knowledge - or knowledge essential for us in order to acquire and adopt other knowledge - will help us evaluate the relevance, truthfulness and context of the information that search engines and other mechanisms find for us.

Three years ago, I wrote an article in the Swedish daily Svenska Dagbladet about this problem, and I concluded then that we could leave the tech part of information technology to the technicians. But the information part of information technology we would have to jealously guard ourselves.

Today, I would like to revise my position a bit. I am afraid that we must also monitor the technology part with a degree of suspicion. There are now many systems being created that handle information and knowledge in a way that will perhaps change the entire human knowledge process.

First of all, systems and programs are now being created to assist decision makers, automating certain decisions. As I have already suggested, not only must the legislation that the decisions are based on not be falsified, but the very selection mechanisms and other criteria that the programs utilize, must be protected from unauthorized access.

It should also be noted that the means by which the government provides itself with information will surely change, things such as committees, reports and investigative operations. In Sweden, for instance, the forms for this are being discussed; i.e. one man investigations versus parliamentary investigations. More than likely, entirely new forms of contact between experts, representatives in parliament, and grass roots will be developed.

Secondly: for some time now we have heard about so called data mining  - the extraction and refining of connections and relationships from databases, a sort of intellectual, mathematically defined processing of raw materials. Now there is even talk of text mining,  which involves exploring grammatical relationships with the help of artificial intelligence-like procedures which can extract new and unexpected facts and correlations between and within texts.

I suspect that this idea of viewing information and text as a sort of mineral or raw material will, in many respects, permeate an increasing number of fields. Even within the private realm, people will probably tinker with some form of text mining.

As an example, I myself have often brutally butchered electronic books that I thought were too slow to handle or otherwise idiotically made. In those cases, I lifted out the pure text from its graphical interface or search engine or presentational form that the producer had chosen. Then I could use some other form of search tool, a regular word processing program for example, in order to more efficiently search through the text. This poses, of course, numerous questions regarding redefining concepts such as reproduction for one's own use, the properties that constitute an artistic work that can be legally protected and other issues which I won't go into here. The most interesting thing here is, however, the fact that this is paving the way for a new, more concrete appearance of the personal reading concept.

Until now, when we have been referring to a person's own reading of a certain work, we meant his or her inner staging, version or mental interpretation of it. In the future, a more tangible, personal reading style will totally reformulate what a work is. Works will change and become something else with every new reader or user, depending on which reading tool he or she chooses.

Soon, I believe, one will simply purchase a raw body of text or other information which one will then read using one's own tools of choice, retrieving information and finding connections, structures and other relationships within the texts.

”A person's own reading of a work is his or her mental interpretation, but soon a more tangible, personal reading style will emerge, depending on which reading tool he or she chooses.”

Man is the measure of all things, and perhaps one of the results of the new technologies, especially within artificial intelligence, will be that after centuries of dreaming, we will finally be able to have a fleeting sense of what it is like to view humankind from the outside. Perhaps we will be able to let a non-human subject give us momentary insights from another perspective.

To summarize, I believe that many trends are in conflict with one another, and that the outcome is uncertain.

As I have already mentioned, one of the issues before us is the issue of democracy. As recently as yesterday, Bo Södersten (in an article in Dagens Nyheter, December 14th, 1997) discussed the special refuge in society reserved for monetary policy, and it is striking how many of his criteria that may be applied also to information. The question, in other words is: will information become yet another isolated and protected area, or will it lead to greater influence through direct democratic methods and increased public access to official sources through IT?

In a similar manner, there is a conflict between freedom of information and the more greedy watch over every little thought as a potential golden nugget. There is a lot of talk, in the business world, about knowledge sharing. According to a recent study by the Delphi Group in Boston, this is taking place very slowly. More than half of the 650 IT managers interviewed saw current business culture as a hinder to knowledge management. Jeff Held, of Ernst & Young's technology center in the United States, was quoted in Computer Sweden on Friday (no. 82, December 12th, 1997) saying:

"One can talk about knowledge management until one is blue in the face, but nobody shares what they know before finding out what sort of advantage they will get themselves."

Thomas Jefferson is often quoted in the discussion of freedom of information. He said that one can share information without loosing anything, in the same way that someone "who lights his taper at mine, receives light without darkening me". It is, of course, a very appealing thought. Still, one has to wonder if that was not an attitude that was easily taken and more affordable when, at the time, the material world was still the focal point for commerce and trade.

Freedom of information versus information protectionism. This is where the current battle over copyright plays a big roll. Usually, copyright is viewed as having two dimensions, the economic aspect and the moral, the latter dealing with the integrity of the work, how it is presented and distributed, so that it does not appear in a form or in a context that the originator has not envisioned.

The more we become an information-based society, with free-flowing data unconstrained by fixed editions, the more important the moral aspect of copyright will probably become. Authenticity and moral right seem, under the current situation, to be a "marriage made in heaven". In the long run, I think, other means will be required. Who knows, perhaps we will return to a "medieval" system where not even the author is always a constant.

For the time being we will probably need these old tools in order to maintain at least a few fixed navigational points in the present flood of digital information, which, from an epistemological point of view, is truly confusing.



Translated from Swedish by Henrik Nordström.


[Back to overview of lectures]
[Back to Karl-Erik Tallmo's start page]