7
The Hyperdocumentalists of Our Lives

Papers:

I want all my papers, manuscripts, notes, documents, to be preserved after my death in their unity, those which served me to prepare the Palais Mondial and what must be between the International City and the intellectual synthesis in which I worked. What I have been able to realize and publish is only a small part of what I have conceived, projected, studied, imagined. I wish that this ensemble remains one as in my thought it is one, and even though I would have had the time to give it this intellectual and material unity (documentary) that I propose to give it, my friends honor my memory by realizing this will. This desire which is formulated only in the conviction that my life was one and that alive I could have increased still much the work of the P.M., dead I could by these manuscripts continue to work there. (Excerpt from Otlet’s will, addition 45, dated September 30, 1923, author’s translation)

Documentary production with personal or individual value implies thinking about its transmission. Otlet’s will, which he corrected throughout his life, especially after the announcement of the death of his son Jean during the First World War, is an interesting document for understanding the will of a man to ensure the transmission of a work that is difficult to fully distinguish from his life itself. The Mundaneum is thus an Otletaneum and probably the reciprocal is true, as the epitaph he chose proves: “Here lies Paul Otlet. He was nothing but a mundanean.”

But Otlet is an exceptional case that Ron Day sums up well:

By means of literary devices, Otlet’s text goes beyond its own time, projecting humanity into a future that Otlet desired to create, both through information technologies and techniques and through the very rhetorical force of his texts. (Day 2001, p. 10)

How do you approach the documentation of your own existence? What should really be transmitted? Is it necessary to consider drawing up specific wills or even codicils for one's documentary productions and personal data? Is it necessary, like Otlet, to ask for certain works to be collected for publication? To whom should they send their accounts on social networks and for what purpose?

These are all issues that require attention and knowledge of how to select the custodians of the documentary legacy. Who are the documentalists in our lives? That is to say, who are the actors who participate in the collection and description of our lives? Are they competent to ensure the preservation, transmission and even synthesis of the legacy? Are they capable of linking our documentary production in an interrelated and reticular way? Who will write our obituary support description? If we find this type of competence among archivists, documentalists, historians, biographers, and other prosopography enthusiasts, the creation of detailed records seems to be the prerogative of enthusiasts, as in the case of this character described by Georges Perec:

He was no longer a professor or Cultural Attaché, he worked in the library of the Institute of Religious History. An ‘old scholar’ whom he had apparently met on a train was paying him 150 francs a month to put the Spanish clergy in files. In five years, he wrote 7,462 biographies of clergymen in office during the reigns of Philip III (1598–1621), Philip IV (1621–1665) and Charles II (1665–1700) and then classified them under 27 different headings (by an admirable coincidence, he added with a sneer, 27 is precisely the number reserved for the general history of the Christian Church in the Universal Decimal Classification, better known as the U.D.C.). (Perec 1980, p. 251, author’s translation)

Today’s reality appears less trivial. In fact, the creation of records, their transformation into searchable and comparable profiles is becoming a classic activity from the point of view of both intelligence services and marketing services, not to mention political groups or lobbies, even if hypocrisy often reigns when categorization data becomes transparent as it can be generated from traces that are left voluntarily or involuntarily online. The acquired and permitted freedom of expression is also a license to say a lot about oneself, like sharing one’s own corpus.

We had previously shown that Ferraris envisaged a form of resurrection via documentary development, which consists of producing traces that can subsequently be consulted and transmitted. “This is my corpus” thus became the new adage of a documentality that allows the transmission of traces and other documents once we are no longer physically present. The transmissible corpus here joins the idea of a double that would have the potential to resist a little longer to time and that would be the documentary double that we would like to transmit to others, a positive double, a documentarized ka (Le Deuff 2009) that would arise from the first documentary heralds (ultrasounds) up to the post-mortem pursuits:

Information began to precede the person. It became possible for information to draw up persons as if out of nowhere. We became coddled from cradle to coffin by so many check boxes on so many scraps of paper. We began being born onto forms: the ubiquitous birth certificate certifying the inauguration of a lifelong paper trail that would outlive even the eventualities of our death certificates. (Koopman 2019, p. 6)

It is also the awareness that our existences require forms of recognition, forms to prove that we exist and that we have the necessary qualities to have the right to claim this or that aspect.

We are paper beings about to become data beings, surrounded by metadata. We may as well consider that it is a reticular weave that takes the place of enclosure as a solid base on which our existence can rest and bounce back.

7.1. The hyperdocumentalists of self

Marie Martel had defended the idea of considering “self-librarians”1 as a means of managing the amount of data and personal elements that can be captured during one’s life. Reacting in particular to the Total Recall project by Gordon Bell and Jim Gemmel inspired by Philip K. Dick’s book, whose goal is to be able to film every moment of one’s life so as to be able to review them at any time. Marie Martel reacted by reminding us that the fact of storing everything does not in any way ensure the relevance of the construction of a “document of life” as a work to be transmitted. She showed (Martel 2011) the stakes of a hyperdocumentation correlated to the construction of the future and the challenge for librarians and documentalists to seize the opportunities:

Collecting, organizing, indexing, storing and backing up represents a prodigious amount of work: how can we do it for our data, archives, photos, texts, statutes, our stories, our own book? It could be that, deprived of this science of memory, we could find ourselves somewhat lost and may well need this librarian of ourselves to succeed (the document of) our life. And then, as we will have participated in the production of the contents, we should participate in this process of managing and archiving our productions while making the decision to enter alive into the collections of our public library or archive center. And, before we die, we will soon make arrangements, as with our organs, to give scientific or public institutions permission to use our data. Librarians and archivists could be promised a great future as accomplices in our lives in documents, as trainers to guide us through public or Open Data projects for the good of our data, as artisans of a context and meaning for us and those who will succeed us, as guardians of the memory of our digital project inscribed in a framework that we will come to visualize within a planned future perhaps, with new clarity and distinction. (Martel 2011, author’s translation)

The radiant future that Marie Martel drew in 2011 does not really seem to be relevant 10 years later. Not only because intermediaries are still very much the leaders of the Web and have not stopped increasing their decisive and decision-making role on the issue of personal data despite legislation, but also because it seems that professionals have not been able to fully grasp its potential due to a lack of training and real aim in this area. It is true that, contrary to what Otlet, Briet and Pagès had wished, it cannot really be said that the “documentalist” vision has taken precedence over the “librarian” vision. The claim of libraries to want to fight against disinformation and to fight against GAFAM seems somewhat disconnected from reality. Firstly, because the quality of information is not always present and because the need for specialized professionals comes up against a vision that is too literary and not very technical, without being able to provide truly effective mediation and training. So how can we really manage to play a role in the construction of the “life document”?

Because hyperdocumentalists must be mediators as well as technicians, mechanologists as well as documentologists and even datalogists. But where are these professions now? How can they emerge when the representation of their institutions is often dated, when the success of mediation depends on a renewed confidence? We have the feeling that archivists have sometimes managed to better seize these opportunities by anticipating better and better the documents to be preserved in the present time to consider their future use.

Training issues appear to be crucial at this stage and the need to rebuild or reinforce I-school-type training courses is paramount in this respect, because it is clearly a matter of going beyond procedural visions and the question of the need for information to become part of everyday life.

The user was already registered in the library as a borrower, but it is the logic of the entire library that has in fact developed in all activities, including the most classic ones, without this necessarily being a cultural or intellectual activity. Amazon is a good example of this expansion of the library model as a content-rich online bookstore. The platform now sells just about anything and everything. The goal is to create a kind of world platform that can meet any need, not just the need for information.

This transformation finally resembles somewhat the situation of the hero in Haruki Murakami’s Kafka on the Shore, who invests the library as a living space:

‘There is one piece of good news, though,’ he says. ‘We’ve decided to take you in. You’ll be a staff member of the Komura Memorial Library. Which I think you’re qualified for.’

Instinctively I glance at him:

‘You mean I’m going to be working at the library?’

‘More precisely, from now on you’ll be a part of the library. You’re going to be staying in the library, living there. You’ll open the doors when it’s time for the library to open, shut them when it’s time to close up. As I said before, you seem to be a pretty self-disciplined sort of person, and fairly strong, so I don’t imagine the job will be very hard for you. Miss Saeki and I aren’t all that strong physically, so it’ll really help us out a lot. Other than that, you’ll just help with small day-to-day things. Nothing to speak of, really. Making delicious coffee for me, going out shopping for us. We’ve prepared a room that's attached to the library for you to stay in. It’s originally a guest room, but we don’t have any guests staying over so it hasn't been used for a long time. That’s where you’ll live. It has its own shower, too. The best thing is you’ll be in the library so you can read whatever you like.’ (Murakami 2005, p. 143)

In fact, it was the initial discourse of Web 2.0 to suggest that the user was at the center of the system as the controller of the system; “you control the information age” was the promise taken up by Time2 – but this was a deception, because the user is indeed at the center of the system, but as the object of all the attention and not necessarily the most sympathetic. The individual is not a simple borrower card or a police card, he is increasingly recorded and registered on multiple repeating occasions.

7.2. From the found friend to the “caring” lover

I asked several private detective agencies to find, in France or elsewhere, the trace of a young woman between twenty and thirty years old, tall, blond, with pale eyes, with a small spot under the right eyelid, a slight stutter; the information sheet also mentioned that she might have been wearing a perfume called ‘Sampang’, that she might have been called Véronique Lambert, that her real initials might possibly be E.B., that she had been brought up in the South of France, had lived in England and spoke English very well, had studied, was interested in archaeology, and that her mother, finally, was, or had been, a singer. (Perec 1980, p. 179, author’s translation)

An instrument of information research, and finally an instrument as much for the indexing of knowledge as for the indexing of lives, the data card does not totally disappear in our current interfaces. It covered different intellectual realities between the note-taking instrument which became more easily manipulable and the playing card as a library catalog card. Compilers selected it for its manipulatory potentialities, like Conrad Gesner (1516–1565), who gave it a higher value than anything else. The playing card later became the appropriate format and enjoyed an almost viral success. Initially used by the physicist and mathematician Georges-Louis Lesage (1724–1803) and then by Rousseau, the card came to be used for bibliographical questions as we have previously mentioned.

Much later, Paul Otlet and Henri Lafontaine worked on the standardization of records and launched an intense production of which he reminds the extent:

The number of books printed since Gutenberg reached 12 million at the beginning of this century; the production recorded in the Universal Bibliographic Repertory of the International Institute of Bibliography and Documentation, carried out in the fifth, authors and subject section, already contains 15 million records. (Otlet 1935, p. 387, author’s translation)

The transition to information technology and the integration of bibliographic standards within the MARC formats gradually distanced the cardboard card format from its representation in interfaces. However, they are still present, but their philosophy is changing, as we showed in the introduction in a passage that is that of the search for information in search of attention. Indeed, it is increasingly a question of putting forward a selfimage that seeks to arouse the desire to know more.

As far as social networks are concerned, the card technique used in a graphical interface can be traced back to the origins of Facebook with the famous Facemash “hack” created by Zuckerberg using data from the Stanford University yearbook. The device consisted in voting for the most attractive person by displaying the photos as if they were a playing card. The results were then compiled into a ranking.

This logic is now duplicated in online dating interfaces. Tinder does not hesitate to present profiles as playing cards. The platform centralizes user profiles through geographical proximity logics for the purpose of potential encounters. It is not a matter of searching for contacts by affinity or search field logic, but rather of capturing attention. Indeed, the goal is to encourage the potential contact to stay on the profile for enough time, presented in the form of a card so that the user mentions an interest via a “like” (a heart). Otherwise, the user can drag the card and never see it again (the “swipe”). The reciprocal like then leads to a match. In Tinder’s logic, the like and dislike are in fact transformed into playful forms, except that the basic data is only known to the device and not to the users.

The Grindr application described by Mélanie Mauvoisin shows this predominant attention-grabbing logic:

The interpretation in Grindr allows you to sort the information. In this sense, the construction of the application gives rise to a ‘picking’ logic, from which users reconstruct the digital identity of the owners of the profiles consulted. The same applies to the profile itself, which the user accesses after clicking on the clickable thumbnail (profile image). Through the profile image and the information contained in the profile, it is rather a matter of the user capturing elements or bits of information. (Mauvoisin 2017, pp. 279–280, author’s translation)

The data sheet is then reduced by minimizing the related information to privilege an attentional effect. On dating sites such as Tinder, the object is then “to attract more attention than to actually present oneself” (Mauvoisin 2017, p. 28, author’s translation). The stakes in terms of retrievability eventually diminish over time and remain the prerogative of intelligence services or Open Source Intelligence (OSINT) professionals. But these uses are decreasing, even if personal queries are still important on search engines. Technically, finding a person from a search engine has become commonplace. It is now a question of knowing whether it is useful to get in touch with an old acquaintance, whether it may be appropriate to follow him or her according to his or her publications, for example on Twitter or Instagram, and whether a professional contact may be appropriate on LinkedIn. Being found becomes easy, being interesting is much more complex.

The card is also used as an instrument of denunciation on social networks where it is a question of presenting a politician, for example in the form of a card with his photo and his name, highlighting the fact that he voted for a decision judged negatively by his denunciators. The promotion also allows the viral circulation of the card by sharing it, which will give new strength to the denunciation. This willingness to denounce “enemies” rather than friends is also increasingly evident.

The shift from information seeking to attention seeking is also coupled with a quantitative rather than qualitative consideration. The goal is to count likes, to allow “matches”, to facilitate sorting, rankings and comparisons. Profiles are types of forms that are filled in by the individuals themselves, but whose calculation is carried out by the managers of the systems. It remains to be seen whether they are calculators or new archivists.

7.3. Computing centers or archive centers

More and more emphasis is being placed on algorithmic processing as well as on the potential of artificial intelligence. The importance of corpora built up and collected over time is probably neglected.

While the overall impression is that of workflow management, the fact remains that digital archive management issues do arise. However, without this mass, it is not possible to apply algorithmic management just as it is difficult to apply recognition schemes in artificial intelligence.

It seems appropriate to recall that computer science and the devices it produces make it possible to consider it as an archival discipline, which Yves Jeanneret rightly points out (Jeanneret 2014). But this discipline of the archive does indeed involve the management and organization of its archives, and therefore its ordering and therefore its command:

The archons are first of all the ‘documents’ guardians. They do not only ensure the physical security of what is deposited and of the substrate. They are also accorded the hermeneutic right and competence. They have the power to interpret the archives. Entrusted to such archons, these documents in effect state the law: they recall the law and call on or impose the law. To be guarded thus, in the jurisdiction of this stating the law, they needed at once a guardian and a localization. Even in their guardianship or their hermeneutic tradition, the archives could neither do without substrate nor without residence. (Derrida 1995, p. 10)

Archive management assumes the exercise of a power that also requires a symbolic place, which represents that power. Yet these places are complex and multiple and as it happens almost indestructible, at least in their collective entirety. Places of power are exercised as much in the interfaces and visible devices as in the back office and the places of data storage, the datacenters. The logic of replication and duplication of content protects against an attempted assault on the castle. Dancing in the cloud makes it possible to guard against a partial attack, following the example of Bob Morane’s enemy, the yellow shadow (Vernes 1999), who cannot die because he has copies of himself everywhere on the planet.

The weakening of state authorities in this management expertise can also be observed in areas of sovereign power such as health data, which are increasingly entrusted to large IT and Web companies. It is to be feared that the issue of identity certification and perhaps even the enactment of related rights will end up being totally beyond the control of states, if not dictatorships. It is the role of the “consignation” described by Derrida that goes beyond the probabilistic issue to seek the articulation of meanings:

The archontic power, which also gathers the functions of unification, of identification, of classification, must be paired with what we will call the power of consignation. By consignation, we do not only mean, in the ordinary sense of the word, the act of assigning residence or of entrusting so as to put into reserve (to consign, to deposit), in a place and on a substrate, but here the act of consigning through gathering together signs. It is not only the traditional consignation, that is, the written proof, but what all consignatio begins by presupposing. Consignation aims to coordinate a single corpus, in a system or a synchrony in which all the elements articulate the unity of an ideal configuration. In an archive, there should not be any absolute dissociation, any heterogeneity or secret which could separate (secernere), or partition, in an absolute manner. The archontic principle of the archive is also a principle of consignation, that is, of gathering together. (Derrida 1995, p. 11)

From the defense of a unified science defended in particular by Otto Neurath, a new project follows, that of a unified prescience that claims not to do evil. This is not the end of science (Anderson 2008), but rather the antescience. Consignment is the key element that follows what Jeanneret called “requisition” and Stiegler called “retention”. Subsequently, the scientific project to study these transformations appears complex, because it is not only a question of producing an archaeology of knowledge and new archival institutions that are now the GAFAM, but to consider a retrospective that forces the researcher to anticipate imagining different scenarios. While the logic of prescience can imagine anticipating our existences, by mixing personal data, cultural data and DNA-type data by comparing them with schemas, it is a question of understanding the crucial stakes that consequently rest on existences.

Paradoxically, work, particularly in the human and social sciences, is increasingly seeking to denounce abusive forms of power, dominant cultural representations and systemic forms of racism in traditional institutions. If these approaches can be interesting, especially when they also examine the observable drifts of platforms, they run the risk of a splintering, of a hyperseparatism that aims to defend particularisms at the risk of being ultimately dominated by a cultural cosmopolitanism of platforms whose common and shared forms will reside in the media and in publication forms defined by the leaders of the web. Denunciation is ultimately carried out in the same technical and media frameworks as the abuses denounced. If criticism of the system becomes frequent, it is nonetheless constitutive of the system that is denounced. However, what is at stake is above all the questioning and study of the system’s infrastructures, because several different forces are at work there, unless one wants to produce only simplistic visions, at best ideological, at worst conspiracist.

It is not possible to decouple messages from their context of production, enunciation and exchange. The archive is not only the content and the document preserved, but something more, marking an exteriority as described by Derrida. Unfortunately, it is often difficult to be able to find and consult this something more, as the archive is engrammed in writing and archiving processes from which it is not fully possible to remove it without loss, as the data recovered is reticular, even rhizomatic, to the point that tearing it out is as much a form of liberation as it is a form of killing.

The archivists of our lives have now changed. Even the possibilities of delivering our data are difficult despite promises of interoperability. The transmission of archives reduced to their personal and individual unity often proves disappointing, as they lose their collective dimension. You can download elements that have been written and liked, but they are often archives without depth as if they had lost their collective and interactive dimension. The production context is deprived of interaction despite metadata that allow dating the moment of the message's transmission. Anyone who has ever tried to upload their personal data to Facebook or Google experiences this disappointment.

There occurs an archival evil, which is that of a lack, of a form of tearing that is produced. This means that one should succeed not only in transmitting data or documents, but in transmitting them with their “fluid”, their existential process.

If we imagine processes that retrieve personal archives in order to transmit them to the organizational level, it is probably necessary to go even further than the OAIS (Open Archival Information System) standard, which anticipates format changes and guides the processes to ensure the best conservation of the documents produced. The individual and organizational temporalities differ in particular in the ante and post.

7.4. Post-mortem hyperdocumentation

The documentality of existences takes place before birth and continues after death. What to keep or preserve, and how to manage this succession? Hyperdocumentation does not cease with death. Several forms are thus conceivable:

– The first one, which corresponds to a transmissive hyperdocumentation, is simply based on a documentary transmission that takes into account the documents produced during one’s existence, those that have been acquired and that one wishes to transmit. In this case, it is necessary to take care to transmit what is really desired, which means not to neglect hidden corners of one’s own documentation, as in the example of Ulrich’s father in The Man Without Qualities:

It was a loose bundle of all kinds of things that had inadvertently been pulled out with the will from a corner of the desk drawer where it might have lain for decades without its owner knowing. Ulrich looked at it distractedly as he picked it up and recognized his father’s handwriting on several pages; but it was not the script of his old age but that of his prime. Ulrich took a closer look and saw that in addition to written pages there were playing cards, snapshots, and all sorts of odds and ends, and quickly realized what he had found. It was the desk’s ‘poison drawer.’ Here were painstakingly recorded jokes, mostly dirty; nude photographs; postcards, to be sent sealed, of buxom dairy maids whose panties could be opened behind; packs of cards that looked quite normal but showed some awful things when held up to the light; mannequins that voided all sorts of stuff when pressed on the belly; and more of the same. The old gentleman had undoubtedly long since forgotten the things lying in that drawer, or he would certainly have destroyed them in good time. (Musil 2011, p. 126)

Nowadays, the risk invoked in Musil’s quotation is probably located in the spaces of machines, as well as in clouds or various applications that can be traced by recovering certain passwords. The same address, a similar password and one can discover data that should have remained hidden. Knowing that it is possible to find out if an e-mail address is on hacked data from this or that application, it is difficult to think in advance about all the data that may have been left in a particular place. Therefore, it is impossible to have complete control over the data and documents one wishes to transmit. Nevertheless, one can simply imagine that the transmission is optimized with methods of classifying what is transmitted with the possibility of also using a search engine in addition to the possibilities of tree navigation. But all professions are not necessarily equal at this level, knowledge workers and especially researchers and writers are large producers of documents. Some may be tempted by the destruction of manuscripts and drafts. Increasingly, however, these drafts and versions are found on hard drives or in cloud spaces. The writer and publisher François Bon rightly wondered whether he should consider transmitting his hard disks, since it is ultimately a question of preserving documents, but also instruments.

– Reactive hyperdocumentation brings the past to life somewhat by relying on the first level described above. But in some ways it consists of making sense by linking documents and data together to make the disappeared being reappear somewhat. Audiovisual documents are often of greater emotional importance in this context, as they allow the deceased to be seen and heard again and again. If one uses modes close to holograms, one approaches then the invention of Orfanik which managed to “revive” the singer Stilla:

By means of glasses inclined at a certain angle calculated by Orfanik, when a light was thrown on the portrait placed in front of a glass, La Stilla appeared by reflection as real as if she were alive, and in all the splendour of her beauty.3 (Verne 1900, p. 209)

As with celebrity news stories, it is also possible to include documents that talk about the deceased. This work can be done with documents that precede or follow the death. At this level, these are elements that can now accompany research archives and especially researchers when it comes to reconstructing the originality of a thought. Tributes, writings and colloquia that mention a deceased author or researcher are part of this post-mortem hyperdocumentation, the challenge of which is then to successfully connect the splintered fragments that may exist. Bibliographic analysis tools can be used in this context to measure the continuities and influences of an author or researcher years after his or her death.

  • – Total hyperdocumentation. Coming from science fiction, the objective would be to capture all the moments of an existence in order to be able to store, preserve and reactivate them. One can obviously think of the Total Recall project, but also of the idea of being able to produce a film of one’s own existence as in the film The Final Cut where Robin Williams does the work of a documentary filmmaker of the deceased’s existence from an implant that captures all the moments. But in this framework, we finally find logics close to the first level, where it is often a question of making a selection that will put aside the most difficult, even the most ignoble moments. The idea of the life film, as a “document of life” as Marie Martel described it, hardly manages in any case to really go beyond the second level of a reactive hyperdocumentation. Total hyperdocumentation presupposes, somewhere, an externalization of thought through a complex documentary and informational system. It cannot really be a reduction, and it requires a long time at the level of its examination. If we think, of course, once again, of Paul Otlet and his Mundaneum, which we know is also, in more than one way, an Otletaneum, these hyperdocumentary forms fascinate science fiction as much as they do many Silicon Valley actors.

Hyperdocumentation then joins other more complex destinies that face a certain number of limitations. According to Otlet, access to knowledge allows one to better understand the world and oneself, but also to rebuild the world.

7.5. Post-human hyperdocumentation?

The hyperdocumentary project conceived by Otlet questions the limits of what can be done with knowledge as well as with the instruments that make it possible to manipulate it. To do so, it is probably necessary to go beyond the traditional disciplinary boundaries and much more according to Otlet:

The greatest experimental discoveries were not made in the field of the ancient, well-recognized sciences, but in the frontier zones, the no-man’s-land of sciences. (Otlet 1935, p. 360, author’s translation)

From then on, the limitations are those of borders, as defined by posthuman theories that evoke the High Frontiers, from the original name of the magazine Mondo. The high frontier corresponded to a logic of spatial colonization and referred to O’Neill's work (O’Neill 1976). By extension, the logic consisted in being able to push back the limits of the impossible and the acceptable thanks to new instruments:

Modify human beings to solve enigmas. The ultimate enigma of the universe (why all this is, why is it the way it is), this enigma calls, pushes and directs more and more men today; it pushes, calls, directs them to solve, to focus all their efforts on science and philosophy. Speculative methods being impotent, new experiments are born. And those that modify the human being himself, will enable him to conceive other relations between himself and things and thus to deepen and enrich his notion of being. Thus the epic and adventurous life of man and humanity towards ‘the never seen’ and ‘the unheard of’, leads to a closer knowledge of the word of the enigma. What would we know about the universe if America and the poles had not been explored, and if children playing with glasses had not suggested their lenses and telescopes to Huygens and Galileo. (Otlet 1935, p. 398, author’s translation)

For Otlet, the goal was to get closer and closer to knowledge and truth. We find this vision in the symbol of the Institut International de Bibliographie created by Otlet and Lafontaine, which is accompanied by a Latin motto “qui scit ubi scienta sit, ille est proximus habenti (who knows where knowledge is, is close to having it)”. 4

Finally, Otlet mixed an approach that was that of the search for information, the fact of exercising requests and finding answers to them through knowledge organization devices such as those he wished to develop, with a more general logic that is that of a quest for truth that is allowed by science and the realization of new instruments.

Proximity then joins the possibilities of increase so that the organology of knowledge tends to merge with the human organism itself:

A less absolute, but still very radical hypothesis would suppose that all the knowledge, all the information could be made compact enough to be contained in a certain number of works arranged on the work table itself, therefore at a distance from the hand, and indexed in such a way as to make consultation as easy as possible. In this case, the World described in the set of Books would really be within everyone’s reach. The Universal Book formed by all the Books, would have become very approximately an appendix of the Brain, substratum itself of the memory, mechanism and instrument external to the mind, but so close to it and so apt for its use that it would really be a kind of annex organ, exodermal appendix. (Let us not push away here the image that the structure of the hectoplasm provides us). This organ would have the function of making our being ‘ubiquitous and eternal’. (Otlet 1934, p. 428, author’s translation)

This fusion between the document and the instrument that Otlet described in the fifth stage of documentation as a condition of the ultimate stage of hyperdocumentation is also understood as a logic that will allow another fusion, that with man himself, which makes him a truly increased man, who is given Promethean powers not only to understand the world and to access knowledge, but to be given demiurgic powers to remake the world and envisages a new cosmogony:

The ultimate problem of scientific knowledge: to know so well all reality, its beings, its phenomena and its laws that it is possible to disintegrate everything that exists, to reconstitute it, to order it in different ways. The ultimate problem of technology: One man having only to push a button so that all the factories of the world, perfectly adjusted to each other, start to produce all that is necessary for all humanity. (Otlet 1935, p. 390, author’s translation)

If Otlet perceived the idea of improving the world towards an ideal of peace and access to knowledge for humankind, the new augmented human he saw seems somewhat calculated on his own ideals. The most important question that Otlet finally expressed: who is this human?

For Otlet, it seems that the Promethean human, omniscient and omnipotent, is the one who finally manages to reach a higher stage, as in a form of quest. However, this removes the organizational processes with which Otlet was familiar, as well as the management processes that rely on people who “record” information and data in order to be able to process them and make decisions. This obsession with the total and totalizing view found in Otlet’s encyclopedic approach, particularly in his museum projects with didactic aims and in his description of the need for real-time accounting statements that embrace the situation, refers to a managerial ideal whose most emblematic artefact is probably currently the smart city.

Documentary ideal that merges the card and the city, it becomes a new milieu that requires new ways of living. Above all, it requires finally going beyond Otlet’s desire to develop a science of the document, a documentology to correlate it with other studies, organology, as a study of various organizational and institutional forms, but also mechanology as a study of instruments, devices, and as strange as it may seem a cosmology, as a study of the monads and representations of the world that are being constructed. Henceforth hyperdocumentalists are not mere observers, but programmers. However, one must understand what the programming logic is, its presuppositions, its underlying and inherited forms, its conceptual and encoded architects, the forms that one tries to impose and replicate. Cosmology comes here to study ideology and its declinations as they may have been written, programmed and increasingly “engrammed”.

The point appears important here, since the risk of an apparently optimized functioning increases the effects of evidence and the naturalization discourses of technical devices and objects. On this point, Flusser was a forerunner among the main actors of existence design:

The real post-industrial society – bureaucracy – will be automated and invisible and efficient. Human and automated officials will function as the cogs inside a slush fund. On the surface, it will be the slush fund programmers who will govern. Technocracy. In reality, it will be the machines themselves that will govern. The programmers themselves will be programmed to program the devices, and they will be programmed by other devices. In reality, it will be a ‘trans-human’ society. (Flusser 2019, p. 68, author’s translation)

While we had previously wondered about Homo documentator, sometimes professional documentors, sometimes simple individuals documenting their lives, Flusser considers that technocrats (who must be distinguished from simple technical executors) are these new men, homines novi, especially when they turn out to be real programmers aware that they have built a virtuality in which they are the players who manipulate programs, codes and symbols. They finally correspond to the new hyperdocumentalists of our existences by documenting the existential process of individuals, but also by “programming” them sometimes without their knowledge.

  1. 1 Martel, M. (2011). Le bibliothécaire de soi, le document d’une vie. Bibliomancienne, May 16 [Online]. Available at: https://bibliomancienne.com/2011/05/16/le-bibliothecaire-de-soi-le-document-dune-vie/.
  2. 2 Cover of Time Magazine, December 25, 2006. Vol. 168, No. 27/28.
  3. 3 “We find ourselves somewhere between science and the imaginary with the realization of ‘ghost machines’” (Bauduin and Berton 2015, author’s translation).
  4. 4 The ancient proverb was brought up to date by Ferdinand Brunettière, who used it in 1998 in his Manuel de la littérature française to refer to the bibliography of the works on which his work is based. Brunetière, F. (1898). Manuel de l'histoire de la littérature française. C. Delagrave, Paris.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.217.101.72