3
Hyperhuman or Hypermachine?

By the machine man makes up for the lack and defense of the organs of his body. By the written word, he makes up for those of his spirit. By the one and by the other, he equals himself, he becomes fungible and universalized. (Otlet 1935, p. 374, author’s translation)

The machine is often presented as an instrument to improve organizational efficiency and increase production. In documentary logic, it is a question of improving the treatment and the documentary circuit in an almost Taylorist or Fordist manner, if we take up the descriptions given by Paul Otlet or Suzanne Briet. The aim is clearly to improve documentary processes by drawing inspiration from the high-performance tools currently being developed in the industrial and administrative spheres:

In front of our eyes, a huge machinery for intellectual work is being built up. It is constituted by the combination of the various existing particular machines of which, in spite of the individualism and particularism of the inventors, the necessary connections are glimpsed. This machinery is today almost exclusively at the service of industry, commerce and finance. Tomorrow it will be put at the service of administration and scientific work, and then marvelous general results will be gathered from it. (Otlet 1934, p. 387, author’s translation)

The development of infrastructures for scientific and technical information appears more and more necessary at the time of the drafting of the Traité de Documentation. While Otlet cannot hide his enthusiasm to imagine such achievements for documentation, this very industrial vision deserves nevertheless some questions. This is what Robert Pagès tried to achieve 10 years later. He sought to integrate machines into theory, beyond enchanted or, on the contrary, disturbing discourses. They are thus fully part of the documentology that he wishes to construct as theory or document science:

The first effect of the current industrialization of intellectual work (often inspired by industrial administrative techniques) is to alter the old intellectual craftsmanship of both the researcher and the librarian, and to transform both into fragmented workers, not individually holding their means of production, linked to an increasingly massive and complex physical and social machinery over which the majority of them have no control, but which is nevertheless susceptible to ‘planned’ organization as never before. In front of this ‘machinery’, there is no recourse to a consciousness or an intelligence that ‘exceeds’ it, because it is this machinery that is precisely the main and most efficient factor of the current human intelligence. Rather than moaning about it or dreaming about lost paradises and Leonardo da Vinci (lost geniuses), it is better to analyze the process, in order to seek an adapted, effective reaction. (Pagès 1948, p. 63, author’s translation)

This loss of control judiciously described by Pagès questions the relationship between man and technology, and in particular man’s relationship to machines. Otlet also considers organizations as machines. Although he gives them exciting perspectives, he warns against many of them.

In the first place, there are the machines whose instrumentalization is that of war:

Living next to these machines becomes frightening, transforms existence into a nightmare. (Otlet 1935, p. 460, author’s translation)

Secondly, he considers that certain organizations or administrations can be formidable machines:

The administration then appears as an enormous apparatus, a machinery that it is important to know how to control so that it does not in turn make itself master of its masters. (Otlet 1934, p. 351, author’s translation)

It seems opportune therefore to think of hyperdocumentation in a systemic that questions and categorizes the different types of machines in which the processes that compile and manipulate data and documents take place.

3.1. Desiring machines?

What machines are we ultimately talking about? If Otlet considers the book, the library, as a form of machine whose processes it is possible to divide into successive tasks or stages, it remains to identify more clearly the systemic objectives of these “devices”, these “machinic arrangements” that ultimately precede the technique itself.1 The challenge lies precisely in the anima or breath that impels the arrangements to give life to the whole.

Deleuze and Guattari had put forward the existence of “desiring” machines to describe these industrial and informational devices that developed in the second half of the 20th century:

Desiring-machines are binary machines, obeying a binary law or set of rules governing associations: one machine is always coupled with another. The productive synthesis, the production of production, is inherently connective in nature: ‘and...’ ‘and then...’ This is because there is always a flow-producing machine, and another machine connected to it that interrupts or draws off part of this flow (the breast–the mouth). And because the first machine is in turn connected to another whose flow it interrupts or partially drains off, the binary series is linear in every direction. Desire constantly couples continuous flows and partial objects that are by nature fragmentary and fragmented. Desire causes the current to flow, itself flows in turn, and breaks the flows. (Deleuze and Guattari 1983, p. 5, author’s translation)

Where there is an expressive desire is in the quest for data flows, without which the systemic cannot be operational. Here Deleuze and Guattari describe a phenomenon of naturalization that confers on machines human and animal qualities that are, after all, basic. A vision that is not totally different from Flusser’s description of a media animal with similar capacities and almost infinite digestive properties, a Vampyrotheutis like a media Cthulhu:

Until all life on Earth disappears once again, whether by catastrophe or because the general tendency of nature towards disorganisation will re-establish itself. This is an infernal, vampyroteuthian vision. Such is the ‘model’ of life proposed by biology, and with this model it prepares itself to initiate a new ‘industrial revolution’, the one of biotechnology and of genetic engineering that will replace inanimate machines with animate machines and apparatus. Thus this model is nothing but the skeleton of the vampyroteuthian being-in-the-world. The present fable tried hard to cover this skeleton with the muscle of human suffering, with the fluids of human desire and the nerves of human sensibility and intelligence. And it is thanks to this rich biological model that the present fable hopes to be able to exorcise Vampyroteuthis, and to make him emerge alive. (Flusser 2001, p. 124)

Only the naturalization of the device is also a means of defending the system and its legitimacy. If Deleuze, Guattari or Flusser describe an organic system close to the digestive system, this implies a metaphor that is not very rewarding. However, current discourse tends to naturalize the system to make it an argument that is quasi-ecological or inscribed in sustainability. Better still, the ideal is then to bring the machine as close as possible to physical and bodily activities to the point of no longer being able to distinguish it. Finally, the ultimate point of the tension between increase and reduction occurs in this objective of “indistinction” between the different actors and players, between machines and living beings, since the man-machine continues to molt.

How can we finally qualify the processes that couple systemic flow logics with bodily logics via sensors that allow us to record this fluid mechanism?

This man-machine coupling concerns aspects of the monitoring type as much as aspects which rest on territories close to the unconscious or at least the territories of desires. Difficult furthermore to fully qualify them as desiring machines, even if the necessary flows are present. If we try to take an example from these machines, for example the pornographic platforms that instrumentalize the indexing of desires, they are specifically desiring machines insofar as they are aspiring and inspiring. Aspiring, because the objective is to capture and therefore to aspire a maximum of video contents that are not always legal and therefore to propose a choice, a catalog of possibilities that it is possible to view. This is the basic model of Web 2.0.

But the goal is also to be inspiring, that is to say, to become able to promote ways of seeing and thus produce trends or at least new ways of describing the world. In this, we find a classic extension of advertising methods and strategies that consist equally of impelling, generating and detecting trends, to the point that a whole sector of activities and professions is now devoted to these issues, as Devon Powers shows (Powers 2019). But the objective is to go further in the indexing of desires, as we will see in Chapter 8.

The monetization of desires, including the most hidden desires, constitutes an extremely serious territory that deserves analysis. Here, the machinery consists more and more of getting the user to participate so that he does not remain a simple video consumer. Live services are integrated into the platform, consisting of live-viewing or even participation. The goal is to imagine a continuous production in which one is likely to take part, either through fantasy or through active participation.

Once again, it seems to remain in the Deleuzian descriptions:

The rule of continually producing production, of grafting producing onto the product, is a characteristic of desiring machines or of primary production: the production of production. (Deleuze and Guattari 1983, p. 7)

The challenge is to produce vitality around the machine.

3.2. Typology of hyperdocumentary machines

What remains to be studied is the new informational and media ecosystem formed from different machines and their organology, which is that of hyperdocumentation. Undeniably, the evolution of machines and their current study presupposes a renewal of analysis that requires not only observation but also participation in their design, especially for those involved in digital humanities or digital studies. Bernhard Rieder thus sees the need for a new “mechanology” (Rieder 2020) that follows on from the work of Gilbert Simondon. We therefore consider that it is as much necessary to study as to realize these mechanical agencies according to new declinations which can be based on the following typology:

  • Delirious machines: in the line of the logic of desiring machines, of which Deleuze tells us that the fact that they can be derailed is part of the overall logic, it is tempting to envisage delirious machines that thus lose all rational sense, at least in appearance. Here we can find several cases. First of all, that of the machine that ends up doing something other than what it was initially asked to do and that finally goes outside the initial programming logic. Symbolized sometimes by the computer HAL, the machine changes logic of use to operate a reconfiguration of possibilities which seems to it to be more favorable without that being necessarily the case for the human. The machine can moreover become delirious, but remain perfectly rational, even hyperational by making choices which could seem contrary to ethics or at least to morals. The delirious machine may be a machine to which too much power has been delegated in the decision-making process to the point that it makes decisions that are potentially dangerous to humans. Algorithmic processes are used to define decision-making criteria that have been established by humans. But the multiplication of criteria and the possibility sometimes offered to improve sorting processes by machine learning ends up distancing humans from the decision-making process. The delirious machine is also this machine that etymologically deviates from the groove (lira in Latin) initially traced. It leaves the writing process, rewriting the boustrophedon to trace its own furrows and set its own limits.
  • Delivering machines: the case here is a priori the opposite of the delirious machine which ends up delivering itself from the initial conditions imposed. The delivering machine returns to the very principles of the machine that increases the potentialities of man. It frees up time for the mind or for physical exercise. We then join this lineage of intellectual machines and projections around work stations. Delivering machines are never guaranteed to remain so. Everything depends on the potentialities of evolution of the machines themselves and of those who use them. Gilbert Simondon shows that such objectives are only possible through the setting up of associated environments, that is, environments that are sufficiently open and delineated to avoid the phenomena of manipulation and proletarianization. Bernard Stiegler showed that associated environments could increasingly be short-circuited to produce dissociated environments (Stiegler 2006).
  • Derivative machines: placed by essence in derivation, these machines end up being so much placed at the end of the process that we no longer really know where they are located, to the point where they end up turning the process around, sometimes becoming new centralities. Derivative places them at the margin and then allows a salutary bifurcation that leads to forms of progressive autonomy, offering innovation new avenues that can change the initial situation to the point of producing new centers that gradually manage to capture data. Search engines were formed in this way, as a derivation of the Web. It is conceived as a kind of measuring and locating instrument, like a paper index that is placed at the end of the book, or even outside an encyclopedic work such as the index produced by Jean Hauftfuney and his team on the historical speculum of Vincent de Beauvais (1184/94–1264). In the case of the tabula by Jean Haufuney (13.–1358), its transformation into a printed version was produced in the form of an integration into the printed version of the work of Vincent de Beauvais. The innovation carried out at the margin allowed its integration with the original work in a form renewed by the printing house. The derivative machine ends up producing a new derivative and augmented work. Bibliographies of the universal type will function in the same way in relation to documentary flows by constituting themselves in the margins of a tedious bibliographical work, but which ends up producing a new centrality, because the bibliography can thus constitute a starting point, a new centrality in the accessibility of knowledge. The bibliotheca universalis by Conrad Gesner (1516–1565) represents this phenomenon well by responding to several initial “desires”: that of the greatest possible accessibility and that of saving time. Library catalogs will produce the same effect with a reinforcement made possible by the development of a new knowledge environment: the physical place which is then constituted by the library. It should be remembered that the word biblioteca was most often used in these early applications to designate, not physical places, but lists or directories of books, which we translate today as “bibliography”. The derivative has produced a centrality that has materialized. Search engines took the same path so that machines like Google reversed the initial problem to the point of becoming a control point, whereas the engine was only a derivative machine. Today, it is centralizing, and in turn produces derivative machines that it tries more or less to control through its APIs. But it is not unthinkable to imagine that other derivative machines could in turn return the centrality of the moment. At the informational level, the flip-flop movement of the index shows that centrality moves from content to metadata each time, which Michael Buckland explains well:

The first and original use of metadata is for describing documents, and the name metadata (beyond or with data) along with its popular definition, ‘data about data’, are based on this use. A second use of metadata is to form organizing structures by means of which documents can be arranged. These structures can be used both to search for individual documents and also to identify patterns within a population of documents. The second role of metadata involves an inversion of the relationship between document and metadata. These structures can be considered infrastructure. (Buckland 2017, p. 120)

More simply, it is possible to produce, if not derivative machines, then the means of obtaining information by derived or diverted means. Health monitoring application projects that seek to alert that one has been in contact with a virus carrier can lead to many drifts, which are not due to the creators of the application or to the authority that would control the device. It is thus possible to know if a future employee is potentially affected by placing a specific phone during the interview.2 These examples are important because they demonstrate that it is very often possible to retrieve information or personal data without necessarily being the creator or manager of the device.

Professionally, the fact of derivation makes it possible to envisage evolutions in terms of business. The profession of documentalist is thus a derivation of that of librarian and according to Pagès, the derivation is such that the librarian is only one type of documentalist. According to Robert Pagès, it is therefore possible to envisage new derivations that will allow us to reach higher levels of competence:

From the very fact that he does not to have to create new documentary ‘content’ (of an empirical, extra-documentary origin) the documentalist is led to create of ‘content’ of an ‘interdocumentary’ type. Because of the symbolism which he has created (e.g. classification), he may become a researcher of a superior type – combinatorial. But this could only be through a lateral outcome of his current principal function as distributor, an outcome which would surely be accompanied by a division of labor. (Pagès 1948, p. 62, author’s translation)

If Pagès describes here the fact of developing complex and potentially more efficient intermediate tools, it is indeed the idea of developing a combinatorial art that allows us to create inter-documents (like punched cards) that will be able to be combined afterwards. Rober Pagès had thus developed an entire system in this sense with the CODOC or coded analysis that he created in 1954 to apply in his laboratory. The combinatorial artificial language was inspired by the work of Georges Cordonnier and his Selecto system of cards, which was a system of optical reading of cards with perforations. The cards work on mechanographic logic.

Today’s semantic documentary products enable the creation of new forms of inter-document that link data and document repositories from different institutions and organizations. It remains to be seen who are and who will be the leaders of these metadocumentary machines, which always end up imposing a form of centrality, that is, a starting point reflex to search for information that is fragmented in several places. Although there are several players in the field of scientific information research, it is clear that there is also a strong presence of major web leaders such as Google, who sometimes manage to produce more powerful tools. One can think in particular of the recent tool that lists open data, which finally manages to list sets of data more efficiently than institutional alternatives. The same goes for Google Scholar, whose metadata quality is sometimes questionable, but which has opted for exhaustiveness or near-exhaustiveness. While other tools provide more adequate metadata, it is clear that documentary work leaves more and more room for improvement in terms of scientific and technical information. All of this implies choices in terms of scientific and technical information, at the political and professional levels. Finally, it is not so easy to become an indispensable entry point for information, even if it sometimes seems technically easy to harvest warehouses.

Where this gets more complex is that becoming a centralizing machine implies costs, because accumulating documents or data is an expensive activity:

Accumulation introduces irreversibility: its cost is always high. It is never free. Competing with it therefore requires resources that are at least as considerable. But it is now possible (i.e. with the advent of computing) to make it more subtle. (Robert 2010, p. 50, author’s translation)

Robert specifies that it supposes a logistic reason, which is undeniable, and requires metamachines or rather megamachines which are infrastructures. These infrastructures are storage places (datacenters), but also calculation places. If we use Bruno Latour’s expressions, they are gigantic computing centers.

Whatever the type of machine, more-or-less accepted or well-experienced issues of filiation are played out and tied if one wants to place oneself in the Deleuzian demonstration which then foresees points of disjunction, which we would probably call attempts at disruption if we wanted to fit into a fashionable vocabulary or to refer to the work of Bernard Stiegler (2016).

The derivations can also be the result of a change of support or even a quasi-civilizational rupture. This does not mean that a tabula rasa takes place, on the contrary:

Finally, still on this same line of slope, a higher dimension intellectual technology can always reappropriate, include, use a lower dimension intellectual technology without denying it as such: it is a process of hybridization in a way. Inserting a list in a printed book is less worth disappearing from the list than a way of sublimating it. It is a mutual enrichment: the printed book takes on the value of facilitating the reproduction of lists that are now easily comparable; the list acquires a new value in the very efficiency of the comparison. (Robert 2010, p. 45, author’s translation)

It is here that we must remember that what resists and transforms, or even ultimately evolves, are those forms that are the archi-text, which must be understood as a medium of power as much as of knowledge. The archi-texts are these pre-written frameworks in which we exercise our actions of writing, communication and documentary production, these forms written by others in which we try to write in our turn (Jeanneret 1999).

Several players and companies are trying to secure their positions on centralizing devices while betting on derivative machines. This is the case of developments made possible by APIs. If we take the case of Twitter, most of the innovations have taken place within this framework. Except that the virtuous circle of the associated environment is never guaranteed. In fact, openness allows the externalization of new applications and the possibility of testing new innovative potentialities without really assuming the internal costs. Several scenarios are then possible: purchase, if the functionalities are deemed promising and can then be integrated; short-circuiting, which consists in directly integrating the functionality; or the strategic limitation of access to the API, to force the user to accept an advantageous deal. This strategy allows Web leaders to minimize the risks of a derivative machine that could then recover forms of leadership. The API environment is therefore a semi-open environment, seemingly welcoming, but which can close itself off like a carnivorous plant.

Here, therefore, we see that the game is by no means fair, since leaders are in a position of strength to actuate or control the flow of information and documents. They can only militate for a strong accessibility of open public data since they are capable of developing powerful tools to exploit it. Google’s latest tool, Google Data Public Explorer, is exemplary in this respect, as it is far more powerful than the hundred or so databases spread around the world. The problem is that in the opposite direction, we only have limited access to the data from Google and its parent company, Alphabet.

The universe machines that these leaders are trying to develop are a source of imbalance in our innovation potential. Their goal now is to anticipate potential competitors in order to be able to “swallow” them. However, there remains a limit to this model, which is that it is part of a pre-existing or at least pre-written model on which the derivative machines are based.

However, this strategy forces us to think with bases theorized and concretized by others. We find ourselves innovating with what is already in place, or even thinking with what we already know. Pierre Lévy showed in 1987 that the models that seek to internalize, mathematicize and finally translate our knowledge universe into computerized form are in fact limited; they do not necessarily allow the production of new knowledge, and if they were able to do so, it is not certain that we would be able to recognize its true value because of our theoretical models.

The time of facts takes place within a problematic, while the passage from one world to the other scans a temporality of another order. According to Heidegger, let us call historical the time of effectiveness or calculation within the domain controlled by the same transcendental, and historical that of the modification of problems:

  • – the historial designates the mode of alteration of the transcendental region, the successive choices between the possible worlds;
  • – the history describes the sequence of operations within a chosen world.

But this last formulation of the difference of temporalities is incorrect: the transcendental is properly unselectable, indeterminable, since it does not pre-exist its advent. A problematic, before it actually projects a world for the subjects it informs, is rigorously impossible, because it is not contained in any enumeration, however infinite, and is not determinable from any combinatorial of simple elements. The historial lies in the gap between the various ways of questioning being. It suggests the coexistence of the impossible, the time of the incalculable. The historical, on the other hand, indicates the dimension of time in which events are interdependent and the impossible never happens. (Lévy 1987, p. 28, author’s translation)

It is not certain that quantum computing can change the game for all that remains incalculable, at least not by machines. As far as Artificial Intelligence is concerned, all too often human hopes are placed on it, when devices are based on patterns, and require masses of human traces to function, and a great deal of human regulation as well.

3.3. Towards hyperdocumentality?

Documentality shows that the document is the essential element for human societies. The example of our daily use of cell phones, which are writing and recording devices that multiply documentary forms and the massive production of metadata, shows us that we are now in a regime of hyperdocumentality as most of our actions are recorded from the most futile to the most intimate, let alone our actions as professionals or citizens on the various information systems.

Hyperdocumentality joins hyperdocumentation by constituting this breeding ground in which the least of our actions is recorded, at least potentially, because we try as best we can to resist. Our permeability to the recording of our actions tends to increase over time for reasons that are based as much on habit and on a growing acceptance that is linked to the presence of services on a daily basis, but also ultimately to the discretization of surveillance potentialities.

Consequently, the powers that are exercised there are complex, as they are carried out amid a growing overall distrust that questions the most traditional forms of power and that is beginning to reach the new ones, especially those of the big web companies, especially in view of the various revelations that have taken place. Attempts at legislation and the new conditions proposed to Internet users are proving to be a source of a fable of consent rather than its real fabrication. We don’t really consent even if the general conditions of use are quickly pointed out to us and finally imposed if we want to have access to the information and service. From a constraint of compliance with GDPR type rules for web actors, we have moved to a constraint of signature and accessibility for users. Worse, sometimes the signed conditions are more restrictive than the previous ones, those that were cordially ignored.

Mistrust and even distrust are not sufficient resources for serious resistance, except to see the triumph of generalized doubt, conspiratorial thoughts that go in all directions and aim to put everything on the same level, absurd thinking, delirium, jokes, lies and finally all other less-than-rational thoughts. The current documentary and informational regimes mix different approaches. An examination not very far removed from the time of hyperdocumentation finds it difficult to distinguish what comes from the “old” regimes from the new ones, whose novelty seems so incessant that it constantly calls into question the established powers. However, if we look a little more distant, we can better perceive the divisions and continuities, the resistances and the real turning points.

  1. 1 Deleuze and Guattari even specify that the machine should not be reduced to a tool: “Desiring-machines are not fantasy-machines or dream-machines, which supposedly can be distinguished from technical and social machines. Rather, fantasies are secondary expressions, deriving from the identical nature of the two sorts of machines in any given set of circumstances” (Deleuze and Guattari 1983).
  2. 2 A group of researchers has thus produced a document explaining the risk of tracking applications even if they are dedicated to public health issues. Available at: https://risquestracage.fr/.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.216.37.123