© Anastasia Utesheva 2020
A. UteshevaDesigning Products for Evolving Digital Usershttps://doi.org/10.1007/978-1-4842-6379-2_1

1. Digital Evolution

Coevolution of Humans and Technology
Anastasia Utesheva1 
(1)
Mullumbimby, Australia
 

We are a product of what we create, as much as what we create is a product of us. The last two centuries have brought into sharp focus a curious relationship between humans, cultures, ways of thinking, and the technology that we create, appropriate, and normalize into our lives.

Digital technology is evolving at such a phenomenal rate that shifts in global social dynamics can now be seen in less than 12 months. There is increasing public recognition of the exponential nature of digital progress and innovation. One of the most obvious examples is the reflection of Moore’s Law in technological progress through the observation that, since the 1960s, the number of transistors in a dense integrated circuit has doubled roughly every 2 years while the costs of computers has halved. Along with the changes in price-performance of computation (i.e., as cost per unit of production decrease and number of computations per dollar increase), more and more powerful digital technology is becoming globally available and adopted. Our mobile phones are more powerful now than supercomputers were in the mid-20th century. They are also far cheaper and more accessible. The social changes due to this trend of technological advancement are vast and often overlooked as the use of digital technology becomes a more and more constitutive part of our everyday lives.

Now we carry around more computational power than we once thought possible, and what we found ourselves doing with it has been equally as surprising. We are now able to reach, see, create, and influence more than previously possible in history. Digital media reflects our increased connectivity and our adaptation to new forms of being. We now live in a time where information is more obviously significant in our lives than ever before. It is what we crave, what we go to extremes to get, what we base our sense of self and identity on, and what we build our habits around. This increased desire for connectivity has enabled us to explore ourselves more fully than ever before, and our exploration of ourselves has aligned our desire for self-expression with the desire to design our technology to suit our needs (rather than to constrain us).

To comprehend the how and why of the social trends we embody and design for, we need to look at the changing nature of the relationship between digital technology and humans. Such changes are not limited to digital innovation or the increasing complexity of information. Global social dynamics are becoming more complex as a result of technological progress. We can now see and feel change within one generation. These are changes that go beyond the superficial and to the core of who we are. Lingering assumptions of gradual progress or long-term stability are becoming increasingly difficult to maintain. We constantly witness the emergence of social, technological, and cognitive paradigm shifts that reflect the exponential rate of progress and, as is often the case, its misinformation by-products that we have become wrapped up in. We now have more information to access, more to comprehend, more to create than ever before. And we cannot do this without technology. We cannot evolve in the way we want unless we comprehend the role that technology plays in our latest evolutionary step.

How does this all relate to designing products?

Types of Information

To design for a user is to design for a complex, almost foreign perspective of someone else or “other.” In order to do so with any success, we, as designers, need to look beyond the superficial and see this individual as part of a complex system: a system of interrelated and constantly evolving information types.

To help untangle the different types of information that constitute our new technologically enhanced selves, we need to look at four different, yet interrelated elements of our complex reality: biology, technology, culture, and cognition. These four types of information constituting our being are important to keep in mind in digital product and/or service design, as we are not static beings that do not change over time. We are in fact the opposite, and to forget that is to design products and/or services that simply do not resonate or add much value to users’ lives (or for long).

How might we design for purposeful and meaningful change and, most importantly, assume our full responsibility in creating and driving these changes?

We first need to immerse ourselves in the patterns of information we see evolving and deeply comprehend how and why these changes occur.

Let’s look at an example to illustrate this point. Consider the following technology interaction: a 37-year-old woman from New Zealand is playing a game on her phone. If we just look at the surface, we reduce our analysis of the interaction to what we can literally observe in front of us. Our view here: it’s simply a person who is interacting with a product. Adopting this reductionist view, we do not take into account that the user is female, of a certain age and nationality. We reduce our perspective of the interaction from someone who has a rich history, intent, and purpose to someone with thumbs who can interact with the phone. If we then only focus on functionalities within the game without considering who is playing and why it appeals to them over another game, we do disservice to product design again. At best, we know what the game does and how someone can play it, and design within these parameters of our focus. At worst, we make all the wrong assumptions when we redesign or change the game so that our design decisions make the game less appealing to this human, rather than more. If we do make this game more appealing, are we using positive reinforcement that betters the human, or are we using patterns of addiction to make it harder for the user to stop engaging with the game? Limited perspectives on evolving phenomena typically skew product design in an unhealthy and/or unethical way, and may be a significant contributor to how products become obsolete and/or become stuck on a path of incremental, linear, change.

Alternatively, we can look at the user as an evolving being on a journey that is unique to them, acknowledge their history, and strategically design a product/service to enhance their journey rather than hindering it or unintentionally skewing their behavior and/or perception. We look at the woman playing the game and ask: why is she doing it at this time, in this way? We would ask her to reflect upon her choices, help us (and likely her) gain a deeper insight into her internal decision making processes, and evaluate her feelings toward the product and its place in her life. From there we would be able to “get to” the root cause and effect of her interacting with this product, where it fits in her life, and how she feels about it. Unravelling core details about the lives of those we design for is fundamental, as the more we know about them and how they came to be, the more we are able to design products around their core drivers, the core themes that govern who they are and form their sense of self, and the identity that they develop to comprehend and operate in the world. More importantly, we can begin to answer the question: how did this person come to be who they are (and not someone else)? How might this product/service fit into their lives? How might we design a better product and/or service that fits their specific needs and wants?

To consider these crucial details and use them effectively in design, the exploration of four different evolutionary streams of biology, technology, culture, and cognition is a useful starting point. Biology refers to the physical form of living beings. Technology refers to any innovation that extends the “reach” of life beyond the baseline of biology. Culture refers to the information set pertaining to social interactions among members of the same and/or similar species, which were traditionally inherited through birth into the culture and now are converging in a single culture of the “digital global.” Cognition refers to the information sets and processes through which a being operates in order to comprehend and function in the world.

Biology is the most consistent of the four in its rate and method of change. Technology has been a constant companion of our species, enhancing our reach beyond what we may otherwise biologically be limited to. Technology here refers to both material (e.g., hardware, tools, machines, etc.) and immaterial (e.g., language, class structures, social norms, belief systems, abstract reason, logic, digital, etc.). The line between biology and technology is arguably growing more and more blurry over time, as the boundaries between human and digital dissolve through their ongoing entanglement (i.e., continuous coevolution). Culturally and cognitively humans have archives of historical data illustrating how and why we, as a species, have progressively been changing. The major shift in the 20th century that accelerated the observable changes has arguably been digital technology (our closest mirror to life we have been able to create completely through human effort).

Consideration of the exponentially increasing rate of change is important in untangling and comprehending the elusive causality of our reality, of life in an increasingly complex world. We can now observe the intentional way in which humans are placing less emphasis on traditions and locales, and instead connecting and evolving through digital. So much so that this is starting blur lines between the digital world and the material world that birthed it. Without a stable cultural base, we have also become free to think in ways unseen before and to explore and create new ways of thinking.

In a lot of ways, digital was the atomic bomb that converged all culture, wisdom, knowledge, information, and, most recently, data. Since the wide adoption and proliferation of this technology, we have stopped being “local” and started to form our sense of self (and our identities) through interaction on the global stage. We now have a relatively uniform pastiche of culture we can relate to no matter where we are geographically. A lot of localized traditions have died out or have become integrated into the global set. We are now united through common values, ethics, and preferences, rather than the minor details in which we enact them. What this means is that we no longer embody the disparate cultures we were born into and instead pick and choose what we want to uphold and be. Through this process, we have become more reliant on our identity (i.e., the set of attributes we use to classify ourselves), in the way we perceive ourselves and make sense of what happens.

Because identity has become so important to the modern human being, let’s get a bit technical on how this new conceptualization of identity fits into our evolution. Let’s assume for a moment that information is constantly evolving. There are four types of information that converge as one human being: biology, technology, culture, and cognition. In essence, the human sitting near you using a mobile phone is the convergence in the material of billions of years of this evolutionary process. As are you. Pretty incredible to think that we are what this evolutionary journey has culminated in thus far. What we do with it once we have had this realization, well, that’s the true test for intelligence.

The core assumption in this perspective is that information evolves over time in different forms through different media. As its complexity increases, information transforms into new forms. As information is refined over time and media, the patterns that remain the same and different become the focal point. For example, humans have communicated with one another for centuries, yet this century is the first time that they can do so over digital text to someone that they may never meet in person. What is the same here is the need to communicate. What is different is how and to whom. The why may be the same or may be different. We won’t know unless we explore the details of each interaction and compare the core drivers, themes, and value of the experience.

To comprehend the why, we need to keep in mind that the reality we live in is relational. As it is relational, we cannot assume that there is a single “truth” out there, some unchanging dataset that always was and always will be. Most of what, we as humans, have considered “truth” turned out to be nothing more than a collection of fragmented information generated from obscure and often heavily biased sources (usually an “authority” of some kind). We as a species, in the last 60 years in particular, have invested a significant amount of time and effort finding our way out of the global misinformation matrix we created. We were able to see that the reason that we used to propagate and mandate biased/linear assumptions (even to the point of threatening life) is that, historically as humans, we have been inherently biased through skewing perception of reality to benefit the few at the expense of the many. This is why, in our most recent focus on liberating information, it has not been an uncommon pattern in most cultures that what we may consider “fact” today may be considered “false” tomorrow. In the digital age, there is no longer such a thing as a “single source of truth” and its accompanying concept of “authority.” We now look for “experts” (i.e., those who have spent considerable time growing and refining a knowledge base for the benefit of all), knowing that “authority” is a fabrication and has, historically, often been skewed by a few to attempt to control others.

Truth” for a designer is not a hard unchanging fact written by one individual without explanation or explication; rather, “truth” is a measure of the resonance of someone with an idea, a feeling, or a concept. As symbolism and a symbolizing mind has been a core technology to fuel the evolution of culture, technology, and cognition, the concept of “truth” needs to be viewed as emerging from a specific context and the local variations resulting in it. Hence why focus on change and historicity is key to design. What we design today may not be useful tomorrow. Or how we imagine the future today may be the opposite of what tomorrow actually brings. However, if we look at patterns of change over time, we may become better equipped to adapt to change as it comes. To design for it, if not to fully predict it.

To learn how to design for evolving users, let’s adopt compatible relational evolutionary perspectives of information. Evolutionary perspectives focus on changes among patterns and fluctuations in context that constitute those changes (rather than linear perspectives that attempt to predict the future based on the information collected in the past, typically implementing linear assumptions). Evolutionary perspectives assume constant holistic change. For this reason, evolutionary perspectives are valuable as a basis for digital design for evolving users. This chapter presents such perspectives through the framework of Universal Darwinism and provides us with a way to view how we evolve, where we came from, and how this came to be. We can then use this perspective as a basis for comprehending and designing for evolving users.

Universal Darwinism

The concept of evolution originated in observations of biological life on Earth, and the core mechanisms of which have been applied to many areas beyond the original theory of biological evolution. The concept has consistently provided valuable perspectives on phenomena, which account for relational causality over time. For example, you can study the evolution of building styles in a neighborhood, or how your chocolate chip cookie recipe changed over the years. You can look back through childhood photos and see how your body, your sense of style, and your feelings and perspectives have shaped who you are today. Evolution is the concept of change over time, such that the change enables (or constrains) adaptation to changes in the entities’ internal/external context. In a sense, the process enables the survival of the fitting (i.e., most suitable to the broader whole).

The mechanisms and processes of evolution vary greatly among media (e.g., evolution of biology vs. evolution of the automobile), yet the overall pattern remains the same. This is why we can successfully use the meta-theoretical framework of Universal Darwinism to explore changes and patterns in all forms of media in a specific, analytically bound, context.

A meta-theoretical framework is a way to see similar patterns in different perspectives, which form the assumptions and mechanisms through which phenomena are examined and interpreted (and designed). It is a template of sorts that we can use to select different evolutionary theories and see whether or not they are compatible (i.e., whether they operate from the same set of mechanisms and/or assumptions, and can be used together). Let’s start with a brief overview of the framework, focusing on its core mechanisms, and common misconceptions.

Popularized by Richard Dawkins in 19821, the term Universal Darwinism refers to a variety of approaches and theories that extend the original theory of evolution2 beyond its application to the domain of biological entities on Earth. Universal Darwinism focuses on change in entities of a certain type over time and in their context. Although the mechanisms for the specific entity evolving may be different (e.g., biological change happens very differently to changes in clothing choices in a fashion magazine), the mechanisms constituting both patterns of change are similar.

Universal Darwinism posits that evolution occurs in any group of entities if there are mechanisms for introducing variation, a consistent selection process, and mechanisms for preserving and/or propagating different patterns in the “population.” Universal Darwinism can be applied in any domain as long as the patterns that constitute the changes can be explained through the core mechanisms of evolution (in their most abstract form). A set of genes (i.e., biological information segments that constitute the formation of said entity) can be viewed to change through the same abstract patterns as a news article on the Web (i.e., digital information segments changing over time to shape the story)—both start off as a specific pattern at a point in time and, depending on what happens to them and other entities in that context over time, change in pattern to be most suitable to their context (survival of the fitting) or stop existing if they are no longer relevant or advantageous (extinction). Think of the evolution of a breaking news segment over time—it changes as the selected information and its presentation generates the emotional charge and the political spin of the story.

It is important to note that Universal Darwinism itself is not a theory that can be used to explain or predict low-level phenomena (e.g., survival of a company or specific behavior of an individual), but is very helpful in comprehending changes in trends and patterns in specific sets of entities over time and to compare changes in very different sets of entities. In essence, we look for patterns of change in hindsight (rather than attempting to make predictions) and see patterns emerge that indicate the core mechanisms at play. Upon comprehending the core mechanisms, we can design accordingly to enable, evolve, or disrupt.

Indeed, specific theories and mechanisms for biological evolution cannot adequately explain or predict technological, economic, cognitive, or cultural evolution, and vice versa. However, they can provide a useful way to view change, especially when we look at change in what we can call a “composite entity”: part digital, part biology (e.g., a human user interacting with the Skype app on their phone).

In the digital product world, we are constantly having to draw boundaries between what is digital and what is not, and often come across challenges in the distinction. When a digital product goes from concept to implementation (i.e., a live website or app that humans can interact with), it literally transforms from an abstract idea in someone’s head, to a set of emails, to a set of whiteboard drawings and post-it notes, to a set of requirements/user stories, to a set of designs, to code, to a prototype, to a live product. Not to mention a whole collection of supporting documentation (and social affordances and constraints) that marks its evolution across media (human/analog/digital) over the course of the project. The humans that are the intended end users use the digital product and it changes again, with their feedback and what the end users (and the product team) deem as “improvement” in subsequent releases. So what is the digital product in each of these stages of change? To meaningfully examine that, we need to look at the patterns of change in and of themselves (i.e., what remains the same and what changes over each instantiation).

To do so, we need to be a little defiant with traditional notions of linear causality (i.e., A causes B, B causes C, etc.). Rather than linear and simple cause and effect, we look at causality as relational and multiplicious. An outcome or material instantiation of a digital product at a certain point in the project (e.g., a set of UI designs) is assumed to have many influences and mechanisms for it emerging over something else that could have happened.

As such, Universal Darwinism adopts the principle of “universal causation,” which implies that every event has a cause (i.e., there cannot be an event without a cause) but the cause cannot be assumed to be linear, singular, or simple to identify. In metaphor, think of a wave in an ocean, rather than a trigger on a gun. It is important to note that this perspective does not undermine the potential for novelty or imply that every event is predetermined and predictable. Quite the opposite. The focus becomes on tracking patterns of change to influences/influencers that have more “gravity” (i.e., are more prominent over time in that context and the most change can be traced to). There may be more than one (e.g., a stubborn product owner pushing their own worldview and budget constraints in that financial year) that skew the evolution of a digital product in a certain direction (e.g., to have a certain set of product features or look a certain way).

In the example of digital product emergence, we can see that what evolves over time is not only the product (across varying media) but also the humans involved (through their knowledge, preferences, and ability), the company or group they are a part of, and eventually the end users and the wider social dynamics the digital product constitutes. To meaningfully be able to see what works better and to design digital products that have positive social impact, we need to examine a complex and constantly evolving tapestry of changes. To make this analysis possible, these changes can be delineated for our purposes analytically into four types: biological, technological, cultural, and cognitive. Examining these four types, we need to focus on recurring patterns, especially when comparing phenomena at multiple levels of analysis and/or rates of change. Hence, the Universal Darwinism framework is helpful, as it provides a means to examine the symbiotic relationship of biology, technology, culture, and cognition through focus on similar patterns of change in different entities at different analytical levels.

Biological Evolution

At the core of some of the largest historical paradigm shifts in both scientific and social realms is the theory of biological evolution by natural selection as first proposed by Charles Darwin in On the Origin of Species (1859). Challenging the dominant assumptions of the time (i.e., those of fixed or constant typological “classes” in biology that accorded with an unknown divine plan), the theory presented a unified way to explain and track the observed diversity of biological life on Earth. Although the idea of evolution itself was not new (it emerged in many recorded forms, going back to the ancient Greeks), the major contribution of Darwin’s perspective was the overcoming of a powerful Platonic bias against the idea of descent of seemingly unrelated species from a common ancestor. Pre-Darwin, “essences” or forms of beings were assumed to be unchanging: a being couldn’t change in “essence” and new “essences” couldn’t be born. A reptile could not turn into a bird, no more than silver into gold. This perspective was problematic for explaining and classifying the large diversity of similar biological forms and any observed changes over time in already classified species.

To help classification of species and to better comprehend how different species came to be, Darwin (1859) proposed an alternative view. In his view, differences in biological forms stem from the process of natural selection where traits that enhance survival and reproduction chances of an organism become more common in subsequent generations. Darwin spent years studying different bird species in different neighboring habitats and observed that members of the same species looked and acted differently depending on the specific habitat he found them in. He also observed that the same species’ behavior changed over time, as one bird learned something new and others learned from it (e.g., changing bird song patterns). Darwin believed that the gradual process of accumulation of small changes, over enough time, could create speciation (i.e., the divergence of a genetic path to a “new” subspecies). This perspective sparked a pivotal transition in scientific thought concerning the very nature of biological change and our own historic journey of becoming human.

Human beings began to see themselves not as an entity that was “designed” by God and left unchanging over time, but rather as something that emerged through a long process of accumulated change from simpler life forms that came before it. Change that resulted from gradual adaptation to the broader changes to other life forms in our habitats. Although different, human beings began to see themselves once again as part of nature (along with all other life forms on Earth). As changes to our habitats affected our access to resources, those most capable of surviving and thriving despite (or because of) the changes multiplied in numbers, while those that were not able to do so as well decreased in numbers. Over a large enough period of time, large shifts in our form and behavior became observable, as any behaviors and/or characteristics that provided an advantage (no matter how minor) became amplified over time as they were “passed down” through subsequent generations.

Darwin also argued that multiple species can evolve from a common ancestor, each species exhibiting divergent heritable traits resulting from their unique adaptation to changes in context and their capacity to refine the advantageous traits through future generations. The same species of plant, over enough time, may look and require completely different environmental conditions to thrive, if planted in very different habitats. Darwin collected empirical evidence to illustrate how individuals within a species most able to adapt to contextual changes survive and reproduce, thus passing their adaptive advantage to subsequent generations in an iterative ongoing process (for detailed analysis and examples, see Dawkins 1976)3. As this perspective proved very useful in explaining the patterns that Darwin’s contemporaries observed in the flora and fauna they studied, it was gradually adopted and later became the foundation for the principles of Universal Darwinism.

Darwin’s contribution to modern paradigms was to unify and apply the principles of variation, selection, and heredity/retention to explain biological diversity and change. Interestingly, the original theory of evolution was not based on genetics, as Darwin struggled to explain specifically how heredity occurs or why divergence of observable traits may occur in a contextually bound population. Despite this initial limitation, the theory continued to be developed to the form that is widely acknowledged today. Importantly, the work of James Watson and Francis Crick (1953) on the structure of DNA, Stephen Jay Gould and Niles Eldredge’s (1972) theory of punctuated equilibrium, Donald Johanson’s (1974) discovery of the oldest humanoid skeleton, and Richard Dawkins’ (1989) conceptualization of behavior and cognitive habits as the “extended phenotype,” and the mapping of the human genome (2000) have extended Darwin’s original ideas and provided more detail on the specific mechanisms for the processes of variation, selection, and heredity/retention missing from the original.

Core mechanisms for biological evolution are now recognized to include natural selection, biased mutation (i.e., difference in probabilities of mutations occurring), genetic drift (i.e., change in the frequency of a gene variant in a population due to random sampling), genetic hitchhiking (i.e., low rate of gene recombination leading to “linked” genes being inherited together), and gene flow (i.e., the exchange of genes between populations and/or species). The outcomes of the process include adaptation (i.e., developmental patterns that enable or enhance survival/reproduction probability), coevolution (i.e., development of a matched set of adaptations, such as those of predator and prey), cooperation (i.e., development of mutually beneficial relations between species), speciation (i.e., divergence of a species into two or more descendant species), and extinction (i.e., disappearance of an entire species). Although these mechanisms originated in biology, they have since been applied in abstract form to explain change in other contexts, such as media studies, cultural studies, art, fashion, technology, architecture, social studies, and economics.

Biological evolution thus forms the foundation for the meta-theoretical framework of Universal Darwinism and helps us comprehend how and why it emerged. The value of Universal Darwinism is that it can be applied beyond biology and to other areas of change. We can use this perspective to explore patterns in what we now call the “Information Age.” We, as a species, have developed complex social dynamics that (enabled by the tools we created) focus on creating, sharing, and refining information. In our modern civilization, we can see that everything can be reduced to the level of information, and it is patterns of information that evolve across different media through instantiation and transformation over time. In this perspective, there is not much difference between genetic information and software languages in how they evolve over time (although of course the specific mechanisms and rates are very different).

What works best at that point in time occurs again in the future. What doesn’t work simply does not occur again (over a long enough timeline).

The total information in a certain area (e.g., human genome, Unix technology, etc.) at a point in time is called a “design knowledge base.” This design knowledge base changes and becomes more detailed and/or refined over time (a process we have termed as evolution). Some design knowledge bases evolve without conscious human intervention (e.g., naturally formed human facial features), while some evolve through conscious human intent (e.g., Unix technology).

As we have reached an era where information has become the core focus, four distinct types of design knowledge bases can be separated analytically to help explain the complexity of our modern existence: biology, technology, culture, and cognition. As the human species is arguably one that is most reliant on technology for survival (and digital products are the latest in the progression of technological evolution), the next section looks at patterns in technological change that may help us design better digital products for our evolving users.

Technological Evolution

Technological evolution is the rate and patterns of technological change that constitute the developmental passage of the human species. “Technology” in this view is anything that humans have created (intentionally or not) in order to achieve some value. This includes fire, tools, machines, buildings, computers, language, and even memory and voluntary memory recall. Keep in mind that in this view “technology” is not strictly material or external to our biological forms (e.g., a pencil); rather, it is anything that is not directly responsible for the core biological functions of the human body (although, technically, they too can be considered a form of “biological technology” as biological processes are heavily focused on what we have termed in the age of digital as “information communication” and “appropriate timely response”).

But before we dive into the complexities of the micro perspective, let’s look at technology external to the human body first (as it makes more sense due to our intentional experiences of using it). In the last century, we have gone through the evolution of tools to machines and, more recently, automation. The emergence of each signifies major paradigm shifts in global cultural, cognitive, and biological change of the human species (and their environments). In this view, a tool is defined as any material entity that extends the potential of human achievement by overcoming the limits of biology (e.g., a knife or axe). Machines are defined as complex tools that substitute an element of human effort but require human input for operation (e.g., a boat or car). Finally, automation is defined as a machine that is capable of executing commands without direct human input (e.g., robotic equipment).

We can go so far as to view each transition as enabled by previous stages such that as the design knowledge base for technology improves, so does our capacity to create more complex and elegant technology. Curiously, the emergence of each new type does not lead to the extinction of its predecessors; rather, it marks a transition of the human species into the next stage of technological, cultural, and cognitive evolution.

Think simply: we may now have automobiles but we still use knives. Once we developed the automobile, to think we will be driving flying cars next is an example of linear thinking/reasoning. To hope that we get to proper teleportation in our lifetime is an example of exponential convergence thinking/reasoning. The core concept of teleportation addresses the core need (i.e., get from A to B as a cohesive unit/entity) with the current pain points of travel eliminated (i.e., instant transfer over any distance). Although teleportation as a material reality is still far-fetched, we can see how it would hugely benefit a lot of current systems and go so far as to predict that some groups may fight it (had it been invented) to retain the state of current systems that they are reliant on for survival. The challenge of designers is to go far enough in evolving the core without taking it beyond the reach of current state reality (though futuristic ideas in and of themselves are hugely beneficial to the process of innovation because they allow for exploring all possible ways the situation or product might evolve and the effect it may have if released “into the wild”).

Similar to biological evolution, what evolutionary passage a technology takes is largely dependent on whether or not the technology is the most suitable for a certain outcome (or provides specific value). If it is, it remains and other technology gets “added” alongside it. If a new technology emerges that is an improvement upon the previous, then it takes the place of the earlier prototype such that the earlier prototype is no longer commonly used (e.g., think of vinyl records, CDs, MP3 players, iPod, Pandora app, etc.).

Even more curiously, we can observe that technology advances are cumulative. This means that once something is created, new innovation comes at a quicker pace than it took between previous innovations. If we look at recording of written language, there is a larger gap in time between invention of early written media and the printing press than between the printing press and digital media. Within innovation patterns of digital media itself, we can see exponential change and improvement in the last 60 years alone. Along with the rapid evolution (and improvement) of technology, we can also observe very significant changes in global social dynamics, including changes in culture, business practices, social habits, identity, and concept of “self” for a human being. We, as a species, have begun to question our existence in a way previously unexplored: through the interplay between what we now call our “digital self” vs. our “material self.” Our sense of self, and its operational identity as “John” or “Mary” has become far more complex than in previous centuries where it was uniquely linked with our biological forms and our social roles in our local communities. Indeed, the concept of community has also changed: we are now players on a global stage, constantly straddling our global (distributed) and our local (biocentric) roles and identities.

How did we reach this level of complexity? Due to the vast volumes of technology we have created to the present moment and the complex history of its emergence, an overview of all documented technological change sequences is beyond the scope of this chapter4. However, we can at least note the most prominent technological developments relevant to digital design: those that specifically involve information creation, storage, distribution, and refinement. These include memory (pre-history), language (prehistory), painting/writing media (prehistory), printing press (15th century AD), telegraph (18th century AD), telephone (19th century AD), radio (early 20th century AD), television (early 20th century AD), computational devices (mid-20th century AD), satellites (mid-20th century AD), early Artificial Intelligence (mid-late 20th century AD), and, most recently, the Internet and World Wide Web (late 20th century AD). Plotted on a timeline, the pattern of emergence suggests that the rate of technological paradigm shifts is increasing and may be linked to the context of the creators of technology (i.e., accumulated design knowledge), though not necessarily to the rate of their biological evolution. The evolution of technology is much faster than the slow evolutionary processes that constitute our biological forms. We have gone from a paper-based society that used horses as its fastest mode of data transport, to a global network of digital connectivity with 24/7 access to a vast repository of information, in less than 200 years.

Futurists, such as Raymond Kurzweil5, in analyzing these patterns of technological evolution predict that sooner than that we will have another major shift. Referred to as the Singularity , the next paradigmatic shift is argued to involve the transcendence of the limits of human biology through merging with emerging technology. It is important to note that this perspective does not assume a dystopian society in which the essence of “being human” is irrevocably altered or lost; rather, the current assumed ontological separation of machines and humans is argued to become increasingly obsolete to the point where it will be impossible to establish meaningful analytical boundaries between the two. Indeed, when we look at a user on Facebook, it is almost impossible to distinguish the boundary between the two except in the most obvious (and least meaningful) way: where the biology ends and where the phone begins. Digital technology, as with all previous technologies, extends the reach of the human being beyond what they can accomplish with biology alone. In this sense, we have already “merged” with technology: albeit not necessarily by literally placing it inside our bodies, but by making it inseparable from us in our daily activities. Technology has been with us, as part of us, the whole time. It is now just becoming more complex and more obvious in its irreplaceable value and significant impact on, not only human life, but life in general.

Kurzweil echoes the perspective of the 20th-century media theorist Marshall McLuhan6, who argued that media is an extension of man, far more than we realize. His views of the medium of television, and its effect on culture and cognition, suggest that humans have already merged with technology. Not by implanting microchips as early dystopian fantasies suggested, but by adopting and evolving with and through technology, and using our extended reach to become more than what we were without it. Indeed, the push medium of television has arguably provided the foundation for our creation of a uniform values-based digitally enabled “melting pot of culture” across the world. Yet, recently, we have outgrown the push medium due to recent distributed, on-demand, digital innovations being more valuable for our changing needs than the early medium of broadcast television. Streaming services are rapidly overtaking the home entertainment and news markets because someone else telling us when to watch something, or interrupting the show with advertising, became unacceptable and new media emerged to address our need to consume content how and when we want it. We evolved, and so did our needs and expectations, and so did the technology that we extended ourselves through. Thus, in a very tangible way, our choices constitute the emergence of “new” and the slow death of certain “old” media (also termed in this book as “legacy systems”), based on what we want and need to do/experience.

Realizing this, we can begin to reimagine our process of digital design and its intended purpose. If we are able to focus on core mechanisms/drivers of human needs, we can design better digital products to meet those needs and help us all to transform into ever new states of being. To do so, we need to shift from being reactive to being proactive in our design. Furthermore, we can start to consciously shape the direction of human evolution through digital product creation and, by extension, of all life on our planet.

To do so, we need to look deeper at these interrelated patterns and start to map out how different media has impacted social dynamics in the past, identify root drivers/mechanisms/needs, and use the technical design knowledge base already acclimated to create things that make life better (whatever the evolving concept of “better” may mean for our rapidly evolving selves). Before we go any further, as designers, we need to navigate the ethics of what we design and the design process itself such that it has no deliberate or accidental negative effects (i.e., those that detract from, rather than improve, health and capacity to independently thrive) occur at any point.

First, we need to realize that survival applies to technological evolution in a similar sense as observed in the broader animal kingdom. Survival of the most fitting in context. Analyzing the historical emergence of technological diversity, we can conclude that both creation and adoption are very much shaped by the social relations prevalent in a context and value of the technology in the context of these relations. In a very real sense, technological novelty, selection, and replication arise from the interplay of psychological, intellectual, socioeconomic, and cultural needs (among others). Throughout history we can see that technological evolution has been inseparably intertwined with both cultural and cognitive developmental shifts of humans, which have not always arisen from basic survival needs or can be deemed as a natural extension of technological progress.

Think, for instance, of the relatively recent realization of human beings that they may not necessarily be “of nature” in the same way as their garden is. In a very literal sense, they both are (as humans too have been “selectively bred” through culture) and are not (as humans have historically “lived in” a simulated reality that formed as a by-product of the development of the symbolizing human mind). A still relatively common example of an ongoing simulated reality can be seen through religious or cultural practices, business, economics, politics, etc. The question of who we are is currently more strongly felt through our social reality, a reality which has little to do with (and has often been at odds with) nature itself. By living in and focusing predominantly on the social reality we embodied and created, we have in a very real way excluded ourselves from other life forms on Earth, and evolved new and more influential mechanisms for evolution that are social, rather than purely biological. Technology is at the core of this artificial (i.e., predominantly cognitive) reality that has long ago become the primary focus of human attention. Think how often one ponders the disconnect/misalignment between one’s own logic and feelings, and why the disconnect/misalignment exists in the first place. After all, surely our aim is to be fully integrated and effortlessly functioning as a being. Why then is it so challenging to be alive? Perhaps because of the disconnect between life itself and the conceptualization of the human who (for some reason or other) believes that they can control life, govern it, skew it, force it, constrain it, or exploit it. Aside from the obvious ethical issues that are raised during this line of thought, the core question remains: why do/have humans prioritized anything over life? How did this come to be? What might we do about it?

Through increasing isolated focus on our social reality, humans have arguably developed a different relationship to “nature” than other living beings. For humans, what is “nature” and what is “artificial” is not clear cut, as we are arguably “natural” but live almost entirely within a reality that we created and actively orchestrate (e.g., the stock market is not directly of “nature” but it shapes behavior and opinions of humans far more than the weather). If left to our own devices, most of us would not survive long in “nature” without the comfortable buffer of technology we have created to distance ourselves from it. Or the technology we have created to, later, bring ourselves closer to it, as we started to see the toll that the artificial reality we began to predominantly exist in had on us: it turned out that our “modern” social reality was in opposition to our own (biological) nature. We realized that we have access to more data and information than before, but we have lost knowledge and wisdom that we had prior that was of “nature.” We realized that what we create is always artificial if for no other reason than it being “dead,” unable to change on its own without our interference (or us determining its change parameters). We became unsatisfied with that notion, yet unclear on what to do about it.

Applications, virtual realities, chat bots, artificial intelligence, and more have called upon us to further question the distinction between natural and artificial, of human and “other,” life and death. They have forced us to confront the idea that, perhaps, we too are no more than complex algorithms running on biology, and brought us full circle to the idea that, perhaps, we have not been of “nature” for longer than we thought. Perhaps we have always lived, to some extent, in a world that we created through various technology, especially since the early days of language where the distinction (and confusion) between nature and our representation of it originated. As at the 21st century, it is impossible to argue that technology is not a driver of global change, as we now live predominantly in a world that does not exist outside our screens and minds. We are further removed from “nature” than ever before, and it is more important than ever to recognize the critical role technology (specifically digital media) plays in our lives, examine how it came to be, and decide where we go next (with and through it).

This may sound and feel pessimistic, yet it is quite the contrary: we are in a more empowered position than ever before to do whatever we want to do however and whenever we want to do it. All we need to do is be increasingly intelligent about what we create and adopt into our existing social dynamics, paying close attention to its impact on health and welfare of all life constituting our planetary ecosystem. We can learn to do so by looking for patterns, learning from the past, and designing for the future we want (rather than focusing on short-term goals, such as profit or social status).

We can begin by analyzing what constitutes the modern lifestyle: what can we not live without and why? From there we can look at how these technologies emerged and what specifically made them indispensable to our lifestyle.

Unfortunately to do so is not as straightforward as biological evolution. Unlike biological evolution, tracking of technological change at a micro-level (each individual entity) is difficult as we need a unit of heredity at a suitable analytical level to do this. For example, when comparing two social media apps, how might we meaningfully derive lines of heredity and definitively trace where a specific feature (e.g., the “Like” button) emerged from or its full impact on our emergent social dynamics? Think of two prevalent social content–focused technologies: Facebook and Reddit. Although they both are heavily reliant on the user base for content (and adoption rates), they have two marked differences in design. The first difference is the Up/Down Vote system on posts in Reddit vs. the “Like” only vote system in Facebook. The addition of an extra button (affording users to dislike content or “vote it down”) shifts focus from providing the content poster with (arguably) meaningless positive external validation (i.e., if there is no choice but to approve, then there is no real choice at all) to the potential of positive or negative external validation, thus placing in focus the idea itself (and its delivery through written/visual/interactive text). The second major distinction is the focus on the users’ (fictitious or not) self-curated identity. In Reddit, where the identity is hidden and placed on the furthest periphery of users’ interaction (i.e., it remains private and only known to the individual user), the content/ideas themselves take priority. In Facebook, where the self-curated identity is the core focus (and core product that users voluntarily create for the company, with murky ownership rights to their self-generated content), interaction is entirely based on who you know who likes what you say/do, rather than on the stand-alone merit of the idea itself. Both digital products have evolved through their user base and feature set over their lifetime, yet still to this day have very distinct evolutionary passages and a very different evolving user base.

So what exactly evolves in the example above? The users, the digital product, the organization creating it, the broader context of humans relating through it? All the above and more. Hence why it is important to look at technology (and evolution in general) holistically. Although the “artifact” has been suggested as the core unit for the study of technology, there is no clear unit of heredity—thus far, technological change was not independent from human beings (i.e., technology has historically been unable to replicate, select, or vary its own design); hence, the unit of heredity has to be transferrable between biology and technology. Technology-specific accumulated design knowledge cannot be isolated in a specific medium or its progression assumed as temporally and spatially bound in the latest sequential artifact instantiation (unlike accumulated biological design knowledge contained in the genome).

To add to the complexity, the recent emergence of immaterial digital media has shifted focus from material tools and/or machines to those that are spatially and temporally distributed and iteratively change in real time (e.g., software, such as the Google search engine). What is deemed as a “technology,” or even a “tool” or a “machine,” can no longer be confined to definitions based on the concept of a static “thing.” For example, “immaterial” digital media, such as software, although devoid of fixed material properties that biological entities can directly interact with, are not devoid of “utility functions” (i.e., properties that evolve over time). Comprehending digital media as an entity is a little like trying to solve the challenge of consciousness: where is it, what is it, and how might we consistently measure and track it? How might we evolve consciously through fully informed choice?

With digital media, to track change, we need to shift focus from chronological technological instantiations (i.e., material artifacts observed at specific points in time) to look at how that technology came to be and all the actors that played a role (direct and indirect) in its emergence. This may include teams of people, product designs, software code, wireframes, organizational structures, user perceptions, and/or development processes. What we need to do is look at how an abstract pattern (i.e., an idea) transformed through instantiations across different media over time, such that it became a digital artifact that then became used by humans in their day-to-day lives. Doing so, we recognize that an app is more than just pixels on a screen, or code, or lists of features, or what it enables and/or constrains the user to do. It is all those and more: the whole is greater than the sum of its parts. Digital media is a composite entity that is instantiated through many different relations across various media and evolves through each. Effectively, we can trace its evolution through a string of decisions. By focusing on decisions that constitute the product’s emergence, we can unravel the process of its creation and pinpoint when, where, how, and why the significant decisions were made that later made the significant impact on how the product was used and how it evolved over time.

Let’s look at a common industry example. A client approaches a digital agency and asks them to build an online shop. The technology emerging is an “online shop” and by the time it reaches the brief and project sign-off, the technology (a concept still at this point) is translated across multiple people and other technologies, such as Word docs and emails. It then becomes a project plan, a set of user stories, design concepts, low- and high-fidelity prototypes, until it becomes code, and an instantiated e-commerce website on the World Wide Web. By the time real-life users can interact with it, it has evolved so substantially from the original concept that only the core utility function remains true to its first instantiation. Constituted by other technology, the e-commerce store evolves through every decision made, every value and effort of every human (and technology) it comes in contact with. Some design decisions win over others, and some fade away only to re-emerge later on in the process. The final version is akin to a biological being, its process of emergence hidden by its current instantiation, yet, simultaneously, being a true embodiment of it.

This is the true power of digital design. To create something that comes from nothing, exists as nothing, and yet drives human evolution in a powerful way by emerging of us, for us, because of us. Ideally in a symbiotic, rather than parasitic way. It is a part of us, just made of different matter to our biological form. How might something so different to us be a part of us? To begin to answer that question, we need to look next to the evolution of culture and human cognition.

Cultural Evolution

The term “culture” refers to a body of socially transmitted knowledge accumulated and refined over time by a species. This includes representational content and the ways that that content is interpreted and created. When we commonly use the term “culture,” we, of course, primarily refer to the human species. Although the phenomenon is not unique to human beings, the ongoing transmission and refinement of cultural knowledge has been fundamental to human evolution and our observed disconnect from the flow of other life in our ecosystem.

Cultural knowledge accumulation, refinement, and sharing practices play as an important role (if not more important) in our evolutionary passage as biological adaptations. Typically, these evolved as patterns of static attributes that were used to normalize behavior/perception, which were historically used to homogenize/control populations. A simple thought experiment can be used to test the prevalence and rootedness of cultural attributes. Forget, for a moment, that you are an “American,” a “New Yorker,” a “man” or a “woman,” “rich” or “poor,” “educated” or “street-smart.” You are left with just being “human,” same as everyone else in your species. You are life itself, along with all the other species. What would you now do or no longer do given the lack of cultural attributes that defined you before this thought experiment?

It is not difficult to see how “labels” or categorizations (that one, of course, chooses to internalize for one’s own personal reasons) determine quite a lot of behavior. To remove them would be akin to becoming a blank canvas, no different in awareness of yourself than existed before we evolved the capacity for voluntary memory recall (a key “technology” in establishing and refining culture), symbolic systems of thought, and, later, an individualistic identity-based sense of self. One would be acting entirely on “instinct” and have no capacity to repeat an action at will (i.e., without a direct contextual trigger). Without this ability, what we now term as “cultural knowledge” could not have been deliberately refined or accumulated. We would only have patterns of behavior that are inseparable from contextual triggers, and the same pattern could be “reinvented” countless times as it would have no way to be remembered and passed on to future generations.

Continuous accumulation and refinement of a cultural design knowledge base has historically increased our evolutionary progress, as well as deepened our comprehension of the universe and ourselves. It did so by giving us a basis from which to operate and build upon, rather than constantly having to reinvent the same knowledge. For example, hundreds of instances of the same design breakthrough (e.g., the wheel) could have occurred if we had no capacity to remember and pass down this information. The value of culture (as our design knowledge base) is that it improves over time, as more and more cultural information is created, recorded, and built upon. Cultural lines are akin to biological lines as a component can, in theory, be traced back to the first instance of that particular pattern. A fascinating example of this can be seen in the evolution of philosophical paradigms and axioms about reality in different human subcultures since beginning of recorded history and their role in shaping our perception of ourselves, shaping social practices, and enabling advances in technological innovation.

The historicity of cultural evolution suggests that what we are as human beings is very much a product of culture. It is culture that shapes our constantly changing perception of ourselves, and our ways of being. Arguably, the evolution of cultural elements (both material and cognitive) continuously enabled improvements in our lives across time. Developments in culture have always been symbiotic with developments in technology and human cognition, with culture evolving through us and the technology we chose to externalize and internalize. The more we were able to pass down, the better our cultural design knowledge foundation, the quicker we were able to evolve culturally, technologically, and cognitively. We can argue that our uniqueness can, perhaps, be best characterized by our ability to remember, transform, and pass down elements of information that we can externalize and another individual can internalize with little distortion through patterns of interpretation.

So what is a unit of heredity we can use to examine culture? Richard Dawkins (1976)7 first proposed the term “meme” to describe a semiotic unit of cultural inheritance. A meme, defined by Merriam-Webster Dictionary as “an idea, behavior, or style that spreads from person to person within a culture,”8 refers to replicating semiotic information units within the “primordial soup” of the whole infosphere that constitutes a culture and the concept of culture more broadly. Memes follow a similar pattern to genes, competing for survival and adhering to the principles of variation, selection, and heredity/retention. Interestingly, memes have at times been in direct competition with genes and vice versa (e.g., meme for celibacy vs. the biological instinct to procreate). Hence why culture is as significant, if not more, than genetic variation in the modern ecosystem. There are few cultures that have not historically placed some kind of social rules on who can reproduce with whom, when, and why. In a very literal sense, we modern humans have been selectively bred by our cultures and historical roles within them.

Unlike biological information transmission, which is linked to a stable and traceable language (i.e., DNA), a unit of culture (e.g., a song) can manifest in a variety of media (e.g., recording, sheet music, live performance, memory, etc.), each of which employs a medium-specific language for information translation, processing, and transfer, and requires different units of analysis at multiple analytical levels. Furthermore, the transfer path of memes across media (e.g., technology and humans) is difficult to trace, as there is no clear mechanism or patterns of actions that can be repeatedly observed as the process of memetic selection, variation, and/or inheritance. Hence, a “meme” cannot be used to create a taxonomy of culture in the same manner as genes can be used to create a taxonomy of biological forms, but it is a useful way to look for meaningful patterns of change in our cultural knowledge base.

Both memes and genes can be viewed as different types of replicator entities, and parallels between them may provide a suitable foundation for deepening our comprehension of cultural evolution. For instance, a “meme” is to the characteristics of an instantiated object (i.e., artifact, idea, thought, song, lyric, etc.) as a “gene” is to a biological organism’s phenotypes. By extension of the analogy, individual memes (i.e., patterns of semantic relations) can be linked in a memeplex just as genes are linked in a genome (i.e., all the genetic information of an organism), with the relations between memes in a memeplex constituting the specific phenotypes (i.e., observable behaviors or characteristics). Similarly to genes, the selection of a meme or a memeplex is shaped by its surrounding environment (i.e., multilevel context), which includes other memes being selected. It is important to note that, due to its semiotic nature, memetic evolution is viewed as relatively independent from biology in the sense that the survival of a meme may be only advantageous to itself at the expense of biological evolution (e.g., meme of celibacy hindering/skewing biological evolution rather than directly enabling it, for the entity hosting the meme).

Unlike DNA replication that involves high-fidelity copying, minds or brains alter the information based on the schemata and other interpretation means that a particular human has developed throughout the life of its biocomputer. This process is exemplified by the children’s game of “Chinese Whispers” or “Telephone,” where initial sequence of words is rarely unaltered through the process of transmission. This is important to consider in digital design as often we are designing products and services that enable translation and recombination of information patterns (e.g., file sharing, messenger apps, etc.), yet rarely stop to consider the cultural implications of these processes: specifically, the common phenomena of misinterpretation and mistranslation. For instance, variation in a cultural context more often comes from replication issues (i.e., copying the pattern wrong) than from deliberate and conscious innovation.

The capacity to represent experienced events in the mind (in our case, in the minds of the predecessors of the modern human being) involved storing the representations in memory and voluntarily recalling stored representations. This process provided the initial conditions in which complex communicative patterns emerged, which formed the basis for cultural knowledge accumulation. The process, in a very literal way, allowed for acts (and experiences they represented) to be remembered, rehearsed, and improved upon. In this sense, memory and voluntary memory recall can be viewed as one of the earliest “technologies” that human beings evolved. To this day, memory is one of the most important aspects for cultural evolution, with many digital products aimed specially at extending and improving cultural memory and reach (e.g., Google, Wikipedia, etc.).

So we developed the capacity to remember events that happened and communicate those events (no matter how accurately) to someone else. We then learned how to share our knowledge of those events to even more individuals through basic externalized representations (e.g., written text). What happened then? As more complex representations emerged through abstraction of events, objects, experiences, and relationships between them, we developed more complex meaning systems. These complex meaning systems and ways to represent them over time culminated in the emergence of distinct cultural paradigms. These cultural paradigms prior to digital technology were localized geographically and have significantly shaped the evolution of technology and, to an extent, biology in those localized areas.

With the creation and proliferation of digital technology, those local cultural groups merged in a global melting pot of cultural knowledge, such that a more or less uniform global culture has emerged with certain local variations based on geographical locales and the history of the area. Digital technology even now is directly enabling the further refinement of our shared cultural design knowledge base and leading to an increased shared comprehension of who we are and where we want to go. It enables us to share more information and compare ways to represent and interpret information. It also allows for us to track in close to real time how cultural knowledge evolves (e.g., forums, chat rooms, etc.).

How did we get to this point? Cultural evolution can be classed in four distinct stages: episodic culture (i.e., experiences as representationally stored events in memory), mimetic culture (i.e., experience representation classes as actions that can be voluntarily recalled and rehearsed, and passed down through physical action-based communication), mythic culture (i.e., development of syntax; cultural meaning and value structures passed down through spoken language), and theoretical culture (i.e., development of semiotic structures; cultural transmission based on written symbols and paradigmatic thought). We can observe a distant echo of this process in how modern humans mature through cultural exposure from baby to toddler to child to young adult. It is important to consider this process when designing digital products for different age groups and cultural subsets.

Due to these mechanisms of cultural evolution, we can argue that this process, over time, may have contributed to us becoming further removed from nature. We became caught in waves of misconceptions and ever more elaborate conceptual webs of representations that became our core focus instead of the experiences those representations originally pertained to. The problem is that the representation of something is not the thing itself. It is inherently challenging to create an accurate enough representation so that there is absolutely no ambiguity in what is being represented. It is even more challenging to bring this idea to a wide audience without risk of misinterpretation. For example, if I ask you to imagine a “cat,” the only thing I can be sure of is that every reader imagines a different cat. It is likely though that most imagine some kind of furry four-legged creature that meows and purrs. For those of you who thought of something other than what Google Images deems a “cat,” take some time to think about how and why your unique representation came to be attributed to the word “cat” within your cognitive system.

As we can see, a represented experience has many multiples of representation and relies on similar cognitive models (and constitutive experiences those are founded on) for accurate transmission and, hence, transference/propagation across humans. The danger is in creating more and more representational content in cultural knowledge bases such that future generations are led further astray, spending copious amounts of time untangling (or worse, trying to “prove”) representations, rather than experiencing life and moving closer to living whatever the “ideal” life may be for them. The danger has been lessened significantly by the content circulating digital media (especially the World Wide Web) as multiple types of content (text, video, images, etc.) could be linked together helping users triangulate representations and get closer to the core of the “thing” being represented much faster and more efficiently than in previous generations.

Due to the nature of culture, when examining cultural evolution, focus on material instantiations alone is not enough—we need to examine how knowledge (and culture) comes to be and that means looking at how meaning is made and shared. Only then can we meaningfully distinguish between the “fake information field” components of culture and those that are aligned with and truly enhance life. Arguably, that is one of the core benefits of digital and core responsibility of digital designers: enhancing life through what we create, and allowing for users to navigate their reality easier and more efficiently/effectively.

Where do we begin? We can start by looking at how cultural knowledge is accumulated, refined, and inherited. We can examine different memes and memeplexes in the culture we originate from (and hence comprehend the most). Memetic evolution can occur through three distinct processes, which can be broadly termed as:
  1. 1)

    copy-the-product

     
  2. 2)

    copy-the-instructions

     
  3. 3)

    enhance-design

     

The process of “copy-the-product” involves mimicking a pattern of behavior or reverse engineering a materially instantiated object (e.g., watching a chef prepare a dish; i.e., mimicry). The process of “copy-the-instructions” involves creating a product from an established algorithm (e.g., a soup recipe; i.e., imitation). The “enhance-design” implies a recombination of materially uninstantiated memes into a materially instantiated form (e.g., combining multiple recipes to create an original; i.e., innovation). In these ways, the concept of a “meme” is closer to that of a “software algorithm” in computer science than it is to a “gene” in biology.

A meme can be viewed as a relational pattern of representational content that is capable of variation, selection, and inheritance, where variation is often (problematically) introduced by copying errors and/or differences in interpretation that are unique to the medium under examination (e.g., divergent interpretations of a symbol structure). A copying error of a meme’s representational content (i.e., memetic mutation) would, thereby, introduce phenotype changes in instantiations that can then be mimicked and lead to further mutations (e.g., errors in “copy-the-product” processes) that, if the algorithm is unknown, may become representations without a direct logical link to experienced phenomena the original meme pertained to. As such, this perspective on mechanisms underlying cultural change is similar to Jean Baudrillard’s (1981)9 concept of “simulacra,” which is used to argue that copies of copies (i.e., representations of representations) mutate further from the original over time, primarily due to the inability of representational content to capture the entirety of instantiated complexity. Hence, unless the underlying algorithm is known, the “memetic drive” may lead to the spread of representational systems that have no experiential grounding for their content and may inadvertently contribute to the degradation of quality of life and skew the cultural design knowledge base through the spread of misinformation.

How might we analyze how a specific meme in a culture came to be? First, we need to recognize that cultural replicator entities come in two types: semiotic and structural. Semiotic refers to meaning, and structural refers to how meaning is created through relations between concepts. The adaption of complex mimetic skills is arguably the foundation for all cultures. The process of mimesis involves a direct representation of experienced events and the communication of these experiences through the use of the body, such as tone, expression, movement, signs, and gestures. A cultural example of mimesis is a game of charades or interpretive and/or ritual dance. Complex mimetic skills allowed for the human species to transition from nonsymbolic forms of intelligence still observable in the animal kingdom to a symbolizing mind. These mimetic skills are still a core part of modern cognitive processes, and they too are replicating entities passed down through culture.

Combined with memetic skills, mimesis forms the basis for cultural evolution, and thus we can define two types of cultural replicator entities. The core difference between the two: memes shape the what (representational content; meaning), while mimetic skills shape the how (relational structures; cognitive schemata). In the digital ecosystem, mimesis relates to structure and functionality rather than readable representational content (i.e., text). Both types move through social groups in waves, inherited through pedagogy and altered through life experiences, where transmission/interpretation differences increase cultural variety and influence replicator entity selection over time. A chorus of a song (a meme) can be traced through multiple media (e.g., sheet music, CDs, live performances, etc.). We know it is the same meme because the notes are arranged in a certain pattern and that pattern maintains its integrity across media. If someone hears the chorus performed by an orchestra and then by a computer, they should be able to recognize the same pattern without difficulty. In this sense, it is a pattern recognition activity at its core.

To help comprehend how humans operate cognitively, let’s delve deeper into how a human being currently gains access to the shared cultural design knowledge base. In childhood, as a human being matures, complex social skills are passed down and imprinted through various experiences. These experiences and the way the child makes sense of them (or is taught to) form the foundation for subsequent cultural inheritance. A child first learns culture based on the information available to them and their capacity to interpret and make sense of that information. They then are able to order it and add to it through changing existing elements as they see fit. Arguably, one of the most valuable aspects of cultural inheritance is the capacity of a child to interpret the information in their own way, thereby evolving the culture at a micro-level. They may then share their interpretation with others and create a paradigm shift if the interpretation is different enough and significant enough in its value.

Hence, cultural evolution is closely linked not only to technological evolution but also to cognitive evolution.

Cognitive Evolution

Cognition has been the fascination of the human species for as long as we have attempted to comprehend who we are and our life purpose. One of the most valuable contributions of the analysis of cognition is the realization that how we think shapes our experience of reality, our choices, and the cultural/technological inheritance we pass down to future generations. The study of human cognitive development involves a large body of knowledge spanning various disciplines, including linguistics, neuroscience, psychology, education, anesthesia, anthropology, biology, and computer science. The term “cognition” refers to the internal processes and abilities that relate to the formation of knowledge, such as attention, working and long-term memory, analysis, evaluation, reasoning/“computation,” problem solving, decision making, comprehension, and creation/use of language.

It is important to note that evolutionary accounts of the cognitive development of the human species (i.e., those founded on the principles of Universal Darwinism) retreat from the dualist isolated-mind doctrine, which rests on the central assumption that the mind is “inside” the brain and is to a large extent predetermined. Instead, these perspectives realize that cognition evolves with and through culture. The “outside” (context) of an entity constitutes the cognitive architecture development (“inside”) through the outside-inside principle popularized by developmental psychologists. As such, cognitive evolution is viewed as part of a symbiotic relationship of culture and biology and, by extension, technology.

Cognition refers to all processes and cognitive architectures that constitute “thinking,” with the brain viewed as the biological processor used to “compute” structured cognitive algorithms that produce the observed effect of the “mind” or “consciousness.” When we examine the evolution of cognition, we look to patterns of biological, cultural, and technological change.

Although genetic code provides an important foundation for biological development, context and subsequent life experience play an arguably greater role in cognitive evolution. Genes constitute the basic biological structures that facilitate cognition, but cannot be viewed to inherently produce it—continuous exposure to culture is necessary for mimetic and memetic skills to develop. These skills shape the evolution of cognition of a being. As we evolve, mimetic replicator entities (e.g., pedagogical techniques, rituals, roles, taboos, social relations) form the foundation for subsequent assimilation/accommodation of memetic replicator entities (e.g., language, syntax, semiotic networks, paradigms, ideas, theories, etc.) into a being’s cognitive architectures.

As humans evolved, the increasing complexity of social organization and cultural mimetic skills increased development of cognition. The assumption here is that, historically, the ability to develop more complex representational systems provided a social adaptive advantage (i.e., the more intelligent a human, the more they were able to adapt to context and pass on their knowledge). The implication of this assumption is that humans have advanced as far from the rest of the species on Earth as they have due to their capacity to represent experience and operate almost entirely within the immaterial realm of these shared representational systems (e.g., culture). Over time, the increasing complexity of social structures and communicative patterns led to the development of more abstract representational systems, including complex abstract signs, symbols, and syntax.

Studies of modern cognition isolate three major systems core to mental representation in the human mind/brain: permanent semantic memory (i.e., the mental lexicon), episodic memory (i.e., the current text), and working memory and attention (i.e., the current speech situation, external context). Each system is essential to processing contextual information, as well as creating “knowledge” from representational events/types and semantic relations recorded in memory (e.g., mind models). Over time, increases in information storage and processing skills facilitated creation of more elaborate ways to remember and interpret phenomena and enhance predictive capacity for outcomes.

The core mechanism for cognition is the formation of schemata. Schemata organize knowledge and experiential events into small networks of information that become activated by contextual or internal triggers and shape the response of the entity (including processing of contextual input, analysis, and decision making). Schemata are vital high-level conceptual structures or frameworks that organize prior experiences, aid in interpretation of context, reduce cognitive load, allow to predict or infer unknown information, and frame the semantic content of situations. They play a critical role in the development of a sense of self and/or identity of an individual, and the way that sense of self and identity evolve over time. Schemata encompass both mimetic and memetic replicator entities and evolve over time and experiences of the individual.

Schema formation involves two core processes: assimilation and accommodation. The process of assimilation involves the reuse of schemata to fit new information (e.g., if a familiar object is observed, it will be integrated into existing schemata of that object). If information is encountered that does not fit any existing schema, then a new schema is formed (i.e., the process of accommodation ). Equilibrium refers to the necessarily temporary stability of schemata, as no new information is encountered that necessitates alteration or formation of new schema. Each process constitutes the development of increasingly complex schema, which allows for more complex knowledge to be internalized, comprehended, refined, and externalized. Arguably, each generation is able to form more complex schemata easier in their biological timeline through the process of pedagogy (as they are able to use the accumulated cultural design knowledge base as their foundation, rather than having to re-create it each time).

As digital designers, we constantly use schemata in design (even if we may not be consciously aware that we are). Schemata are invaluable in reducing cognitive load when dealing with familiar contexts (i.e., development of automatic reaction patterns to stimuli—habits), however, are problematic in dealing with unfamiliar contexts. In an unfamiliar context, many new procedures are constructed or modified, and then tested for effectiveness and the relevant schema changed to the new pattern. Hence, schemata are emergent (not fixed) and constantly evolve through experience. As schemata are the primary way for paradigms to emerge and be inherited, a change in schema is needed for an existing paradigm to change or a new paradigm to emerge. The challenge with evolving schemata is enabling change and (re)formation at a pace that is comfortable for the individual and, at the same time, useful for their functioning in their context. Evolutionary trends of recorded history indicate that “new” has been the adaptive advantage (across all media), while “old” created significant limitations as the surrounding context changed and it limited adaptation to the changes.

Digital media has arguably been responsible for greater and quicker schema evolution than any other medium in recorded history. Not only have digital media allowed for us to become aware of and compare dominant paradigms constituting local cultures and ways of life, but to form entirely new paradigms and subcultures that originated in the digital realm. Digital media enabled us to do this not only through rapid comparison of meaning systems but also in being able to compare how the meaning was put together. For example, Google search engine changed how we think through the very process of using the product. How we learned to access and structure information through using the search engine altered how we organized information in our minds. Social media has altered how we think of social networks and what attributes we value in them. Forums and chat rooms have altered the way that we express ourselves, as well as the complexity and form of language that we use online and offline. Most importantly, digital has fundamentally changed our relationship with the world and with ourselves by redefining our perception of our role and place within it.

A practical example of how designers use schemata is the global UX and UI standards that have emerged over the last two decades. As each digital product entered the market, we as designers learned from their success or failure and each new product we created borrowed design elements and interaction patterns from the previous. Our aim was to improve what we created (even if only a little). Over time, the UX and UI standards that have emerged have enabled users to be able to quickly figure out what to do in each digital product without needing a lengthy tutorial (as was common in the early days of digital technology). This is a practical example of the accumulation of technical design knowledge base of digital products, one which is also most reflective in the simultaneous coevolution of technology and cognitive schemata.

Evolution of Our “Selves”

Based on the meta-theoretical framework of Universal Darwinism, this chapter discussed compatible evolutionary perspectives on the symbiotic evolutionary trajectories of biology, technology, culture, and cognition. These evolutionary trajectories share a common core: the process of evolution is constituted by patterns of information creation and refinement through variation, selection, and inheritance. In other words, we can observe that what evolves across disparate media is essentially information.

Through the perspective outlined in this chapter, we can conclude that a significant constituent of what makes us “human” is the way that we create, store, process, refine, and share information. This capacity has been iteratively inherited, refined, and selected within the human species through the symbiotic processes of biological, technological, cultural, and cognitive evolution. We need to consider this in digital design and ensure that we are designing for users using assumptions that are reflective of their state of development, their needs, and their aspirations. We also need to constantly be aware that our design decisions fundamentally alter our evolutionary trajectory and ensure that our designs enable and reflect the future that we want to inhabit.

Biological evolution provides the most historically stable example, as genetic code has limited methods of propagation and can be traced through a common medium. The unit of heredity is a gene (i.e., segment of information on construction and function(s) of a biological entity). Cultural, technological, and cognitive evolution is more challenging in analysis, as the media across which replicator entities circulate encompass both biological and technological components. However, we can still analytically trace these replicator entities across media through two broad types: memes and mimes. Using these in our analysis, we can observe that what we think about, how we think about it, and what we do with this information has largely been a product of our social reality rather than the “natural” world from which we have historically separated ourselves through technology and culture.

As digital designers, we have a responsibility now more than ever to create products that enable us to shift our evolutionary trajectory in the direction we want to go. Digital technology is an extension and amplification of us, of our abilities, such that we are now able to do, think, and feel in ways we could not through biological affordances alone. Arguably, without technology, human beings would still exist in a primitive culture, passing down knowledge directly from entity to entity via mnemonic devices, stories, myths, and rituals. Technology is core to comprehending the passage of human evolution as it simultaneously reflects the development of the human species and enables it—innovation, over time, provides the necessary conditions for further innovation. Furthermore, it allows for us to reimagine who we are, our role and purpose, and consciously shape our evolutionary trajectory.

Digital designers have a responsibility to design products that are meaningful and useful for our users and that also create positive holistic impact. To do this, we need to really get to know our users. So let’s jump in and go to the core of modern human beings: let’s focus on the development of our sense of self, focusing specifically on identity (a core driver in our decisions and preferences). Who we consider ourselves to be has become inextricably linked to the technology that we use and our reflection of ourselves that we are able to cultivate through it. The next chapter deepens our exploration of how and why we are what/who we are and do that we do, using the focus on human identity as the core driver of the modern human being. We next look at digital users specifically and begin to come up with ways to design digital products that fit seamlessly into their/our lives and create as much holistic value as possible.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.230.82