6
The Human Body

Jean-Baptiste Poquelin – or Molière as he is more widely known as – (1622–1673), is one of the most famous French playwrights in the world. Performing the main role in most of his plays, he explored all the resources of comedy theater. Suffering from a lung disease, he died from a stroke he had on stage, while performing his last play, Le Malade Imaginaire [MOL 73]. Molière was wary of doctors at the time: the remedies they proposed had as much chance of curing a patient as they did of speeding up his death. The figure of the aging practitioner imbued with outdated knowledge appears in several pieces. Thus the character of Diafoirus, administering care to Argan, a hypochondriac, who, for his part, dreams of becoming a doctor. The final painting of the play features him as he takes the exam opening the doors of the profession. To university teachers who ask him about cures for all kinds of diseases, his answer is invariably: enema, bleeding and purging. What if the disease persists? The same cure! These answers earned him the ovations of the jury, a choir enthusiastic to welcome a new member worthy of practicing the respectable profession. To reinforce the solemnity of the examination, and to insist with humor and irony on the ridicule of the characters, Molière wrote in Latin:

“SECONDUS DOCTOR – Quæ sunt remedia quæ in maladia ditte hydropisia convenit facere?

BACHELIERUS – Clysterium donare postea seignare ensuitta purgare.

TERTIUS DOCTOR – Quæ remedia eticis, pulmonicis, atque asmaticis trovas à propos facere.

BACHELIERUS – Clysterium donare postea seignare ensuitta purgare.

QUARTUS DOCTOR – Et pena de respirare: Veillas mihi dire, Docte Bacheliere, Quid illi facere?

BACHELIERUS – Clysterium donare postea seignare ensuitta purgare.

QUINTUS DOCTOR – Si maladia opiniatria non vult se garire quid illi facere?

BACHELIERUS – Clysterium donare postea seignare ensuitta purgare, reseignare, repurgare, et rechilitterisare.

CHORUS – Bene, bene, bene, bene respondere: dignus, dignus est entrare in nostro docto corpore!” [MOL 73]1.

6.1. A digital medicine

The medicine that Molière mocks is not that of the 21st Century: like many other sciences, it has broken with Diafoirus through innovative approaches. It evolves in relation to other areas of research and knowledge, and benefits from the contributions of new techniques.

Applied mathematics and digital mathematics extend the knowledge accumulated by physicians over a long period of time (Figure 6.1). They are an integral part of the techniques used for health research, which is at the interface between medicine, physiology, biomechanics, physics, biology and imaging. As explained by Anne-Virginie Salsac, researcher at the CNRS*:

“Three main methods now make it possible to increase the understanding of living organisms: ‘in vitro’ experiments (on experimental devices representative of living organisms), ‘in vivo’ tests on animal models (mice, pigs, sheep, etc.) or human models (clinical tests, etc.), and ‘in silico’ modeling. The latter have two main objectives: the first is to contribute to the understanding of the mechanical behavior of the human body under physiological and pathological conditions to help explain degenerative processes, diagnosis and planning of medical procedures; the second is to allow the development of medical devices (prostheses, catheters, heart valves, etc.).”

images

Figure 6.1. Excerpt from Jean-Baptiste Sénac’s book, Traité de la structure du cœur, de son action et de ses maladies, Éditions Méquinon, Paris, 1783

(source: www.biusante.parisdescartes.fr)

In vivo experiments have been part of biomedical research since its earliest origins: in ancient times, the first Greek scientists interested in medicine, such as Aristotle (384–322 BC) and Erasistratus (304–258 BC), learned through experiments on live animals. Ethical debates on animal testing have been ongoing since the 17th Century. Nowadays, the sacrifice of animals, made necessary for certain research tests, is the subject of strong criticism from groups of citizens involved in animal protection and the defense of their rights. Various countries, including France, are legislating to regulate practices and, if they cannot yet do without them completely, to ensure animal welfare [HAJ 11]. Numerical simulation helps to integrate these ethical concerns into scientific research, as Jean-Frédéric Gerbeau, a researcher at INRIA*, explains:

“Digital simulation makes it possible to prepare in vivo test campaigns and significantly reduce the use of experiments on living organisms. It also offers researchers the possibility of developing alternative models to living organisms, by validating ‘in vitro’ devices or by carrying out ‘in silico’ experiments.”

Let us also note that in vivo tests include experiments to which human beings consent. The development of cardiac prostheses and their approval by health authorities requires the demonstration of their safety and robustness, established in particular by implantations on voluntary patients [COU 18, EDW 18].

The difficulties in modeling the human body are as follows:

  • – the mechanical behavior of human tissues and organs is heterogeneous and anisotropic – and, in most cases, it is necessary to use “complex” behavior laws (blood behaves like a non-Newtonian fluid, liver deformations are described by a viscoelastic model, for example);
  • – the geometry of the human body varies greatly from one patient to another and is not standardized: it is necessary to develop algorithms capable of quickly producing models adapted to each patient by identifying the different anatomical parts (skin surface, bone, fat, muscles, etc.) on medical imaging data;
  • – the boundary conditions to be applied to a biomechanical model (efforts on a tissue, muscle or tendon; pressure or velocity of air, blood, etc.) are not easily accessible to modelers. They also have a great influence on the calculation result and one of the major challenges of the models is to represent as realistically as possible different biophysical phenomena at the limits of the models;
  • – the reference configuration of the human body is not easily characterized. Operating at slightly higher pressure than its environment, it is in a naturally deformed configuration. The shape of the blood vessels “at rest”, in the absence of a stress field, is not known and a return to a neutral configuration is not easily achieved.

In the biomechanical field, simulation is still a research and development tool, summarizes Anne–Virginie Salsac:

“With the tools currently available, researchers are calculating realistic biomechanical quantities, but we are only at the beginning of their use in hospitals. Generalizing its use requires rapid modeling and execution, and it is a matter for researchers to develop efficient and robust computational codes, a task that nowadays occupies researchers in numerical biomechanics.”

In addition, a legal dimension arises: in order to generalize the use of simulation tools in the biomedical field, it is necessary to demonstrate their robustness, the way in which calculation codes are qualified in industry, and to develop current regulations – a task that has yet to be undertaken. However, biomechanical modeling has been undergoing continuous development in recent years and the future of numerical simulation is in the life sciences, according to Édouard Lété, expert in digital simulation at Dassault Systems and operational director of Digital Orthopaedics:

“The new ‘frontier’ of numerical simulation is that of life sciences! It combines many of the latest innovations in scientific computing, such as the modeling of ‘non-linear’, ‘multi-physical’ or ‘multi-scale’ phenomena. Many mathematical models developed in fundamental sciences, such as soil or fluid mechanics, make it possible to accurately represent certain situations encountered in the human body.”

The challenge of simulations is to develop digital models of organs – such as heart (Figure 6.2), liver, brain or joints – accompanying the practice of surgeons, for example by correlating a symptom with a treatment or by simulating an operation.

images

Figure 6.2. “In silico” model and new generation “in vitro” model. For a color version of this figure, see www.iste.co.uk/sigrist/simulation2.zip

COMMENT ON FIGURE 6.2.– Numerical simulation makes it possible to calculate realistic physical and physiological quantities for the organs studied and thus contributes to representing the functioning of some of them. “About 15 digital core model projects are underway in the world,” explains Édouard Lété. “The Living Heart Project”, initiated by the French company Dassault Systèmes, is one of them. The model is based on the electromechanical behavior of the organ, allowing for simulations that reproduce the natural beating of the heart. The simulation also permits virtual experiments to be carried out, for example, to develop models of synthetic organs. The latter are potentially useful in the preparation of the surgical procedure or the production of prostheses. The choice of materials and manufacturing processes is based on the analysis of medical data and a calculation that can provide the mechanical characteristics representing human tissues. “These techniques, which practitioners discover at scientific conferences, may help to renew their practices for the benefit of their patients. However, they are still at a research stage today,” explains Pierre-Vladimir Ennezay, a cardiologist specializing in vascular diseases.

6.2. Medical data

Since medical data are one of the key elements in the construction of a digital model, let us start by briefly discussing some methods for obtaining them on the living world.

6.2.1. Medical imaging

Medical imaging has made tremendous progress in recent decades and is helping to change the practice of many medical specialists. Several techniques provide access to increasingly accurate data on patients’ health status or on the functioning of organs that are still largely unknown, such as the brain.

Ultrasound, based on the analysis of ultrasound propagation in human tissues, is used urgently by a cardiologist, for example for cardiac output, valve filling rate, etc. The examination provides rapid diagnostic data (Figure 6.5).

images

Figure 6.5. Cardiac ultrasound

(source: www.shutterstock.com)

Magnetic resonance imaging (MRI) uses the magnetic properties of matter:

  • Anatomical MRI allows us to observe the resonance of hydrogen nuclei, which are present in abundance in water and fats in biological tissues, under the effect of an intense magnetic field. With MRI, it is possible to visualize the structure of an organ: this method can be used to diagnose cancerous tumors or to locate certain malformations (for example, in the brain, those that cause epilepsy). It makes it possible to construct an image of the chemical composition of the biological tissues explored (Figure 6.6), and thus of their nature and distribution in a living organ.
images

Figure 6.6. Cardiac MRI

(source: www.123.rf.com/Weerapong Tanawong-udom)

It requires more substantial resources and offers potentially more accurate information than ultrasound, such as analysis of the dimensions of the heart (overall volume, valve size, position of arteries, etc.) or tissues (to identify those that would be poorly perfused, for example). Non-invasive and painless, MRI also makes it possible to accurately describe the movement of the heart, such as torsion or contraction, and to deduce certain mechanical properties of this muscle.

  • Diffusion MRI is a powerful tool for measuring the movements of water molecules at the microscopic level. It is used, for example, to establish the fine architecture of neural tissue in the brain and to determine its variations at scales below the millimeter (Figure 6.7). Jean-François Mangin, a researcher at the “Neuropsin” brain study center and an expert in this technique, explains about diffusion MRI:

    “This imaging technique works like a real probe to understand the anatomical structures of the brain and other organs at microscopic scales. Decoding the images that cause the molecular agitation signals of water in living tissues remains a difficult task. For example, numerical simulation can help experts develop data processing algorithms: by varying different parameters in tissue modeling, we understand their influence on the signal to be analyzed.”

images

Figure 6.7. Identification of the main fiber bundles of the brain based on diffusion MRI (source: Jean-François Mangin, Vincent El Kouby, Muriel Perrin, Yann Cointepas, Cyrille Poupon/Commissariat à l’Énergie Atomique, NeuroSpin center). For a color version of this figure, see www.iste.co.uk/sigrist/simulation2.zip

COMMENT ON FIGURE 6.7.– Diffusion magnetic resonance imaging is used to map the brain’s “information highways”. These are the large bundles of fibers that allow neurons from different brain regions to communicate. The Neurospin center was created in France by the CEA more than 10 years ago to support the development of certain life sciences, including neurosciences. The latter are undergoing a significant evolution in their practices, due to the development of digital techniques. “This evolution is similar to that experienced by physics with the emergence of means such as particle accelerators,” explains Jean-François Mangin. “With the development of magnets capable of generating intense magnetic fields, it becomes possible to significantly improve the resolution of images from diffusion MRI and to ‘zoom into the brain’ at unprecedented scales. More than a century ago, the first studies on the brain made it possible to understand, by means of dissections, how connections between brain zones are organized. Diffusion MRI allows smaller scale mapping of connections, providing a detailed understanding of how different brain regions communicate with each other. Brain imaging also helps to change research in psychiatry. While brain structure varies from one individual to another, the analysis of imaging data does identify characteristics that are not found in the brains of some individuals, such as those who have experienced atypical neurobiological development. With diffusion MRI, for example, a first map of brain connections ‘at short distance’ was established, and the researchers found that in Asperger’s autistic people, these connections were poorly developed. Beyond the genetic factors influencing the expression of a pathology, imaging offers the possibility of understanding the origin of certain neurological diseases by means of their signature in the brain structure. A first step before being able to heal them.” The development of these imaging tools requires the collaboration of different experts: physicists, computer scientists and researchers in numerical techniques thus contribute equally with clinicians and physicians to the development of brain sciences.

  • – The electroencephalogram (EEG) gives access to the electrical signature of the brain and the contribution of neural areas involved in a movement or perception for example. This signature is obtained by placing electrodes on the surface of the skull (Figure 6.8).
  • – Magnetoencephalography (MEG) makes it possible to monitor the activity of groups of neurons with a very low temporal resolution, in the millisecond range – allowing complex mechanisms to be studied with great precision. This involves measuring variations in magnetic field intensity resulting from brain activity. MEG requires significant resources, currently reserved for research or medical expertise centers (Figure 6.9).
images

Figure 6.8. Electrodes placed on a skull to perform an EEG

(source: www.123rf.com/Dmitriy Shironosov)

images

Figure 6.9. Implementation of the MEG at the CEA NeuroSpin Centre

(source: © P. Stroppa/CEA)

COMMENT ON FIGURE 6.9.– No less than 300 sensors, with the sensitivity of femto-tesla (10–15 tesla), continuously record the magnetic fields emitted by the currents circulating in the brain. Thus, brain activity is detected in space (on a square millimeter scale) and time (at the duration of a millisecond) to access the dynamics of the brain’s information processing. In its NeuroSpin center, CEA is also developing a new generation of MEGs based on new magnetic sensors, providing access to even finer spatial and temporal resolutions.

These measurement and visualization techniques allow us to collect useful data in order to understand the complexity of the brain, linking it to the way we react or behave in given situations. They also provide an understanding of how certain areas can be damaged or how they synchronize – and what the consequences are for brain function. They can help to understand and prevent certain brain diseases or help a surgeon prepare for surgery. Visualizations of brain activity remain largely intended for study purposes: “brain imaging […] leads to a map of the brain: it tells us ‘where it happens’ in the brain – but does not really explain to us ‘what is happening’ and ‘why’” [WAA 17].

6.2.2. Genetic information

Inspired by a novel of anticipation from British writter Aldous Huxley (1894–1963), the film Gattaca imagines a society based on DNA testing. Orchestrated by the powerful Gattaca Aerospace Corporation, it maintains a genetic segregation of individuals [NIC 97]. In the perfect world of Gattaca, choosing a partner, accessing a profession and a social status are based on purely rational criteria. Vincent (Ethan Hawke), who is inferior due to an imperfect genetic heritage, dreams of being part of space missions open to humans with exceptional physical and intellectual qualities. He misleads the controls put in place by eugenics institution with the help of Jerome (Jude Law), who possesses all the required genetic qualities – Jerome was literally designed for this purpose…. Resigned to a wheelchair following an accident, Jerome is nevertheless unable to express them. He gives some of his genetic traces (blood, urine, bits of skin, etc.) as a contribution to Vincent’s plans – who, in turn, gives him a reason to live. First of all, fraud goes unnoticed. Unpredictability and suspicions make it apparent, jeopardizing Vincent’s project. Fortunately, for him, a final human decision is the flaw in a system where technology aims to leave nothing to chance. The conclusion of this filmed fiction allows one of its protagonists to realize his dream – and the spectator to keep hope, taking with him many questions about a possible and desirable future.

Genetic engineering tools are nowadays a reality, being developed in many laboratories: the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats) system is one of them. It allows the DNA of plants and animals to be modified with great precision and can theoretically be used to prevent certain genetic diseases of the embryo [REG 15]. In 2018, research in China suggested that this tool has become operational for manipulating human DNA: have the first children with a tailor-made genetic heritage that is resistant to the AIDS virus already been born? The announcement of the work of Chinese researcher He Jiankui provides an answer and raises questions about the future of humanity [REG 18].

The DNA molecule encodes genetic information on a four-letter alphabet, those of the nucleotide bases that constitute it: A (adenine), T (thymine), C (cytosine) and G (guanine). It was discovered in the cells of living beings at the end of the 19th Century by the Swiss biologist Friedrich Miescher (1844–1895) and for a long time remained an enigma [DAH 08]. In 1953, British biologist Francis Crick (1916–2004) and American geneticist James Watson showed that the DNA had a double helix structure [JAC 87]. Within the double helix, the DNA bases are organized in pairs, A with T and C with G. The DNA structure was visualized by Crick and Watson using an X-ray diffraction technique developed by English physicists Maurice Wilkins (1916–2004) and Rosalind Franklin (1920–1958). This discovery earned the three men the Nobel Prize in Medicine in 1962 – Franklin’s contribution had been forgotten at the time and it was only recently highlighted [LAW 18].

These DNA characteristics help understand how genetic information is copied and transmitted. Partly responsible for who we are as living people, it can also help predict what we will become in this regard – in particular the probability of developing a disease when it is linked, for example, to DNA damage. The generic human heritage, consisting of some 30,000 genes, has been accessible to scientists since 2003 through the development of sequencing techniques (Figure 6.10).

images

Figure 6.10. Sequencing techniques allow scientists to decode the human genome from the molecular structure of the DNA. For a color version of this figure, see www.iste.co.uk/sigrist/simulation2.zip

COMMENT ON FIGURE 6.10.– The figure represents a molecular model of the DNA, organized in a double helix. The atoms are represented by colored beads: Carbon in white, oxygen in red, phosphorus in purple and nitrogen in blue (in order to lighten the figure, hydrogen, an atom present in abundance in DNA, is not represented). The background of the image represents the fluorescent bands representing the genes carried by the molecule, as scientists can obtain it from an automatic sequencing machine (source: Peter Artymiuk/Wellcome Collection/www.wellcomecollection.org).

An emerging discipline, genomic medicine aims to identify certain genetic abnormalities in patients in order to provide them with targeted treatment. It is beginning to transform the way in which disease progression is prevented, diagnosed, treated and prognosed. Sequencing DNA is a health issue with multiple aspects: technical, economic and ethical. High-performance computing means, used in numerical simulation, also serve DNA sequencing techniques, which are undergoing a real revolution and are now bringing genomics into the era of data processing [JOL 17, QUI 17].

For many countries, genomic medicine appears to be a major public health issue because it revolutionizes the development of clinical research, therapeutic management, the care pathway and therefore the organization of public health. In 2016, France was deploying the “France Genomics 2025” plan, which included producing several dozen Po of data per year, including data from DNA sequencing, in order to contribute to the development of genomic medicine. The latter potentially involves all stakeholders in the health field (attending physician, university expert, biological analyst and, of course, the patient himself and his family).

6.3. Mechanical behavior of muscles and organs

Modeling the human body helps to understand and heal it. The finite element technique, particularly adapted to mechanics in general (Chapter 1 of the first volume), contributes to the development of models of human body behavior [LUB 15], whose complexity lies beyond standardization. The uniqueness of the latter partly makes it beautiful and difficult to propose a mechanical model.

images

Figure 6.11. Finite element model of a leg’s soft tissue (source: image owned by Texisense and the TIMC-IMAG laboratory). For a color version of this figure, see www.iste.co.uk/sigrist/simulation2.zip

The mathematical models for mechanical behavior of living tissues are complex and the elasticity characteristics depend on each patient: muscles, fats, tendons and cartilage react very differently when external stresses are applied to them. The rigid parts of the body – mainly the skeleton – undergo small deformations, while the organs can deform more significantly. Depending on the objectives, the models are based on elastic models, valid for example to describe bone strength, or viscoelastic models, adapted for soft tissue and organ deformations. In these cases, the models are nonlinear and their calculation quickly becomes complex and expensive.

In order to be used in the preparation of an operation, simulations must give the most accurate results possible (and in some situations, with the shortest possible calculation times). Their constraints are numerous: in particular, it is a question of producing a model that can be used by practitioners, subject to the imperatives of urgency. Biomechanical researchers propose simulations based, on the one hand, on the rapid generation of meshes (for a model specific to each patient, adapted to his morphology) and, on the other hand, efficient calculation methods (close to real time, allowing for example interactive simulations) [PAY 14].

What solutions are proposed by mechanics to overcome these difficulties? Algorithms capable of quickly producing models adapted to each patient, identifying the different anatomical parts (skin surface, bone, fat, muscles, etc.). By aggregating the imaging data (scanner, MRI) of the body part concerned, a calculation model can be produced in a few minutes. With a calculation tool, it is then possible to evaluate a state of stress in a muscle or part of the body (Figure 6.12), and to anticipate, prevent and heal contact pain, felt when a weakened foot is placed on the ground, for example [LUB 14].

images

Figure 6.12. Pressure calculated in a foot under pressure on the ground (source: image owned by Texisense and the TIMC-IMAG laboratory). For a color version of this figure, see www.iste.co.uk/sigrist/simulation2.zip

COMMENT ON FIGURE 6.12.– “Customized models in orthopedics are in some respects more mature than those in cardiac surgery,” comments Édouard Lété. “Requiring comparatively less accurate medical imaging on a fixed organ, it is also not subject to the requirements of urgency. The calculation makes it possible to evaluate the stresses in tissues and cartilage, which give a good picture of the comfort or pain experienced by a patient. Simulation allows practitioners to imagine therapeutic solutions and to anticipate the possible consequences of an intervention.”

Each organ of our body has a unique function and interacts in different ways with others at the same time. Mechanical interactions are the simplest, biological or physiological interactions are more complex. Modeling requires a good understanding of these things. Characterizing living organs is thus one of the crucial issues in biomechanics. For example, researchers collaborate to build databases and knowledge relevant to their practice – in particular to select the appropriate behavioral laws for modeling each living organ [PAY 17]. Data are useful for building simulation models, for example to train a gesture on a digital model of a component or on a three-dimensional model, obtained by 3D printing (Figure 6.2).

6.4. Blood circulation

In Molière’s century, the English physician William Harvey (1578–1654) discovered the laws of blood circulation and published a summary of his work in 1628 (Figure 6.13).

images

Figure 6.13. Excerpt from William Harvey’s book, Exercitatio anatomica de motu cordis et sanguinis in animalibus, Fitzer Publishing, Frankfurt am Main, 1628

(source: www.biusante.parisdescartes.fr)

Based on experimental anatomy, Harvey analyzes the data at his disposal. By studying hearts of all shapes, he measures the average amount of fluid contained in his cavities and the rate of heartbeats. According to his calculations, the heart stirs nearly 250 kg of blood per hour. Set in motion by the heart pump, its circulation takes place within a large network of arteries and veins whose functioning he discovers. The flow of blood through the body is divided into two circuits:

  • – the pulmonary circulation ensures the transport of oxygen and carbon dioxide in the lungs. From the right ventricle of the heart, blood low in O2 and high in CO2 is sent to the lungs via the pulmonary artery. It releases the carbon dioxide and charges itself with oxygen. Then returning to the heart through the pulmonary veins, it arrives in the left atrium and passes through the left ventricle of the heart;
  • – the systemic circulation provides the oxygen necessary for cellular metabolism. From the left ventricle of the heart, blood is sent throughout the body via the aorta and then the arterial system. It then returns to the heart through the venous system, enters the right atrium and is expelled into the right ventricle.

Blood is composed of 55% plasma and 45% cells (red blood cells, white blood cells and platelets). Plasma, made up of water and mineral salts, ensures the transport of blood cells, nutrients of food origin (sugars, amino acids, etc.), proteins (hormones, antibodies, etc.) and metabolic waste. Red blood cells are rich in hemoglobin, the protein that can carry oxygen or carbon dioxide, while white blood cells are responsible for the body’s defense. Platelets are responsible for blood clotting. Blood rheology is thus complex, but, as a first approximation, the Newtonian fluid model is suitable for simulations in large arteries. However, blood flow models cover a wide range of vessel sizes and characteristic velocities. As we will see in the following sections, low Reynolds numbers occur in microcirculation, while flows in large arteries involve higher Reynolds numbers: the physics of flow is thus very different from one situation to another.

6.4.1. Blood microcapsules

Some treatments for cardiovascular diseases, such as myocardial infarction or chronic heart failure, are based on angiogenesis, a therapeutic solution that stimulates the growth of blood vessels. Angiogenesis takes advantage of the targeted diffusion of growth factors that promote the migration and proliferation of vascular cells.

In order to optimize the process, the treatment uses microcapsules to ensure the controlled release of substances promoting the angiogenic process. Consisting of a membrane that insulates the contents from the external environment, their very small size allows microcapsules to circulate in the vascular network in order to release their contents [BAN 11]. Numerical simulation helps to size such microcapsules. Anne-Virginie Salsac and her colleagues have developed calculation codes adapted to microfluidic flow conditions. The researcher explains the purpose of the simulations:

“As they pass through certain areas of the vascular system, microcapsules can undergo significant deformations. For example, the simulation seeks to ensure that their integrity is not questioned in these situations or that it does not clog the vessels in which they circulate. The flows involved are controlled by a balance between viscous and pressure forces and by strong interactions with membrane deformation.”

The numerical simulation, validated by means of dedicated experimental devices, uses a calculation code developed for this class of flows with very low Reynolds numbers. The calculations allow to represent various conditions, close to those encountered in physiological flows and to understand how the capsule is deformed under the effects of the shearing of the blood layers. They provide useful data for selecting membrane materials, their strength or the shape of the capsule to withstand flow conditions (Figures 6.14 and 6.15).

images

Figure 6.14. Flow of a microcapsule (100 large microns) into a channel of the same size with sudden widening (transition to a rectangular channel [SEV 16] or Y-shaped opening [CHU 13]). For a color version of this figure, see www.iste.co.uk/sigrist/simulation2.zip

images

Figure 6.15. Numerical simulation of ellipsoidal microcapsules in a single shear flow [DUP 13, DUP 16, VAL 11]

COMMENT ON FIGURE 6.15.– The figure gives examples of the successive shapes taken by the same ellipsoidal capsule subjected to a simple shear flow encountered in the flow of physiological fluids such as plasma. Three possible flow regimes are observed, corresponding to different shear rates offlow: rotation of the capsule as a solid body, transition, rotation of the membrane around the capsule in a so-called “tank track” movement.

6.4.2. Angioplasty simulation

Angioplasty is one of the most common techniques in non-invasive surgery. It consists of restoring the failing arterial circulation in a narrowed vessel by dilating it with an inflatable balloon, introduced endovascularly from one of the vessels passing through the patient’s groin. It may also be accompanied by the installation of a prosthesis, or stent, a metal spring that maintains the distance between the vein or artery to be treated (Figure 6.16).

images

Figure 6.16. Principle of angioplasty

(source: www.123rf.com/ Roberto Biasini)

Numerical simulation is interesting in order to predict the potential success of an intervention:

“The simulation is based on a numerical model built from medical imaging data: the patient-specific geometry of the vessel concerned can thus be imported into a calculation code to solve the equations controlling blood flow and vessel deformation. In addition to the construction of a ‘realistic’ geometry, i.e. the one that is most faithful to the patient’s anatomy, the calculation presents two major difficulties: on the one hand, the pulsatility of the flow and, on the other hand, the deformation of the vessels, which locally influences the blood flow. This is a problem of ‘fluid/structure interaction’, also encountered in other industrial problems…”

This type of simulation is made possible after a long period of development work, making it possible to have calculation algorithms adapted to the specificities of fluid/structure interaction for biomedical applications (Box 6.1). In order to validate the simulation, researchers are developing dedicated devices, such as a silicone vein model based on imaging data (Figure 6.17).

images

Figure 6.17. Experiments on a silicone model allow to validate the simulations [DEC 14]

Its geometric and mechanical characteristics are known with great precision and the calculation can reflect them more easily than on living organisms. The simulation compares flow conditions before and after the intervention (Figure 6.18).

images

Figure 6.18. Numerical simulation of blood flow in an arteriovenous fistula [DEC 14]. For a color version of this figure, see www.iste.co.uk/sigrist/simulation2.zip

COMMENT ON FIGURE 6.18.– The figure shows the blood flow in the arm vessels in a patient with renal failure treated with hemodialysis. To allow treatment and bring the blood to the hemodialysis machine for filtration, access to a high blood flow vein is required. This access was previously achieved by surgically connecting a vein (bottom vessel in the images, where blood flows from right to left) to an artery (top vessel where blood flows from left to right). The purpose of numerical simulation is to understand the impact that arterial stenosis, which has appeared on the patient’s artery, has on blood flow rates, pressure and flow, and to test the conditions necessary for the successful treatment of stenosis by angioplasty. Numerical simulation allows different possible scenarios to be tested and helps to find the parameters to restore physiological pressure conditions. In order to be integrated into surgeons’ practice, this type of simulation must show several properties: speed, adaptability and of course precision.

Image processing, modeling and calculation techniques are constantly being refined so that numerical simulation can become a real help to practitioners in preparing an operation. David Perrin [PER 15, PER 16], founder of PrediSurge, a French start-up in the field, explains:

“We have developed a tool to model an aortic prosthesis insertion in patients at risk of an aneurysm. This prosthesis is made of a textile material, on which are sewn ‘stents’, metal springs allowing to deploy the prosthesis and to apply it on the walls of the vessel. Simulation helps to design this type of prosthesis and, in some cases, to adapt its design to the patient.”

The approach encounters two difficulties already mentioned for other applications: the solution proposed by the simulation is effective when the mechanical characteristics of the tissues and materials constituting the prosthesis are well-known, and when the modeling is adapted at a lower cost to the patient’s morphology.

“We worked on algorithms generating the geometry of the model from the imaging data produced in the preoperative phases. The calculation model then created with a finite element method has several thousand degrees of freedom. It is suitable for a ‘static’ calculation to visualize the shape of the prosthesis at the end of the operation. The task is completed in just under a day!”

The calculations are carried out using a commercial tool widely used in the industry and allow practitioners to anticipate possible complications in the course of the operation. Three-dimensional imaging data are used to develop a computational model, on which a finite element mesh is based (Figures 6.20).

images

Figure 6.20. Simulation of stent placement in angioplasty: personalized arterial geometry, CAD model and finite elements and aortic prosthesis placement [PER 15]. For a color version of this figure, see www.iste.co.uk/sigrist/simulation2.zip

The simulation it facilitates reproduces the main stages of the operation: insertion of the prosthesis into the artery, adaptation to the patient’s geometry, and prediction of the mechanical condition and shape of the treated vessels.

Researchers and engineers are thus working to develop numerical models of the human body based on algorithms developed for industrial problems (fluid flow, material resistance) and, in return, industrial simulations also benefit from biomechanical research results, as in the case of “fluid–structure interactions” (Boxes 2.2 and 6.1).

6.5. Cosmetics

The Birth of Venus is one of the most famous paintings by the Italian painter Sandro Botticelli (1445–1510). Venus emerges gracefully, gathered in the conch of a giant shell, rocked by rough seas. Zephyr’s breath carries the goddess’ long blond hair in its wake. Some verses by Charles Baudelaire, taken from a poem from Les fleurs du mal, remind us of this illustrious painting:

“Oh curls! Oh, fragrance full of nonchalance!

Ecstasy! To populate the dark alcove tonight)

Memories sleeping in this hair,

I want to shake it in the air like a handkerchief! […]

Blue hair, a flag of strained darkness,

You give me back the blue of the immense and round sky;

On the edges of your twisted strands covered in down I get very drunk from the mixed scents"

(La chevelure, [BAU 57], translation from French)

Nowadays, beauty products offer to restore the natural reflections of women’s hair, to play with various colors in order to imitate the grace of Venus or to arouse the poet’s delight. Developing cosmetic products that give hair a subtle and long-lasting color also requires mathematical techniques. These are based on data that complement equations expressing certain chemical and optical phenomena. Eva Bessac, a researcher at L’Oréal’s R&D center, explains:

“Modern hair tints are obtained with the help of dyes. Trapped inside the hair, they absorb some of the light to which they are exposed and restore the desired shade. This mechanism depends, among other things, on the composition of the applied product. Equations describe the phenomena of light absorption and reflection. They involve many parameters that must be characterized, by means of colorimetry tests, carried out on a range of different hair.”

A large part of the cosmetics industry’s know-how lies in the use of the best basic components and their mixtures, in proportions that allow a desired color to be obtained. From about 15 dyes, it is possible to create an almost infinite number of colors. Imagining a target coloring, what should be the mixture to be made? This question cannot be answered by systematic testing – even for experts with long experience in this field: it would take too long! Statistical methods allow it:

“In order to develop new products or to offer a ‘personalized’ shade, we develop mathematical models to define an optimal mix or to invent new ones with the expected qualities. The approach is based on equations that reflect how the product applied to the hair releases light. These equations are supplemented by experimental data specifying the model parameters. These are mixtures of known dyestuffs, produced in different concentrations and applied to hair samples. They are obtained according to a reproducible protocol because their reliability determines the accuracy of the calculation.”

With a modeling capable of characterizing all the achievable colors, it is possible to predict which one will be obtained with a given formula! In order to evaluate the effect produced by a shade, experts use established standards. A color may be represented by three sizes: the clarity, and the amount of red and yellow it contains. The mathematical model offers to perform this rating in addition to the sensory evaluations conducted by experts:

“We can now correlate sensory evaluations with the rating criteria obtained from mathematical models with great reliability. Modeling allows hair coloring experts to save time, raw materials and unnecessary experimentation. It is a creative tool and does not replace human sensitivity!”

The subjective judgment as to the aesthetics of a coloring, whose rendering is a subtle mixture between a color and its reflection, remains essential. In the digital age, Venus still keeps all her secrets.

6.6. Neurosciences

Ada Lovelace, a contemporary of Edgar Allan Poe, has turned her own mind into a laboratory: “I hope one day to succeed in equating brain phenomena” (cited in [CHE 17]). She thought mechanically and mathematically about the brain, which in her time was avant-garde because she anticipated certain concepts that are now being developed by neuroscience. Understanding how the brain works – and how it contributes to intelligence – is one of the scientific challenges of our time. While Edgar Allan Poe thought it unlikely at the end of the 19th Century, 21st Century technologies promise it. Researchers can nowadays measure and visualize brain activity. They have different means at their disposal that are used in a complementary way, offering to know the structure of the brain, to understand its activity in different regions and to monitor its evolution over time. This information helps to understand many brain functions [NAC 18].

Complementing these visualizations, some scientists are currently trying to model the brain and simulate its activity. Thus, just as it has become a tool that contributes to other sciences that use it more and more systematically, numerical simulation accompanies some brain research. Researchers in neuroscience are developing models that are very similar in principle to those used in the physical sciences. They allow virtual experiments that contribute to understanding the organization of the brain or aim to predict some of its evolution. We have seen that this task remains very difficult (and still beyond the current reach of scientists) for complex physical phenomena, such as turbulence. If it seems much more daring with regard to the brain, it nevertheless becomes a reality.

To date, major international projects have set out to provide numerical calculations to simulate, using data and physiological models of brain cells and their connections, the functioning of certain parts of the brain. They are the subject of significant investments in the United States, China, Japan or South Korea, countries with the world’s largest computational resources, and Europe. The European Human Brain Project (HBP2) is one of them. It aims to provide a digital model of the brain, calling for collaboration between researchers from different scientific disciplines (including numerical simulation). It also aims to contribute to the emergence of a computer infrastructure available to researchers in brain sciences and other sciences to conduct part of their research. These are based in particular on techniques for data collection and storage, which contribute to the development of theoretical and numerical models [AMU 16].

Markus Diesmans, a neuroscience researcher and contributor to the BPH project, explains the challenges of simulation:

“The brain has a very complex functioning that we are only beginning to understand. It is organized at different levels. The molecules of the matter of which it is made up; the nerve cells, divided into different layers (with a density and variety of cells specific to each species), themselves organized by the proximity allowed by their multiple connections; more global regions, linked to each other. What does this complex, ‘multiscale’ and ‘non-linear’ organization mean? How is it related to brain activity? Simulation in the general sense seeks to know which level of modeling is most appropriate to understand brain function, and aims to reproduce its activity.”

For neuroscientists, digital reconstructions and simulations provide new tools for their work3. Scientists use different tools: developed mostly in a collaborative and international context, they are adapted to the description of the brain4. At the level of neurons and their connections, the models represent the electrochemical reactions occurring in these cells, as well as the transmission of information between them. Modeling uses mathematical equations, supplemented by data (such as the type of cells represented in the different neural layers, their density in the areas studied, etc.). Researchers refer to simulation based on both models and data – in this respect, the methodology used is identical to that used by scientists studying, for example, weather or climate (Chapter 4).

An example of a detailed simulation of a mouse brain area is provided by Makram et al. [MAK 15]. Like astrophysical models that only take into account a “part of the Universe” in order to be operated (Chapter 3), this simulation is sufficient to perform calculations from which it is possible to extract information. The model is built from a database representing the biological variability observed in the brains of rodents. This describes the anatomical organization of neurons in the brain, their statistical distribution in different areas of the brain and their connections. These are rendered by a random algorithm of neural position and morphology in the studied region. The model represents about 30,000 neurons and nearly 10 million connections between neurons, by axons and dendrites (a total length of 350 m for the former and 215 m for the latter). These connections are expressed by means of an algorithm representing biological rules, the data are supplemented by mathematical equations representing the main electrical and physiological phenomena at work in nerve cells.

The brain model is thus described as general. It is not designed to represent a particular type of activity, but rather to perform digital experiments. According to their authors, the simulation validates the ability of a simulation to correctly reproduce certain functions observed in vivo on an animal. It also makes it possible to produce data that is currently inaccessible to experience:

“This study demonstrates that it is possible, in principle, to reconstruct an integrated view of the structure and function of neocortical microcircuitry, using sparse, complementary datasets to predict biological parameters that have not been measured experimentally” [MAK 15].

The current limitations of brain simulations are numerous and discussed by the scientific community – to go into detail would go far beyond the scope of this book. Through scientific and ethical debate, researchers build the rationale, limitations and contributions of digital simulation for their practice [DUD 14, ROS 14]. Markus Diesmans enlightens us on this subject:

“The scientific community is developing a wide range of opinions on brain simulations. For some researchers, its use is premature: the knowledge available to scientists today is too limited for calculations to produce exploitable and interpretable results. The structure of the brain is extremely complex and simulations are based on data that do not represent all the scales necessary to reproduce a complete brain. However, the idea of simulating the brain is nowadays relevant to many other researchers. This technique is used pragmatically, becoming a tool accessible to scientists and contributing to their research.”

The American author Siri Hustvedt focused on neuroscience for philosophical and intellectual interest [HUS 18a]. She notes this limitation, which other researchers in this field point out with her [DUD 14]: “It is impossible to isolate the brain from what is happening outside it (what we call our environment and our experiences), as well as from other parts of the body that interact with it…” [HUS 18b]. Therefore, a complete simulation of the brain should account for these bodily and environmental interactions, as well as the variability of living organisms. A task that has so far proved impossible to achieve by means of modeling. Markus Diesmans reminds us in a concrete way that the calculation resources called for by complex simulations remain very significant:

“Reproducing one second of operating time from a brain area required, for example, nearly 15 minutes of computing time on a supercomputer. The human brain contains just over 85 billion neurons, each with an average of about 10 thousand synapses. It evolves on very different time scales. The plasticity of the brain, its ability to reorganize itself, involves phenomena occurring on extended time scales: from a few minutes to a few months… even years! Fully simulating the dynamics of the human brain is currently beyond the reach of the models and computing machines available to scientists – and therefore remains a very ambitious goal!”

Brain sciences nowadays benefit from digital visualization techniques, which will be reinforced by simulation techniques. In both cases, it is worth noting the role played by supercomputers, allowing the processing of large quantities of data or the performance of calculations. Simulation provides new ways for researchers to understand in depth the complexity of the brain, its structure, connections (Figure 6.23) and activity.

images

Figure 6.23. Simulation of the electrical activity of a portion of a virtual brain consisting of seven reconstructions of neocortical microcircuits (source: © The Blue Brain Project/www.bluebrain.epfl.ch). For a color version of this figure, see www.iste.co.uk/sigrist/simulation2.zip

Simulation is not yet able to offer the possibilities of visualization – will it ever be? Despite its current limitations, it will most certainly contribute to the evolution of the way some brain science research and other disciplines are conducted:

“The road to simulation of the human brain, or even only part of its cognitive functions, is long and uncertain, […] but on the way, much will be learned. [.] New methodologies and techniques are also expected that will benefit neuroscience at large and probably other scientific disciplines as well” [DUD 14].

Artificial intelligence might be one of the scientific fields that can benefit from this research. Presented as one of the flagship techniques of the 21st Century, it fascinates, intrigues and raises many questions, as we have mentioned in the first volume of this book. For some researchers, the future of this technique lies in its convergence with other disciplines. Thus Geoffrey Hinton thinks: “Overcoming Al’s limitation involves building a bridge between computer science and biology…” (remarks reported by [SOM 17]). An example of this possible connection can be that of neuro-morphic chips, of which Timothee Levi, a researcher in this field [LEV 18], explains the properties:

“These chips are close to the energy performance of biological brains, which surpass those of machines in terms of consumption/execution speed. They find applications in robotics and artificial intelligence offering computational and learning capabilities at higher yields and performance than current calculators…”

Inspired by life, computing systems reproduce the propagation of nerve impulses between neurons:

“Some ‘neuro-morphic’ systems are based on deep learning techniques and are designed to automatically develop their connections. These are reinforced when they are requested. This so-called ‘unsupervised’ learning is close to ‘neural plasticity’. The artificial neural networks of which these systems are composed only perform calculations when they are used. They operate efficiently with few IT and energy resources.”

Nowadays, several research laboratories and computer companies are announcing the industrial development of such devices, chips and computers [LEO 18, OBE 18]. The structure of the chips on the computers of the future simulates that of the human brain. Will computers be closer to our brains tomorrow?

images

Figure 6.24. Galway neuro-morphic processor (source: Timothée Lévi, Université de Bordeaux). For a color version of this figure, see www.iste.co.uk/sigrist/simulation2.zip

  1. 1 This may be naively translated in the following terms:

    SECOND PHYSICIAN – What are the appropriate remedies for this disease called dropsy (water retention)?

    CANDIDATE – Perform an enema, then a bleed and a purge.

    THIRD PHYSICIAN – What remedies do you think are appropriate for asthma and pneumonia? CANDIDATE – Perform an enema, then a bleed and a purge.

    FOURTH PHYSICIAN – And for respiratory failure, would you tell me, aspiring doctor, what you would recommend?

    CANDIDATE – Perform an enema, then a bleed and a purge.

    FIFTH PHYSICIAN – And if the disease persists, if the patient does not heal, what should be done?

    CANDIDATE – Perform an enema, then a bleed and a purge; again a bleed and a purge….

    CHORUS – Good, good, good! He is worthy to join our learned corporation!

  2. 2 Available at: http://www.humanbrainproject.eu.
  3. 3 Let us add that some scientists even think that numerical simulation, like visualization, may help us to understand the origin of certain brain diseases and prevent their development.
  4. 4 In the toolbox available to scientists, we find in particular models of nerve impulse dynamics and a programming language appropriate to their description. The models, on the other hand, are implemented in different open-source programming codes [GEW 07, PLO 16].
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
44.203.219.117