Yves Lasne's conference
CEIA* Novotel Bercy 7 December, 1996
* Centre Européen d'Informatique et d'Automation (European Centre for Computing and Automation)

As a foreword, introduction, general preamble, we can say that the " Théorie des Hautes Dilutions Aspects Expérimentaux (THD)" ("Theory of High Dilutions and Experimental Aspects (THD)") is a truly multidisciplinary work, as many scientific critics have already informed us. It is not simply a juxtaposition of certain Scientific fields: mathematic, physics, statistics, biology..... This is true to a certain extent, but it is also a coherent interleaving of all these disciplines. It has been hailed as possibly the first multidisciplinary work. This book and the work involved was produced by a youthful group (about 70 working years between them) and involved considerable personal work performed outside working hours and the context of subsidies. This last point deserves to be mentioned because, as Rolland Conte would say, in return it gives us great freedom.

Today, I work, above all, in the biological and medical field. However, if you are working in the medical field, particularly with application of the THD known as Homoeopathy, this book provides protection, an umbrella, a pyramidion which, between science and non-science, lays the foundations of this field, until now taken to be magic. This book first treats Homoeopathy in a form we have called the 3 P. What are the 3 P? - firstly, the Product which we discuss in chapters 2, 3 and 4; - next, the Patient, in chapters 5 and 6; - finally, the Practitioner. The practitioner does not come into this book. He appears, so to speak, in the second part of the THD or the second book which is already almost complete. (Theory of High Dilutions: Application to life)

Why did these, originally independent authors, tackle this field of High Dilutions? It can be summarized by this Hahnemann's phrase which was not necessarily our own. We were interested in this problem because it posed a simple question, very simple for a researcher: Why does it work? How does it work?.
Hahnemann had already said it: What actually happens must at least be possible. This is what all four of us had felt for many years, each in his own corner. It was the combination of research performed by each of us using our own resources which convinced us to produce the synthesis presented in this book..


What is the idea of introducing Solovay's axiom?

The left hand side of this figure shows the chosen axiom, which as you know, and as its name implies, at unit level, is YES or NO.
You belong or you don't belong.
This axiom is based on the excluded-middle:
It's YES or NO. In a way you don't have the choice to choose.

What does this mean from the point of view of nature, since this is our approach today.
If you take the growth of a tree, from this point of view there is no branching, i.e., the trunk has subdivisions which are real, no matter what you do (everyone has seen a tree). From the axiomatic modelling point of view, there is a lot missing.
We can see it on the slide which is much prettier:
The tree of the chosen axiom, or excluded-middle is constructed in this way.
It is a species which is fairly rare although the most widespread.
The alternative, now offered with Solovay's axiom, is to move closer to a description of the nature.
How? In some situations we have shown the influence of the tree on its environment, you see that the colours are inside the trunk.
This is the conventional biological approach:
you analyze the inside but you don't want to know what has happened, what caused the phenomenon.
Here you bathe in the environment in the broadest sense of the term and find a form which emerges.
From A will come B and C as before, naturally, because it's the same tree, but before you didn't go through it:
You cut, it was discrete. That is chemistry.
You combine materials and try to see how it works.
It works very badly as you know.
In the theory of Ethers and Solovay's axiom, when the following conditions apply to A are put together: "environment + tree", the alternative is not only to the left or right, not only YES or NO but also YES and NO, i.e. a situation of undecidability.
Since it is undecidable the two choices cancel themselves out. This is a fundamental point which later led to the processes being modelled.

We shall look at Henri Berliocchi's tree. You will agree that it looks much more like what we are used to seeing.
This undecidability which you see illustrated at each point, corresponds to the natural reality you observe, what we call branches or bifurcations.
It is clear that we can influence this decision of undecidability using external resources: it is the individual who causes this branching..
But we can also arrange for this branching zone to be decidable: by pruning, by fitting a sleeve to help it grow, if we retain our example of a tree: it's one of the main fundamental constructions.
Click here to zoom
The other important point we shall see illustrated, is an overview, if we can call it that, of the idea of space-time.
For most people, space-time is really considered in this way, i.e. through three spatial dimensions, time's arrow determining a cone of the future and its accompanying wave for every material. It is postulated that this space-time is homogeneous.
Figure 1 below

The alternance or alternative, or perhaps both, we shall see, is to replace this smooth, homogeneous space-time by an inhomogeneous space-time. This is our point, from a symbolic point of view. Figure 2 below
Inside the cone of future of space-time, there is no more material, there is only the wave which accompanied this material. Outside the cone of the future, there is the material and its wave as before.

Click here to zoom the Figure1 Click here to zoom the Figure2
Figure 1 Figure 2

What is Contonian Statistics?
It is deterministic statistics, unlike many types of statistics you know, which transforms the appearance left by a dynamic system in its measuring medium into a frequency called contonian.
(See "Théorie des Ethers". Henri Berliocchi. Editions Economica, Paris, 1994)
Let's break it down into stages.
In practice, contonian statistics can be simply broken down into two stages:
This is what we call a Lagrangian calculation.
Here you have curves observed experimentally, which were previously called semi-chaotic or chaotic.
The Lagrangian method consists of calculating areas under the curve using the trapezoid method and the colourful process presented below. The signal, depending on the heights of dilution-dynamization, the CH, the integral or sum of trapezoids of areas under the curve is the sum of the sums shown here.
As predicted by the theory when it is a contonian process, the lagrangian calculation is linear.
It belongs to a family of linear graphs.
It is linear to an extremely powerful degree in the classic statistical sense, because the coefficient of correlation R2 is sometimes 1, sometimes 0.9999998.
This linearity is almost strict when the process is contonian.

Click here to zoom the photo Click here to zoom the photo

We know this, we won't repeat it 100 times. OK guys, it's time, we can go. This is how children perceive contonian statistics, which we must agree appears to be extremely simple in practical use.
Although developing it was extremely complicated and sophisticated for us, its practical application is almost easier than what you are used to doing, i.e. arithmetical average, variance and standard deviation
Click here to zoom the photo

Application of this type of statistics

This possibility of moving from a chaotic world, in which it is difficult to know where you are (you may see that one curve is higher than the other but what does it mean?), starts to make itself felt very strictly, absolutely mathematically and elegantly and you begin to see what this type of statistics will allow you to do.

Apart from this aspect of the lagrangian method, its calculation has a practical consequence for us because it lets us control the way the manipulation works: if there is a break in the slope concerning the relationship between the CH, it's because something happened to the abscissa.
This is generally a problem with dynamization, a change in frequency or energy which results in the steps no longer being regular in the energetic sense of the term, which, it has to be said, is subjacent to the concept of CH which corresponds to both a dilution factor and a dynamization energy which is reproduced in the same way every time between the different CH.
This point is fundamental to the obtention of this type of phenomenon, but we can go further.

Contonian statistics is used to calculate a value H, called the contonian frequency of the dynamic process in the form of its appearance as we see it.
We do not have access to the basic dynamic system itself, but we can access it through its appearances which will be reflected in each measuring system.

This contonian frequency, which is literally very simple, because in formulation, it equals 2 PI divided by the lagrangian for the maximum value of X, and is also a frequency in the dimensional sense because from a physical point of view it is the inverse of Planck's constant, which you will remember is expressed in joules*second. The inverse of joules*second gives us an equation of dimensions which brings us to something per second, and hence, by definition, to a frequency in the physical sense of the term.

The ability to calculate this value for each manipulation opens the way to an immense range of possible combinations, comparisons of observed and measured processes. There is a frequency you can calculate, process phases and the phases can be compared for different experiments or between different substances and so on. This comes within a field which is now very well-developed and which you know well: wave manipulation or telecommunications.

So there is a complete practical arsenal, set to music by Gabriel Vernot, which lets you plan quality controls on the basis of all these frequencies and phases of remanent waves, because with 2 appearances, which were chaotic to the eye, it was difficult, until now, to say whether the results obtained for a given substance were reproducible or not. Now it can be done. We can make a strict comparison of 2 semi-chaotic aspects of the same substance.

What happens in homeopathic solutions?
There are 1/10, 1/100, 1/1000... etc. dilutions. The volume of "solvent+solute" remaining constant; the matter being initially in the solvent partly disappears. When this matters disappears, there is what is called a branching in space-time, leading to what we have called white holes, at quantum level.

What are these white holes from a conceptual point of view?
A white hole is something immaterial, mathematical. These are fields, in the physical sense of the term. These fields will express the properties of the material which has disappeared. The mechanism responsible for this is as follows. A complete succession or cascade of quantum phenomena will take place: dilution, disappearance of a matter, appearance or generation of a physical field, white holes. One neutron of the solution, following the line of Universe according to Einstein, will come into contact with the physical field emanating from the diluted substance.

Henri Berliocchi has demonstrated that, for the particle, this situation is undecidable. It doesn't know whether it will be destroyed in the field or will continue its line of universe. This moment of hesitation makes that it dematerializes, this dematerialization being linked to its dilution, and leaves quantum singularities and physical fields in the solvent.

There are succussions, a second operation with which you are familiar in homoeopathy, which will multiply situations of undecidability.
You can see the breakdown at physico-mathematical level of two phenomena which, I repeat, are expressed and formalized to the lowest point.
The consequence of these two steps: dilution-dynamization or dilution-succussion, is thus a complete change in the solvent in which the matter has disappeared, but this does not take place just anyhow.

How can we remove the undecidability?
First practical remark at this theoretical level: undecidability can indeed be removed.
- By other physical fields: solar radiation of the dilution, heat above 120°C, which seems odd to you because water boils at 100 °C, but it is at 120°C that the phenomenon is annealed.
- By shaking too hard during the succussion phase which can also change the undecidability zone and make it decidable, deterministic, i.e. you will return to the solvent itself.

The quantum possibility is thus described, but, and we always come back to practice, in a narrow energy band.

We can produce these white holes and their consequences but we can also destroy them, luckily for us in a way.
You know that matter disappears in all permanent environments: in rivers for example, it is clear that solar radiation always sets the clocks back to zero.
We also know, by chemical means, that the sun sterilizes water.
In fact, the fields provided by solar radiation compete with and extinguish the physical fields induced by dilution.

What are the consequences on the medium in question?.

These consequences exist in all media but here, we have focussed on the problem of homeopathic dilutions.
We shall consider an aqueous or alcoholic medium, such as you use.
Dematerialization is the example of the neutron we used just now but which, illustrated in this way,
Is not a neutron in the classic sense of the term you know.
This neutron will disintegrate according to a model comparable to the one you know well with reference to classic particles.
You all remember that neutron can give a proton+ an electron + an antineutrino balanced in the way you know.

By analogy, and to get the message across, we write this first equation, for an aqueous medium – as a result of this creation, this dilution and succussion- invaded by entities which are not really protons but which can be, called Hyperprotons (or virtual protons) + virtual electrons + what we call an antigraviton, not an antineutrino, translate and illustrate the influence of gravity on matter-energy exchange phenomena.
The possible reactions propagating these phenomena have led to several ways of explaining the materialization of these Hyperprotons in an aqueous or biological medium.

The classic reaction: an association of 2 Hyperprotons gives rise to a proton or a hydrogen atom, with emission of negative Beta rays, which are accelerated electrons.
When you are close to the atomic mass unit, dM close to 1, deuterium H1 or tritium H3 may form in the solvent.
These equations have been verified experimentally.
It has been detected and shown that, over a period of time, certain infinitely small quantities, but nevertheless measurable using current instruments, of tritium are produced by homeopathic solutions.
We shall now talk about extensions of this phenomenon which lead to the aspect I wish to move on to: start, propagation, undulation, remanent wave.
Of course we can always go back to the classic interactions you know in biology or chemistry.
Do you remember how, in high school or medical school, in all biochemical or chemical reactions, you always had to balance the protons? Sometimes you got 5/11, 3/2, 4/3 protons to balance both sides of the reaction.
This means, if you think about it, that protons were omnipresent in your biochemistry and chemistry lectures, although they sometimes played an insignificant part in pH variations.
Click here to zoom

We have taken the opposite approach and made it the basic regulatory phenomenon.
We shall say that: dilution, white holes, physical fields, virtual hyperparticles, the end of all these mechanisms can be illustrated by the last equation which indicates that all the material expressed by these physical fields and white holes, produce this remanent wave which must be analyzable by Beta minus type emissions.
As you have seen, Beta minus emissions are produced by all the exchanges in the solvent, the specific physical field of the material having disappeared and they correspond to a multitude of spectra which we were able to detect.
The sum of these spectra, which vary with gravitational force as has been shown, is an image of the initial material which has disappeared.
In practice, at least two methods will and already have allowed us to measure the consequences of this theory experimentally, these methods involving the change of protons in aqueous or alcoholic solutions:

- MNR Magnetic Nuclear Relaxation, for which I shall simply remind you of the principle. Some odd spin atoms, notably proton H1, are magnetically sensitive, which means that if you put them in a homogeneous magnetic field and irradiate it with waves in radio frequencies, the nuclei, including the proton, being magnetically sensitive, will move into a certain orientation. You switch off the radio frequency beam and measure the time the atoms take to restore their balance, which is what we call Relaxation Time, which means exactly what it says. The proton is there, it is sensitive to radio frequency waves and the magnetic field; it reorientates, you stop, it returns to its previous position and you measure what is called FID (Free Induction Decay) which is linked to excitation by the coil which was emitting and becomes a receiver for magnetic signals. This relaxation time therefore reflects the dynamics of relations between water protons.
- radioactivity counters, known as Beta counters. If there is really an emission of Beta minus radiation, we should be able to measure it.
Indeed it can be measured, either in homeopathic solutions or individually using specific conventional technology in the form of so-called Beta radiation counters.
These two points therefore correspond to what can be done.
Hahnemann's Idea
Hahnemann had already described, in his way, what we have just mentioned. He had broken down the phenomenon and we measured it.
After the formal description, let's move on to the practical aspects.
Click here to zoom Using different homeopathic solutions 8CH, 18CH, 14CH, 29CH, we broke it down into stages. First a dilution which you see here, 1/100 dilution, for which we measured the relaxation time, called spin-spin or T2 using MNR methods. Then we measured this relaxation time T2 again after dilution, without shaking it, both dilution and handling being done as carefully as possible.
Concerning solutions below Avogadro's number, we are sure they were "home-made", because we knew exactly how many molecules were initially put into the solution.
What do you notice about these measurements?

You see that in the solutions where there is still material, the phenomenon is expressed automatically. Why? If you followed the beginning of the discussion, it has been dematerialized again, i.e. we have again induced the formation of white holes, because there was still material in the previous solution. This takes place at primary level, by which I mean in the physical fields of these white holes.
For dilutions above Avogadro's number, on dilution there are no more white holes created. White holes are transferred from the previous solution but on the other hand there is still dilution of the virtual particles we called Hyperprotons.
Nevertheless, you can see that, without shaking, there are some extremely important differences between an 8CH dilution and a simple dilution of 15, 19 or 30CH.

What causes succussion in these cases?

We demonstrated earlier that succussion multiplies undecidability and dematerialization, and multiplies communication of the remanent wave in the solvent.
Indeed, in the 1st case, you multiply, you amplify the T2 signal.

Up to Avogadro, you continue increasing propagation of the remanent wave in the medium in a quite amazing way. The same thing, but less intensely, occurs with the other solutions.
You can see that, physically, in accordance with the theory, you can break down the preparation of a homeopathic solution into two apparently separate stages, which are actually complementary:
- the "fundamental" process of dilution which will generate physical fields and create white holes
- the improvement side, which is succussion which will increase the power of the remanent wave.
You have now understood almost everything.

Here are white holes as seen by children:
They are not only different colours but also different weights.
As you have understood, and we have deliberately refrained from saying it earlier, white holes do not contain matter. They are the inverse of the black holes of astrophysics, which are hyperdense. This is what the children wanted to illustrate.


We have seen the theory and envisaged the technological way of measuring the consequences of this phenomenon.
Here is a real example. You have already seen the breakdown of dilution-succussion. Here is a transverse comparison between methods of dilution-dynamization. This sequence is split into 3, the solutions being prepared from the same original flask and the dilutions made using the same batch of water.
In this figure you have:
- in purple, a simple vortex dilution;
a red curve, representing the classic, so-called Hahnemannian solution process, classified as frankly chaotic, as we have said until now;
- in blue-green, still the same solution, but "raised", as homeopaths say, using the Korsakovian method.
It is immediately clear that this looks really different. Until recently, you could simply see by looking at it, but there was no way of applying this chaotic or semi-chaotic type of graph, a deterministic chaos because several times, on preparing the same solution we found the same peaks in the same places, and a probabilistic chaos because one solution did not seem to give the same peak as another solution.
Click here to zoom
We still didn't have a real tool for learning about the phenomenon.
The solution is now prepared. We have spoken about it through diagrams. On the basis of quantum theory of relativistic fields, the model predicts the influence of gravitational force.

Gravity, as Einstein said, is a curvature of the space-time. If this is the case, all these equations and terminal measurements made on appearances have a reality which we can use. For man, who produces white holes – we shall see how to deduce it by analogy – there must be a relationship between gravitational variations, which we know how to calculate, and pathologies.
It is true that remanent waves can influence living material as we saw just now. It is what is revealed by the variations in frequency of cerebrovascular lesions depending on the time of year. These data, which you may have seen elsewhere, are not ours (this is the reinterpretation of results published in many articles which we are currently working on). In red, you have the Fz gravity component and the method of comparison by the cosinor. You can see that there is a relationship between gravitational variations, which we cannot escape, neither on Earth nor elsewhere, and pathologies .

Click here to zoom Click here to zoom

This analogy has a meaning, so that if what has been described, mathematized and formulated for in vitro solutions does occur in vivo, it should produce the same curves as those presented just now. We should have a change in relaxation time for liquids, serums, urine…. and the emission of low-energy negative Beta radiation. We have shown all that directly but, here, we have illustrated the emission of Beta rays through some very old work in which the radiation had already been detected, with "radiation" in inverted commas at the time, through an effect which you may already know, the Kirlian effect.

The very old photo shown on this slide has been treated. Why? Because, as Kirlian and his pupils said, you had to try to raise the tension higher and higher so that the electric field really blackens the plate, so that people will take notice. Nobody wanted to look at the details of what they obtained.
What we wanted to see was not a plate as black as possible. On the contrary, we wanted to see the details of the photographic print. Using normal scanning and image digitization techniques, we saw, even on this very old photo, network aspects at the level of the flesh and its projection. It was what we were looking for, the pores of the skin.

Click here to zoom Click here to zoom

Recent colour photos showed the same phenomena. What you have to notice on these 3 photos is the difference in intensity of distribution of Beta minus radiation. Why is there a difference? As we shall see later, from the results of some of our personal work, the measurements made on serum or urine or any biological liquid from one individual, are proportional to what you call homeopathic types, i.e. carbonics, fluorics and phosphorics.

Here is Kirlian's diagram

Click here to zoom Click here to zoom

This test consists of placing an electric field (two electrodes: one on the tibia and the other under the foot) a film and protection in the form of a dielectric plate, passing an electric field and observing how the plate blackens.

There is Beta minus radiation in the body and it is measured. If you reverse the accelerating plates, and some authors have described this, the Kirlian effect does not take place: no emission occurs at the foot location because the Kirlian effect accelerates Beta minus radiation. If you replace the electrodes in the right direction: anode, positive plate under the foot, you attract and project Beta minus radiation outside the body and it is printed on the photographic plate.

This view, which could be considered as a simple anecdote, is nevertheless fundamental. It provides an explanation of the Kirlian effect if such was needed (I don't know). It also provides an explanation of the mechanism and validates measurements such as counts of serum, urine and other bodily liquids.


This is an autoradiograph of a thumb, i.e. a film taken without applying an electric field. If there are any Beta rays, they leave through the pores of the skin, as already detected on Kirlian's photos. We have no other hypotheses to put forward because we measured the rays and calculated their wavelength. These rays do not leave the skin at dermal level. On the other hand, they leave perfectly via the pores. The colouring shows the variations in intensity, white being used by convention to show the most irradiated points. This shot shows a right thumb set for 5 minutes on a commercial film specifically designed to register low-energy Beta radiation. This type of shot can, naturally, be produced by any of you.

Digitization of the previous photo is used to quantify the activity at each pore during the 5 minutes exposure.

This other film, which was not taken by us, experimentally confirms our model. This autoradiography was sent to me by someone who was curious enough to take a film of the brand indicated in our book and place his thumb on it, and he asked to me to see what was on the film. This film was blackened and I ran a three-dimensional integration on it to compare it with the shot we took ourselves, which I showed you earlier. The shot taken by this person shows all the little peaks and, if you look closely, you can see that it is not always very sharp but you can just about detect the pores, flesh and fingerprint of the person (circles and swirls very visible in green-yellow).

Click here to zoom Verification.

In this presentation we are continually changing from in vivo to in vitro and vice-versa. This is deliberate. It is to give you the idea of the type of continuum which exists between what happens in homeopathic solutions and what happens in the body. By applying the theory you can change from in vivo to in vitro without any difficulty.

This slide displays the results obtained by placing impregnated and non-impregnated granules on the same type as film as was used for the autoradiograph of the thumb, the duration of this experiment being naturally much longer than 5 minutes. On developing this autoradiography of granules we revealed black spots on the film (conventional terminology) only at the locations of impregnated granules. This autoradiograph was digitized to quantify the activity of the impregnated granules.

Click here to zoom Finally the blackened spots on the autoradiograph were integrated and the integration analyzed by erosion, using a classic algorithm from image restitution software. This type of processing is used to reveal, reasonably accurately, the object initially placed on the film.
Verification in vivo, verification in vitro. We went virtually as far as we could go. We therefore assimilate the organism's regulatory functions to regulation of a wave resulting from all the dematerialization, giving a remanent wave of the entire organism.

You will immediately see the therapeutic approach: you will manipulate waves you will produce by dematerializing specific products and combine them in the organism so that there is an interruption, phase opposition or amplification of remanent waves in this same organism.

The following shot shows the results of numerous Beta radiation counts performed on serums belonging to different homeopathic types.
- carbonic thuyas with 50,000 counts/min which is not negligible;
- fluorics which, on the other hand, are generally around 30,000 counts/min;
Click here to zoom
- phosphorics which are about 21,000 counts/min, having always had the smallest number of Beta radiation counts performed during these experiments.

Click here to zoom If this negative Beta radiation exists, and I think now that the reply can only be YES, it passes through the body. We know its wavelength which is less than 10 nanometers and, by chance or not, for most of its components, it can interact with DNA structures, i.e. with the conformation of a DNA helix. If these waves can react with DNA, DNA can be perceived as a receptive antenna for this radiation, i.e. for remanent waves. If this is the case, we shall find optical systems at nuclear level. Indeed, beams of waves cannot arrive like that from any part of space, through nuclear walls to irradiate DNA. In view of the frequencies concerned, this would inevitably cause the antenna, i.e. the DNA helix, to explode.
We therefore searched through the literature to see whether there was any kind of optical regulation in nuclei. Indeed, I don't know if I'm telling you something new, but personally, during my medical studies and even after, I looked at a great many slides under the electron microscope and never noticed that the nucleus is a veritable sieve. As you can see on this shot, the nucleus consists more of holes than walls. Each of these holes has a diaphragm which works in exactly the same way as the diaphragm you use in your camera. We showed, using various photos taken from the literature, because we don't take them (you search in big books and find loads) that the diameter of the holes varies from one organ to another. There is also a great variety of holes in various pathologies.
The only comment we found in a book of over 1,000 pages concerning Anatomopathology by electron microscopy was the following: it was noted that some nuclei had holes and diaphragms. This comment was on of the first pages in the book and no further mention was made of either holes or diaphragms.
The picture we are showing was not made to study the nuclei but to locate mitochondria.

An enlarged version of this photo clearly shows the diaphragm in the nuclear holes.

The next figure presents a reconstitution of a nucleus, produced, with some calculations, from a shot of an actual nucleus taken by electron microscopy. This reconstitution was done to illustrate the arrival of all the radiation we described previously, calculated and measured in the DNA, the arrival of this radiation (shown by the straight lines converging towards the centre of the picture) taking place through the nuclear walls.

Click here to zoom

Although it's true that Beta rays are produced, reach the nucleus and are filtered, they must still cause DNA vibration, which means that DNA must not be a fixed helix. Indeed, in cell cultures, DNA looks like it does in the photo. As soon as cells are alive the DNA is in permanent vibration as the authors said without being able to explain why. The picture shows a strand of DNA photographed at very short image times which gives this multi-strand appearance. Our interpretation has been much appreciated by molecular biologists of my acquaintance because this broadens their horizons from the point of view of modelling.
These biologists have nothing to do with homeopaths, they are often a long way from homoeopathy, but their main problems were to know what there is before genetic dogma, to know why DNA, even immobilized in extracts and crystallized, sometimes displays what they call piano strings: there are nodes.

If you take other cells on a different day and repeat the same experiment, you find the nodes again but they are different. You would say it was a different wave, a word they already used. This is why our model seems to be very useful to them right now.

Click here to zoom Click here to zoom

In conclusion, this short work which occupies very few pages as has been said, contributes, in our opinion, but you will be the judges of that, a certain number of points considered as fundamental, not only for homeopathic solutions but also, and even more importantly according to the reactions we have had from scientists, for molecular biologists; the model of animal material we have presented gives them a start with DNA. Indeed, it's all very well to talk about DNA but at the moment there is nothing before it.
How is that possible?
How can such or such a protein be produced from a fragment of DNA if there is no manufacturing order? Why does DNA, which is the same in all cells, not express the same factors according to its spatiotemporal position?

This is part of the quantum view of the phenomenon which was put into an equation by Henri Berliocchi and used for modelling. Of course there is still a lot of experimental work to be done.

We think we have provided you with a relatively solid basis.