The International Research Area on Foundations of the Sciences

English Version

Chosen News from the World of Science

Show All [For a better view, click on "Show All"]

Newsletter n° 3
February 2016

Download the pdf version 





2016 has opened with the important news of the discovery of four heavy elements of the periodic table. Here we recall other news that have marked the year 2015.


1. Several new elements have been discovered: they are elements number 113 (Ununtrium), 115 (Ununpentium), 117 (Ununseptium) and 118 (Ununoctium), which are synthetic metals with a very short life ( atomic-numbers-113-115-117-and-118.html).


2. On October 6th 2015, the Stockholm Royal Academy of Sciences awarded with the 2015 Nobel Prize for Physics Takaaki Kajita, of Tokyo University in Japan, and Arthur B. McDonald of Queen's University of Kingston in Canada. The motivation is "for their key contributions to the experiments which demonstrated that neutrinos change identities. This metamorphosis requires that neutrinos have mass. The discovery has changed our understanding of the innermost workings of matter and can prove crucial to our view of the universe". The official Press Release continues: "For particle physics this was a historic discovery. Its Standard Model of the innermost workings of matter had been incredibly successful, having resisted all experimental chal-lenges for more than twenty years. However, as it requires neutrinos to be massless, the new observations had clearly showed that the Standard Model cannot be the com-plete theory of the fundamental constituents of the universe. (…) Now the experiments continue and intense activity is underway worldwide in order to capture neutrinos and examine their properties. New discoveries about their deepest secrets are expected to change our current understanding of the history, structure and future fate of the uni-verse". .

3. On March 12th, 2015, the second run of the LHC inaugurates a new era of physics ( energy to which the protons beams will be subjected, is going to be double that of the previous season, namely 6.5 TeV per beam (remember that the Higgs boson was discovered in 2012 at the energy of 4 TeV per beam, for a total of 8 TeV). 13 TeV for the two beams of particles to collide, therefore, to try to complete the knowledge of the particles of the Standard Model and venturing into the study of dark matter and dark energy, which should make up 95% of the universe. On June 3rd, the initial collisons have been made: we are now expecting the first results.

4. Dark matter could interact with other dark matter differently than by gravitational interaction: (; 9525-67ab89dee9f1). By observing the galaxy cluster Abell 3927, astronomers have detected the presence of a gravitational lens (i.e., a special magnifying effect explained in General Relativity, and due to the deflection of light caused by the presence of a large mass) that distorted the image of a galaxy much farther than that mass. This gravitational lens was not, however, produced either by the four galaxies seen, or by any other galaxy, and its source, invisible, was slightly shifted. The only explanation of the phenomenon is that, during the collision of galaxies, clusters of dark matter associated with each galaxy interacted with each other: thus, the dark matter of one of them would be left behind, due to a kind of friction.


5. To reconcile Maxwell (light as a wave) and Einstein (light as a particle) is the inter-est and dream ofmany researchers for nearly a century. It's now been announced that wave and particle were last seen together: ( re-2504610/). The detection of the behavior of the dual nature of light was made with an electron microscope: the interaction between electromagnetic radiation in a nanocavity and an electron beam. So far only movies have tried to "say something" and to indicate at least a "significant experience" (if not an actual experiment). In fact, in 1976 a film of Merli, Missiroli and Pozzi won the Prize for Physics for showing the inter-action in the case of matter at the elementary level ( More recently Tonomura, using an elec-tronic microscope, has filmed for one minute interference fringes of a single electron, re-vealing at the same time the two aspects of matter. Here is the link to the video of Tonomura:

6. January 2015: the experiment of Michelson and Morley was reproduced at Berkeley on the quantum scale, with results more than 100 times more accurate than previous experiments:
(; The experiment of Michelson and Morley is one of the most famous of the history of science, by which the invari-ance of the speed of light was discovered, regardless of the direction of light propaga-tion. It was also shown that there is no medium for light called "ether" and that space is isotropic. The isotropy of space (i.e., space is identical in all directions) is a prerequisite to the Lorentz invariance, used by Einstein before in Special Relativity and afterward in High Energy (Relativistic) Quantum Physics. The recent test, conducted by a team of physicists at Berkeley, shows that space does not show signs of being "smashed" be-cause of the motion of the Earth (as Michelson and Morley thought), and the experiment used electrons (Michelson and Morley used photons instead), with entanglement effect. Practically, the experiment took advantage of the oscillation of the electrons simultaneously in two configurations, as qubits in quantum computers.


7. On Saturday, June 14th 2015. The crew of Mission Futura returned , after having been launched, with destination of the International Space Station on February 2015, under the command of Italian astronaut Samantha Cristoforetti: ( Among the experiments conducted in the Space Station concerned the reduction of bone mineral mass, induced by the permanence in space or by aging on earth, with the study of the related enforcement actions, and the influence of microgravity on gene expression, in order to understand how to cure diseases related to the cytoskeleton of cells.

8. Hubble Space Telescope. Having been launched out of Earth's atmosphere on April 24th, 1990, the Hubble Space Telescope celebrated in 2015 its 25th anniversary. The same telescope that showed us beautiful images of the universe, arriving to see at a spatio-temporal distance not so far (12 billions of light-years) from the big-bang, dated 13.5 billion of light-years ago: (; spazio-esa-nasa-5659c909-af0c-4f63-b0a0-10cf7e2f534d.html). Hubble will continue to work for a few years: its successor seems to be the James Webb Space Telescope, whose launch is planned by NASA for 2018.

9. "Gravitational" telescopes are designed to search for gravitational waves: the waves provided by the Theory of General Relativity, and so far never observed. Among the most famous telescopes of this type, there are LIGO (at Hanford and Livingston, USA) and Virgo (in Cascina, Pisa, Italy) ( Now in the United States and Canada a great program for the search for low frequency gravitational waves will begin, with two of the best radiotelescopes in the world: the Green Bank Telescope and the Arecibo Observatory ( These tools will integrate the observations of the PLANCK telescope. The first problem is the variability of the sources needed to generate gravitational waves. In addition, there is the difficulty of having to seek such sources in a space 20 times the diameter of the moon, and in a relatively short time.

Hubble Image

Image for the celebration of the 25th anniversary of the Hubble Space Telescope: this is a place of new star formation, located in Gum 29, about 20,000 light-years away in the constellation Carina.


10. Touch and brain. Proprioceptive and other sensory information do not travel on separate channels, as it was believed, to be then integrated into higher brain areas: ; ). Already the primary somatosensory cortex is able to process in a much more complex way than previously thought. For the exact perception of an object (haptics) cutaneous and proprioceptive stimuli are necessary. To understand how they could interact, monkey experiments have been carried out, identifying the brain areas involved in touching something: we have seen that both the perception of movement and touch are transported by the same group of neurons. This discovery will help to better connect better the bionic prostheses to the brain in order to produce more natural behavior.

11. A new technique of Brain-imaging to understand the functional differences of the autistic brain (; The information may also be useful in the understanding of other disorders such as hyperactivity and attention problems (ADHD).

12. Coordination between the brain and the lymphatic system, discovered by a group of researchers at the University of Virginia in Charlottesville ( Researchers have reversed the standard procedure, which included cutting the meninges and its subsequent observation. In this way, it could detect the minute structures of the lymphatic system in the brain. This discovery could radically change the approach to diseases such as multiple sclerosis and Alzheimer's.

Limphatic system



13. Well before Homo habilis our early ancestors were forging work tools. Until now it was thought that the oldest tool in human history was the one found in the village of Gona, Ethiopia, dating back to 2.6 million years ago, at the dawn of our species. At Stony Brook University, the State University of New York found that instead the oldest working instrument dates back to 3.3 million years ago, found in a region of Kenya. That takes us back another 700,000 years, to the Cenozoic era of Australopithecus. (Http://; worlds-oldest-stone-tools/ ). This seems consistent with the finding that the oldest fossils of Homo habilis found shows similarities with Australopithecus ( antico-fossile-di-homo-habilis-mai-ritrovato).


Some of the greatest astronomers of the seventeenth century carefully described the appearance of a new star in the sky in 1670. It was long thought to be a nova, but the idea did not convince entirely. Now it turns out that Nova Vul 1670 was actually an even rarer and more violent type of stellar collision,



In 2013 the Nobel Prize for physics went to François Englert and Peter Higgs, two theoretical physicists (the third, Robert Brout died in 2011) who, with their mathematical models, made the hypothesis of existence of the Higgs Boson, then experimentally observed at CERN in 2012. The high energies reached by the LHC Large Hadron Collider are needed to study subatomic particles. Of these and referring to the composite particles, part of the mass is the so-called "rest mass", such as that derived from the particle mass components (quarks) and other elements. In the case of proton, for instance, the rest-mass is given by the sum of its components (quarks), and of their bond energies within the proton, just as the mass of the atom nucleus is given by the sum of the masses of nucleons (protons and neutrons) and of their bond energies. Now, the mass of the proton, of which the most of the matter of the atoms, and therefore of our bodies, is made, is a total of about 1GeV / c2 (1 giga-electron-volts, 1 billion eV, 109eV). If you add up, however, the rest masses of the three quark components, you do not even arrive to a hundredth of the mass of the proton, therefore made up of 99% of energy.

You understand then why, when you want to produce a fundamental particle artificially in our accelerators, you must produce kinetic energies at least of the same order of magnitude of the mass of the particle desired, by accelerating the motion of smashing beams of massive particles (electrons-antielectrons or, in LHC, protons-antiprotons: the opposed electromagnetic charges are necessary for allowing the beams to smash and hence doubling the energy, otherwise they repel each other). The Higgs field with the relative boson is needed to keep the law of conservation of charge, when the fields of massless particles interact (e.g., the fields of bosons that, per se, according to the fundamental Goldstone theorem, ought not to have masses like, effectively, it is the case of photons and gluons), so to give rise to the tremendous mass of the Z (practically, it “weighs” like an iron atom) and W bosons and, finally, to the mass of quarks and leptons, of which all the ordinary matter is constituted. In other words, the masses are generated when the Quantum Vacuum (QV) produces the Higgs field, a kind of viscous medium that “brakes” the propagation of other force fields, so that they manifest a sort of “inertia”, and therefore a mass of the relative force field quanta (e.g., the Z, W± bosons of the weak force by which quarks and neutrinos interact). On the contrary, other fields like the electromagnetic one, whose quanta are the photons, by which quark and electrons interact, or the strong field whose quanta are gluons by which quarks interact, do not couple with the Higgs field. In short, the mass of all the elementary particles is proportional to the degree of coupling of all their fields with the Higgs field, with a key-role of Z-W± bosons and hence of the neutrino physics.

The Higgs boson, at least of the type observed till now, explains, therefore, the deepest symmetries we know in nature, at the level of ordinary matter. If the future research helps us to improve our knowledge, going beyond the Standard Model that effectively “explains” less than the 5% of matter constituting our universe, i.e., the “ordinary” matter (see figure), we will be able “to shed light” on the ultimate constitution of matter.  That is, to shed light on what we metaphorically denote as the “dark matter”, to signify the mystery enveloping the lacking mass necessary to justify the gravitational force binding together the galaxies and the clusters of galaxies. And to shed light on what we metaphorically denote as the “dark energy” to signify the lacking energy necessary to justify the acceleration of the universe expansion, become “recently” (i.e., some million years ago) faster than what is previewed by the Hubble Law.

The cosmic "pizza"


From the theoretical standpoint, the best candidate to such a shift of paradigm in fundamental physics is the so-called “Quantum Field Theory” (QFT). Till now it was conceived as an extension of the “Quantum Mechanics” (QM), i.e., the so-called “second quantization” of P. Dirac and R. Feynman. In this interpretation, in the study of the fundamental electromagnetic interactions of the “Quantum Electrodynamics” (QED) and of the color charge interactions of the strong force of “Quantum Cromodynamics” (QCD), it continues to work according to the mechanistic scheme of the Newtonian Mechanics (particles isolated from forces in the mechanical vacuum) that, from Laplace on, in the study of many body dynamic systems, uses systematically the so-called “perturbation methods”. They study the particle behavior using systematically the so-called “asymptotic condition”, i.e., they represent the system by separating the objects at infinite spatio-temporal distances, so to isolate the particles from interactions “cutting-off” them, and re-create artificially the condition of particle isolation of the Newtonian Mechanics.

The supposition is that such a modeling does not falsify the observed phenomena. The consequence of such an approach is the absolute distinction between particles and interaction force fields, constituting the core also of the Standard Model. Let us think at the fundamental calculus device of the so-called “Feynman diagrams” in QED and in QCD (see figure below), where the particles (fermions) are represented by straight lines and the quanta of the force fields mediating the interactions (gauge bosons) by wavy lines, and the coupling strengths of interactions by the different angles of the diagrams (in the figure, for sake of simplicity, this variable is not represented). The intuitive model is that the particles interact mechanically by exchanging reciprocally force quanta, like ice-skaters move up/away as to each other, by exchanging a basketball.

Feynman diagrams

In other terms, in the Standard Model the ontological distinction particle-force is introduced by interpreting the distinction, in itself only statistical, between fermions and bosons like the difference between particles constituting the “building bricks” of ordinary matter (quarks and leptons (electrons, neutrinos, etc.)), divided into three families or generations (the first three columns of the figure below), and quanta of the three fundamental quantum forces electromagnetic and nuclear strong and weak (photons, gluons and bosons Z and W, respectively), with the addition of the Higgs field with the relative boson (see the last two columns of the figure).

The Standard Model

In parentheses, because of the principle of interaction with the Higgs field, the electromagnetic and weak forces, were unified in the electroweak force that therefore mediates all the interactions between quarks and leptons (electrons, muons and neutrinos). For this discovery, Steven Weisenberg, Sheldon L. Glashow e Abdus Salam, who proposed the theoretical model, and Carlo Rubbia with Simon van der Meer who experimentally validated it at LEP – the predecessor of LHC at CERN, which used high-energy electron-antielectron beams, instead of protons-antiprotons – were awarded with the Nobel Prize, in 1979 and in 1984, respectively. Because of this amazing success, CERN decided to build LHC for searching and then finding, the Higgs bosons, the key of everything in the Standard Model, but also the starting point of a new era beyond it. The rest is today history.


In the Standard Model, indeed, the neutrinos would have to be massless, while on the contrary they have a mass, as it is shown also in figure. This is the deep reason of the 2015 Nobel Prize to Kajita and McDonald, who experimentally observed that neutrinos have mass and change continuously their nature, so to falsify the Standard Model as the fundamental physics of matter, even though extraordinarily good for ordinary matter. In science, any true progress depends indeed on a confutation of the precedent theory, but only in its false pretension of “completeness”, of being the ultimate one in some field, so it is confuted in its false “metaphysical” pretensions of “ultimate truth”. Anyway, the empirical evidence of neutrino mass, united to the growing disaffection of physicists for the perturbative methods in quantum physics because it is impossible to considerate a quantum system as isolated from the QV fluctuations in which it is immersed from within, are the deep reasons for the growing interest to the alternative paradigm of QFT, even though, we cannot speak of a definitive “paradigm shift”: a lot of work is still necessary, as the Stockholm Academy rightly emphasized.

Anyway, in QFT, every particle both fermionic or bosonic, is considered as the quantum of the relative force field. Roughly speaking, there exist material force fields (fermionic) and interaction force fields (bosonic). In this framework, the suggestion as to the neutrino oscillations is that they consist in as many phase transitions of the same neutrino field. On the other hand, the association of whichever mole of matter to a force field, and therefore the existence of the QV as the totality of the quantum force fields is an immediate consequence of the Third Principle of Thermodynamics. It affirms that for whichever physical system it is impossible to reach the absolute 0°K. This means that near the absolute 0°K, there is a mismatch between the variation of the body content of energy, and the supply of energy from the outside. We can avoid such a paradox, only by supposing that such a mysterious inner supplier of energy is the vacuum. Intuitively, the QV can be interpreted as a sort of universal “energy reservoir” of all energy forms in the universe(s) (the temperature of the QV is indeed >0°K, even though it is bounded energy, for the lack of any “ordering”), as something including and connecting dynamically everything. In this framework, any physical system at whichever degree of complexity is immersed “from within” into the QV.


From this dynamic substratum, all the particles and all their systems “emerge dynamically” like as many “spontaneous symmetry breakdowns” (SSB) of the QV at its “ground state”, i.e., without any input from the “outside”, because there is no “outside” of the QV, since everything is “inside” it. Now, the SSB principle is strictly related to the already quoted Goldstone theorem. According to it, each SSB corresponds to a local “phase coherence domain” among some oscillating fields of the QV, each with its own quanta, so to constitute a “dynamic system” with the relative particles, and “freeing” at the same time some energy from the QV bounded energy reservoir, so to produce some physical “work”. Intuitively, it is like when the sound wave propagating from a guitar chord playing the A tone makes visibly vibrating a diapason already micro-vibrating on the same frequency, and not a fork near it, micro-vibrating on different frequencies. Pay attention: what makes possible the transmission from the guitar of “free” energy able to produce “work” into the diapason and not into the fork, is the “mode”, the “form”, of their oscillations coherent as to each other, making “entangled” them, before that the physical signal of the vibrating air arrives.

Now, in the relativistic quantum physics, the phase velocity propagation is practically instantaneous, and does not depend on the physical signal propagation, so that the latter is “channeled” by the former. Analogously, in the QV, each SSB corresponds to a phase coherent mode of oscillation of some of the force fields. The quanta of these coherent modes, necessarily appearing in the equations, and experimentally observed and measured, are another type of bosons, the so-called “Nambu-Goldstone bosons” (NGB), the silent protagonists of all the Nobel Prizes in Physics quoted in this note. They are, however, a new type of bosons, because, despite they follow the same statistic distribution of the gauge bosons of the fundamental forces, nevertheless they are not quanta mediators of a new interaction force field. In other terms, they do not mediate any energy exchange, but they are quanta of the phase coherent modes of whichever force field, by which whichever energy exchange might occur. Roughly, speaking, they are not quanta of “energy”, but quanta of “form”, and for this strange nature are defined “quasi-particles” in literature. Indeed, they vanish without residuals and without violating the First Principle of Thermodynamics (energy balance), when the dynamic system they “order” is destroyed. For instance, in a crystal, the NGB are named “phonons” and they disappear, as soon as the crystal structure is destroyed, e.g., for a diamond at a temperature over 4000°C.  NGB, indeed, take different names according to the different force fields of which they “control” the possible coherent modes of stable interaction.

So, in QFT of condensed matter physics (atomic and molecular) closer to our everyday experience, the NGB are named, for instance in the solid state physics (mechanics), “phonons”. They determine, indeed, the breaking of the “Galileian symmetry” in the propagation of the vibrational motion of molecules. Namely, they determine, either its longitudinal propagation, corresponding macroscopically to the “liquid state”, or the longitudinal and transverse propagation, corresponding to the “solid state”. In this latter situation, in the case of a rigid crystalline lattice of oscillating atoms/molecules, their coherent oscillation modes determine the regular distribution of the particles in the lattice of a crystal.

In the case of magnets, the NGB are named “magnons” because the symmetry they break is the rotational symmetry of the electromagnetic field, so that the magnetization points into a specific direction, and, macroscopically, the metal acquires magnetic properties.

In the organic matter and water, in which only the biological molecules are active (this is the deep reason for which the 80% of our bodies is made of water, and the 90% of our molecules are of water), the complex structures of the biomolecules and the ordered sequences of chemical reactions constituting the single biological functions are ultimately derived, at the fundamental level, by the NGB named here “polarons”. Indeed, what characterizes both these molecules, of water and organic, is a strong electrical dipole field.  In such a way, the basic hypothesis of QFT applied to living matter is that “at the dynamic fundamental level, the living matter can be considered as a set of electrical dipoles whose rotational symmetry is broken down” (E. Del Giudice).


In such a new theoretical framework, the same duality wave-particle of the QM acquires a dynamic meaning. The “waves” of QM indeed, more precisely, the Schrödinger-De Broglie “wave function” is a statistical entity, intrinsically related to the measurements made by an observer, and not a dynamical entity, like the oscillations of a force field. On the contrary, in the QFT fundamental framework, both the duality wave-particle as dynamic entities, and the quantum entanglement related to the wavelike behavior of a dynamic system, derive naturally and without any oddity also for the commonsense (think at the resonance phenomenon). In fact, when the material fields and their quanta oscillate with one defined phase (are entangled), it is meaningless to describe the system in terms of particles. In such a case, the collective behaviors are what makes sense, with the emergence of new properties as to the summation of particles (think at crystals or at ferro-magnets).

Vice versa, when the phase coherence is lost, it makes sense to describe the aggregate – no longer the “one” dynamic system – in terms of individual particles. In such a case, also the “probability measure” and hence the statistical properties of quantum systems are dynamically justified, they are not observer-choice dependent.

This change of ontological and epistemological perspective has of course a lot of implications, also very practical. For maintaining the coherence of the Schrödinger wave function – for instance, for using it in quantum computing based on QM or in the very powerful quantum cryptography – it is necessary “insulating” the system from any perturbation. The consequence is that a quantum computing device based on such a QM principle must work practically at 0°K (-273°C): a strong limitation, indeed. Think, on the contrary, at the stability of the dynamic phase coherence characterizing a diamond: it is stable till over 4000°C.

In QFT, interpretable as a “thermal field theory”, as we have seen, the stability of a system depends, indeed, on its thermal bath, on the “mirroring” with it, on which it intrinsically depends also in the mathematical formalism. It does not depend on the absolute insulation of the system. I,e., in the QFT framework, all systems are “open” systems: we are far beyond the Newtonian paradigm. For this reason, at the recent Davos Conference, one of the most interesting communications was about the optical quantum computers as the protagonists of the computational revolutions of the very near future – 2020 is the date suggested –, given that quantum optics nanotechnology is one of the natural application fields of QFT, since tens of years.

To conclude, from the philosophical standpoint, the universe of QFT is a dynamic universe of the “causal interactions”, much closer to the Aristotelian one, than to the universe of the “seeing” or “blind” clockmaker of the classical or of the statistical mechanics. The QV is much closer to the próte dynamis, to the Aristotelian “primary dynamism”– improperly translated into “first matter” from the Middle Age on –, than to the “mechanical vacuum” of Newton, as to which it is exactly the opposed. The próte dynamis is indeed for the Aristotelian philosophy of nature the inner substrate of all material things. And from it “all the corporeal forms are caused into matter not like if they would have inserted by an immaterial form, but from a matter reduced by the potency to act by some physical agent” (Aquinas, Summa Theologiae I,65,4).


For these reasons, from the theological standpoint, using such a powerful Aristotelian notion, Aquinas explained that the proper of the Christian “creation from nothing” is not to be intended in the neo-platonic way, common also to all non-biblical religions. That is, in putting from the outside of the universe “order into the chaos”, or in putting from the outside “forms into the matter”. The “creation from nothing” consists in affirming that “everything nothing excluded”, also the formless “primary dynamism” of matter from which all derives, is within the creative act of God.

More precisely, the primary term of the creative divine act is precisely the “próte dynamis” from which everything derives, apart from the human spirit, and apart from all intelligent creatures. Spirit intended as relation with God, and on which their personal freedom is necessarily founded. And this creative act is “outside time” – because for Augustine and Aquinas like for us, differently from Newton, time is “inside” not “outside the universe”. So that, for them, like for Aristotle and for us, differently from Descartes for whom God “triggered” the inertial machine of the universe, it is impossible “from inside the time” to demonstrate that time has an “absolute beginning”. In a word, the “In the beginning” of the Bible is timeless, is “metaphysical” not “physical”. This is also according to the literary sense of the biblical text, where the time counting (the six days) starts after that God gives existence to the limits of the universe (heaven and earth), for distinguishing it from Himself, so as to remain “outside” the universe, without nothing “inside” apart from the primordial chaos. So, given this preamble that is very precise in distinguishing this position from the other “stories of creation” of other religions, the biblical author can abundantly use these other stories in the rest of other chapters of Genesis about the origins, till the chapter 11.

The beginning of Genesis reads indeed:   
(1) In the beginning God created heaven and earth. (2) Now the earth was a formless void, there was darkness over the deep, with a divine wind sweeping over the waters (Gen I).

Therefore, they are completely missing the point those physicists, also eminent like S. Hawking (Hawking & Mlodinow, 2010), or like L. Krauss  (Krauss, 2012), with the collaboration of a biologist, the everywhere present R. Dawkins with his afterward to Krauss’ book, when they pretend to explain the creatio ex nihilo using the infinite SSB’s of the QV. The QV is not “nothing”, just as the “próte dynamis” of Aristotle or the “formless void earth” of the Genesis are not “nothing”. Eventually, the Newtonian “mechanical vacuum” has some resemblance with the metaphysical “nothing”. It is not polite, in such a case, to play with the ignorance of the people, by confusing QV and mechanical vacuum.

Anyway, what these scholars effectively criticize is the neo-Platonic theology of God that like the Platonic Demiurge designs and creates things by putting from the outside “models into the sand”, a vision systematically incompatible with an evolutionary vision of the universe(s) and of life like the contemporary ones. On the contrary, if we use the Aquinas scheme, physicists and natural scientists, on one side, and metaphysicians and theologians, on the other one, can make their respective jobs, with reciprocal respect and without any interference.

The former ones, indeed, must suppose the existence of the QV and have to rewrite all the fundamental physics and all the natural sciences on this basis. The latter ones have to try to find a reasonable answer to the ultimate, eternal questions such as “why there is something and not nothing?”, by re-writing a metaphysics and a theology able to interface themselves with the extraordinary progresses and risks of contemporary and future science and technology. The aim is to avoid the risk that their disciplines look like tales for children “amusing on the shore”, for scientifically educated people. Anyway, as we see, there is a lot of job to do in our age of “paradigm shift” not only for physicists, but for philosopher and theologians too.        


Hawking, S., & Mlodinow, L. (2010). The grand design. New answers to the ultimate questions of life. London: Bantam Press.
Krauss, L. M. (2012). A universe from nothing. Why there is something rather than nothing. Afterward by Richard Dawkins. New York: Free Press.


These are some of the most important events in which members of IRAFS were involved.

1. 2015, YEAR OF LIGHT. A major conference was held in Rome to celebrate the Year of the Light, involving the prominent physicists and philosophers of our age. Fiat lux is the title, or rather, "Let there be light" ( fotoni-dal-Big-Bang-a-Roma/612421/ ). We refer also to the interesting historical and philosophical depth of the Interdisciplinary Dictionary of Science & Faith, della-luce-2015

2. DUBROVNIK, CROATIA,  INTER-UNIVERSITY CENTRE. IUC-2015. INTERNATIONAL CONFERENCE ON: “FORMAL METHODS AND SCIENCE IN PHILOSOPHY. DUBROVNIK (CROATIA), MARCH 26-28, 2015”. The Conference was aimed at presenting a lot of high level contributions in formal philosophy. That is, philosophy using formal methods of logic, overall of modal and philosophical logic, far beyond the results obtained during the last decades, by analytic philosophy using only mathematical and Fregean logic. Two our professors, Gianfranco Basti and Flavia Marcacci presented their contributions at this Conference:

3. JOAO PESSOA, BRASILE, FEDERAL UNIVERSITY OF PARAIBA.  FIRST INTERNATIONAL CONFERENCE ON: “LOGIC AND RELIGION. JOAO PESSOA (BRAZIL), APRIL 1-5, 2015”. The Conference was aimed at presenting, for the first time in the history, the results of formal philosophy as a theoretical tool for the inter-religious dialogue, for avoiding misinterpretations and language confusions. Several scholars presented high-level papers formalizing different religious doctrines, belonging to the different Traditions to which they belong. Professor Gianfranco Basti presented an Invited Plenary Speech on: “A formalization of Aquinas Theory of Creation as Participation of Being”:

4. AT UNILOG 2015, 5th WORLD CONGRESS AND SCHOOL OF UNIVERSAL LOGIC, TURKEY ISTANBUL, JUNE 20-30, 2015. Workshop: Representation and Reality: Humans, Animals and Machines. OVERVIEW: Our workshop could be considered as the continuation of a part of the symposium “Computing Nature” organized by Gordana Dodig – Crnkovic and Raffaela Giovagnoli in the AISB/IACAP World Congress 2012 and “Representation of Reality: Humans, Animals and Machines” in the AISB50 Convention at Goldsmith 2014. We aimed at offering a further occasion to discuss the problem of “representation” in humans, other animals and machines. It is closely related to the question what capacities can be plausibly computed and what are the most promising approaches that try to solve the problem. At this Conference, with about 300 speakers from all over the world, Prof. Basti and Prof. Raffaela Giovagnoli, chair of the Workshop, presented two invited papers.

5. SUMMER SCHOOL LILLE   ( ), 1st International Summer School for Sciences, History and Philosophy of Sciences & Science Education, New Educational and Fundamental Insights for Sciences and History-Epistemology-Philosophy of Sciences, & Science Education, and Roundtable & Open Debate, Exploring Changes in How the Histories and Philosophies of Sciences Have Been Written: Interpreting the Dynamics of Change in these Sciences and Interrelations Amongst Them—Past Problems, Future Cures?. June 22nd-26th | MESHS, Lille, France. With the participation of Joseph Agassi (Israel), Jean Dhombres (France), Helge Kragh (Denmark), Nichoals Maxwell (UK) and Patricia Radelet de Grave (Belgium) as Lecturers and Keynote Speakers. Other lecturer: Raffaele Pisano, Giuseppe Bellavia, Didiel Christeller, Flavia Marcacci, Romano Gatto, Snezana Lawrence and others. The aim of the this first international Summer School (ISSHPSE 2015) is to provide a platform to the young researchers, post-doc, Ph.D. candidates teachers and practitioners from both academia as well as education School to meet and share cutting-edge development in the field. Particularly ISSHPSE 2015 mainly aims to improve-innovate scientific, historical and philosophical techniques of investigations within science.


Science, Philosophy, and Religious Commitment: Catholic engagement in philosophy of science. University of St. Thomas, St. Paul, Minnesota, 26-28 June 2016 ( The aim of the conference is to cultivate sober perspective and insight into history and current state of engagement with philosophy of science among Catholic intellectuals with an eye to “What now?” sorts of questions. Invited speakers: Paul Allen, Nicanor Austriaco, Stephen Barr, Gianfranco Basti, Robert Deltete, David Diekema, Flavia Marcacci, Patrick McDonald, Meghan Page, Anne Peterson, Lidia Obojska, Philip Sloan, Brendan Sweetman, Nicholas The.

The 7th International Conference of the ESHS will be held in Prague, 22 - 24 September, 2016 (



9 febbraio, seminario “Ontos e Logos. Ontologia formale, linguaggi, culture”

h. 15.00-17.30, Aula Papa Francesco

Durante il seminario sarà presentato il volume di G. Basti, S. Mobeen, "Ontologia formale" (Roma: Apes Editrice, 2015), contenente contributi di Habermas, Searle, Ales Bello, Basti, Kanakappally, Poli, Mobeen, Giovagnoli.

Modera Philip Larrey (Pontificia Università Lateranense), Introduce A. Iodice (Presidente dell’Istituto di Studi Politici San Pio V). Interventi di
Guido Traversa (Università Europea di Roma) e Daniele Santoro (CNR-IRPPS - Luiss). Discussione introdotta da A. Ales Bello, G. Basti, R. Giovagnoli, B. Kanakappally, C. Ariano.


11 aprile, seminario “Logos e Pathos. L’immagine tra filosofia e scienza”
h. 15.00-18.30, Aula Papa Francesco.

Moderano e introducono: Patrizia Manganaro e Gianfranco Basti (Università Lateranense). Interventi di: Roberta Lanfredini (Università di Firenze), Res viva. Il sentire in Maurice Merleau-Ponty, Andrea Pinotti (Università di Milano), Violenza all’immagine, violenza in immagine. Questioni di empatia; Gianitalo Bischi (Università di Urbino), Il logos delle forme: generare immagini dal caos

9 maggio: “Logos e Pathos. Aspetti epistemologici e terapeutici della cura”. h. 15.00-18.00, Aula Papa Francesco

Modera e introduce: Patrizia Manganaro (Università Lateranense), Empatia e complessità. La svolta fenomenologica della psichiatria. Interventi di: Luigina Mortari (Università di Verona), La pratica di cura: posture dell’esserci; Cristina Trentini (Università di Roma “La Sapienza”), Il sé intersoggettivo tra psicologia e neurobiologia. Origini e sviluppi terapeutici