Our brain doesn’t think (and yours, either)

Share:


“Human beings only use 10% of our brain.” “The brain of adults does not change.” “The reptilian brain is the one that governs the behavior of children.” “A person is more intelligent the more neurons he has.” Who among us has not ever heard these statements? And yet they are false.

These are misinterpretations about the brain (” neuromyths “) that often permeate the population through certain forms of scientific dissemination. They also even reach the field of education. This is demonstrated by a study published in 2014, in which it was found that teachers from various countries, both Western and Eastern, tended to believe in these kinds of statements.

The spread of these misconceptions is not banal, but can lead to unscientific and harmful educational strategies. For example, the excessive enrichment of the children’s environment and the obsession to teach them the more things the better before the age of six.

Confusing the part with the whole

Another error that occurs frequently in the communication of neuroscience consists in perpetuating the so-called “ mereological fallacy ”: assigning to the part (the brain) psychological attributes that, in reality, belong to the whole (the human being as a whole).

Through a quick search on the internet we can come across expressions such as ” the brain thinks “, ” the brain remembers “, ” your brain sees “, or even ” your brain hates “.

These types of expressions are not only used by scientific popularizers, but also in areas such as teaching and even professional science. One of the objectives pursued by the Australian brain research initiative ( Australian Brain Initiative ) serves as an example of the latter , which its promoters propose as “understanding and optimizing how the brain learns in childhood.”

This mereological fallacy constitutes the conceptual basis of what the philosopher Carlos Moya describes as a new (and paradoxical) “materialistic dualism”. Once the dualistic conception of soul-body has been overcome (in the Cartesian way), we now tend to think of a brain that is independent or isolated from the body. The latter seems, in a way, expendable. This does not conform to reality: the brain is only one part of the nervous system, which in turn is only one part of the body. This body, moreover, is framed in a social context (it is not a “ brain in a bucket ”) that decisively affects the development and life history of the individual.

Neither the feet walk, nor the brain thinks

The reader will agree that your feet do not walk, but you are the one who walks using your feet. Similarly, it is not your brain that thinks, remembers, hates or loves, but you are the one who does all this using your brain.

It might be thought that the comparison between brain and feet is not adequate, since the brain, unlike the former, has a great capacity for control over the other parts of the body. However, it should not be forgotten that the brain depends, in turn, on other organs for its subsistence and functioning, especially (but not only) the heart .

In no way is the brain independent and the governor of the rest of the body, as shown by the dynamics of its development: it is not until the twenty-third week of prenatal life that the first synapses appear in the human embryo, and it is not until after twenty years of age. the brain finishes developing completely. In fact, the brain continues to change until the day we die. Simply put, without a body there can be no brain, both functionally and chronologically.

To some extent, it is understandable that scientists or disseminators trained in neuroscience tend to transmit, consciously or unconsciously, the mereological fallacy. After all, your specialized knowledge can lead to overemphasizing the importance of a part of reality.

For this reason, and just as the fact that a “science of the part”, such as neuroscience, decisively permeates the understanding of the social sciences and humanities that study the human being as a whole, has been normalized, it should also be normalized the complementary path: that these “sciences of the whole” contribute to a more complete (and realistic) understanding of the nervous system.

To achieve this, neuroscience should be more receptive to study and genuine dialogue with other disciplines (psychology, education, communication, law, philosophy). Interdisciplinary collaboration could, thus, contribute to curbing the proliferation of neuromyths and reductionist views of the human that even hinder the advancement of neuroscience itself. Methodological rigor should not be associated with a lack of argumentative rigor. Communicating the brain, after all, does not imply limiting yourself to the brain.

Author Bios: Jose Manuel Muñoz is a Researcher at the International Center for Neuroscience and Ethics (CINET) of the Tatiana Pérez de Guzmán el Bueno Foundation, and at the Mind-Brain Group, Culture and Society Institute (ICS) and Javier Bernácer is a Researcher at the Mind-Brain Group, Culture and Society Institute (ICS) both at the University of Navarra

Tags: