Tennis player Serena Williams does not change her socks during a tournament.
Some athletes perform specific actions before or during the match. These actions are exact repetitions of actions they did before or during earlier successful matches. They wear a certain set of clothes, eat a specific meal before a match or clip their fingernails during timeouts. Is this superstition? Do they really think there is a cause-and-effect relationship? We are inclined to look for a magical connection between causes and effects which are actually unrelated. Finding magical connections is normal for children, it is commonly regarded as part of their development. But when adults do it, we perceive this to be rather odd. So why do sensible people think or act like this? Maybe it gives a sense of control over what might happen? There must be a good reason for this behavior, no doubt. But we don’t know it.
It is appealing to think in terms of causes and effects. High blood pressure, smoking, high cholesterol and obesity are often referred to as causes of cardiovascular disease. But they are, of course, just risk factors, be it important ones. If high cholesterol were the cause of cardiovascular disease, everyone with high cholesterol should develop cardiovascular disease. However, this is not the case. We are also inclined to think in terms of causes when treating diseases. For example, there are scientific studies which established a correlation between running and the recovery from depression. Does this mean that running is the cause of the recovery? No, it does not. After all, many people who suffer from depression do not have a beneficial effect from running. If there would be a cause-and-effect relationship then all people with depression should recover with running. So this is not a case of cause-and-effect, yet it is often perceived and communicated that way.
A pattern is not the same as a cause-and-effect relationship
When analyzing large amounts of research data, you may discover patterns. But beware: a pattern is not the same as evidence of cause-and-effect. Sometimes the relationship is exactly the other way around. In an interesting TED-talk [1] about correlations, Ionica Smeets pointed out that research, conducted in the US in the 1970s, proved that children with high self-confidence often performed well at school. This research received a lot of attention and for decades parents tried to enhance the self-confidence of their children, believing it would improve their performance. Many years later, new research proved that the opposite is true: children who do well in school, gain more self-confidence and not the other way around.
A pattern can be perceived as proof of a relationship between two things, while another factor, a third factor, actually causes this relationship. For example: ‘Compass needles always point to where most of the polar bears are’. It is (now deliberately) overlooked that the North Pole is the place where most polar bears live and compass needles always point to the North Pole. Likewise, it is not difficult to find a pattern which connects obesity to cardiovascular disease. Yet we now know that cardiovascular disease is influenced by many factors.
Focus on causes and their relationships
If cause-and-effect thinking is so appealing, let’s take all causes into account. For this we can use the concepts of upward and downward cause-and-effect relationship: upward causality and downward causality [2]. When we use the concept of upward causality, we always focus on one cause-effect relationship. In doing so we try to gather knowledge; we attempt to understand how something works and we look for scientific evidence that corroborates a cause-effect relationship. An example: Gravity causes an apple to fall.
In addition to this one-on-one relationship between gravity and a falling apple, we can study causes (and their consequences) that act together-and-at the same time in a subject’s environment. This approach, the so-called downward causation, is especially effective when dealing with complex subjects such as the weather, which is affected by a wide array of factors, like the sun, wind, humidity, air pressure, the effects of the land or sea, and butterflies of course.
Upward causation gets the most attention
We are used to focus on upward causation. The medical model, for instance, tends to follow the route problem-examination-diagnosis-treatment, with the aim of eliminating (or reducing) the problem. This approach is based on the idea of a one-on-one relationship between a problem and its solution. When applied to relative stable issues it serves healthcare practitioners, and healthcare in general, well. A bacterium is the cause of the infection and must therefore be removed. A blocked blood vessel causes an infarction and must be opened or diverted by means of bypass surgery. Iron deficiency is the cause of anemia; iron must be supplemented. The development of evidence-based medicine (EBM), which led to the implementation of protocols, was a logical consequence of this approach. In a broader sense, this approach is also referred to as evidence-based practice (EBP), a term used by the paramedical professions, government bodies, regulators and health insurers.
Despite being broadly accepted among healthcare professionals, criticism on EBM is on the rise. An important objection is based on EBM’s tendency to focus on populations instead of individuals. Research findings that are deduced from a group study say something about the (average) effect in the group that was subjected to the study. However, an individual may have very different characteristics and move within a different context than the average of the study group. When a drug has been tested on a group of healthy young men, what do the test results tell us about its effect on an elderly lady who has various illnesses and who is also taking other drugs? In 2017, the Dutch Council for Public Health and Society (RVS) published the report ‘without context no evidence’ (Zonder context geen bewijs) [3]. The report asks its readers how evidence deduced from scientific research can be applied in different situations. No evidence without context.….. Beautiful, it could have been the title of this book. Why? Because this is our argument in a nutshell. Influences from the context – the many other causes – cannot be ignored. Upward and downward causality both matter.
Which p-value is low enough?
When is scientific evidence sufficient? There is an ongoing debate [4] about tightening the p-value threshold. Should the classic 5 percent threshold be reduced to 0.5 percent? This threshold is often used to indicate whether evidence is significant or not. A high p-value increases the risk that research findings are due to chance. Furthermore, researchers may feel a pressure to come up with significant results, leading them to scavenge through research data until they find something with a sufficiently low p-value, so-called p-hacking. However, discarding results that don’t fit the narrative, but publishing findings that are the results of p-hacking is hardly scientific.
Researching irregular subjects
There is a growing awareness that researching irregular subjects is difficult. After it became known that quite a few studies, including a good amount of famous studies, yielded different results when the research was repeated, the scientific world started to pay more attention to replication research, the repetition of research that has been conducted by others. The results are remarkable. Let us present two examples. In 2005, Ioannidis [5], a researcher who has done much to verify previous research, stated that more than half of all results in the biomedical literature were false. And in 2012, researchers from the biotechnology company Amgen [6] reported that they were able to confirm the findings with replicates in only 6 out of 53 landmark studies in oncology and hematology. You may wonder whether repeating research on highly variable subjects is even possible at all. (Remember the Butterfly Effect?) Yet repeating a study to the best of one’s ability, is, of course, always possible.
The discrepancies found in replication studies show how difficult it is to research complex and dynamic subjects. And it’s not just about conducting the research. On top of that, there are the ethical issues about what should or should not be published. In 2018, the Royal Netherlands Academy of Arts and Sciences (KNAW) published a set of recommendations [7] for improvement.
Summary Chapter 6:
- A pattern is not the same as a cause-and-effect relationship.
- We predominantly focus on upward causation. A good example is the medical model where the route problem, examination, diagnosis and treatment is followed, with the aim of eliminating (or reducing) the problem. But criticism is increasing.
- There is a growing awareness that researching dynamic subjects is difficult.
- Many studies on complex subjects yield different results when their procedures are repeated in ‘exactly’ the same conditions.