top of page

Don't be fooled! Why modern science is largely a masquerade for real science.

  • leogabe
  • May 8, 2023
  • 18 min read

Updated: Oct 14, 2024

'When one sees Eternity in things that pass away and Infinity in finite things, then one has pure knowledge. But if one merely sees the diversity of things, with their divisions and limitations, then one has impure knowledge'

Lord Krishna, the Bhagavad Gita


Modern science's great problem - a dearth of creativity and imagination.


The issue is modern-scientists and modern science so often tries to simplify life and her biggest mysteries, and then out of insecurity end up ridiculing the unexplainable yet most interesting aspects of life; all in their quests to verbalise an official set of truths to be followed - facts from fiction. In this way, modern science is now modern society's equivalent to what religion was to past societies, and has stretched itself into becoming an overbearing, dogmatic authority served to comfort and make dull the masses, that ostracises the creatives, open-minded and curious. Modern science resembles the specialist and non-creative driven pursuit of knowledge and that is doomed to failure and dogma. As Krishna states, 'if one merely sees the diversity of things, with their divisions and limitations, then one has impure knowledge'. In contrast to pure knowledge, modern science is over-systematic and categorised; over-specialized and dry; over-analytic and sort-sighted.


This is a view commonly shared by outspoken experts. Sociologist and Mathematician Nassim Taleb pointed out - 'we have managed to transfer religious belief into gullibility for whatever can masquerade as science.' In an interview Kary Mullis, the bio-chemist Nobel prize winner who invented the Polymerase Chain Reaction (PCR) technique, described how ‘the CDC (Centre for Disease Control and Prevention)… have an agenda which is not what we like to have, given we pay for them to take care of our health. They are considered the final arbitrators of what’s good for the planet or what’s bad for the planet, and they haven’t the slightest of ideas. Instead of wearing white robes they wear white lab coats, and they don’t have to understand what it is they are making you do. I think people fall naturally into it because there is a need in humanity for something like a religion.’ The fall of the importance of priests sees the rise of the importance of many scientists in modern-day culture.


'Neuroscientist Ian McGilchrist highlights how the crucial problem is a dearth of creativity and imagination among modern scientists, not being permitted by the scientific establishment itself, writing - 'most scientists have got where they are by being good at following procedures, not by seeing when too (discredit them)... for all except the best scientists, creativity is not their strong suit.'


He explains that this is partially out of fear of suffering obloquy and also since 'creativity is one of those areas where there just aren’t rules and procedures.' As Schopenhauer states - 'reasoning ‘can give only after it has received. It is by logic that we prove, but by intuition that we discover.' The issue is if truth requires intuition and creativity to grasp, it can only be discovered, not mathematically or verbally created and calculated. Thus science with its attempts to show proof will end up falling short at explaining truth - it is a fatally flawed discipline if that is its goal and most of the most successful and groundbreaking scientists acknowledge that.


One of the co-founders of Quantum Theory, Max Planck, observed - ‘we have no right to assume that any physical laws exist, or if they have existed up to now, that they will continue to exist in a similar manner in the future.' No matter how well you describe yellow to someone colour-blind, and how much science that persons reads about the colour yellow, if he suddenly one day wakes up able to see colours - he will be shocked by the colour yellow (the science can't come close, serving only a dull aid to the imagination).


Yet science is still extremely important and at its best- totally creative, but on the large part modern scientists are not, and therefore, they are being wasted. A small minority of scientists have found themselves fortunately not trapped in logical thinking spiral and past scientific dogma, and this is what allowed them to instead instinctively use their creative sides to give it science its greatest meaning and uses. It is telling that the majority of great scientific discoveries of the ages have come from people who weren’t paid to be great scientists, and countless were stumbled upon by the most extravagant kinds. Consider Gregor Mendel for instance, who developed the three principles of inheritance that described the transmission of genetic traits before anyone knew genes existed, and who died over a decade before his work was ever even taken seriously. Further, Einstein was working in a low-level job as a patent examiner during the time he 'hatched his most beautiful ideas'. He famously stated 'imagination is more important than knowledge' in science. Regarding knowledge, he further emphasised how 'pure logical thinking cannot yield us any knowledge of the empirical world; all knowledge of reality starts from experience and ends in it'. He even wrote - 'I very rarely think in words at all. A thought comes, and I may try to express it in words afterward’.

Whilst Einstein thought in musical shapes; Niels Bohr, the other co-founder of Quantum Theory alongside Planck, had notebooks containing no words or formulae, just pictures. Moreover, Thomas Edison was an applied chemist whose numerous inventions and businesses were enabled by his chemical knowledge. Yet, he only attended school for a few months as it didn't suit his personality and he gained a reputation of constantly taking naps 'whenever and wherever he needed them'. Likewise, Charles Darwin (who theorised modern evolution by natural selection) initially dropped out of university after two years when studying medicine, and then studied to be a Parson (an Anglican clergy-man) before he traveled (as a guest to the Captain Fitzroy) to the Galapagos. He had no support from any institution to help him make his scientific discoveries and not even a very scientific background. It's clear being unorthodox comes part-and-parcel with great discoveries, genius individuals and that relaxation stimulates creativity. Funnily enough, science has now shown 'rationalising and verbalising have a damaging effect on creativity, at least during the generative process itself'. Of course, modern day scientists in power don't tend to be promoted to positions of authority based on the tendency to have siestas or break social-norms like Edison or for being generalists like Darwin, and universities distinctively separate the studies of arts from the studies of sciences in their bachelor courses, as if the two concepts weren't actually necessarily overlapping, despite the renaissance movement, and instead stood as polar opposites on the academic spectrum.


It is equally telling that the majority of the great scientific discoveries of the the ages have come from young adults. Ian McGilchrist points out that 'original work in a field like physics was often in the past achieved by those least worn down by the pressures of conformity and most open to fresh thinking: the young. Newton was 23 when he developed his theory of gravity. Einstein was 25 when he discovered the theory of relativity; Pauli was 25 when he announced his exclusion principle; Heisenberg was 26 when he presented his uncertainty principle; Dirac was 26 when he discovered his relativistic equation; Bohr was 28 when he developed his atomic theory; De Broglie was 28 when he made his breakthrough in wave mechanics; and even Schwinger and Feynman found the renormalisation solution before the age of 30, despite the interruption of World War II.119 This is not because young people are cleverer, and it cannot be that they know more: it is surely that their minds are more open than those of their seniors.'



The Scientific Community needs a liberation from the paper and peer-review system (and specialists and administrators) so that science can be understood again.


Chemist Nobel-Prize winner Kary Mullis summed up the situation of modern science in a pessimistic but realistic stance stating - ‘(people) do not know the best and the best and brightest. They don’t know the difference really (from other scientists). I like humans and don’t get me wrong but the vast majority of them do not possess the ability to judge who is and who isn’t a really good scientist. That’s a problem, that’s the main problem of science in this century… There are no old wise men out there at the top of science... I would have thought till 68 that if you try to publish a dumb paper in a journal like Nature it wouldn’t get published, but if you try to publish a good paper like I later tried... to publish the invention of PCR in the same journal, and they didn’t take it… There isn’t an up there... the academy of science are just like everyone else, the editors of journals (etc…) there are no wise men on the top making sure that we don’t do something really dumb.’

Truthfully, modern conventional science is in disarray. A sizable survey of 1,576 researchers across the full spectrum of scientific disciplines published in Nature revealed 'more than 70% of researchers had tried and failed to reproduce another scientist’s experiments, and more than half had failed to reproduce their own experiment.' Of the researchers, over half thought there was a 'significant crisis in research reproducibility' and 97% that there was some-sort of "crisis".


Stanford physicist professor John Ionnidis released in 2022 a study titled '‘Why most published research findings are false’ following a survey of close to 900 members of the American Society for Cell Biology. He deduced that competition for grants, awards and financial incentives was a principal reason for why research findings lacked accuracy stating 'The hotter a scientific field (with more scientific teams involved), the less likely the research findings are to be true... With many teams working on the same field and with massive experimental data being produced, timing is of the essence in beating competition'


Multiple reasons why research papers are unable to be reproduced and are unreliable exist. Firstly, scientist are equally prone to bias as anyone else (maybe more so, as higher-educated people have been shown by some studies to be more ideologically biased than people with less education). Secondly, it is common practice and almost acceptable for scientist to cheat. In an even larger survey of 3200 scientists, a third admitted they had engaged in at least one common misdemeanor during the past three years. McGilchrist points out - 'Since this finding is based on a questionnaire survey, with a response rate of about 45%, the figure may be a serious underestimate, since more corrupt scientists are less likely to participate in a survey of this kind. Another study which involved inducements to be honest, found that ‘even raw self-admission rates [frank admissions of guilt] were surprisingly high, and for certain practices, the inferred actual estimates approached 100%, which suggests that these practices may constitute the de facto scientific norm'. In order to be published, scientists disproportionately aim to achieve positive findings as negative findings are more commonly ignored, and regularly data mine - McGilchrist writes 'in order to achieve this desirable outcome, researchers may keep analysing the results in slightly different ways until they get a signifcant p-value (a measure of statistical signifcance, conventionally less than 0.05)' which 'makes one’s research much more likely to be published.' 


Thanks in part due to the explosion of scientific knowledge over the last few centuries, McGilchrist claims 'all scientific enquiry has now to be highly specialist in nature.' It is a field full of experts – 'who has been defined as knowing more and more about less and less.' And generalists stand less and less chance of contributing to science because papers are now so specialised and acronymic, they are too tedious and complex to be understood by anyone other than other specialists in whichever narrow field they are in. Worse, they are not even necessarily written by a human. French computer scientist Cyril Labbé generated over a hundred fake papers and invented an author, Ike Antkare, who became the 21st most cited scientist in the world. This led to supposedly two of the most highly reputable science publishers Springer and IEEE having to removing more than 120 gibberish AI papers from there databases. Overall, it's estimated that only 20% of cited papers actually have been read, and this was illustrated by British Biologist Peter Lawrence who publicly revealed that of the 48 citations of one of his articles, 37 were irrelevant and three completely wrong. In China you can pay to be cited on someone else's paper or to have 'prestigious authors attached to your own', such is the extent of the blatant corruption of science.


Richard Smith, who was editor of the British Medical Journal and chief executive of the BMJ Publishing Group for 13 years, and therefore had dozens of years experience dealing with peer review, opened up on the actual realities of it, describing it 'as a subjective and, therefore, inconsistent process … something of a lottery’ and joked 'most scientists believe in it as some people believe in the Loch Ness monster'. Worse, the peer review system invites sabotage to good research if no McGilchrist states 'prestigious investigators may suppress via the peer review process the appearance and dissemination of findings that refute their findings, thus condemning their field to perpetuate false dogma. It is clearly also possible to produce an unjustly harsh review to block or delay a competing publication and even to steal a competitor’s ideas...' More to the point, this is inevitable due to bias proven by the past. McGilchrist explains 'The certainty that what one was taught to believe must be right afflicts even the best scientists. It meant that Tycho Brahe could never accept Copernicus’s discovery of the circulation of the earth round the sun, and that Justus von Liebig, the founder of organic chemistry, could not accept the germ theory of disease put forward by Louis Pasteur and Robert Koch. In the words of Bernard Barber, ‘because of their substantive conceptions and theories, scientists sometimes miss discoveries that are literally right before their eyes.’


The peer review process is so ineffectual that when the British Medical Journal conducted several studies where 'major errors were deliberately inserted into papers and then sent to many reviewers' they reported ‘Nobody ever spotted all of the errors: some reviewers did not spot any, and most reviewers spotted only about a quarter'. Steve Ceci and Douglas Peters took 12 published papers from prestigious institutions and resubmitted them to the same journals only changing the names of the authors and institutions to fictional ones. Only three detected the resubmission and of the other nine, eight were rejected - one for 'serious methodological flaws'. They pointed out that 'none of the twenty reviewers who recommended rejection even hinted at the possibility that a manuscript might be acceptable for publication pending revision or rewriting.' Linus Paulin was rejected and left unfunded for his work by the peer review system for his work on vitamin C and cancer only for it later to be acclaimed, for it he received two Nobel Prizes.


This all points out that peer reviews and citations are nearly completely pointless systems, even potentially harmful. However, this was known as science advanced considerably before peer-reviews were in fashion. McGilchrist points this out when writing 'Rutherford discovered the nucleus of the atom, he published it in a paper with just a single author: himself. By contrast, the two 2012 papers announcing the discovery of the Higgs particle had roughly a thousand authors each. This is a recognisable trend... For hundreds of years during the great age of Western science, papers were reviewed by the editor or editors alone... Of Albert Einstein’s 301 publications there is evidence that only one underwent peer review (in 1932): ‘interestingly, he told the editor of that journal that he would take his study elsewhere'. In the past, peer review and citations didn't matter, why would they matter now?




Modern Science - a great waste of money and human effort? (compared to the past)


‘The feedback loop between science, empire and capital has arguably been history's chief engine for the past 500 years... the twin turbines of science and empire were latched to one another, and then learn how both were hitched up to the money pump of capitalism.’ 

Yuval Noah Harari


Almost all pivotal contributions to science in history came from scientists following their intuitions and questioning old data, not from scientists conducting expensive experimental work and making new observations after producing and gathering new data. Take Copernicus and Galileo reinterpreting the motions of the planets in the sky as revolving around the sun or Newton deducing gravity having been inspired by how apples fall. Take Einstein's proposal of the theory of relativity after questing the behavior of light, without having performed a single experiment on the subject or even more recently, Watson and Crick, who won Nobel awards, having formed their model of the structure of the genetic molecule without ever experimenting themselves with DNA - they based their model on existing data made principally by chemists Franklin and Wilson beforehand.


The big point is, it is in the last century that almost all pivotal contributions to science should be being made! We are living in a time where per the population, thousands more scientists exist than ever before - in possessions of the highest educations more than ever before, and who have access to billions more in financial investment, yet science is advancing at less speed than the centuries before. In 1969, it was calculated 'Of every eight scientists who ever lived [in the history of the world], seven are alive today (in 1969)"; Yet we cannot find among them the eight modern Galileo's, Planck's, Einsteins, Koch's, Pasteur or Mendel's that these statistics predict.' This figure could have been an underestimate, since Derek de Solla Price, considered the father of scientometrics (the science of studying science), estimated that 90% of all the scientists that ever lived were alive in 1961. Over the last fifty years the number of scientists continued rising and a huge number more have PHDs (see graph below).




In recent times, scientists have specialised in conducting incredibly costly research requiring high-tech equipment and a focus on constant experimentation has replaced traditional contemplative thought and analysis. Science wasn't a field that had major investment before World War II - total (public and private) research and development funding in the US for example, roughly amounted to $250 million per year. It was only when the war required scientists to help the US win the arms race against Russia, that Science became more about the money. The nuclear bomb was invented after a team of scientists led by Openheimer were equipped with $2 billion and science never looked back. Disregarding private research, 'the federal share alone on investment in science (in 1993) became half of all research and development spending in the United States at $76 billion'.

According to Forbes, finding the Higgs Boson particle alone cost $13.25 billion, funded by the European Council for Nuclear Research (CERN).


As aforementioned, academic papers are the business of science. As an NIH biomedicine paper reports 'scientific publishing was initially non-commercial, it has become a profitable industry with a significant global financial turnover, reaching $28 billion in annual revenue before the COVID-19 pandemic.' This is a completely modern phenomenon as the graph below shows:

In the 1800 there were only 100 published journals that existed and by 1900, there were slightly below 10,000 Science Citation Index Expanded (SCIE) publications. Fast forwarding a century, its believed that 'at least 64 million academic papers have been published since the year 1996, with the growth rate of newly published articles increasing over time. As of 2022, over 5.14 million academic articles are published per year, including short surveys, reviews, and conference proceedings.' The growth is vast and mammothly underwhelming considering the lack of scientific advancements proportionally.


Outspoken and ostracised virologist Peter Duesberg, a German-American molecular biologist and a professor of molecular and cell biology at the University of California, Berkeley bemoaned the state of affairs writing 'Few scientists are any longer willing to question, even privately, the consensus views in any field whatsoever. The successful researchers —the one who receives the biggest grants, the best career positions, the most prestigious prizes, the greatest number of published papers — the one who generates the most data and is the least controversy.' The statistics frighteningly completely back-up his claim, and it seems like money is being thrown down the drain considering science was advancing at a far greater rate before there was so many qualified scientists and so much expensive scientific research.



Modern science's pursuit of total truths - a lost cause and fuel for linear and constrictive dogma and grandoise pomposity


'A very great deal more truth can become known than can be proven.' - Richard P. Feynman


‘It is the hallmark of any deep truth that its negation is also a deep truth was the distinction between the two sorts of truths, profound truths recognized by the fact that the opposite is also a profound truth, in contrast to trivialities where opposites are obviously absurd’ - Niels Bohr


The Necker Cube is perfectly perceivable through two different angles as anyone can see - and is a clear demonstration of the ambiguity of space. It is an example of 'transfiguration' according to Richard Dawkins


Modern science's pursuit of truth has resulted in ‘sustained incoherence’ - a term coined by particle physicist and quantum theorist David Bohm, which means that whilst it strives (on the large part earnestly) for truth, it unfortunately achieves its precise opposite. Iain McGilchrist explained concisely the issue - 'truth is uncertain not because it is empty, but because it is full – rich, complex, manifold... Very little that we take for granted as most essential to life – love, energy, matter, consciousness – can be convincingly argued about, or even described, without becoming ultimately self-referential. You have to experience it to know it: all we can do is point.' Science is according to Wikipedia a 'rigorous, systematic endeavor that builds and organizes knowledge in the form of testable hypotheses and predictions about the world' - in other words, science is completely detached from the mysterious nature of feelings and experiences and thus, in reality truth. After all, nothing is deemed scientific because it merely feels right, but our feelings are most certainly more real than any words attempting to describe and condense them - words are mere abstractions for feelings and actual things.


Therefore, modern science with its burden for proof (requiring certainty) finds itself severely limited addressing truths about what we care most about in life, those relating to our feelings. For instance, modern science can give insight into the process of love through research in biology and particularly on neurotransmitters. You can talk about chemicals, but very few people would equate love to merely oxytocin and other endorphins. No one would say ´wow her smile makes me enjoy the bliss of oxytocin' and hopefully, never will! We can understand love principally through our shared self-knowledge of the activities which trigger in each of us these chemicals, e.g. our experience of the effect of touch and of the sweet softness of compassion; not truly by rationalisation and chemical equations.


More pertinently, science often then attempts to reduce love (at least romantic love) to a singular purpose - the purely selfish/ selfless reproduction needs embedded in our DNA for the good of the species, but that is so far condescending to the hundreds of millions of people who feel romantic love but have no urge to reproduce, not to mention to homosexuals. What's clear, romantic love is not merely a means to a singular practical end/ output (e.g. to reproduce), but more of a demonstration of individuality and an expression of unity at the same time, which science will struggle to understand linearly and completely fail to explain definitively.


Science does not explain completely life nor does it explain-away god, and the best scientists most of all acknowledge this. Ian McGilchrist highlighted this, writing 'according to a book-length study of the beliefs and characteristics of Nobel Laureates, 286 overall only 10.5% described themselves as ‘atheist, agnostic, freethinker or otherwise nonreligious at some point in their lives’. What is striking, however, is that while the figure reaches as high as 35% for Laureates in literature, the figures for science are 8.9% in physiology/medicine, 7.1% in chemistry, and 4.7% in physics. Since Nobel prizes are a twentieth-century invention, that means that, as far as we know, 95.3% of the most acclaimed physicists of the twentieth and twenty-first centuries were consistent theists – still more striking when one realises that agnostics and ‘freethinkers’ counted, for the purposes of this exercise, as nonbelievers.'


The personal cost of modern scientific culture on scientists


To finish, consider how once Charles Darwin became a respected scientific leader, the originally generalist discoverer changed himself to be a role-model scientist, in the most extreme form - sacrificing his emotional side. He wrote to fellow Biologist T. Huxley in 1857 a letter saying ‘alas a scientific man ought to have no wishes, no affections, – a mere heart of stone.'


Darwin took himself seriously, and this is shown by how twenty years later, in his autobiography, Darwin reflected 'I have said that in one respect my mind has changed during the last twenty or thirty years. Up to the age of thirty, or beyond it, poetry of many kinds, such as the works of Milton, Gray, Byron, Wordsworth, Coleridge, and Shelley, gave me great pleasure, and even as a schoolboy I took intense delight in Shakespeare, especially in the historical plays. I have also said that formerly pictures gave me considerable, and music very great delight. But now for many years I cannot endure to read a line of poetry: I have tried lately to read Shakespeare, and found it so intolerably dull that it nauseated me. I have also almost lost my taste for pictures or music … My mind seems to have become a kind of machine for grinding general laws out of large collections of facts … and if I had to live my life again, I would have made a rule to read some poetry and listen to some music at least once every week; for perhaps the parts of my brain now atrophied would thus have been kept active through use. The loss of these tastes is a loss of happiness, and may possibly be injurious to the intellect, and more probably to the moral character, by enfeebling the emotional part of our nature.'

Scientists need to protect themselves from the culture of certainty, the culture that they are specially able to be non-biased through strict scientific servitude, when in reality this is tantamount to turning your heart to stone in order to follow a cult. That is the culture that turns them into feeling like machines 'grinding general laws out of large collection of facts'; slaves to some scientific dogma - explaining falsely how the world is just a mechanical thing, not a creative process - a wish in itself which deep down troubles and enfeebles even the best scientists who fall down its rabbit hole.




References

(1) Antifragile: Things That Gain from Disorder ,  Nassim Nicholas Taleb, 2012

(3) Why most published research findings are false, John P. A. Ioannidis, 2005, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1182327/

(4) Peer Review: A Study of Reliability, Stephen J. Ceci and Douglas P. Peters, 1982, https://www.jstor.org/stable/40164010

(5) Peer review: a flawed process at the heart of science and journals, Richard Smith, 2006, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1420798/

(6) 90% of All the Scientists That Ever Lived Are Alive Today, Eric Gastfriend, 2015 https://futureoflife.org/guest-post/90-of-all-the-scientists-that-ever-lived-are-alive-today/

(7) Scientific Publishing in Biomedicine: A Brief History of Scientific Journals, Asghar Ghasemi, 2022

(8) Denying to the Grave: Why We Ignore the Science That Will Save Us, Revised and Updated Edition, Sarah and Jack Gorman, 2021

(9) Number of Academic Papers Published Per Year, 2023, https://wordsrated.com/number-of-academic-papers-published-per-year/

(10) Autobiography Life and Letters of Charles Darwin, Descent of Man A Naturalist's Voyage Round the World Coral Reefs Voyage of the Beagle Origin of Species Expression of Emotion in Man and Animals, Charles Darwin, 2015

(11) The Matter With Things: Our Brains, Our Delusions, and the Unmaking of the World, Ian McGilchrist, 2021

Comments


bottom of page