• Maths

  • Physics

  • Chemistry 

  • Biology

  • Computer Science


Culture vs Science


Science has clashed with culture throughout history. From Galileo battling the Church to global warming, there have always been disagreements based on scientists challenging the norm. Here, we discuss various times when scientists have challenged a culture, for the better, but been shut down by those in power. 

The traditional example of this is Galileo. Born in 1564, he championed Copernicus’ Heliocentric model of the solar system, much to the disdain of the Catholic Church. Such a model would broaden our horizons, quite literally, as we went about investigating the world above our heads. Had we still believed in the old model, we would never have had a successful space programme, never been to Mars and never decided to fly drones on a moon thousands of miles away (Dragonfly mission - do check it out). Yet, instead of seeing progress for what it was, the culture of the time rejected it and commissioned an inquisition to brand it heretical, foolish, and absurd. 

Another, more serious, rejection of ideas came from the scientific community itself. Standard medical practice was to operate on patients after carrying out autopsies with the same tools, without washing your hands. Now, especially during the COVID-19 pandemic, we cannot imagine going around with unwashed hands, spreading disease between dead bodies and patients. However, when a doctor named Semmelweis theorised, with evidence, that the unhygienic doctors were killing their patients, he was met with rejection and died in an insane asylum. The scientific culture of the time did not want to admit to being wrong in such a serious way, and so chose to continue the lie. 

Before the 1940s, smoking tobacco was not only commonplace but considered beneficial for health. Cigarette companies had medical doctors on their adverts, promoting the benefits of smoking. In retaliation against the publication of studies in the 1940s that linked smoking to cancer, the tobacco companies, instead of re-evaluating their product, insidiously commissioned their studies to ‘prove’ that smoking was safe. It was not until the U.S. Surgeon General himself released a report, concluding that smoking caused cancer, that the public took the dangers of smoking seriously. This was in 1964, after over 7,000 articles had been published in the medical literature about smoking. Despite the piling evidence, companies refused to change their ways until forced by law - and the issue still plagues us today. The development of vaping and e-cigarettes shows how the tobacco industry has simply re-branded, rather than society truly tackling nicotine as a public health issue. While our culture has turned on smoking, commercial industries still refuse to learn the ethical lessons taught by science, and popular culture prefers an attractive product to scientific fact. 

Another area where the wider culture refuses to help itself and industry ignores denies, and discredits the scientific community is climate change. Research into the impact of carbon dioxide (CO2) on the global temperature started over a century ago in 1909 when the term ‘greenhouse effect’ was first coined. By the late fifties, more scientists were arguing that our CO2 emissions were harming the environment, including Edward Teller, who declared that the temperature rise associated with just a 10% rise in CO2 would be enough to put New York underwater. He further warned that atmospheric CO2 was increasing at an exponential rate, and yet he was ignored, and nothing was done for years. Despite the scientific consensus and accumulation of evidence over the past century, we have continued using fossil fuels ubiquitously; the cultural norm was, and still is, one of burning fossil fuels to power our every need. Even though there have been cultural movements, like Extinction Rebellion, and global agreements signed, the world still gets over 80% of its energy from fossil fuels. [1] Had we acted sooner, rather than blindly accepting the status quo, had we listened to the warnings of science rather than adhering to the voice of business and power, we would not be in today’s dire situation. 

While, in the twenty-first century, many in the public choose to listen to science, rather than companies that happily kill their client base to make a few extra bucks, the clash between culture and science continues. There have been some infamous situations where culture and science have fought, and culture has won (at least temporarily). Just because a view is popular, does not mean it is true. [2] If everyone could learn to be open to alternative viewpoints, while respecting the research that develops scientific theories, many of the issues that have troubled us for years would never have been issues at all. 

Written By Eleanor & Ella (Physics subject reps.)



[1] [accessed 14.9.21] 

[2] Confirmation bias - Conservapedia, 




Lectures – ‘The mathematics of love’ by Hannah Fry, ‘How they found the world’s biggest prime number’ by Matt Parker


YouTube Channels – 3Blue1Brown, blackpenredpen, Numberphile


Books – The Code Book by Simon Singh, The Housekeeper and the Professor by Yoko Ogawa, The Simpsons and their Mathematical Secrets by Simon Singh.


Favourite: The Code Book by Simon Singh. This book takes you through how codes and cyphers have developed over time from the simplest forms of steganography to the modern age which currently utilises prime numbers and a lot more maths. It is one of the applications of maths that makes use of the problems that haven’t been solved rather than using what we already know.

The Koch Snowflake

Niels Fabian Helge von Koch (1870-1924) was a Swedish mathematician famous for his discovery of the Koch snowflake curve. The Koch snowflake is built-in iterations, in a sequence of stages. Starting with an equilateral triangle, you create another, smaller, equilateral triangle in the centre of each side. This is repeated indefinitely, and during this, a snowflake shape would be created. The Koch snowflake is an example of a fractal curve (in fact, it was one of the first fractals to have been described), a shape that has a similar pattern at any magnification. For example, if you look at one part of the shape in its third iteration, you would be able to see a very similar structure at a later iteration, when the snowflake is magnified.


The Koch snowflake has quite unique properties: it has an infinite perimeter, but a finite area. As iterations happen infinitely, the perimeter and area keep increasing. With every iteration, the perimeter of the shape increases by a factor of 4/3. As the number of iterations tends to infinity, the perimeter keeps on increasing, reaching infinity.


As the number of iterations increases, the triangles added get smaller and smaller. With the area, the snowflake would never exceed the area enclosed by the orange hexagon on the right, as the triangles being added are so small. For this reason, the Koch snowflake has a finite area.


Following the same concept, many other mathematicians created variants of the Koch snowflake, using different initial shapes, angles, and planes. After many iterations, intricate and beautiful shapes can be created.

Tessellations can also be created using snowflakes of the same or different sizes. In the gaps, the same snowflake is created, demonstrating its repetitiveness and versatility.












The Koch snowflake is used to illustrate that ‘it is possible to have figures that are continuous everywhere but differentiable nowhere’. Fractals are used to understand important scientific concepts, for example, the way in which bacteria grows and brain waves.


The Life and Work of Maryam Mirzakhani


Maryam Mirzakhani was an Iranian-born mathematician and professor at Stanford University, born in Tehran, Iran. By her own estimation, she was fortunate enough to grow up after the Iran-Iraq war, when the political, social, and economic situation had stabilised, so she could focus on her studies. As a teenager, she showed her mathematical skills by winning gold medals in 1994 and 1995 in the International Mathematical Olympiad. Mirzakhani first gained her bachelor’s degree in mathematics from the Sharif University of Technology in Tehran in 1999 before going on to earn her PhD in 2004 from Harvard University. She then became a Clay Mathematics Institute research fellow at Princeton University before finally becoming a full-time professor at Stanford University in 2008.



Her PhD was concentrated on the topic of Riemann surfaces and by the time she became a professor, she was considered to be a leader in the fields of hyperbolic geometry, topology, and dynamics. Her contributions to her field also led to her being awarded a Fields Medal, often regarded as one of the highest honours a mathematician can receive, in 2014, in which the award committee commended her for her ‘outstanding contributions to the dynamics and geometry of Riemann surfaces and their moduli spaces. She is the first and to date only female winner of the Fields Medal since its inception in 1936. Her technique involved using the moduli spaces of surfaces. In 2014, the Mirzakhani Society was founded by students at the University of Oxford, a society for women and non-binary students studying Mathematics at the university in which Mirzakhani met whilst visiting Oxford back in 2015. Mirzakhani also became a member of the National Academy of Sciences in 2016, which made her the first Iranian woman to be officially a part of the academy. Mirzakhani said she enjoyed pure mathematics the most, due to the elegance and longevity of the questions she studied.



Mirzakhani was diagnosed with breast cancer in 2013 and unfortunately ended up passing away on 14 July 2017 (at the age of 40) at Stanford Hospital. Many paid tribute to her in the days following her passing, with Mirzakhani’s birthday (12 May) being agreed by the International Council for Science to be International Women in Mathematics Day. On 4 November 2019, it was announced that The Breakthrough Prize Foundation had created the Maryam Mirzakhani New Frontiers Prize which was to be awarded to outstanding women in Mathematics annually.

Written by Samantha and Saachi (Maths subject reps.)


Computer Science:

The Origins of the Hacker

Hacker culture is defined as a group or subculture of individuals who enjoy creatively overcoming the limitations of software to achieve 'clever' outcomes.  Whatever these 'clever' outcomes may be, the word 'hacker' does not have positive connotations for most people. Nowadays, 'hacking' could mean a whole company goes completely bankrupt from one exploitation in their code or an individual is victim to cyber identity theft. With today's rates of cybercrime incidents, law enforcement is struggling to keep up, and these incidents are no longer easy to identify and prosecute.  However, the origins of hacker culture were quite different. 

In the 1960s at MIT, the word "hacker" originated as an extremely skilled individual who practised what we now may think of as ancient computing languages like FORTRAN or LISP.  Hackers used to be people who were locked up in a room, programming all day, and no one seemed to mind them in the 60s. Most people did not even own a personal computer or laptop, let alone know what hacking was.  Yet, people who knew about these programmers viewed them positively and welcomed them to challenge computer systems and software to improve them.  

However, in 1971, the first major hacking was carried out by a vet named John Draper, who figured out how to make free phone calls and this act later became known as... phreaking. Although this may not be similar to the overscaled hacking you may be used to, at the time, it was considered completely groundbreaking. Those who followed John Draper were groups called "Legion of Doom" and "Choas Computer Club", two of the largest and most respected hacker groups ever found, and Kevin Mitnick still the world's most famous hacker. 

As technology and code progressed, so did hacker culture. Hackers can find more ways of exploiting holes in software and remote machines; most things you would not even think you could hack, like a boiler system or an electric iron.  Hackers can also find and release vulnerabilities that can be very useful for software engineers to know about and fix before those vulnerabilities cause even worse problems. 

Although hackers are commercialised as the evils of cyberspace, real hackers only want to learn more about a program and tend to be more helpful than problematic. While it is often true that hackers do commit malicious attacks, their acts should not be considered as part of 'hacker culture', something that originated from a place of curiosity and love for programming.

Written by Ailin (Computer Science rep.)




  • The Immortal Life of Henrietta Lacks

  • Genome

  • The Epigenetics Revolution

  • The Selfish Gene (Year 10+)

  • The Man Who Mistook his Wife for a Hat

  •  Sapiens

  • Medicine related books:

  • This Is Going to Hurt

  •  When Breath Becomes Air

  •  Do no harm


  • BBC has great science pieces e.g one recently on antibiotic resistance

  • The BMJ (use school subscription)

  • The Lancet  & Nature (both are journals that require subscriptions but there are some free articles)



  • edx has really good courses e.g Harvard Fundamentals of Neuroscience is good for A-Level students who want to learn Y2 course content in advance



  • Stanford Lectures on human behavioural biology

  • TED-Ed clips

  • Ologies podcast (really interesting and covers such a wide range of topics)

My favourite recommendation would be The Man Who Mistook his Wife for a Hat as it gives an insight into incredibly interesting neurological cases encountered in clinical practice. Each story teaches you about various mechanisms you may learn about at the university level and their neurobiological manifestations. I would also really recommend the Stanford lectures because, in each session, you are able to develop your understanding of how and why humans behave in the way they do, learning about fascinating examples that are relevant to daily life. For example, one of the topics discussed is whether a person’s genetic makeup can influence their religious and political views.

Human Errors by Nathan H. Lents is a really great book that explores the evolutionary flaws in humans. Most biological books explain how evolved human anatomy is but this book shows why humans still have so many pointless bones, why we are susceptible to vitamin deficiencies and how our bodies are not entirely equipped for childbirth. This book is accessible for anyone to read and enjoy. 


Another book I recommend is called The gene: an intimate history. This book explains how genes were discovered and the ethical implications gene technology can play in the future. It involves some GCSE knowledge, but again anyone can enjoy this book.

Written by Christina and Catriona (Biology rep.)




Written by Iris (Chemistry rep.)