Science communication

Communication

As this will be the last post on my blog I thought it fitting to end with a discussion of the history of science communication itself. That is, to provide an overview of the major changes that have occurred over the past few hundred years in the way that science has been conducted and communicated.

During the 18th and early 19th centuries, scientists usually depended on patrons for financial support and credibility. Those with independent means funded their own research and papers were largely produced by those working in the private sphere.

At this time, science was popular for both education and entertainment. Ordinary people were exposed to scientific ideas through exhibitions and museums as well as public lectures and demonstrations. Members of the middle classes pursued academic hobbies and amateur scientists made breakthroughs in fields such as astronomy, geology, botany and zoology. People who were literate could also read about science in books, newspapers and periodicals. A number of early Australian newspapers including the Sydney Morning Herald, Hobart Mercury, Melbourne Argus and Brisbane Courier published science articles, weather reports and agricultural information in the 19th century.

The late 1800s saw a shift in the way that science was studied and communicated. Scientists began to conduct research in labs away from the public eye and discuss their findings with other members of newly-established scientific societies. There was a greater focus on sharing findings with others working in the same field through the publication of scientific journals and the process of peer review. Most fields became dominated by experts and amateur scientists no longer made such important contributions to scientific knowledge. Overall, science became increasingly professionalised and more aligned with government and research institutions such as state-funded universities.

In industrialised nations this trend continued until the middle of the 20th century, at which point there was the emergence of Big Science. From the Second World War onwards there has been an emphasis on large-scale government funded projects and international partnerships. Increasingly, research scientists began to focus on collaborative work and have been able to utilise large budgets in multi-million dollar laboratories. Scientists also started to work for large corporations that invest in research and development. To draw an example from the biological sciences, the Human Genome Project is typical of the kind of work that was being done by the late 20th century.

These changes caused a significant shift in the way that scientific knowledge was communicated to the public. A widening gap between researchers and the general public meant that scientists were no longer able to educate people directly. The media largely took over the responsibility of communicating science and journalists played a key role in mediating between scientists and the general public. This kind of science communication used a top-down approach known as the deficit model. This strategy followed the assumption that most people had a low level of scientific literacy and simply needed to be fed the correct information about the issue at hand. This one-way communication strategy did little to support public engagement with science as it left no room for dialogue about scientific issues. Furthermore, it meant that science stories were selected based on newsworthiness and communicators were limited in their ability to target specific groups.

Over the past few decades, the relationship between scientists and the public has begun to reflect that of earlier times. An explosion of different forms of media has meant that there are new ways of interacting with the public and science is no longer just channelled passively to people through third parties. Many scientists don’t just research and publish papers but perform roles as communicators too. Scientists can discuss their work and interact with both other researchers and the general public using social media such as Facebook, Twitter and LinkedIn. Others blog or have YouTube channels. While the rise of Big Science has meant that individual scientists are generally less visible, the media has enabled some to become well known. Sir David Attenborough has been presenting nature documentaries for years and most Australians are familiar with Dr Karl Kruszelnicki’s regular science talk back on Triple J radio. Science is now easily accessible through multiple channels such as news media, radio, the internet, books, magazines, television…

Non-professionals have also started to assume a more central role in scientific research through citizen science programs whereby members of the public can participate in research projects. Science cafes, TED talks, themed events and open days at universities and research institutions are also popular. For example, Australian National Science Week has been an annual event since 1997. Science issues also receive attention as they frequently spill over into other areas of public interest such as politics, the environment, health, travel and education.

Although the ways that science research has been conducted has transformed over the past few centuries, in many ways the communication of science has come full circle as it has come back to an approach that promotes public engagement and participation.

Image credit

Feliz, E. (2009). Communication [Image]. Retrieved from https://www.flickr.com/photos/elycefeliz/3224486233

References

Burns, M. (2014). A brief history of science communication in Australia. Media International Australia, 150, 72-76.

Konneker, C. & Lugger, B. (2013). Public science 2.0 – back to the future. Science, 342(49), 49-50.

Logan, R. A. (2001). Science mass communication: its conceptual history. Science Communication, 23(2), 135-163.

Re-evaluating scurvy in the Irish famine

Historical records from the 18th and 19th century document cases of scurvy at a level that is unsupported by archaeological evidence. Scurvy is a nutritional condition that results from vitamin C deficiency and it commonly occurs during times of famine. The characteristic bone lesions formed upon the re-introduction of vitamin C into the diet have long been used by archaeologists to identify the disease in skeletal remains. This post explains how recent improvements in bioarchaeology technology and techniques have been used to identify the disease in victims of the Irish Famine (1845-1852) and suggests that early studies showing lower rates of scurvy may have missed signs of the disease.

“Unhinged her nerves completely”: late Victorian attitudes towards female cyclists

Ever since the invention of the bicycle in the late 19th century, cycling has been promoted as a healthy and invigorating outdoor pursuit. However, concerns have also abounded about the potential health risks of engaging in the sport. As explained in a post by Professor Hilary Marland over at the Wellcome Trust history blog, in late-19th century Britain there were conflicting attitudes towards cycling. It was regarded by some as a therapeutic activity while others saw it as a dangerous pastime for women.

Dieting since the 19th century

A_corpulent_physician_diagnoses_more_leeches_for_a_young_wom_Wellcome_V0011771

Alarmingly high levels of obesity in western nations means that dieting and weight loss programs feature heavily (pardon the pun) in our popular consciousness. Yet an obsession with weight goes back at least 150 years to a period well before the current obesity crisis.

Prior to the mid-19th century, carrying a little excess weight was seen as a good thing. In the age before vaccinations and antibiotics it was commonly believed that being fatter enabled people to better withstand infectious diseases. Weight gain was also regarded as desirable because most people were thin as the result of not having enough to eat. Being overweight was a marker of prosperity as it signified that a person had the means to buy plenty of food and indulge themselves. Only wealthy merchants, businessmen or members of the aristocracy would have been able to afford to become corpulent, hence dieting to lose weight was relatively uncommon in this period.

The mechanisation and prosperity that accompanied the industrial revolution brought with it weight gain and the beginnings of modern diet plans. Reduced energy expenditure and increased access to (often poor quality) food meant that obesity began to rise in the working classes. The trend of previous centuries became reversed as obesity was increasingly associated with the lower classes while physical exercise fads and a quest for thinness became prominent amongst the wealthy upper echelons of society.

The social and economic changes that contributed to rising obesity influenced the emergence of the field of nutritional science. In the late 1800s balancing proteins, carbohydrates and fats featured in government health publications. Vegetarianism began to be promoted for good health from the early years of the 20th century. The food pyramid was created during the First World War. Calorie counting and diet pills emerged the 1920s and vitamins designed to correct nutritional deficiencies resulting from restrictive diets were sold from the following decade. Weight-for-height charts similar to the BMI charts of today first appeared in the 1940s and doctors began to advise their overweight patients to cut down on saturated fats in the 1950s.

As you can see, a concern with weight control has existed for quite some time. For a more in-depth look at the history of dieting you can read Louise Foxcroft’s entertaining and informative book Calories and Corsets: A History of Dieting Over 2000 Years.

Image credit

Numa, P. (1833). A corpulent physician diagnoses more leeches for a young woman, who lies drained and bedbound [Image]. Retrieved from http://commons.wikimedia.org/wiki/File:A_corpulent_physician_diagnoses_more_leeches_for_a_young_wom_Wellcome_V0011771.jpg

References

Rao, N. (2011). Dieting since the 1850s. The Journal of Health, Ethics, and Policy, 10(9), 38-39.

A festival of science: Christmas traditions in the Victorian age

I love Christmas and eagerly start looking forward to the festivities about this time every year. But what does Christmas have to do with science? Well, more than you may think.

The Victorian obsession with rationality and progress saw massive changes in science and technology throughout the 19th century and the growing interest in science spilled over to all aspects of public life. This post from the Guardian shows how the festive season was a time of scientific celebration for our Victorian ancestors, when scientific lectures, entertainments and gifts were common features of Christmas celebrations.

Resurrection men and anatomists

Resurrectionists_by_phiz

The availability of human bodies is critical to the study of anatomy. Cadavers are usually made available for research purposes through programs where people bequeath their bodies to medical schools and universities when they die. This wasn’t always the case, as during the 18th and early 19th centuries bodies were illegally procured for dissection.

The rapid growth of the biological sciences during this period was matched with an increased demand for human cadavers for dissections in medical schools and anatomy demonstrations. Up until the early 19th century in the United Kingdom the only legal means of securing corpses for anatomical research by was claiming the bodies of those condemned to death and dissection by the courts. As only those who committed the most serious felonies were sentenced to this fate only a few bodies were made available to anatomists each year and there was a severe shortage of cadavers.

Spotting a lucrative market in the trade of human bodies, people took to stealing the bodies of the recently deceased from fresh graves and selling them to medical schools. They would dig up the head end of a recent burial under the cover of night, break open the coffin, tie a rope around the neck of the corpse and pull it out. They earned the nicknames “resurrectionists” or “resurrection men”. As stealing from cemeteries was not a crime that was punished harshly in courts and many medical schools were willing to pay handsome sums for bodies there was a significant incentive for criminals to engage in body snatching.

Many people feared dissection as it was believed that the soul of a person who had been dismembered was unable to enter Heaven in the afterlife. The prevalence of body snatching caused fear amongst the pubic and people went to extreme lengths to prevent the bodies of their loved ones from ending up on anatomists’ dissecting tables. Often times, the friends and relatives of someone who had died would watch over their grave day and night until the point that the body would have decayed and become useless for medical dissection. In other cases, people were buried in heavy iron coffins or cages called mortsafes that were built around graves to prevent their contents from being exhumed.

In Britain, a spate of murders committed to obtain fresh bodies to sell to medical schools led to the passing of the Anatomy Act of 1832. This Act permitted unclaimed bodies and those donated by relatives to be used in the study of anatomy. It also regulated anatomy instruction through a licensing system that monitored private medical schools. By regulating an increased supply of corpses for scientific research this legislation finally brought the practice of grave robbing to an end.

Image credit

Knight Browne, H. (1847). Resurrectionists [Image]. Retrieved from http://en.wikipedia.org/wiki/Resurrectionists_in_the_United_Kingdom#mediaviewer/File:Resurrectionists_by_phiz.png

References

Quigley, C. (2012). Dissection on display: cadavers, anatomists and public spectacle. Jefferson: McFarland & Co.

Joseph Lister: the father of antisepsis

445px-Joseph_Lister_1902

Hospitals have not always been the places of cleanliness we know them as today. For centuries they were places where people were just as likely to die as they were to be cured. Even if a person was able to survive the ordeal of surgery without anaesthesia, the unsanitary conditions of operating rooms meant that a postoperative infection was likely to result in their demise. By the late 19th century this began to change, thanks to the work of English surgeon Joseph Lister.

Born in 1827, Lister graduated as a doctor in 1852 and spent much of his early career working in Scotland. It was there that he noticed a mortality rate of nearly 50% in patients following surgery. Infections in wounds resulted in fatal systemic inflammation known as sepsis and this phenomenon was so common in hospital settings that it earned the nicknames “ward fever” and “hospitalism”.

Infection was poorly understood at the time and most people subscribed to one of two alternative theories. The first was known as miasma and stated that infectious diseases were caused by impure air and noxious gases. The second was called contagionism and proposed that infections in wounds arose spontaneously by an unknown action of the tissue itself. Neither explanation connected the practices of doctors to the outcomes of surgery and although French chemist and microbiologist Louis Pasteur had demonstrated the existence of micro-organisms in the mid-1860s, germ theory was not yet accepted by the medical establishment.

After witnessing how patients with simple fractures survived whilst those with compound fractures in which the bones pierced the skin often died, Lister became convinced that infections in surgical patients were being caused by outside agents. After reading the work of Pasteur, he took measures to kill the pathogens that he believed were causing infections in wounds. Suspecting that it was an antiseptic, Lister diluted the carbolic acid used to treat sewage and applied it to dressings. He also used it to sterilise surgical equipment and wash his hands. He sprayed it around operating theatres to eliminate airborne pathogens and soaked catgut sutures in the solution in further attempts to reduce infection.

His techniques were remarkablly successful and the incidence of infection was drastically reduced. The death rate of Lister’s surgical patients fell from 45% in 1866 to just 15% by 1870. He took measures to eliminate pathogens that had already entered wounds and prevent others from entering sterile operating rooms (antiseptic and aseptic techniques).

After publishing his results in the Lancet, Lister’s work began to receive a great deal of attention. While some hailed his findimgs as a breakthrough in surgical technique, others viewed it with scepticism. His methods were not immediately adopted as it took over a decade of work before he could convince others of his theories and sanitary surgical procedures became accepted as common practice. Yet once doctors began paying better attention to hygiene in hospitals patient health dramatically improved and the field of surgery was able to advance rapidly. Joseph Lister continued to refine his surgical techniques for the rest of his life and was knighted for his services to medicine in 1883.

Image credit

Unknown. (1902). Joseph Lister [Image]. Retrieved from http://commons.wikimedia.org/wiki/File:Joseph_Lister_1902.jpg?uselang=en-gb

References

Osborn, G. G. (1986). Joseph Lister and the origins of antisepsis. The Journal of Medical Humanities and Bioethics, 7(2), 91-105. Retrieved from JSTOR.

The forgotten anatomist

Many people have heard of the medical textbook Gray’s Anatomy. First published in 1858 by English anatomist and surgeon Henry Gray, the book remains an authoritative textbook and continues to be revised and republished more than 150 years later.

Yet Gray’s Anatomy was the work of two people, not one. While Henry Gray was responsible for the text, the book owes much of its success to the fine engravings of the illustrator Henry Vandyke Carter. A brilliant anatomist in his own right, Carter was overshadowed by his colleague’s success and has now been largely forgotten. To find out more about him visit the Dittrick Museum blog.

Or if you are interested then take a look at some of the books written on the subject. I recommend The Anatomist: A True Story of Gray’s Anatomy by Bill Hayes and The Making of Mr Gray’s Anatomy: Bodies, Books, Fortune, Fame by Ruth Richardson.