Sh!t Sc!ence
Our regular science column
At Sh!t Sc!ence we like to defend the lil’ science, the science that gets bullied, the science that gets wedgies from the bigger sciences when it tries to speak up, because we believe that every science, no matter how puny, has value and should be respected. This week: archaeology. Need I say more?
Yes, archaeology, or the discovery of things that happened literally thousands of years ago. Going to an archaeology conference is like watching an episode of Sherlock, “This fragment of a piece of bone was positioned two centimetres East of the remains of a fire pit which is evidenced by these three flecks of carbon matter. Therefore, Australopithecus tried to kill Homo erectus!” Elementary, until another archaeologist finds another tiny fragment of bone on the other side of the planet, and the entire field is shaken to its core. It is a strenuous, arduous, ridiculous nerd of a science; but it is not useless, because looking into the past gives us a better grasp of what is coming in our future.
That is the case of a new study published in Clinical Anatomy this week. Sometimes all that is left of a time period is human remains, so archaeologists have learned to read anatomical clues to the conditions in which people were living. One such clue is a condition called cribra orbitalia (CO), where the bone inside the eye socket becomes porous. This condition is generally accepted to be a tell-tale sign of anaemia due to iron deficiency, periods of malnutrition or infection. If archaeologists find a high prevalence of CO in the population, they take it as a sign that these poor people had a rather rough time of it. “But there’s been a lot of debate about the prevalence of CO in modern populations, with some saying it had effectively disappeared,” says Ann Ross, co-author of the study. The scientists compared 245 prehistoric, 381 historic (pre-20th century) and 218 modern skeletons. Surprisingly, the researchers found that not only was CO not extinct, it was relatively common, with 12.35 percent of modern North Americans and 16.8 percent of modern South Africans displaying it. Even more surprisingly, this is a higher prevalence than in their historic counterparts. “We think the increased prevalence of CO in the modern skulls may be due to intestinal parasites in some populations and iron-poor diet”, Ross says.
Therefore, like many a time before, knowledge of prehistory sheds light on modern culture. These findings, Ross says, show that “disadvantaged socioeconomic groups, and parts of the developing world, are still struggling with access to adequate nutrition.”