news

Why do humans drink animal milk?

Posted: 29 July 2022 | | No comments yet

Humans are the only species to drink the milk of another, and researchers think the answer may lie in the desperation cause by famine and disease outbreaks.

Milk and dairy products are so ubiquitous in our industry, it’s easy to forget that humans are in fact the only species that drink the milk of another.

So, why do we enjoy literally billions of litres of milk each year? Until now, it was widely assumed that lactose tolerance emerged because it allowed people to consume more milk and dairy products. But new research, led by scientists from the University of Bristol and University College London (UCL) alongside collaborators from 20 other countries, shows that famine and exposure to infectious disease best explains the evolution of our ability to consume milk and other non-fermented dairy products.

While most European adults today can drink milk without discomfort, two thirds of adults in the world today, and almost all adults 5,000 years ago, can face problems if they drink too much milk. This is because milk contains lactose, and if we don’t digest this unique sugar, it will travel to our large intestine where it can cause cramps, diarrhoea, and flatulence; known as lactose intolerance. However, this new research suggests that in the UK today these effects are rare.

“To digest lactose we need to produce the enzyme lactase in our gut. Almost all babies produce lactase, but in the majority of people globally that production declines rapidly between weaning and adolescence,” explained Professor George Davey Smith, Director of the MRC Integrative Epidemiology Unit at the University of Bristol and a co-author of the study.

“However, a genetic trait called lactase persistence has evolved multiple times over the last 10,000 years and spread in various milk-drinking populations in Europe, central and southern Asia, the Middle East and Africa. Today, around one third of adults in the world are lactase persistent.”

By mapping patterns of milk use over the last 9,000 years, probing the UK Biobank, and combining ancient DNA, radiocarbon, and archaeological data using new computer modelling techniques, the team were able to show that lactase persistence genetic trait was not common until around 1,000 BC, nearly 4,000 years after it was first detected around 4,700–4,600 BC.

“The lactase persistence genetic variant was pushed to high frequency by some sort of turbocharged natural selection. The problem is, such strong natural selection is hard to explain,” added Professor Mark Thomas, Professor of Evolutionary Genetics and study co-author from University College London.

In order to establish how lactose persistence evolved, Professor Richard Evershed, the study’s lead from Bristol’s School of Chemistry, assembled an unprecedented database of nearly 7,000 organic animal fat residues from 13,181 fragments of pottery from 554 archaeological sites to find out where and when people were consuming milk. His findings showed milk was used extensively in European prehistory, dating from the earliest farming nearly 9,000 years ago, but increased and decreased in different regions at different times.

To understand how this relates to the evolution of lactase persistence, the UCL team, led by Professor Mark Thomas, assembled a database of the presence or absence of the lactase persistence genetic variant using published ancient DNA sequences from more than 1,700 prehistoric European and Asian individuals. They first saw it after around 5,000 years ago. By 3,000 years ago it was at appreciable frequencies and is very common today.

Professor George Davey Smith’s team had been probing the UK Biobank data, comprising genetic and medical data for more than 300,000 living individuals, found only minimal differences in milk drinking behaviour between genetically lactase persistent and non-persistent people. Critically, the large majority of people who were genetically lactase non-persistent experienced no short or long-term negative health effects when they consume milk – in fact, it was a necessity for some.

“If you are healthy and lactase non-persistent, and you drink lots of milk, you may experience some discomfort, but you not going to die of it. However, if you are severely malnourished and have diarrhoea, then you’ve got life-threatening problems. When their crops failed, prehistoric people would have been more likely to consume unfermented high-lactose milk – exactly when they shouldn’t.”

To test these ideas, Professor Thomas’ team applied indicators of past famine and pathogen exposure into their statistical models. Their results clearly supported both explanations – the lactase persistence gene variant was under stronger natural selection when there were indications of more famine and more pathogens.

 “Our study demonstrates how, in later prehistory, as populations and settlement sizes grew, human health would have been increasingly impacted by poor sanitation and increasing diarrhoeal diseases, especially those of animal origin,” concluded the authors.

“Under these conditions consuming milk would have resulted in increasing death rates, with individuals lacking lactase persistence being especially vulnerable. This situation would have been further exacerbated under famine conditions, when disease and malnutrition rates are increased. This would lead to individuals who did not carry a copy of the lactase persistence gene variant being more likely to die before or during their reproductive years, which would push the population prevalence of lactase persistence up.   

“It seems the same factors that influence human mortality today drove the evolution of this amazing gene through prehistory.”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.