The perils of detecting pathogenic bacteria in foods

Posted: 1 February 2017 | | 1 comment

Testing foods for microbes may seem archaic when compared to testing for the presence of chemicals. After all, chemists get to use very expensive grey boxes with blinking lights and labelled with an alphabet soup of acronyms. Not only that, they can sometimes do screens of hundreds of analytes in a single run. So, why can’t you do the same for microbes? Andrew Hudson explains why it would be fantastic to have a screen for all bacterial pathogens in a single run.

Pathogen contaminants in food

Well, yes it would, but microbiological testing is still largely constrained by an initial step in the process, known as the “enrichment”. This is needed as there is no accepted means of extracting either the bacteria themselves or their DNA from food by chemical or physio-chemical means. And even if you could, the detection methodologies available are not (yet) able to detect the small numbers of cells that you need to detect. So, to get around these problems the bacterial targets are “grown up” (enriched) to a detectable concentration using a culture procedure. This typically means homogenising the food in a sterile liquid containing nutrients which encourage the target bacteria to grow and other chemicals to supress the growth of other bacteria likely to be present and interfere with the detection system. For example, the nutrient sorbitol is added to media targeting Yersinia enterocolitica and the antibiotic nalidixic acid is used to suppress competitors of Listeria monocytogenes. Enrichment exploits the ability of bacteria to replicate under the right conditions in an exponential manner. Some organisms can divide in a very short period indeed, for example under optimum conditions Clostridium perfringens is reported to double in number every 6.3 minutes, which approximates to almost a 1000-fold increase every hour.


Unfortunately, not all of the bacteria that you might want to detect behave in a similar way and that means that there cannot be a one size fits all enrichment procedure. For example, it is usual for an enrichment to be done at the optimum growth temperature of the target. That might be OK for growing Salmonella and Escherichia coli O157 at 37°C, but not so good for Y. enterocolitica which likes it a little cooler. Similarly, Campylobacter needs a low oxygen concentration to grow, others prefer to grow under normal atmospheric conditions. Also, Campylobacter grows painfully slowly (one doubling per hour) in comparison to the other foodborne pathogens commonly tested for, and that means that the enrichment might be too short for Campylobacter to reach detectable levels or so long that any organism which can grow will have grown to massive numbers and possibly interfere with detection.

Let’s assume, though, that there is a universal enrichment medium. The next step is to detect the bacteria present in the sample and, again, people would ideally like to detect all of the pathogens in a single “run”. There are currently three main methods for detecting bacteria:

  1. “traditional” plating
  2. DNA detection (e.g. the Polymerase Chain Reaction, PCR)
  3. immunological

The plating method relies on bacteria to grow from a single cell to produce large piles of cells, known in the trade as colonies, on solid agar surfaces. This growth is encouraged by using conditions favouring the target organism, and the media usually contain chemicals which allow for easy visual recognition of the colony type sought. Sometimes the media are very poor at selecting for the pathogen and for allowing recognition, a recognised one being that for Yersinia where almost all of the colonies look like a bull’s eye with a red centre. For the same reasons that there is no universal enrichment broth there is no single plating medium for all foodborne pathogens, but there is no reason why the same enrichment could not be used to plate several different media at the same time.

Observing a colony that looks like a pathogen on a plate is hardly sufficient evidence to say that the food is contaminated. The “presumptive” isolate must then be “confirmed” as the pathogen in question. There are several ways this can be done, but many protocols still rely on conducting a battery of biochemical tests, a process which is very time consuming even with the availability of many disposable kits to help.

Skewed results

Another issue with the cultural approach is that closely related species might grow at different rates during the enrichment and skew the results of the test. An example is with Listeria where Listeria innocua, a harmless organism, is able to outcompete the well-known pathogen L. monocytogenes. The result might be that if there are 200 colonies on a plate which look like Listeria and the benign species has swamped out the pathogen then your chances of detecting the pathogen by confirming the identity of the standard five colonies are reduced; a false negative result.

Nucleic acid detection methods such as PCR can be devised to detect more than one pathogen, but generally speaking, the more targets included per reaction the less efficient the system becomes and the more difficult it is to design the protocol. However successful applications of multiplex (detecting more than one target) have been reported for at least three targets and commercial test kits have been developed. A potential method for the future is MLPA (Multiplex Ligation-dependent Probe Amplification) which offers the prospect of detecting many targets in the same reaction. Papers reporting sensitive detection of 10 pathogens have been published and instances of detecting more than 10 targets in other applications reported. Nucleic acid detection techniques need enrichment to ensure that targets detected are live (DNA from dead organisms would also be detected) and to dilute out chemicals in the sample that might interfere with the reaction.

The remaining commonly applied detection method is based on immunology and often referred to as a lateral flow device. These are similar to pregnancy testing kits and a coloured band shows when bacteria in the sample bind to antibodies on the test strip as the sample flows along the porous material. Coloured particles also bound to the bacteria become visible when the bacteria are captured by the antibodies on the band. They are simple to use, but not very sensitive and so are reliant on the enrichment for the bacteria to reach a sufficiently high concentration. Examples of testing protocols for more than one target have been described, for example for the detection of various kinds of mycotoxin.

Depending on the application it might be necessary to confirm the results of a PCR or immunological test using classical cultural methods. While there is some momentum towards accepting PCR test results, protocols generally the confirmation of a colony. As such, PCR and immunological tests can be used as rapid screening methods, assuming that most of the samples tested will be negative for the pathogen. If it’s positive, then it’s back to the enrichment and plating methods. This can be a problem as, for example, Yersinia is present in far more samples when PCR is the test used when compared to culture. So cultural testing of a PCR-positive sample may well be fruitless. But which of the two results is correct?


This was a brief account of the trials and tribulations of testing foods for bacterial pathogens, and it must all seem very old fashioned compared to testing for chemicals. The possible bright light on the horizon is high throughput sequencing in which all of the bacteria in a sample could be detected. The two most obvious issues for this approach to contend with are the taxonomic level which can be discriminated and the concentration of cells that need to be there to be detected.

It should be remembered that foodborne viruses like norovirus and protozoa such as Giardia cannot be enriched in a medium as they do not grow in the absence of live host cells. For some of these organisms you can “enrich” but it is a completely different system to that used for the bacteria. Please note too, that the above discussion applies only to those tests for bacteria where a present/absent answer is needed. If counts are needed then enrichments cannot be used as the number after enrichment will not be the same as that prior to enrichment.

An aspect of enrichment which has not received a great deal of attention is the potential for closely related species to grow at different rates and skew the results of the test. An example is with Listeria where L. innocua, a harmless organism, is able to outcompete the pathogen L. monocytogenes. Since a limited number of colonies are confirmed than there is potential for the presence of L. monocytogenes to be obscured by the more abundant L. innocua, so leading to a false negative result. It is also possible that bacteriophages (viruses which kill bacteria) may be present and lead to significant killing of the tar.

An obvious drawback with enrichment is that the concentration of bacteria at the end of the process are unlikely to be related to the initial concentration if the incubation time is long enough.

About the author

Andrew Hudson is a Fellow of the Institute of Food Science and Technology. He has a BSc (Hons) from Bristol University and doctorate from the University of Waikato in New Zealand. Andrew has published more than 90 peer-reviewed papers, reviews and book chapters on various topics in food microbiology.

Related topics


One response to “The perils of detecting pathogenic bacteria in foods”

  1. Shaylee Packer says:

    As you mentioned, not all bacteria behave in the same way, so there can’t be a one size fits all enrichment procedure. I wonder how many tests had to go into being able to figure out what would dissolve certain bacteria. It would be interesting to see one of these tests in action.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.