Wastewater’s role in antibiotic resistance in fresh produce


In an evolving health landscape, emerging research continues to highlight concerns that could impact everyday wellbeing. Here’s the key update you should know about:

A carefully controlled lettuce irrigation study reveals that while secondary-treated wastewater can still carry antimicrobial resistance risks, tertiary treatment dramatically limits what reaches the crop.

Study: Impact of treated wastewater reuse in agriculture on the transfer of antimicrobial-resistant bacteria and genes to edible crops: a One Health perspective. Image credit: Jasmine_K/Shutterstock.com

The use of treated wastewater to irrigate food crops conserves water resources, but its associated risks are poorly understood. A recent study published in the journal Frontiers in Microbiology examines the spread of antimicrobial resistance genes via treated wastewater used on crops in a controlled experimental setting.

Balancing water scarcity with food safety risks

Water is among the most highly valued natural resources, as it is the basis of life and agriculture. Sustainable food production is a major challenge amid increasing water scarcity, prompting the use of alternative water sources such as recycled water.

Wastewater, whether treated or untreated, is used to irrigate crops in over 50 countries and on more than 20 million hectares of land across almost every continent, in areas that face water shortages. For instance, the European Commission (EC) promotes the use of urban wastewater treatment plant (WWTP) effluents as a freely available substitute for freshwater in irrigation.

However, wastewater recycling carries the risk of crop contamination with foodborne pathogens. This risk is most significant with fresh produce, as it is eaten raw.

Wastewater also carries antibiotics, antibiotic-resistant bacteria (ARB), and antimicrobial resistance genes (ARGs). Wastewater treatment conditions promote the emergence of drug-resistant bacterial strains and the transmission of ARGs, contributing to the spread of antimicrobial resistance (AMR). This is especially important when dealing with resistance genes to last-resort antibiotics, such as extended-spectrum beta-lactamase (ESBL) genes, which inactivate a broad range of beta-lactam antibiotics.

AMR caused 1.27 million deaths in 2019, and is linked directly or indirectly to nearly five million deaths globally. Thus, efforts to maintain microbiological and AMR standards for wastewater reuse in agriculture are extremely relevant.

Prior research has demonstrated that WWTPs decrease ARB concentrations but do not eliminate ARGs. However, wastewater irrigation research has produced conflicting findings, perhaps because of variations in environmental conditions, soil properties, crop types, and irrigation methods. The current study examined the transmission of ARB and ARGs from treated wastewater used for irrigation to a lettuce crop under controlled experimental conditions.

Testing resistance transfer using lettuce and reclaimed water

The researchers used a three-arm experimental design to compare ARB and ARG transmission rates in lettuce grown under controlled conditions, with wastewater vs. potable water for irrigation. Each of the three arms contained 936 plantlets, exposed to potable tap water, secondary-treated wastewater, and tertiary-treated wastewater, respectively. The whole experiment was replicated to ensure repeatability.

The wastewater used came from a WWTP that used:

Primary treatment

  • Aeration
  • Solids and suspended solids separation
  • Grit removal
  • Degreasing

Secondary treatment

  • Activated sludge process with coagulation, flocculation, and lamella clarification
  • Tertiary treatment
  • Sand filtration
  • Ultraviolet-C disinfection

The researchers measured the growth in culture of the fecal bacterium Escherichia coli (E. coli) and ESBL-E. coli (representing ARB). The limit of detection was one colony-forming unit (CFU) per 100 mL for water, and 0.08 CFU per gram of lettuce, equivalent to 1 CFU per 100 mL filtered leaf wash, for the produce.

Additionally, they used quantitative polymerase chain reaction (qPCR) to assess the absolute and relative abundances of ARGs normalized to 16S rRNA gene copies: blaCTX–M–1, blaTEM, sul1, and tetA. These are important environmental AMR markers and are widely used for AMR surveillance.

Study findings

Contamination of water

Potable water had the lowest bacterial load compared to treated wastewater.

Both E. coli and ESBL-E. coli were undetectable in potable water and tertiary-treated wastewater samples. Conversely, secondary treatment resulted in detectable levels of both in wastewater in all samples tested, with concentrations several log units higher than potable or tertiary-treated water.

Similar patterns were seen for ARGs. Potable water had low levels of the ARGs sul1 and blaTEM, while the other two were undetectable. In contrast, all treated wastewater samples contained detectable ARGs. Both absolute and relative abundances of ARGs were lowest in potable water and highest in secondary wastewater.

Contamination of lettuce

With lettuce, E. coli was detected in 94 % of plants grown with secondary-treated wastewater, but in 33 % when either tertiary-treated or potable water was used. Tests for ESBL-E. coli was detected in 61 % of the secondary wastewater arm, versus undetectable in the other two arms.

Interestingly, seedlings showed detectable levels of sul1 and tetA at baseline. This indicates the need to examine contamination at the seedling level, independent of irrigation or soil contamination, while supporting a low net transfer of ARGs from irrigation water, particularly with tertiary-treated wastewater.

Post-irrigation, blaCTX–M–1 was primarily associated with lettuce irrigated with treated wastewater. At the same time, blaTEM, sul1, and tetA were detectable across all treatments, including potable water, consistent with background ARGs present in seedlings or plant-associated microbiota. Again, the levels were highest with secondary wastewater irrigation. Tertiary treatment substantially reduced ARG abundance, though they remained detectable at low levels.

Notably, ARG concentrations detected in lettuce accounted for only about 6 % of those in secondary-treated irrigation water and about 4 % of those in tertiary-treated water, indicating limited transfer under the experimental conditions.

The study suggests that the bacterial load in irrigation water depends on the water source. Biological treatment, that is, secondary treatment, is insufficient to eliminate detectable fecal bacteria and ARB, with residual bacteria levels several orders of magnitude higher than those in potable or tertiary-treated water. Such effluent may be a potential reservoir for these pathogens, albeit less than untreated wastewater.

The findings highlight the need for tertiary wastewater treatment intended for the irrigation of fresh produce crops to minimize bacterial transfer to the plants.

Regardless of the source of irrigation water, total bacterial abundance on plants, as measured by 16S rRNA gene copies, remained similar. This suggests that other factors play a major role in bacterial colonization of plants. These could include plant health, ultraviolet exposure, and competition with native bacterial strains.

Notably, the study traced the occurrence of both bacteria and ARGs on plants throughout the entire growth cycle. The results partly corroborate earlier studies, indicating a low risk of ARG transmission via irrigation with treated wastewater under controlled conditions with low microbial loads and indirect leaf exposure.

In contrast, other research indicates that ARGs can be directly transferred to edible plant parts and to soil through irrigation. This occurs especially with high microbial loads in the irrigation water, in contrast to the relatively low microbial burden of treated wastewater in the current experiment.

Overall, meaningful ARG transfer occurs mainly when water quality is low, microbial loads are high, or irrigation brings water into direct contact with leaves.

Future field-based studies are required to improve the generalizability of these results by addressing real-world factors such as rainfall, seasonal variations, soil-plant interactions, and environmental microbial contamination independent of irrigation water. Longitudinal studies of soils would also help understand how ARGs in soil fare over the long term.

Advanced wastewater treatment minimizes resistance transfer risks

The study shows that secondary-treated wastewater remains a potential reservoir for the introduction of fecal bacteria and ARB into crops. Neither potable nor tertiary-treated wastewater contained detectable levels of either E. coli or ESBL-E. coli.

All treated wastewater samples contained ARGs at low abundance, albeit with low transfer into plants, with higher abundances in secondary- versus tertiary-treated wastewater. Among the genes assessed, only tetA showed statistically significant differences in abundance across irrigation treatments on lettuce.

Tertiary treated water appeared to pose a comparably low risk to potable water for irrigation with respect to antimicrobial resistance transmission in this controlled study, and should not be assumed to be equivalent under field conditions. Future studies should address issues of generalizability, the presence of ARGs in seedlings, and the role of environmental and agronomic factors in AMR transmission through fresh produce.

Download your PDF copy now!


Source link

Exit mobile version