A carefully controlled study of lettuce irrigation reveals that while secondary treated wastewater may pose risks of antimicrobial resistance, tertiary treatment dramatically limits what reaches the crop.
Study: Impact of agricultural wastewater reuse on the transfer of antimicrobial resistant bacteria and genes to food crops: a one health perspective. Image credit: Jasmine_K/Shutterstock.com
Using treated wastewater to irrigate food crops conserves water resources, but the associated risks are poorly understood. A recent study published in the journal Frontiers in Microbiology examines the spread of antimicrobial resistance genes through treated wastewater used on crops in a controlled experimental environment.
Balancing water scarcity with food safety risks
Water is one of the most valuable natural resources, as it is the basis of life and agriculture. Sustainable food production is a major challenge amid increasing water scarcity, prompting the use of alternative water sources such as recycled water.
Wastewater, whether treated or untreated, is used to irrigate crops in more than 50 countries and on more than 20 million hectares of land on nearly every continent in water-stressed areas. For example, the European Commission (EC) promotes the use of the municipal sewage treatment plant (WWTP) wastewater as a freely available substitute for fresh water in irrigation.
However, recycling wastewater carries the risk of contaminating crops with foodborne pathogens. This risk is more significant with fresh produce, as it is eaten raw.
Sewage also carries antibiotics, antibiotic-resistant bacteria (ARB), and antimicrobial resistance genes (ARGs). Wastewater treatment conditions promote the emergence of drug-resistant bacterial strains and the transmission of ARGs, contributing to the spread of antimicrobial resistance (AMR). This is particularly important when dealing with resistance genes to last-resort antibiotics, such as extended-spectrum beta-lactamase (ESBL) genes, which inactivate a wide range of beta-lactam antibiotics.
AMR caused 1.27 million deaths in 2019 and is directly or indirectly linked to nearly five million deaths worldwide. Thus, efforts to maintain microbiological and AMR standards for agricultural wastewater reuse are extremely important.
Previous research has shown that WWTPs reduce ARB concentrations but do not eliminate ARGs. However, research on wastewater irrigation has yielded conflicting findings, perhaps due to variations in environmental conditions, soil properties, crop types, and irrigation methods. The current study examined ARB and ARG transmission from treated wastewater used for irrigation in lettuce cultivation under controlled experimental conditions.
Resistance transfer test using lettuce and recycled water
The researchers used a three-arm experimental design to compare ARB and ARG transmission rates in lettuce grown under controlled conditions, with wastewater versus drinking water for irrigation. Each of the three arms contained 936 plants, exposed to drinking tap water, secondary treated wastewater and tertiary treated wastewater, respectively. The entire experiment was repeated to ensure repeatability.
The wastewater used came from a WWTP that used:
Primary treatment
- Aeration
- Separation of solids and suspended solids
- Grain removal
- Degreasing
Secondary treatment
- Activated sludge process with coagulation, flocculation and lamina clarification
- Tertiary treatment
- Sand filtration
- UV-C disinfection
The researchers measured the growth of the faecal bacteria in the culture Escherichia coli (E. coli) and ESBL-E. coli (representing ARB). The limit of detection was one colony forming unit (CFU) per 100 mL for the water and 0.08 CFU per gram of lettuce, equivalent to 1 CFU per 100 mL of filtered leaf wash, for the product.
In addition, they used quantitative polymerase chain reaction (qPCR) to assess the absolute and relative abundance of ARGs normalized to 16S rRNA gene copies: blaCTX–M–1, blahTEM, sul1and tetA. These are important environmental markers of AMR and are widely used for AMR surveillance.
Study findings
Water pollution
Drinking water had the lowest bacterial load compared to treated wastewater.
Both E. coli and ESBL-E. coli were undetectable in drinking water and tertiary wastewater treatment samples. In contrast, secondary treatment resulted in detectable levels of both in wastewater in all samples tested, with concentrations several log units higher than drinking or tertiary treated water.
Similar patterns were observed for ARGs. Drinking water had low levels of ARGs sul1 and blahTEM, while the other two were undetectable. In contrast, all treated wastewater samples contained detectable ARGs. Both absolute and relative abundance of ARGs were lower in drinking water and higher in secondary wastewater.
Lettuce contamination
With lettuce, E. coli was detected in 94 % of plants grown with secondary wastewater treatment, but in 33 % when either tertiary treatment or potable water was used. Tests for ESBL-MI. coli detected in 61 % of the secondary effluent arm, versus undetectable in the other two arms.
Interestingly, seedlings showed detectable levels sul1 and tetA at baseline. This suggests the need to examine contamination at the seedling level, independent of irrigation or soil contamination, while supporting a low net transfer of ARGs from irrigation water, particularly from tertiary treatment wastewater.
After irrigation, blaCTX-M-1 was mainly bound to lettuces irrigated with treated wastewater. At the same time, blahTEM, sul1and tetA were detectable in all treatments, including drinking water, consistent with ARG substrates present in seedlings or plant-associated microbial flora. Again, levels were higher with secondary wastewater irrigation. Tertiary treatment significantly reduced ARG abundance, although they remained detectable at low levels.
Specifically, ARG concentrations detected in lettuce represented only about 6% of those in secondary treatment irrigation water and about 4% of those in tertiary treatment water, suggesting limited transport under the experimental conditions.
The study suggests that the bacterial load in irrigation water depends on the water source. Biological treatment, i.e. secondary treatment, is insufficient to eliminate detectable faecal bacteria and ARBs, with levels of residual bacteria several orders of magnitude higher than those in drinking or tertiary water treatment. Such waste can be a potential reservoir for these pathogens, although less so than untreated sewage.
The findings highlight the need for tertiary treatment of wastewater intended for irrigation of fresh produce crops to minimize bacterial transfer to plants.
Regardless of the source of irrigation water, overall bacterial abundance on plants, as measured by 16S rRNA gene copies, remained similar. This suggests that other factors play an important role in bacterial colonization of plants. These could include plant health, UV exposure and competition with native bacterial strains.
Specifically, the study identified the occurrence of both bacteria and ARGs in plants throughout the growth cycle. The results partially confirm previous studies, indicating a low risk of ARG transmission through treated wastewater irrigation under controlled conditions with low microbial load and indirect leaf exposure.
In contrast, other research shows that ARGs can be transferred directly to edible plant parts and soil through irrigation. This is particularly the case with high microbial loads in the irrigation water, in contrast to the relatively low microbial load of the treated wastewater in the current experiment.
Overall, substantial ARG transport occurs primarily when water quality is low, microbial loads are high, or irrigation brings water into direct contact with leaves.
Future field studies are needed to improve the generalizability of these results by addressing real-world factors such as rainfall, seasonal variation, soil–plant interactions, and microbial environmental contamination independent of irrigation water. Longitudinal studies of soils would also help to understand how ARGs in soil are long-term.
Advanced wastewater treatment minimizes resistance transfer risks
The study shows that secondary treated wastewater remains a potential reservoir for the introduction of faecal bacteria and ARBs into crops. Neither potable nor tertiary treated wastewater contained detectable levels of either E. coli or ESBL-E. coli.
All treated wastewater samples contained ARGs in low abundance, albeit with low transport to plants, with higher abundance in secondary than tertiary treatment wastewater. Among the genes evaluated, only tetA showed statistically significant differences in abundance between irrigation treatments in lettuce.
Tertiary treatment water appeared to pose a comparatively low risk to drinking water for irrigation with respect to transmission of antimicrobial resistance in this controlled study and should not be considered equivalent under field conditions. Future studies should address issues of generalizability, presence of ARGs in seedlings, and the role of environmental and agronomic factors in AMR transmission through fresh produce.
