Due to high levels of automated processing machinery, the use of tools for maintenance, and other mixing equipment, metal remains the most prevalent contaminant risk in food manufacturing, potentially causing product quality and consumer safety issues. Metal detectors can be deployed to reduce this food safety risk. However, due to the physics of disturbing the electromagnetic field, the orientation of a metal contaminant can affect whether a contaminant is effectively detected.
This phenomenon is known as orientation effect. It occurs most noticeably when long, thin wire-type metal contaminants are more easily detected if they pass through the metal inspection system in one orientation rather than another. However, this orientation effect is exhibited by any non-spherical contaminant, not just wires. A typical occurrence could be when equipment is calibrated to detect a stainless-steel sphere measuring 2mm in diameter. While it may identify and reject this contaminant, the machine may fail to detect a stainless-steel wire that is slightly smaller in diameter, but longer than 2mm, as it depends on the orientation of the wire as it travels through the detector.
Often, it can be easier to detect stainless steel and non-ferrous wires when they pass through the aperture space sideways or upright, rather than in alignment with the conveyor. This is because of the magnetic permeability of the metal, which is much lower for stainless steel than ferrous metals.
Reducing the aperture size in relation to the product size can be a simple and effective way to increase metal detector sensitivity. This is because sensitivity is expressed as the smallest detectable spherical contaminant travelling through the geometric center of the aperture. The center of aperture is always the least sensitive position in the aperture. The idea is to challenge the worst-case scenario so, if the contaminant is detectable in this position, then it will be more easily detected closer to the aperture walls.
Features like single pass product learning and automatic calibration can also help, as operators are easily able to overcome a potentially changing product effect which will result in maintaining the highest performance levels and lowest levels of false detections.
Sensitivity signatures
Dairy product applications are typically wet and conductive, which can present an additional challenge for metal detectors. Cheese, for example, with its high moisture content, combined with salt, can be highly conductive and cause a reaction like metal being present. This product effect can result in the product being rejected and good food being discarded.
To identify a metal contaminant within conductive products, a metal detector must remove or reduce this product effect. Single pass calibration is meant to do this however, the underlying operating frequency of the metal detector impacts how effective a calibration can be at eliminating the product effect. With single frequency metal detectors running ‘wet’ products there is often a trade-off between ferrous and stainless-steel performance depending on the selected frequency. Typically, higher frequencies exhibit increased performance in detecting stainless-steel versus ferrous metals. The best approach is to find a frequency that provides a balance between the lowest product effect and the detection of target contaminants.
Using simultaneous multi-frequency technology is the most reliable way to remove product effect without compromising the sensitivity of a metal detector. This processing technology powers the Fortress Interceptor, enabling it to run real-time analysis of the low-frequency and a high-frequency signals in parallel.
The increased sensitivity was one of the primary reasons why dairy processor Vepo Cheese selected seven Interceptor metal detectors to integrate with their vertical packing machines. The cheese giant specifically requested “state of the art inspection equipment that could deal with variations in density and product effect.”
Technical Operations Manager at Vepo, Hugo van Put, said, “These metal detectors are really sensitive. This helps us to feel confident that the risks of contaminants are minimal, with less chance of a food safety issue. Having the double readings within the Interceptor system also lowers the risk of false-positive rejects, which saves on food waste.”
Test spheres: shape and science
The standard technique for measuring the sensitivity of metal detectors in food inspection is to use metal test spheres. Yet, metal contaminants typically enter the production line as flat metal flakes, shards, swarf or thin wires, rather than globular shapes. So why test using spheres?
The main rationale is it provides machinery suppliers and food processors with a comparative sensitivity control. A sphere does not exhibit orientation effect and will always produce the same signal when passed through the same position of a metal detector’s aperture.
The food metal detection industry has general sphere size guidelines. For example, a wet block of cheese measuring approximately 75mm high, currently has sphere size parameters of 2.0mm for ferrous metals, 2.5mm for non-ferrous and 3.5mm for stainless steel. However, these levels are not always one size fits all, as the product effect from different types of cheese as an example, can vary greatly.
Frequency focused
There are multiple variables that can affect a metal detector’s performance, from the potential size and composition of possible contaminants to the liquid content and consistency of the product matrix.
It’s equally important to note that there is no ‘best’ metal detection frequency. There are only ranges of frequencies, each better for different purposes. As with any aspect of food safety, there’s always a cause and a consequence. Having sufficient scientific understanding about how dairy products behave and conditions that can trigger a false positive reaction is important. If in doubt, seek expert guidance.
Eric Garr is a Regional Sales Manager for Fortress Technology.