FLAME DETECTION HISTORY
Flame detection has been extensively used for over 50 years to address the need for quick detection and, in many cases, to respond to fast growing fires.
These devices monitor the optical radiation emitted by the fire in the ultraviolet (UV), visible and/or infrared (IR) wavelengths, and issue an alarm when their measurements indicate that a hazardous flame is present. For high-risk areas, particularly outdoors, optical flame detectors are the preferred solution. This is because, unlike smoke and heat detectors, detection is ‘taken’ to the fire rather than waiting for the fire to reach the detector.
Flame detection technologies have come a long way since the first phototube (UV) and photo cells that detected the photons emitted by flames.
Modern industry has ever-growing requirements for higher performance and reliability, i.e., to continuously detect fire as early as possible with the highest sensitivity, and in all weather conditions, yet to be highly immune to false alarms.
This was addressed in the late 1970s by the introduction of the combined UV/IR detectors. Yet the inability to detect fire at long ranges without a high false alarm rate was still a problem; Increasing the sensitivity of the UV/IR detection system, e.g., by appropriately lowering the threshold level, increased the range of detection however it also increased the false alarm rate.
False alarms may be caused by spurious radiation sources, such as direct or reflected sunlight, black body radiations, artificial lights (particularly halogen lamps), welding, electrical heaters, and ovens. Some spurious radiation sources might not be large enough to activate short-range detectors but can be large enough to activate detectors whose sensitivity has been increased to extend their detection range. UV/IR detectors were commonly used and are still used in some onshore industries, mainly indoors where false alarm sources are less prevalent or long detection range is less important.
The actual damage of false alarms extends beyond the immediate cost of activating unnecessary fire emergency procedures. Unacceptable false alarm rates lead to loss in credibility, which might lead to personnel ignoring future fire alarms or even to the disconnection of crucial fire detection elements. The consequences in case of a real fire might be disastrous.
To address the need to reduce false alarms and increase detection distances, the late 1990s saw a major breakthrough in flame detection with the introduction of IR3 (triple IR technology. This technology revolutionized the field of fire safety by providing a long- range, highly sensitive, flame detector with exceptionally improved false alarm immunity.
IR3 detectors can detect a standard 1ft2 (0.1 m2) gasoline fire at a distance of up to 200ft (60m) and operate in extreme weather and harsh industrial conditions with a very low false alarm rate. This is done by concurrently monitoring the detection area with three IR sensors, one of them sensitive to infrared radiation emitted by the hot CO2 product of fire (wavelength around 4.3), and the other two reference sensors sensitive to background radiation (at longer and shorter wavelengths). These signals are further analyzed mathematically with respect to their ratios and correlations.
IR3 detectors have enhanced flame-detection reliability with a longer detection range, combined with the unprecedented false alarm immunity that was required by the high-risk, high-value facilities and processes in general, but in particular the offshore and onshore Oil & Gas industries.