Current affairs

commissioning – guidance is available from the RSA, the British Active Fire Sprinkler Association, LPCB, IFC or FIRAS.

Multi sensor detectors

With a £1bn loss from false alarms in the UK annually, Raman Chaggar of the British Research Establishment (BRE) stated that these pose ‘consequences for all’, including fire and rescue services (FRSs), businesses and the public. With one false alarm every two minutes, BRE studies undertaken over the last few years looked at false alarms in buildings and conducted a live investigation of false alarms. Recommendations from the first included

increased use of multi sensor detectors, as in certain test cases there were reductions of 35.1% and 49.5%. Another recommendation was establishing performance variability and capabilities of units, with more research required. These units operate later than standard alarms to

avoid false detection, but there is a ‘huge spectrum of performance’, and Mr Chaggar asked ‘what is out there, and how does it perform?’. The BRE and Fire Industry Association study therefore comprised three parts: it would review all alarms, perform tests and finally conduct false alarm tests. Together, they would aim to identify the ‘relative benefits’ of the alarms in use. BRE quickly found that most units operated later

than regular false alarms, but ‘some, not all’ reacted to certain tests. Devices were ‘more sophisticated and less prone’ to false activation, but some false tests were ‘tough to replicate’ and repeat. Those possible to replicate included dust, watermist, aerosol, cooking smoke and toaster smoke. A previous BRE study of building materials’ methodology was utilised for covering test fires, with 35 units tested from 12 manufacturers and each tested against two reference optical smoke detectors. Multi sensor detectors were categorised in basic, intermediate and advanced false alarm resistance. Mr Chaggar showed that in a PU foam and toast

test, differences could clearly be seen between optical and multi sensor units. Watermist and aerosol tests followed the pattern of basic through to advanced, in terms of performance. Multi sensors operate ‘far later’, on average 200% later, than optical detectors. Domestic units go off quicker, and there was ‘very little change’ from basic to advanced. With the latter, there was a ‘greater false resistance’, and Mr Chaggar said he was working towards a LPS standard for false alarm resistance, and concluded that multi sensor detectors ‘have the same response to fire, but a delayed response to false alarms’.

Desktop assessments

Niall Rowan, chief executive of the Association for Specialist Fire Protection (ASFP), opined that there


was always a ‘need for an assessment procedure’. He met Dame Judith, and asked if she would ban desktop studies, stating that he thought she wouldn’t. He ‘got a fierce response’, but noted there were ‘not enough test labs to cope with demand’. Her interim report criticised desktop studies, and

aimed to ‘significantly restrict’ them or force those undertaking them to demonstrate competence. Mr Rowan said the problem was that ‘we have the same situation with installers, fire risk assessors and fire engineers – is it any wonder we have issues?’. As per the Building Control Alliance technical guidance note 18, there are ‘four ways to satisfy’ ADB in relation to high rise cladding: materials of limited combustibility, BS 8414 tests, desktop studies and fire engineering. Mr Rowan said that most ‘don’t want to do’ the first or last, while 8414 tests cost £50,000 each. In contrast, desktop studies are cheapest, and ‘not surprisingly’ there has been a large rush to do them. However, he admitted that the ‘relatively

immature’ industry is reflected by going straight to desktop studies without awareness of issues. There need to be ‘suitably qualified’ specialists, but the phrase ‘desktop study’ is a ‘deprecative term’ that does assessments a disservice. Requirements set out in ADB’s Appendix A note that engineers ‘might’ be expected to have necessary expertise, so without UKAS accreditation, they can’t do such assessments. Accreditation is needed, as there can be many

separate tests with different material combinations and test performances. Desktop assessments are required because you ‘could easily get to 10,000 tests’, and there needs to be a ‘simpler and cheaper’ way. Mr Rowan then discussed previous legislation FEBRUARY 2019 43

Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60