systems), insulation, and cavity barriers. Applying the same logic, if an end user wanted to know if what was proposed for their building was safe, they should ensure that the detailing was exact to their end use. If the system was penetrated by vents, then
vents should feature in the test. If the aluminium composite material (ACM) was folded and hung, as opposed to flat sheet and riveted, the ACM under test should be folded and hung. The same would apply to window detailing (thickness of protective metal sheet over the fire), cavity barrier quantities and specification, panel lengths, and so on. However, in supporting product sales of the components, or indeed as what the former Department for Communities and Local Government have done to ascertain which buildings should be deemed safe or unsafe, there is little or no knowledge of the specific design detailing of the ‘built system’, and ‘generic’ designs are used. Some of the attributes of these generic designs might be considered deficient (no inclusion of penetrations), inaccurate and even fortified against fire in their detailing (unreasonably so?). So, whilst a good result on a generic system
is good for sales and allaying fears, is it really confirming the safety of systems that, whilst using
similar materials, are actually put together in a very different way? At the end of the day, built up system
testing requires the trust that it is addressing your specific issues, and should ideally be conducted and controlled by the end user. Anyone can ‘tinker’ with tests to turn a fail into a pass, given enough resource – and with no requirement to describe the fails, and no concept of safety factors, the end user can be left quite blind to performance and tolerance to deviation. So to close, built up system testing is
absolutely appropriate when undertaken by or for the end user, and is modified to be appropriate and accurate to the system they have. With tests undertaken by product manufacturers, perhaps their input should be curtailed to component tests that provide core building block type data, such as combustibility, ignitability, structural integrity, toxicity and function (cavity barriers). If the individual components perform well,
the performance of the built up design should follow, and could be more resilient to the challenges of installation accuracy and wear and tear over time
Dr Jim Glockling is technical director of the FPA and director of RISCAuthority
www.frmjournal.com APRIL 2018 19
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60