The difficulty in many CCTV operations is that one cannot accurately measure whether the CCTV is being effective or not by the number of incidents detected.
On one hand, the absence of detected incidents may mean that no criminal activities have occurred due to tight and exemplary security, but equally, it could mean that the place is rife with crime but nobody has been caught in the act of theft due to poor CCTV system performance. Even where there are incidents detected, one is not sure whether they are reflecting the real number of incidents actually occurring.
There is only one site in all those I have visited across the world where one can be confident that an incident has not likely to have occurred. In this site, the precise volume of goods is measured across every step of the operation. The steps are small in terms of the overall production process, the measurement is extremely precise, the time interval at the end of which measurement occurs is short and manageable, and the audit function is uncompromising. In this case, while proactive CCTV is important to detect threats and incidents in some parts of the production process and enterprise, it means generally that the measurement of loss drives the CCTV and any deviation is explored with video footage and searches until it is explained. As soon as a shortfall is noticed, a response can be made and the security manager involved has her finger closely on the pulse of the operation.
For the rest of us and most of the industry, it appears that managers, mostly general managers, are getting increasingly uncomfortable with the ambiguity in spending so much money on CCTV and other security systems, and not knowing whether they are getting their money's worth. This trend is likely to continue, and I foresee security managers increasingly having to prove the worth of their efforts in time to come.
So, how can one do this? One could compare your incident rate to those of other similar operations, as is done in some other industries, but site conditions tend to differ appreciably from place to place and this does not work so well with CCTV.
One could measure the performance of each CCTV operator and compare it against the others. Under Jack Welsh, General Electric had a management approach where the executive brief was to 'change out, always humanely, that bottom 10%, and do it every year'. In this context, the bottom 10% of performers would be 'managed out' of the organisation every year. While monitoring detection rates and using these for performance management does have its advantages, trying to fire the bottom 10% of performers on a regular basis in South Africa would lead to massive union action and legal conflict.
If we have problems in measuring valid performance, the other approach open to us is to measure if we are doing things properly. If we are, then we can provide the general manager with a context that allows us to say we are doing everything we can, and it is being done effectively.
The UK tends to use codes of conduct to guide its standards of CCTV. This is largely pushed by city or town centre schemes. The emphasis in the codes of conduct is often what should be in place and adherence to procedures and rules. Given the emphasis on privacy legislation and information protection in the UK, these codes often tend to focus on what should not be done rather than what should be done. They are structural (what should be there), and procedural, rather than results orientated and they are limited by nature in this context. Undoubtedly, they are necessary and need to be considered as part of any evaluation, but they do not deliver the assurance of performance that an industry general manager would expect from the security system.
An alternative is to get a peer review system. This has mixed blessings - we just have to look at Nepad in operation in Africa to see that it has advantages but also can have difficulties that limit it in nature and confidence of outcomes. Peer operations tend to either promote cooperation, or alternatively involve extensive rivalry. However, when properly done they can be constructive.
The dangers of this approach can involve people not wanting to offend others close to them or with whom they work extensively, and also a danger of 'group think' where the whole group of peers has a similar approach and thinks the same way. If these reactions occur, it is difficult to get a real insight into how well things are going relative to the outside world.
An interesting parallel for benchmarking comes from the health and safety industry. Although tied to the same kind of concepts as code of practice, the health and safety area has far more of a focus on audit process.
Interestingly enough, a CCTV audit process I am involved in at the moment was originally initiated by union requests to review the control room environment for issues relating to ergonomics and health and safety. In discussions in a common forum with the security personnel, it was decided to take this further into a more extensive security audit of human factor processes and security principles. While fairly common in the UK and Europe, we do not see much of this in South Africa. Yet it ties in well with a process of continual improvement and getting a good perspective on how well prepared you are relative to the industry. It is also a confidence booster for the security function - general management get an independent perspective, the strengths of the current operation are confirmed, and development options are usually welcomed.
I have seen the benefits of a major organisation conducting annual audits of all its operations and using this process to position itself as one of the primary users of CCTV worldwide. Surprisingly, the impact of stopping these audits is also quite significant. Audits give a focus - there is nothing that motivates more than being assessed, however friendly this is. Also, it gives a chance for contact, for discussions, seeing what you are doing through different eyes, and at times a reality check. It also allows you to get an industry perspective relative to other operations.
Visiting other operational sites is a useful and informative activity and is recommended for all managers, however you will seldom get the full picture visiting someone else's site as a guest. Common courtesy means that you do not cross some lines in such situations. On the other hand, having impartial experts to give views on current developments and assist in providing a context for future planning can be extremely useful, and can highlight strengths and opportunities that have not been considered before. They can also draw all of the above methods together with a qualitative analysis of results to describe the impact of the operation.
I see this becoming an increasing trend in this country. We seem to be going through a watershed period in CCTV - both security managers and GMs/ CEOs are going to be increasingly interested in not just whether equipment is in place, but is it delivering to the organisation what it is supposed to?
Dr Craig Donald is a human factors specialist in security and CCTV. He is a director of Leaderware which provides instruments for the selection of CCTV operators, X-ray screeners and other security personnel in major operations around the world. He also runs CCTV Surveillance Skills and Body Language, and Advanced Surveillance Body Language courses for CCTV operators, supervisors and managers internationally, and consults on CCTV management. He can be contacted on 011 787 7811 or email@example.com
© Technews Publishing (Pty) Ltd | All Rights Reserved