Unintended consequences of ­technology

June 2019 Surveillance, Integrated Solutions

The development of general AI capabilities has seen an accompanying concern among some of the key players in the IT space as well as academics internationally who are expressing a need for some kind of perspective around what it means for humankind and future society.

Wired.com, for example, has featured some discussions with participants around this issue, with one of the observations drawn from this being that engineers need to start doing some philosophy training in their curriculum to enable them to cope with some of the new realities and responsibilities they will be facing. While this is happening at very macro level, the situation with most of those responsible for the day-to-day security of people and things, is that they are getting on with putting new technologies into place as long as they see a possible benefit for these.

Commercial viability and profits are driving adoption. There are, however, a number of unintended consequences to these actions for the very people and organisations they are protecting.

Impact on existing technologies

One of the most common knock-on effects of introducing new technology is the impact on the existing technology within the organisation. Issues like battling to integrate older technologies, compatibility with new formats and standards, processing speed or interface drag on the new technology, and new interfaces that need replacement and training.

Sometimes the old systems or infrastructure have to be thrown out entirely and everything aligned with the new technology. Sometimes the new technology equipment is installed in such a way that it obscures or gets in the way of old functions or features. In other situations, parallel systems are set in place to allow old and new to function at the same time. This is why the role of integrators is so important in the beginning when thinking of implementing new technology upgrades and systems.

New technology may bring entirely new working and conceptual demands, and current operators may struggle to reconcile with the scope or complexity of these demands. At times, partial or even entire work processes and human interfaces need to be revised and adapted, and jobs revised or even replaced. The scope of work and interfacing with other departments may be increased, and the level of data transfer may change in both volume and nature. Design of these human interfaces is often neglected and the potential for failure is often based in such neglect. Training needs to be increased along with transitional processes and people need to be tested.

Adapting to the machine

One of the most concerning aspects of new technology is that expectations are created that it will work within the standardised environment. But suddenly, users realise that it isn’t working properly and they need to get greater control in order to provide a standardised environment that intelligent systems for AI can work in. In this case, the failure of the technology to work means that the surroundings and people have to change in such a way to allow it to work. This generally isn’t a problem as people are adaptable and willing to take on new things that will assist them.

The biggest issue is where the owners of the technology, or the implementers, have not considered the implications on the people and processes in the environment they are introducing these new things into. In some situations, forced behaviour change or conditions of work or social movement create the potential for accusations of social control. Effectively, there are situations where we are forcing people to behave in certain ways so these ‘intelligent’ systems can work.

Where people feel they are being coerced into this, they will react and either a passive or even active resistance to this can occur. Where they feel the changes are not in their best interest, they will do this even more. Yet many implementers of the technology feel that people will just have to fall in line.

The process of change

Failure to look at technology implementation as a change process is one of the key issues why new technologies fail. As part of this change, the environment needs to be prepared and people need to be informed effectively. Manipulation at this stage is likely to increase pushback, especially if it is seen to be manipulative and taking advantage of people.

Contractual provisions in signing up to the e-tolls system in Gauteng, for example, from my perspective showed a total disregard for the rights of users and an opportunist attitude to compel people into contractual obligations and use of their information.

Recently, there have been a spate of notices I’ve encountered on websites that I use where they say that your privacy is important and they want to protect this, yet they want to force you into an arrangement to use anything from tracking to the distribution of personal information, including themes of your correspondence to not only themselves, but associated companies and clients. Yahoo is a good example of this where new conditions of use and ‘protection’ take substantial rights away from you, for marketing and general social use.

We can hardly be surprised then, when organisations, social movements, or institutions start querying why some parties are implementing technology without consultation, explanation, or a firm framework of policy or governance as to why technology is being implemented or information is being collected, especially where this can impact on people. The population at large is generally receptive to things that they think can help them and is good for the community.

Taking this for granted, however, is a strong way to get pushback. Situations like San Francisco prohibiting facial recognition and other electronic monitoring is an example of this. Querying why people are avoiding facial recognition cameras, as has happened in public places in the UK, is another. Even the best systems are not going to be accepted if people don’t believe in them, whether it happens on the corner shop, the website you visit, the company you work at, or the community you live in.

Dr Craig Donald is a human factors specialist in security and CCTV. He is a director of Leaderware which provides instruments for the selection of CCTV operators, X-ray screeners and other security personnel in major operations around the world. He also runs CCTV Surveillance Skills and Body Language, and Advanced Surveillance Body Language courses for CCTV operators, supervisors and managers internationally, and consults on CCTV management. He can be contacted on +27 11 787 7811 or [email protected]



Credit(s)




Share this article:
Share via emailShare via LinkedInPrint this page



Further reading:

Security industry embraces mobile credentials, biometrics and AI
AI & Data Analytics Access Control & Identity Management Integrated Solutions
As organisations navigate an increasingly complex threat landscape, security leaders are making strategic shifts toward unified platforms and emerging technologies, according to the newly released 2025 State of Security and Identity Report from HID.

Read more...
AI for retail risk management
Surveillance Retail (Industry) AI & Data Analytics
As businesses face mounting challenges in a volatile economic environment, Ares-i remains an essential tool for proactively identifying, assessing, and mitigating risks that threaten operational stability and customer satisfaction.

Read more...
The need for integrated control room displays
Leaderware Editor's Choice Surveillance Training & Education
Display walls provide a coordinated perspective that facilitates the ongoing feel for situations, assists in the coordination of resources to deal with the situation, and facilitates follow up by response personnel.

Read more...
Six key security technology trends in 2025
Axis Communications SA Surveillance
Axis Communications examines some new trends for the security sector in 2025, as well as some new, old trends that are once again highlighted because of their benefit to the end user in the race to obtain optimal value from technology installations.

Read more...
The need for integrated control room displays
Editor's Choice Surveillance Training & Education
Display walls provide a coordinated perspective that facilitates the ongoing feel for situations, assists in the coordination of resources to deal with the situation, and facilitates follow up by response personnel.

Read more...
Integration is the key to smarter surveillance
Duxbury Networking Surveillance
According to recent market projections, the local security industry is expected to grow by more than 10% annually through 2029, reflecting the increasing demand for smarter, more proactive security solutions.

Read more...
Insurance provider uses Net2 For access management
Paxton Access Control & Identity Management Integrated Solutions Healthcare (Industry)
BestMed selected Paxton Net2 for its access control requirements because of its simplicity of installation and ease of navigation for end users, as well as the 5-year warranty.

Read more...
The power of knowing your client
Ideco Biometrics Access Control & Identity Management Integrated Solutions
One of the most effective ways to combat the threat of fraud, identity theft, and financial crime threats is through a robust Know Your Client (KYC) process, which safeguards both businesses and clients.

Read more...
Managing identities for 20 years
Ideco Biometrics Technews Publishing SMART Security Solutions Access Control & Identity Management Integrated Solutions IoT & Automation
Many companies are now more aware of the risks associated with unauthorised access to locations and sensitive data and are investing in advanced identity authentication technologies to mitigate these threats.

Read more...
More options for advanced AI analytics
Surveillance Products & Solutions
The new IDIS Edge AI Camera Plus range offers users flexible options to upgrade their video systems by targeting the power of advanced AI video analytics on priority locations for enhanced vigilance and monitoring 24/7.

Read more...