Unintended consequences of ­technology

June 2019 Surveillance, Integrated Solutions

The development of general AI capabilities has seen an accompanying concern among some of the key players in the IT space as well as academics internationally who are expressing a need for some kind of perspective around what it means for humankind and future society.

Wired.com, for example, has featured some discussions with participants around this issue, with one of the observations drawn from this being that engineers need to start doing some philosophy training in their curriculum to enable them to cope with some of the new realities and responsibilities they will be facing. While this is happening at very macro level, the situation with most of those responsible for the day-to-day security of people and things, is that they are getting on with putting new technologies into place as long as they see a possible benefit for these.

Commercial viability and profits are driving adoption. There are, however, a number of unintended consequences to these actions for the very people and organisations they are protecting.

Impact on existing technologies

One of the most common knock-on effects of introducing new technology is the impact on the existing technology within the organisation. Issues like battling to integrate older technologies, compatibility with new formats and standards, processing speed or interface drag on the new technology, and new interfaces that need replacement and training.

Sometimes the old systems or infrastructure have to be thrown out entirely and everything aligned with the new technology. Sometimes the new technology equipment is installed in such a way that it obscures or gets in the way of old functions or features. In other situations, parallel systems are set in place to allow old and new to function at the same time. This is why the role of integrators is so important in the beginning when thinking of implementing new technology upgrades and systems.

New technology may bring entirely new working and conceptual demands, and current operators may struggle to reconcile with the scope or complexity of these demands. At times, partial or even entire work processes and human interfaces need to be revised and adapted, and jobs revised or even replaced. The scope of work and interfacing with other departments may be increased, and the level of data transfer may change in both volume and nature. Design of these human interfaces is often neglected and the potential for failure is often based in such neglect. Training needs to be increased along with transitional processes and people need to be tested.

Adapting to the machine

One of the most concerning aspects of new technology is that expectations are created that it will work within the standardised environment. But suddenly, users realise that it isn’t working properly and they need to get greater control in order to provide a standardised environment that intelligent systems for AI can work in. In this case, the failure of the technology to work means that the surroundings and people have to change in such a way to allow it to work. This generally isn’t a problem as people are adaptable and willing to take on new things that will assist them.

The biggest issue is where the owners of the technology, or the implementers, have not considered the implications on the people and processes in the environment they are introducing these new things into. In some situations, forced behaviour change or conditions of work or social movement create the potential for accusations of social control. Effectively, there are situations where we are forcing people to behave in certain ways so these ‘intelligent’ systems can work.

Where people feel they are being coerced into this, they will react and either a passive or even active resistance to this can occur. Where they feel the changes are not in their best interest, they will do this even more. Yet many implementers of the technology feel that people will just have to fall in line.

The process of change

Failure to look at technology implementation as a change process is one of the key issues why new technologies fail. As part of this change, the environment needs to be prepared and people need to be informed effectively. Manipulation at this stage is likely to increase pushback, especially if it is seen to be manipulative and taking advantage of people.

Contractual provisions in signing up to the e-tolls system in Gauteng, for example, from my perspective showed a total disregard for the rights of users and an opportunist attitude to compel people into contractual obligations and use of their information.

Recently, there have been a spate of notices I’ve encountered on websites that I use where they say that your privacy is important and they want to protect this, yet they want to force you into an arrangement to use anything from tracking to the distribution of personal information, including themes of your correspondence to not only themselves, but associated companies and clients. Yahoo is a good example of this where new conditions of use and ‘protection’ take substantial rights away from you, for marketing and general social use.

We can hardly be surprised then, when organisations, social movements, or institutions start querying why some parties are implementing technology without consultation, explanation, or a firm framework of policy or governance as to why technology is being implemented or information is being collected, especially where this can impact on people. The population at large is generally receptive to things that they think can help them and is good for the community.

Taking this for granted, however, is a strong way to get pushback. Situations like San Francisco prohibiting facial recognition and other electronic monitoring is an example of this. Querying why people are avoiding facial recognition cameras, as has happened in public places in the UK, is another. Even the best systems are not going to be accepted if people don’t believe in them, whether it happens on the corner shop, the website you visit, the company you work at, or the community you live in.

Dr Craig Donald is a human factors specialist in security and CCTV. He is a director of Leaderware which provides instruments for the selection of CCTV operators, X-ray screeners and other security personnel in major operations around the world. He also runs CCTV Surveillance Skills and Body Language, and Advanced Surveillance Body Language courses for CCTV operators, supervisors and managers internationally, and consults on CCTV management. He can be contacted on +27 11 787 7811 or [email protected]



Credit(s)




Share this article:
Share via emailShare via LinkedInPrint this page



Further reading:

Vumacam highlights concerns with proposed Johannesburg CCTV by-laws
Vumacam News & Events Surveillance
Vumacam has raised objections to critical provisions of the by-laws governing privately owned CCTV cameras with a view of public spaces in the city, which were promulgated on Friday, 28 February 2025.

Read more...
Milestone announces a platform to enable access to data and train AI models
Surveillance AI & Data Analytics
Milestone Systems has announced Project Hafnia to build services and democratise AI-model training with high-quality, compliant video data leveraging NVIDIA Cosmos Curator and AI model, fine-tuning microservices.

Read more...
Benchmark in long-range surveillance
Duxbury Networking Surveillance Products & Solutions
Duxbury Networking says the long-range, high-resolution monitoring AXIS Q1809-LE bullet camera has been enhanced further with integration into Milestone XProtect to set a new standard for forensic-level image clarity, intelligent event detection, and enhanced security management.

Read more...
Security industry embraces mobile credentials, biometrics and AI
AI & Data Analytics Access Control & Identity Management Integrated Solutions
As organisations navigate an increasingly complex threat landscape, security leaders are making strategic shifts toward unified platforms and emerging technologies, according to the newly released 2025 State of Security and Identity Report from HID.

Read more...
AI for retail risk management
Surveillance Retail (Industry) AI & Data Analytics
As businesses face mounting challenges in a volatile economic environment, Ares-i remains an essential tool for proactively identifying, assessing, and mitigating risks that threaten operational stability and customer satisfaction.

Read more...
The need for integrated control room displays
Leaderware Editor's Choice Surveillance Training & Education
Display walls provide a coordinated perspective that facilitates the ongoing feel for situations, assists in the coordination of resources to deal with the situation, and facilitates follow up by response personnel.

Read more...
Six key security technology trends in 2025
Axis Communications SA Surveillance
Axis Communications examines some new trends for the security sector in 2025, as well as some new, old trends that are once again highlighted because of their benefit to the end user in the race to obtain optimal value from technology installations.

Read more...
edgE:Tower video analytics integrated with SEON
Surveillance Integrated Solutions AI & Data Analytics
Sentronics has announced a new integration between its edgE:Tower advanced AI-driven video analytics solution and SEON, a Central Monitoring Software (CMS) platform. This integration enhances real-time situational awareness and automated threat detection for control rooms.

Read more...
The impact of video analytics on business security
AI & Data Analytics Surveillance
As more enterprises work to integrate AI-enabled solutions into their networks, enterprises must not lose sight of the implications of these integrations and the added value they are working to unlock.

Read more...
The need for integrated control room displays
Editor's Choice Surveillance Training & Education
Display walls provide a coordinated perspective that facilitates the ongoing feel for situations, assists in the coordination of resources to deal with the situation, and facilitates follow up by response personnel.

Read more...