Blog: Scientists showcase SIMPLE TRICK that STOPS invasive AI SURVEILLANCE recognising YOU – Express.co.uk
As technology continues to expand, the capabilities of artificial intelligence (AI) also grow – from security cameras to self-driving cars – and so too has concerns about the mysterious tech which once was only a reality in science-fiction. However, a demonstration from two Belgium researchers from the KU Leuven university has showcased how a colourful patch can trick the system into ignoring you completely. In the video, the two scientists are facing a camera which is working in tandem with an algorithm designed to identify objects and humans in its surroundings.
One of the researchers is wearing the vinyl record sized patch whereas the other is not.
The AI program seems to identify one of the researchers as a person in addition to a chair within the shot.
The demonstrator holding the patch then passes it to the other and is then identified by the programme whereas the new holder seemingly becomes invisible to the computer.
The pair said that the cloaking pattern currently only works on one type of algorithm but are looking to see if they can develop it further.
In a paper published on ArXiv, they said: “We believe that if we combine this technique with a sophisticated clothing simulation.
“We can design a T-shirt print that can make a person virtually invisible for automatic surveillance cameras.”
The trick can be used for many purposes say experts however not all of them are in good nature.
This same technique could be used by thieves or criminals hoping to go unnoticed by surveillance cameras with artificial intelligence technology programmed into them.
This would make criminals more difficult to apprehend by the police as they lose one of their most valuable sources of information gathering.
As AI is beginning to be used more in self-driving cars this also opens up the possibility for these vehicles infused with AI to not recognise all individuals subsequently resulting in accidents.
The application of cloaking against facial recognition software has become a hot topic as critics have warned that the overuse of the technology could give way to a culture of mass surveillance.