Cryptheory – Just Crypto

Cryptocurrencies are our life! Get an Overview of Market News

Artificial intelligence as a weapon for criminals

3 min read

Artificial intelligence criminals

A report has recently been published by Europol in collaboration with UNICRI (United Nations Interregional Crime and Justice Research Institute) and Trend Micro, with the eloquent title “Malicious Uses and Abuses of Artificial Intelligence” and which focuses on new technologies used by criminals such as artificial intelligence.

The document is quite extensive, 80 pages long, outlining the current scenario and the use cases of AI for criminal purposes. It then examines future scenarios and how will attacks with this technology evolve. Finally, there are also recommendations to minimize possible risks and damage.

There are several pages dedicated to the deep fake phenomenon that everyone knows about, i.e. the technique that allows to replace a face at will in a video and make it look as real as possible.

Known and well-documented cases can be found for the moment especially in the world of porn, where this technique is used to insert the faces of actresses from famous and non-famous films into videos with sexual content, videos that for obvious reasons are not found on the main erotic portals.

This is not a limited phenomenon, as it could be implemented in any sector, for example in politics, using the faces of famous or important people to put them in a bad light. Think about the possibility that the face of a CEO of a company could end up in a porn video or questionable content, this could have consequences not only for the person involved, but also for the image of the company being managed.

Criminal uses of artificial intelligence

Deep fakes are obviously just the tip of the iceberg as the use and exploitation of artificial intelligence goes far beyond a simple replacement of a face on a video. In fact, it is possible to train malware to bypass any protection or adapt to the system in which it is injected, making it self-sufficient and impossible to effectively eradicate.

We are not talking about attacks on individual users, but rather against real companies, possibly even competitors. But this could also happen to strategic structures such as public administration, from municipalities to hospitals, and it is understandable how this type of attack could become a real scourge.

Malware could be created that manages to encrypt the systems and request a ransom according to parameters that it considers suitable both in terms of amounts and methodology since it could be so clever as to ask for a ransom in BTC or XMR and independently check whether or not the transaction has been carried out, without even the intervention of its creator, making it virtually untraceable.

As Edvardas Šileris, Head of Europol’s Cybercrime Centre, points out:

“AI promises the world greater efficiency, automation and autonomy. At a time where the public is getting increasingly concerned about the possible misuse of AI, we have to be transparent about the threats, but also look into the potential benefits from AI technology”.

Vincenzo Ciancaglini, Senior Threat Researcher at Trend Micro, was also concerned but optimistic about the subject:

“Cybercriminals have always been early adopters of technologies, and artificial intelligence is one of them. As the study points out, it is already used to guess passwords, break CAPTCHA and clone voices, but other uses are in the process of being defined, we are very pleased to team up with Europol and UNICRI to increase the level of awareness of these threats, creating a safer digital future for all”.

Finally, it is worth noting that just like criminals, also companies and sectors need to invest their resources in this area and adapt to this type of threat, because if the same company succeeds in developing a prototype AI but does not create appropriate security measures, then the same company could be fooled by another AI that would be able to pretend to be a secure code and bypass any protection.

It is clear that action must also be taken on the advanced and biometric security side as criminals could create software to emulate the biometric credentials needed to access particular information. It would be enough to digitally build a fingerprint to unlock a level of security that was previously thought to be secure. This is particularly serious if the fingerprint is used, for example, to access a nuclear repository, or to unlock and start an electric car.

 

 

The post Artificial intelligence as a weapon for criminals appeared first on The Cryptonomist.

Source link

All content in this article is for informational purposes only and in no way serves as investment advice. Investing in cryptocurrencies, commodities and stocks is very risky and can lead to capital losses.
BlackRock (IBIT), the Grayscale Bitcoin Trust (GBTC), Fidelity (FBTC), Ark Invest/21Shares (ARKB), Bitwise (BITB), Franklin (EZBC), Invesco/Galaxy (BTCO), VanEck (HODL), Valkyrie (BRRR), WisdomTree (BTCW), Hashdex (DEFI)

Leave a Reply

Your email address will not be published. Required fields are marked *