[Scary?] AI Can Now Learn To Manipulate Human Behavior


The Conversation just published something I have been worried about for a while now. Scary? Could be getting that way sometime soon.  They said: "Artificial intelligence (AI) is learning more about how to work with (and on) humans. A recent study has shown how AI can learn to identify vulnerabilities in human habits and behaviours and use them to influence human decision-making.

It may seem cliched to say AI is transforming every aspect of the way we live and work, but it’s true. Various forms of AI are at work in fields as diverse as vaccine development, environmental management and office administration. And while AI does not possess human-like intelligence and emotions, its capabilities are powerful and rapidly developing.

There’s no need to worry about a machine takeover just yet, but this recent discovery highlights the power of AI and underscores the need for proper governance to prevent misuse.

How AI can learn to influence human behaviour

A team of researchers at CSIRO’s Data61, the data and digital arm of Australia’s national science agency, devised a systematic method of finding and exploiting vulnerabilities in the ways people make choices, using a kind of AI system called a recurrent neural network and deep reinforcement-learning. To test their model they carried out three experiments in which human participants played games against a computer. Here is the article: https://theconversation.com/ai-can-now-learn-to-manipulate-human-behaviour-155031

How AI detects your emotions by scanning you with radio waves

The same week, an article in Futurism reported: "If new research is to believed, you may find yourself coming home from work one day in a rotten mood — just to have your smart speaker automatically scan your emotions and start to play soothing music.

That’s one use case for a new neural network that Queen May University of London engineers taught how to automatically interpret certain human emotions — by blasting people with radio waves and picking up on emotional cues like changes in their heartbeat. The algorithm can detect feelings including fear, disgust, joy, and relaxation with 71 percent accuracy, according to research published earlier this month in the journal PLOS One. That’s far from perfect, but impressive enough that it could find some real-world use in our lives.

 Combining a few of these technologies could mean a whole new level of social engineering.  Watch out.

Inside Man Season 3 Now Available

'The Inside Man' is an award-winning KnowBe4 Original Series that delivers security awareness principles embedded in each episode that teach your users key cybersecurity best practices and makes learning how to make smarter security decisions fun and engaging.

From social engineering, insider threats and physical security, to vishing and deepfakes: 'The Inside Man' reveals how easy it can be for an outsider to penetrate your organization’s security controls and network.

InsideMan3_DS2Want access to 'The Inside Man' series and see all our great security awareness training content?

It’s easy! You can now get access to the KnowBe4 ModStore Preview Portal to see the world's largest library of security awareness content; including 1000+ interactive modules, videos, games, posters, and newsletters. See how entertaining security awareness training can be!

Get Started!

PS: Don't like to click on redirected buttons? Cut & Paste this link in your browser:


Subscribe To Our Blog

Ransomware Hostage Rescue Manual

Get the latest about social engineering

Subscribe to CyberheistNews