Cybersecurity is an ever-changing field of research, with new forms of security threats appearing often. One increasingly common threat, surveillanceware, involves a malicious individual installing software on a victim’s mobile device enabling secret remote monitoring of the device’s activity. Using a new $1.2 million grant from the National Science Foundation researchers at the University of Florida will study surveillanceware and develop new AI-based defenses for it.
The project titled, “Countering Surveillanceware Using Deception-Based Generative Models and Systems Mechanisms,” will use AI techniques in combination with system security mechanisms to curtail the effects of surveillanceware. This project will help broaden cybersecurity research to include the concerns of vulnerable individuals and groups, such as survivors of domestic abuse, whose cybersecurity needs have often historically been neglected.
The UF research team is led by principal investigator Vincent Bindschaedler, Ph.D., an assistant professor in the Department of Computer & Information Science & Engineering (CISE), and co-principal investigator Kevin R. B. Butler, Ph.D., a professor in CISE and the associate director of the Florida Institute for Cybersecurity (FICS) Research.
“The system we envision is a deception-based system that uses artificial intelligence techniques, specifically deep generative models, to produce fake but plausible (“synthetic”) data,” Dr. Bindschaedler said. “The synthetic data will be fed to the surveillanceware instead of the victim’s real sensitive data. This will mitigate the privacy threat of surveillanceware actively monitoring the victim’s device, even when the surveillanceware itself cannot be uninstalled.”
Read full article HERE: