Back to Search
Start Over
Leveraging Computational Intelligence Techniques for Defensive Deception: A Review, Recent Advances, Open Problems and Future Directions.
- Source :
- Sensors (14248220); Mar2022, Vol. 22 Issue 6, p2194, 26p
- Publication Year :
- 2022
-
Abstract
- With information systems worldwide being attacked daily, analogies from traditional warfare are apt, and deception tactics have historically proven effective as both a strategy and a technique for Defense. Defensive Deception includes thinking like an attacker and determining the best strategy to counter common attack strategies. Defensive Deception tactics are beneficial at introducing uncertainty for adversaries, increasing their learning costs, and, as a result, lowering the likelihood of successful attacks. In cybersecurity, honeypots and honeytokens and camouflaging and moving target defense commonly employ Defensive Deception tactics. For a variety of purposes, deceptive and anti-deceptive technologies have been created. However, there is a critical need for a broad, comprehensive and quantitative framework that can help us deploy advanced deception technologies. Computational intelligence provides an appropriate set of tools for creating advanced deception frameworks. Computational intelligence comprises two significant families of artificial intelligence technologies: deep learning and machine learning. These strategies can be used in various situations in Defensive Deception technologies. This survey focuses on Defensive Deception tactics deployed using the help of deep learning and machine learning algorithms. Prior work has yielded insights, lessons, and limitations presented in this study. It culminates with a discussion about future directions, which helps address the important gaps in present Defensive Deception research. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 14248220
- Volume :
- 22
- Issue :
- 6
- Database :
- Complementary Index
- Journal :
- Sensors (14248220)
- Publication Type :
- Academic Journal
- Accession number :
- 156096670
- Full Text :
- https://doi.org/10.3390/s22062194