HICSS - 53 Digital Government Track
53rd Hawaii International Conference on System Sciences
January 7-10, 2020 - Grand Wailea, Maui, HI, USA

Cyber Deception for Defense

Description

Creating a system that is always protected and secure in all situations against all attackers is a far-reaching and likely impossible goal. It is important for researchers to continue to move systems closer towards absolute security, but it is also essential to create techniques so a system can adaptively defend against an attacker who circumvents the current security. Deception for cyber defense starts to get towards that goal—to rebalance the asymmetric nature of computer defense by increasing attacker workload while decreasing that of the defender.

Cyber Deception is one defensive technique that considers the human component of a cyber attack. Deception holds promise as a successful tactic for making an attacker’s job harder because it does more than just block access: it can also cause the attacker to waste both time and effort. Moreover, deception can be used by a defender to impart an incorrect belief in the attacker, the effects of which can go beyond any static defense. Understanding the human cognition and behavior of both the cyber defender and cyber attacker is a critical component of cybersecurity.

In the cyber world, an attacker only knows what is perceived through observation of the target network. The intruder is often thousands of miles away from the network to which he or she is attempting to gain entry. Networks often unintentionally provide more information to an attacker than defenders would like. However, the network owner also has the opportunity to reveal information he or she desires the attacker to know—including deceptive information. Because network information is often complex and incomplete, it provides a natural environment in which to imbed deception since, in chaos, there is opportunity. Deception can alter the mindset, confidence, and decision-making process of an attacker, which can have more significant effects than traditional defenses. Furthermore, using deception for defensive purposes gives the defender at least partial control of what an attacker knows, which can provide opportunities for strategic interaction with an attacker.

These research efforts require an interdisciplinary approach and this minitrack is soliciting papers across multiple disciplines. It is essential to understand attacker and defender cognition and behavior to effectively and strategically induce (for attackers) and reduce (for defenders) cognitive biases and cognitive load, making it more difficult for cyber attacks to succeed.

Topics of interest include (but are not limited to):

  • Science of Deception (e.g., evaluation techniques, deception frameworks applied to cyber),
  • Practice of Cyber Deception (e.g., case studies, deception technology, deception detection),
  • Understand/influencing the cyber adversary (e.g., adversary emulation, measures of effectiveness)
  • Psychological and social-cultural adversarial mental models that can be used to estimate and predict adversarial mental states and decision processes;
  • Cognitive Modeling of cyber tasks;
  • Adversary observation/learning schemes through both active multi-level “honey bait” systems and passive watching, in conjunction with active learning and reasoning to deal with partial information and uncertainties;
  • Oppositional Human Factors to induce cognitive biases and increase cognitive load for cyber attackers;
  • Metrics for quantifying deception effectiveness in driving adversary mental state and in determining optimized deception information composition and projection;
  • Experimental Design, approaches, and results;
  • Theoretical formulation for a one-shot or multiple rounds of attacker/defender interaction models;
  • Identification of social/cultural factors in mental state estimation and decision manipulation process;
  • Cyber maneuver and adaptive defenses.
  • Protecting our autonomous systems from being deceived 
  • Policy hurdles, solutions, and case studies in adoption of cyber deception technologies


Minitrack Leaders

Kimberly Ferguson-Walter is a Senior Research Scientist with the National Security Agency’s Information Assurance Research Group. She earned a BS in Information and Computer Science from the University of California Irvine and a MS in Computer Science from the University of Massachusetts Amherst, both specializing in artificial intelligence. She is currently a PhD Candidate at the University of Massachusetts Amherst with a focus on adaptive cybersecurity (degree expected in 2019). Her research interests are focused on the intersection of computer science and human behavior. She has a background in machine learning and has been focused on adaptive cybersecurity at the agency for the past eight years and is the lead for the Research Directorate’s deception for cyber-defense effort. She organizes an annual International Cyber Deception Workshop and also organized a 2018 International Workshop on Autonomous Cyber Operations. She acts as an advisor to the Science & Technology Advisor Council on matters involving Cyber and Autonomy.

Dr. Sunny Fugate is a civil servant for the Navy’s SPAWAR System Center Pacific and the center’s Senior Scientific Technical Manager (SSTM) for Cyber Warfare.  During the last 16 years Dr. Fugate has run numerous research programs to explore the intersections of cyber defense, cognitive science, game theory, and artificial intelligence.  Dr. Fugate earned his BS in Electrical Engineering from the University of Nevada in 2002 and PhD in Computer Science at the University of New Mexico in 2012. Dr. Fugate has also worked in several embedded positions including: Joint Task Force for Global Network Operations; Defense Threat Reduction Agency; and Naval Information Operations Center Hawaii. Dr. Fugate’s current efforts are focused on both improving the human factors of cyber defense and in exploring opportunities to improve cyber defense using defensive deception and game theory. Dr. Fugate hosted the 2018 Cyber Shorelines workshop focused on the use of cyber deception to protect safety and privacy and how we might simultaneously protect autonomous systems from being deceived.

Cliff Wang graduated from North Carolina State University with a PhD in computer engineering in 1996. He has been carrying out research in the area of computer vision, medical imaging, high speed networks, and most recently information security. He has authored over 50 technical papers and 3 Internet standards RFCs. Dr. Wang also authored/edited for 18 books in the area of information security and hold 3 US patents on information security system development. Since 2003, Dr. Wang has been managing extramural research portfolio on information assurance at US Army Research Office. In 2007 he was selected as the director of the computing sciences division at ARO while in the same time managing his program in cyber security. For the past ten years, Dr. Wang managed over $250M research funding which led to significant technology breakthroughs. Dr. Wang also holds adjunct professor appointment at both Department of Computer Science and Department of Electrical and Computer Engineering at North Carolina State University. Dr. Wang is a Fellow of IEEE. Dr. Wang organizes the International Workshop on Cyber Deception and Defenses.

Co-Chairs

Kimberly Ferguson-Walter
(Primary Contact)
 
Department of Defense
Email: kjfergu@spawar.navy.mil

 

Sunny Fugate 
SPAWAR System Center PacificP.O.Box 163
Email: fugate@spawar.navy.mil

 

Cliff Wang 
Army Research Office
Email: xiaogang.x.wang.civ@mail.mil