Two chess are highlighted on a chess board with a vibrant light in the background Two chess are highlighted on a chess board with a vibrant light in the background

The Center for Advanced Red Teaming

The Global Focal Point for Red Teaming Research, Training and Practice

About

The Center for Advanced Red Teaming (CART) is an interdisciplinary Research Center within the College of Emergency Preparedness, Homeland Security and Cybersecurity at the University at Albany.

 

Learn More About CART

The complexity inherent in the security context and other competitive environments of the 21st century presents multiple opportunities for Red Teaming – which encompasses a range of activities that seek to emulate the actions of adversaries – to play a role beyond its military origins. For example, the Red Teaming approach could benefit a variety of areas, from counterterrorism, cybersecurity and emergency response, to corporate espionage and business strategy. 

Potential contributions of Red Teaming include, but are not limited to, the following:

  • Testing of defensive systems and architectures to evaluate their effectiveness;
  • Identifying previously unrecognized vulnerabilities at the operational level;
  • Helping to identify emerging threats and in turn set new requirements for policy and investment at the strategic level;
  • Training of response personnel by exposing them to realistic threats that they might face;
  • Providing validation of novel tools and models that would otherwise be unfeasible or too expensive to validate using empirical data.

However, both the art and science of Red Teaming are underdeveloped, with limited research into how Red Teaming is best conducted and no accredited academic training or education programs. As the first academic center devoted to advancing the art and science of Red Teaming, CART seeks to address a conspicuous need, identified by practitioners, for both research and education in this growing area of security studies.

What We Do
Advanced Red Teaming

CART develops and employs new methodologies to assist a variety of sponsors in designing, conducting and evaluating advanced Red Team exercises. We engage with entities in the public and private sectors to enhance current risk, threat, and vulnerability assessments, and to improve their ability to anticipate, prevent and mitigate unwelcome shocks to their operations. Leveraging deep expertise in social and behavioral sciences, management research and engineering principles helps ensure that Red Teaming provides sponsors with maximum value added in terms of their mission goals.

Research

CART applies cutting-edge scientific research techniques to validate and enhance existing Red Teaming practice. CART researchers conduct a variety of research efforts to improve the art and science of Red Teaming for the betterment of the field overall. CART 's research utilizes research and analysis methods across a wide range of disciplines. These include controlled experiments, surveys, statistical analysis, and computational modeling inter alia the research questions posed above. CART is committed to publishing the results of its research in ways that provide direct support to Red Team practitioners and scholars.

CART explores new applications for Red Teaming. CART researchers assess the extent to which Red Teaming can contribute to new domains, as well as how existing practices need to be modified to adapt to new contexts. It also works to integrate Red Teaming into a wide array of risk assessment tools, including computation risk models and response exercises.

Clearinghouse

CART serves as a clearinghouse for Red Teaming best practices and other resources. CART will serve as a repository of Red Teaming literature and guidance documents. It will also facilitate linkages across the broader Red Teaming community by providing both online forums for collaboration and convening in-person meetings to advance the science and practice of Red Teaming.

Training and Education

CART seeks to design and conduct accredited education and training in Red Teaming. CART's offerings will supplement existing training modules available at the Federal Law Enforcement Training Centers and within the U.S. Army's University of Foreign Military and Cultural Studies. Initially, CART's activities will provide experiential learning opportunities for students at the University of Albany and beyond. Planned offerings include a Graduate Certificate in Red Teaming for Defense for both University students and working practitioners, as well as a concentration in Red Teaming for CEHC Master's Programs.

CART will explore the pedagogy of Red Teaming. CART will examine the extent to which Red Teaming activities can function as a case-based learning approach that can be embedded in a variety of courses at various levels of education, from K-12 through graduate studies.

Primary Goals

  • One chess stands in front of other chess knocked down
    Employing interdisciplinary research and analysis techniques to advance the art and science of Red Teaming in support of the public and private sectors.
  • Serving as a clearinghouse for Red Teaming best practices.
  • Advocating for scientific approaches to Red Teaming.
  • Educating and training the next generation of Red Teaming practitioners, emphasizing the strengths that diversity brings to the Red Teaming enterprise.
  • Facilitating the recognition and implementation of Red Team results into policy and practice.

Provide Feedback

Indicates required field
Contact CART
ETEC
University at Albany

1220 Washington Avenue
Albany, NY 12226
United States

Our Team

 

Leadership

Gary Ackerman
Associate Professor and Associate Dean
College of Emergency Preparedness, Homeland Security and Cybersecurity
Director, Center for Advanced Red Teaming
Brandon Behlendorf
Assistant Professor
College of Emergency Preparedness, Homeland Security and Cybersecurity
Deputy Director, Center for Advanced Red Teaming

 

Staff

Jenna Latourette

Research Associate

[email protected]

 

Mike Mieses

Research Associate

[email protected]

 

Hayley Peterson

Research Associate

[email protected]

 

Anna Wetzel

Research Associate

[email protected]

 

 

Affiliated Faculty

 

Advisory Board

Ana Aslanishvili
Red Team Lead - Facebook

Amaury T. Cooper
Deputy Director, Global Security - Global Communities

Dr. Scott Crino
Co-Founder and CEO - Red Six Solutions LLC

Michael G. Deal Jr.
Intelligence Community Government Contractor, Red Team Lead Analyst

Mark French
Director - University of Foreign Military and Cultural Studies, United States Army Combined Arms Center

Lt. Col William Johnson
Deputy Director - School of Advanced Warfighting

Dr. Kathleen Kiernan
CEO and Founder - Kiernan Group Holdings

Dr. Mark Mateski
Founder - Red Team Journal

Brian McDermott
Red Team Analyst / Facilitator

Jason Pinegar
Director - Red Team Index Division, Transportation Security Administration

Chad Treboniak
Owner - Critical Ops LLC

 

Spring 2022 Interns

Erik Chung

Tyler Flood

Ethan Fowler

Melissa Jennings

Eric Marsden

Cailee Navarro

Deirdre Occhino

Elyssa Thomas

Sophie Vieni

Resources

CART is committed to providing user-friendly, open-access resources for the Red Teaming community. Included below are resources produced by CART, as well as resources produced by other organizations. Please check back often as new resources will be added periodically.

 

#ffffff 1
#ffffff 1

Best Practices

A collaborative effort between CART and the wider Red Teaming community to distill a set of best practices that can help improve the conduct of Red Teaming.

As a collaborative effort, CART wants the Red Teaming community to have input into this process. We, therefore, encourage members of the Red Teaming community to submit their feedback. This can be used to suggest modifications to a listed best practice or its wording, to supply additional evidence for or against a listed best practice, and – hopefully – to recommend additional best practice candidates for CART researchers to analyze. 

The best practices listed here, together with their ratings, are thus always provisional and will be updated as new evidence comes to our attention. What is posted at any point in time represents the best information currently available to the CART team.

For each best practice candidate, CART researchers have conducted an analysis to determine:

  1. What, if any, empirical support there is for the best practice.
  2. The degree of consensus among Red Teaming experts on the best practice.

These two factors are then assessed together to determine the relative “strength” of the claimed best practice, according to the key below. Each best practice also lists the number of sources that mention the best practice, as well as the Red Teaming contexts in which the best practice is likely to be applicable.

 

Strongly Supported

Multiple streams of empirical evidence from a number of Red Teaming contexts argue in favor of the best practice.


None identified so far.

Moderately Supported

Some, but not conclusive empirical evidence in favor of the best practice, or the scope of the best practice is limited to a narrow context, such as one type of Red Teaming only.


Almost always avoid groupthink when Red Teaming

Groupthink introduces powerful biases that skew simulation results and strenuous efforts should be made to avoid it. The sole exception is when simulating adversaries who tend to display groupthink in their own decision making, in which case intentionally introducing groupthink effects can increase the fidelity of the Red Teaming.

Sources Discussing the Best Practice:

Contexts Represented in Sources: General Red Teaming

Overall Assessment of Best Practice: While the cognitive phenomenon of groupthink is well-documented in the psychology literature, there is little direct evidence of its impact on Red Teaming specifically. However, it is widely asserted in the Red Teaming domain that groupthink is generally to be avoided, and indeed that Red Teaming can be a corrective to groupthink in organizations.

Weakly Supported

Widely proposed as a best practice, but little to no empirical evidence, or empirical evidence that is contradictory.


A Red Team must be able to operate independently

For the results of its activities to be credible and analytically useful, a Red Team cannot be influenced by, or appear to be influenced by, parties or concerns outside of the simulation process.

Sources Discussing the Best Practice:

Contexts Represented in Sources: General Red Teaming; Other (Military Planning)

Overall Assessment of Best Practice: There is consensus in the Red Teaming literature, with anecdotal evidence but no empirical validation.


Leadership must buy into Red Teaming process

Without support from organization leadership (especially direct supervisors), it becomes less likely that the resources necessary for appropriate Red Teaming activities will be sustained or that the results of the Red Teaming will shape organization decisions or behaviors. 

Sources Discussing the Best Practice:

Contexts Represented in Sources: General Red Teaming; Cyber Penetration Testing; Other (Military Planning)

Overall Assessment of Best Practice: There is consensus in the Red Teaming literature, with anecdotal evidence but no empirical validation. Most sources cite the 2003 Defense Science Board report.


Avoid mirror imaging bias

Mirror imaging (attributing one's own beliefs and thinking to the adversary) is detrimental to Red Teaming because it prevents Red Team members from considering the adversary's point-of-view and thus undermines a core aim of the Red Teaming approach.

Sources Discussing the Best Practice:

Contexts Represented in Sources: General Red Teaming

Overall Assessment of Best Practice: Although there are no direct empirical tests in the Red Teaming literature, this appears to be universally accepted among the Red Teaming community as an essential best practice.


Red Teams generally benefit from diversity

In almost all cases, Red Teaming is more efficient if there is diversity among Red Team members in terms of their knowledge, experience, demographics and/or cultural background. This allows for a broader range of perspectives, more multi-faceted analysis and more synergistic collaboration amongst team members. 

Sources Discussing the Best Practice:

Contexts Represented in Sources: General Red Teaming; Cyber Penetration Testing

Overall Assessment of Best Practice: There is consensus in the Red Teaming literature, with anecdotal evidence but no empirical validation.


A Red Team needs a clear mandate

Successful Red Teaming begins with defining the scope and objectives of the team. This helps to ensure that the team remains focused on the intended issues, assists with creating benchmarks for evaluating the team's performance, and increases the likelihood of its outputs being accepted and implemented. 

Sources Discussing the Best Practice:

Contexts Represented in Sources: General Red Teaming

Overall Assessment of Best Practice: There is consensus in the Red Teaming literature, although there has been no empirical validation.


Judicious Application

Red Teaming should be applied as needed to solve the prescribed problem being faced by the organization, but no more than that.

Sources Discussing the Best Practice:

Contexts Represented in Sources: General Red Teaming; Cyber Penetration Testing

Overall Assessment of Best Practice: The low number of sources for this Best Practice is likely due to the fact that many sources focus on the process or best practices for a singular Red Team engagement, instead of a Red Team program or series of engagements.


Record All Outputs

Capturing and documenting all outputs from a simulation is important to ensure successful delivery of feedback to participants. Equally important is providing a review for participants during the simulation, when learning opportunities are presented by the identification of vulnerabilities or poor performance (Kleiboer, 1997).

Sources Discussing the Best Practice:

Contexts Represented in Sources: General Red Teaming; Other (Military Planning)

Overall Assessment of Best Practice: Many sources appear to imply this Best Practice without explicitly talking about it. Data collection seems to be considered a fairly obvious, implied function of a Red Teaming exercise. Even though the basic idea of data collection is implied, there are few discussions of methodology or best practices for this particular function.


Collect as much information as possible about the target

A critical preliminary step in Red Teaming is gathering thorough information on as many aspects of the targeted entity as possible, including such aspects as its infrastructure, operations, stakeholders, defenses and competitors. Of particular importance is understanding where the entity places the most value, i.e. its “crown jewels”, since this will differ across targets.

Sources Discussing the Best Practice:

Contexts Represented in Sources: General Red Teaming; Cyber Penetration Testing

Overall Assessment of Best Practice: This has been recommended by several sources, but there have been no direct empirical tests in the literature.


The Red Team must be involved in all stages of an effort, from initial planning through implementation and reporting

This ensures that crucial Red Team principles are considered during both the design and conduct of an exercise or penetration and allows for alternative analysis to play a role throughout.

Sources Discussing the Best Practice:

Contexts Represented in Sources: General Red Teaming; Other (Military Planning)

Overall Assessment of Best Practice: There appears to be consensus in the Red Teaming literature, although there has been no empirical validation.


Inhabit the Adversary’s Mindset

The Red Team must attempt to accurately emulate the adversary’s mindset, by internalizing (but not necessarily sympathizing with) the motives of the adversary and making sure that their portrayal is precise and detailed. Although Red Team members might be able to perceive multiple possibilities for action, they should make decisions based only on factors which reflect the adversary’s cultural influences and biases rather than their own or those of the organization conducting the Red Teaming. Red Teamers should actively avoid ethnocentrism, which causes the Red Team to be “blind to the ability to see the world through the eyes of another national or ethnic group.” (Ken Booth, Strategy and Ethnocentrism, New York: Holmes & Meier, 1979, 15.)

Sources Discussing the Best Practice

Contexts Represented in Sources: Military Wargaming, General Red Teaming

Overall Assessment of Best Practice: Context rating based on the assessment that this is only applicable to Red Teaming in which the adversary is being simulated by a human, so it must not apply to some contexts.


Maintain a Cooperative Attitude

It must always be remembered that the Red Team exists to improve the defense (i.e., to serve the Blue Team). This requires that the Red Team maintain a cooperative and constructive attitude throughout the process, especially during after-action briefings to the Blue Team. It is important to present exercise results in a manner that explains how the organization can utilize the results to improve its practices. The emphasis should thus be on learning, progress, and mutual trust, while smug or condescending “Gotcha!” attitudes should be avoided.

Sources Discussing the Best Practice

  • Craig, Susan. Reflections from a Red Team Leader, Military Review, 60 (2007)
  • Lauder, Eles, and Banko. The Glaucus Factor : Red Teaming as a Means to Nurture Foresight, Canadian Army Journal (2012),
  • Zenko, MicahRed Team: How to Succeed by Thinking Like the EnemyNew York: Basic Books (2015)

Best Practices Feedback

bestpracticefeedback
Indicates required field

Newsletter

Subscribe to our Newsletter

Thank you for your interest in the Center for Advanced Red Teaming (CART) and "The Red Siren" our Quarterly Newsletter.

The newsletter is designed to be a compact, interesting resource for the Red Teaming community. Most issues of the newsletter will feature the latest news about CART, recent additions to CART’s ongoing compilation of Red Teaming Best Practices, an opinion piece from a leading Red Teaming practitioner or expert, and a catalog of broader developments and publications associated with Red Teaming.

Indicates required field

Research

Launched in November 2019, CART has undertaken a number of projects and activities designed to advance the art and science of Red Teaming in support of both the public and private sectors. In addition to the projects introduced below, CART has also supported private sector threat assessments through Red Teaming, conducted Red Team training simulations for the National Defense University, and piloted the use of Red Teaming as a teaching / training tool for homeland security students.

For more information, please contact Douglas Clifford at [email protected].

 

Validating Adaptive Behavior Models of Adversaries for Risk Assessment (VABMARA) Framework

Funder: Department of Homeland Security, Science & Technology Directorate through the Center for Accelerating Operational Efficiency at Arizona State University.

Questions:

  • Year 1 - How do the presence and knowledge of Computed Tomography (CT) screening capabilities influence adversary decision-making in an aviation environment?
  • Year 2 - Do organizational structure and operations influence the selection of hard vs. soft targets by adversar­ial organizations?
  • Year 3 - Which security infrastructures serve to visually deter a potential adversary from moving through a passenger screening environment?
  • Approach:
  • Year 1 – Distributed Red Teaming exercise involving 178 novice and expert red teamers focused on the develop­ment of attack plans and subsequent modifications due to experimental injects about CT screening capabilities. Results triangulated across multiple methods, including historical case studies, utility decision models, and game theoretic counterterrorism models.
  • Year 2 – Distributed Red Teaming exercise involving 200 novice red teamers assigned organizational profiles, assessing variation in target preferences, and ultimate selection of specific hard vs. soft targets.
  • Year 3 – Two phases, including a nationally-representative conjoint experiment of 2,000 participants testing perceived deterrence cues based on simulated security infrastructure at passenger screening environments in aviation, intercity passenger rail, and cruise POEs. Second phase incorporates results from Phase 1 in a distrib­uted Red Teaming exercise focused on attack planning through passenger screening.

Innovation: First attempt to assess whether Red Teaming results could be empirically similar to those from his­torical case studies and thus validate advanced decision models. Also developed DESSRT, or Distributed Empirical Structure Scalable Red Teaming, which allows implementation of tactical Red Teaming at scale.

Outputs/Outcomes: Results from Year 1 find that Red Teaming can validate models of adaptive adversary behavior, especially for questions or environments where historical data is limited. Results also show that novices and ex­perts also exhibit Red Teaming similarities, allowing for the expansion of Red Team role players within the security environment for generalizable results. Finally, the availability of CT scanning equipment and information led to some changes in adversary tactics, specifically in security evasion and weapon package selection. Additional results from other years are underway.

 

Employing Red Teaming for Countering International Proliferation (CIP Kit)

Funder: Export Control and Border Security program, Department of State

Questions: Can Red Teaming be used by foreign partners to self-identify key vul­nerabilities in export control and licensing operations related to origination and transshipment of proliferation-sensitive technologies?

Approach: Developed a self-administered Red Teaming “kit”, allowing partner nations to develop design, execute, and analyze their own red-teaming opera­tions with Customs and licensing personnel. Kit translated into three languages (Ukrainian, Georgian, and Azeri), and pilot tested with stakeholders.

Innovation: First “Red Teaming in a box” kit for foreign partners within a coun­terproliferation context.

Outputs/Outcomes: Kit developed, and delivery via international engagements with key foreign partners is currently underway.

 

Red Teaming the Post-COVID-19 Biological Weapons (BW) Threat Landscape (2021)

Funder: Department of Defense, Defense Threat Reduction Agency

Questions: How might COVID-19 impact the strategic decision making of states that currently do not possess a robust BW program? Which decision elements might precipitate changes in strategic BW decisions by state leaders?

Approach: An asynchronous, immersive Red Team simulation focused on 30 se­lected countries not known to currently be pursuing offensive BW. Each country was evaluated by 8 experts and 2 naïve participants who role-played the country leaders, split equally between gov/non-gov and between country and technical expertise. Both initial assessments, as well as counterfactual prompts, were used to assess the strategic direction and characteristics of possible BW programs.

Innovation: Asynchronous Strategic Dynamics Red Teaming - a distributed, low-resource tool to simulate multiple red perspectives and provide preliminary threat assessment and early warning of strategic change in WMD postures.

Outputs/Outcomes: An overall threat ranking of potential future pursuers of BW that enumerated pre- to post-COVID changes and yielded insights into the deci­sion making underlying these choices.

 

Experimental Red Teaming to Support Integration of Information in Joint Operations (2021)

Funder: Strategic Multilayer Assessment program of the Department of Defense

Questions: The project explored a dozen different hypotheses regarding the na­ture of disinformation and the optimal response to disinformation.

Approach: Six scenario-based Red Team experiments using 223 U.S.-based proxy participants from similar cultural backgrounds to actual adversary target popu­lations (Taiwan for the Asian context and several Southeast European countries for the European region). These experiments collected data on several measures of messaging effectiveness to investigate a dozen insights regarding the com­petitive information environment with respect to Great Power Competitors.

Innovation: Exposing the hypotheses generated by experts to realistic simula­tions involving disinterested participants at scale.

Outputs/Outcomes: Several counterintuitive results were obtained regarding the best way to respond to disinformation. The project demonstrated how the use of an integrated human simulation approach (experiments plus table-top exercises) can both validate insights provided by experts and reveal new dynam­ics in complex systems.

 

 “The Storm After the Flood” (2021)

End User: Mad Scientist Initiative of the U.S. Army Futures Command

Questions: How might weaponized information evolve when used against the U.S.?

Approach: A live, virtual wargame across three rounds, consisting of six govern­ment and academic experts playing various high-level U.S. government roles as the Blue Team, as well as over 250 attendees acting as a “Pink Team”, deciding on the adversary’s next moves by selecting from a range of prepared audiovisual “injects”. The scenario began with U.S. military forces stepping in to assist after a major flood in Southeast Asia and led to an adversary initiating a complex, multi-modal and multi-system information operation against the United States over three rounds of play.

Outputs/Outcomes: The wargame was well received, with over 93% of attend­ees viewing the exercise as useful and several important insights arising regard­ing how to defend against weaponized information.

 

Red Teaming Great Power Competition in the CENTCOM AOR (2020)

End User: Strategic Multilayer Assessment program of the Department of Defense

Question: How might Great Power Competition and regional dynamics change following the targeted killing of Qassem Soleimani?

Approach: Multi-round simulation, with the PRC, Iran, and Russia as Red teams, the United States as the Blue team, and Saudi Arabia, Israel and the EU as Green (or allied) teams. Four simulation sessions were conducted (three expert ses­sions and one student session), collecting a variety of strategic information, including: strategic objectives, assumptions, and risk proclivities; shorter-term “operational” objectives, overt and covert actions; and a post exercise strategic assessment.

Innovation: Developed Strategic Dynamics Red Teaming (SDRT), a wargaming technique that varies Red team players across multiple simulations, while keep­ing Blue and Green teams constant.

Outputs/Outcomes: Beyond merely narrative output, the multiple simulations involved allow for sophisticated analysis, and demonstrated that SDRT is capa­ble of rapid, low-cost explorations of complex strategic dynamics in an AOR.

Publications

News

 

2021

Red-Teaming the Post-Covid-19 Biological Weapon Threat Landscape. Global BioDefense.

 

2020

CEHC Red Team Exercise Studies Threat of Biological Warfare. UAlbany News Center.

UAlbany Red Team Exercise Studies Threat of Biological Warfare. Homeland Security Today.

CAOE researchers use strategic dynamic red teaming to study the post COVID-19 biological weapons threat landscape. CAOE.

Insights from the Mad Scientist Weaponized Information Series of Virtual Events. TRADOC (October 19, 2020).

CART Co-Hosting Wargaming Exercises to Prepare for “The Storm after the Flood” - Weaponized Information: The storm after the flood. U.S. Army.

CEHC to Launch Center for Advanced Red Teaming (CART) This Semester. UAlbany News Center.

Red Teaming Great Power Competition in the USCENTCOM AOR. NSI.

 

2019

UAlbany Launching Nation's First Red Teaming Center. WAMC.

University at Albany Launches Nation’s First Center for Advanced Red Teaming. Homeland Security Today.

UAlbany College of Emergency Preparedness, Homeland Security and Cybersecurity to Launch Nation’s First Center for Advanced Red Teaming. NewsWise.

 

Presentations

Beyond Pen-Testing: Tips for Red Teaming the Cyber-Physical Adversary. A recording from the Cyber Salon Series - The Center for Cyber Strategy and Policy (CCSP) by Dr. Gary Ackerman on February 18, 2021.

Validating A New Validation Approach: Comparing Risk Models, Human Simulation And Ground Truth In Adversary Tactical Choice. Presented by Dr. Gary Ackerman and Anna Wetzel at the Society for Risk Analysis (SRA) 2021 Annual Meeting on Risk Science and the Policy Interface, Decision Analysis and Risk,  on December 7, 2021.

Extension of Red Teaming Into Strategic Risk Analysis. Presented by Dr. Brandon Behlendorf and Douglas Clifford at the Society for Risk Analysis (SRA) 2021 Annual Meeting on Risk Science and the Policy Interface, Decision Analysis and Risk, on December 7, 2021.

Targeting Adaptation and the 'Stickiness' of Initial Selection: a Simulation Approach. Presented by Dr. Brandon Behlendorf at the Society for Risk Analysis (SRA) 2021 Annual Meeting on Risk Science and the Policy Interface, Decision Analysis and Risk, on December 7, 2021.

Operational Implications: Modeling, Validation and Red Teaming. Presented by Dr. Gary Ackerman and Dr. Jun Zhuang at the Society for Risk Analysis (SRA) 2021 Annual Meeting on Risk Science and the Policy Interface, Decision Analysis and Risk, on December 7, 2021.

Validating adaptive behavior models of adversaries for risk assessment (VABMARA). Presented by Dr. Gary Ackerman, Dr. Brandon Behlendorf, and Douglas Clifford at the Society for Risk Analysis (SRA) 2021 Annual Meeting on Risk Science and the Policy Interface, Decision Analysis and Risk, on December 7, 2021.

Simulations to Assess the Post-COVID-19 Strategic Biological Weapons Risk Landscape. Presented by Jenna LaTourette and Hayley Peterson at the Society for Risk Analysis (SRA) 2021 Annual Meeting on Risk Science and the Policy Interface, Security and Defense, on December 7, 2021.

Spotlight Webinar: Red Teaming the Post-COVID-19 Biological Weapon Threat Landscape. Presented for the Center for the Study of Weaposn of Mass Destruction by Dr. Gary Ackerman and Ted Plasse on August 3, 2021.

MadSci Weaponized Information: Lessons Learned from Vignette Wargame. A recording from the Mad Scientist: Weaponized Information Virtual Conference by Dr. Gary Ackerman and Douglas Clifford on July 21, 2020.

 SMA CENTCOM Panel Discussion- Black Swan Scenarios. An audio recording from the SMA CENTCOM Speaker Series on Black Swan Scenarios by Dr. Gary Ackerman on March 27, 2020.

 

Publications

Gary A. Ackerman, and Douglas Clifford. “Red Teaming and Crisis Preparedness.” Oxford Research Encyclopedia of Politics (2021).

Gary A. Ackerman, Brandon Behlendorf, Douglas Clifford, Hayley Peterson, Jenna LaTourette and Anna Wetzel. "Red Teaming the Post-COVID-19 Biological Weapons Threat Landscape, Final Project Report" (Center for Advanced Red Teaming: Albany, New York, 2021).

Gary A. Ackerman, Douglas Clifford, Anna Wetzel, Jenna LaTourette and Hayley Peterson. “Experimental Red Teaming to Support Integration of Information in Joint Operations.” Prepared for the Strategic Multilayer Assessment, Office of the Secretary of Defense Joint Staff J-39 (University at Albany, SUNY: Albany, NY, 2021).

Gary A. Ackerman and Hayley Peterson. "Red Teaming the Post-COVID-19 Biological Weapons Threat Landscape, Project Overview and Preliminary Report." (Center for Advanced Red Teaming: Albany, New York, 2021).

Gary A. Ackerman, Brandon Behlendorf, Jun Zhuang, Kyle Hunt, Douglas Clifford, Anna Wetzel, Hayley Peterson and Jenna LaTourette. “Validating Adaptive Behavior Models of Adversaries for Risk Assessment (VABMARA) Framework Report.” Prepared for the Center for Accelerating Operational Efficiency, a Department of Homeland Security Center of Excellence (University at Albany, SUNY: Albany, NY, 2020).

Gary A. Ackerman, Anna Wetzel, Douglas Clifford, Hayley Peterson, and Jenna LaTourette. “Red Teaming Great Power Competition in the CENTCOM AOR.” Prepared for the Strategic Multilayer Assessment, Office of the Secretary of Defense Joint Staff J-39  (Center for Advanced Red Teaming: Albany, New York, 2020).