Key Takeaways
- Collecting high‑quality tips from the public hinges on four steps: recognizing concerning behavior, knowing it should be reported, knowing how to report it, and providing sufficient detail; each step presents points of failure.
- An overload of tip‑reporting channels (school‑based, local, state, and law‑enforcement systems) confuses the public and hinders fusion centers’ ability to aggregate information effectively.
- Direct, face‑to‑face community engagement—such as road‑shows and local outreach—is seen as more effective than passive PSA campaigns for teaching the public what to look for and how to report it.
- Many civilians remain unaware of what fusion centers do, which undermines public trust and reduces willingness to share information.
- Ongoing research, both pre‑ and post‑implementation, is essential to evaluate the usability, accessibility, and question design of SAR platforms across jurisdictions.
- A national experiment comparing web forms and chatbots, while varying the information solicited, aims to empirically identify which prompts yield the most accurate tips with the fewest errors.
- AI’s potential extends beyond the initial intake stage to triage, validation, and information sharing among response teams; digital intake is only the “tip of the iceberg.”
- Adoption of AI tools will be cautious, requiring attention to civil‑liberty safeguards (28 CFR § 23), legislative review, demonstrated value through case studies, and education of both analysts and policymakers.
Introduction and Framework Overview
On May 12, the National Counterterrorism Innovation, Technology, and Education (NCITE) center hosted a webinar that introduced a lifecycle framework for integrating next‑generation technologies—such as artificial intelligence (AI), spatial computing, and autonomous systems—into the entire suspicious activity reporting (SAR) process. The presenters emphasized that these tools are intended to augment, not replace, human analysts’ judgment. Panelists included Tin Nguyen (Director of Research Translation & Technology Transition), Erin Kearns (Director of Law Enforcement Partnerships), Joel Elson (Director of IS&T Research Initiatives), Roman Asaro (Crime Analyst, NE Information Analysis Center), and Deepak Kukade (Lead Intel Analyst, MN Fusion Center). The discussion centered on how emerging technologies can improve each phase of SAR while preserving accountability and civil‑liberty protections.
Challenges in Sourcing High‑Quality Public Information
Tin Nguyen opened the conversation by outlining the four‑step chain that determines whether a tip becomes useful intelligence: the public must (1) recognize concerning behavior, (2) understand that it warrants reporting, (3) know the appropriate reporting mechanism, and (4) supply enough detail for analysts to act. He noted that “there’s a lot in the chain of events that can go wrong.” Erin Kearns added two common psychological barriers: reporters’ desire to remain anonymous and their reluctance to “waste law enforcement’s time.” Together, these factors create multiple points where valuable information can be lost before it ever reaches a fusion center.
Oversaturation of Tip‑Reporting Mechanisms in Minnesota
Deepak Kukade described how Minnesota’s recent rise in targeted violence and school‑related mass shootings has prompted a proliferation of distinct tip‑reporting systems—school‑run hotlines, district portals, local government apps, law‑enforcement websites, and state‑level platforms. While each channel is marketed as a safe, confidential avenue for juveniles and the public, the sheer number of options creates confusion. Kukade observed that this “wide field of options makes it difficult for the public to know where to turn” and consequently hampers fusion centers’ ability to gather a complete picture of threats.
Impact of Reporting‑Channel Overload on Public Behavior and Fusion Centers
Kukade went further, suggesting that the abundance of reporting avenues may have unintentionally reduced overall tip volume. Although official messaging stresses confidence, security, and anonymity, the fragmentation can lead to reporting fatigue or uncertainty about which channel is “correct.” As a result, some potential reporters may opt not to submit information at all, undermining the very force‑multiplier effect the systems aim to achieve. Roman Asaro echoed this concern, noting that the public’s lack of awareness about which entity receives a tip diminishes trust and willingness to engage.
The Need for Direct Community Engagement
To overcome the limitations of passive outreach, Kukade advocated for a return to “old school” methods: fusion centers and tip‑reporting agencies should conduct road‑shows, attend community events, and engage directly with residents—especially youths—so they can learn in real time what behaviors are concerning and how to report them effectively. He argued that billboards, TV spots, and digital ads struggle to convey the nuanced judgment required (e.g., distinguishing between innocuous odd behavior and genuine threats). Face‑to‑face interaction allows agencies to tailor messaging, answer questions, and build rapport, thereby improving both the quality and quantity of tips.
Public Unawareness of Fusion Centers and Trust Issues
Both Kukade and Asaro highlighted a fundamental obstacle: the general public largely does not know what fusion centers are, what they do, or how tip reporting fits into broader intelligence‑sharing efforts. Asaro stated, “That is a huge limitation we have, and with that comes public trust.” Without a clear understanding of the endpoint of their reports, citizens may doubt whether their information will be used appropriately or fear misuse, which further suppresses reporting. Building transparency about fusion‑center missions, data‑handling practices, and oversight mechanisms is therefore essential to foster confidence.
Ongoing Research Before and After Tool Implementation
Erin Kearns explained that her team has been evaluating SAR platforms nationwide, examining factors such as ease of access, user‑interface design, and the specificity of required fields. Their review revealed striking variability—even neighboring states like Iowa and Minnesota employ markedly different data‑collection approaches. Kearns stressed that any technology integration must be preceded by rigorous research to ensure tools are usable and that they do not inadvertently introduce bias or error. Post‑deployment evaluation is equally important to measure real‑world impact on tip quality, analyst workload, and investigative outcomes.
National Experiment Comparing Web Forms and Chatbots
To empirically determine which solicitation methods yield the most accurate tips, Kearns and Joel Elson are designing a national experiment. Participants will be presented with a realistic suspicious scenario and then randomly assigned to either a traditional web‑form reporting tool or a chatbot interface. Within each condition, researchers will vary the specific questions asked (e.g., open‑ended narratives vs. structured checkboxes) to assess how different prompts influence the completeness and accuracy of the information provided. The goal is to identify question sets that maximize signal while minimizing noise, thereby informing best‑practice design for future SAR interfaces.
AI’s Role Across the Full SAR Lifecycle
Elson cautioned that focusing solely on the intake stage undersells AI’s potential. He noted that AI can assist in triage—prioritizing incoming tips by risk level—aid in validation by cross‑referencing tips with existing datasets, and facilitate information sharing among response teams through automated summarization and alert generation. “Digital intake is just the tip of the iceberg,” he said, urging stakeholders to consider how machine‑learning models, natural‑language processing, and predictive analytics could enhance every step from initial receipt to final action.
Cautious Adoption Path: Civil Liberties, Legislation, and Education
Adoption of AI‑enhanced SAR tools will be deliberate, according to Asaro and Kukade. Most fusion centers currently lack integrated AI capabilities, and any deployment must comply with civil‑liberty safeguards codified in 28 CFR § 23, which governs criminal intelligence systems. Demonstrating tangible benefits through pilot studies and case analyses will likely be necessary before legislative bodies authorize broader use. Kukade emphasized that success also depends on educating both analysts and policymakers about the technology’s capabilities, limitations, and appropriate safeguards, ensuring that new tools are embedded within existing policies rather than imposed atop them.
Conclusion and Disclaimer
The webinar underscored that while next‑generation technologies hold promise for making the SAR process more efficient and effective, their implementation must be grounded in rigorous research, clear public outreach, and robust protections for privacy and civil liberties. By addressing the chain‑breakpoints in public reporting, reducing channel overload, engaging communities directly, and carefully vetting AI tools across the entire lifecycle, fusion centers can enhance their ability to turn tips into actionable intelligence without eroding public trust.
Disclaimer: The views and conclusions contained in this webinar are those of the authors and should not be interpreted as necessarily representing the official policies or views, either expressed or implied, of the U.S. Department of Homeland Security, the University of Nebraska, or guest‑affiliated institutions.

