Blog

Engaging with Security Researchers: Embracing a “See Something, Say Something” Culture

Released

Tod Beardsley, Known Exploited Vulnerability Team Lead & Daniel Larson, Coordinated Vulnerability Disclosure Team Lead

In an age where digital systems have an electronic tendril in nearly every aspect of our lives, the role of cybersecurity researchers is more important than ever. These individuals and groups proactively identify weaknesses in software, networks, and hardware, often before malicious actors get a chance to exploit them. Yet even though we’ve collectively developed a set of norms and standards for coordinated vulnerability disclosure, companies, open-source projects, and government agencies sometimes respond to these unsolicited reports with fear, uncertainty, and doubt, rather than engagement, driving away the very allies we all rely on to keep our systems safe.

“See Something, Say Something”

The Department of Homeland Security’s (DHS) “See Something, Say Something” campaign is a cornerstone of public safety. It encourages people to report suspicious activity to the authorities who are best equipped to assess and handle the potential threat. At CISA, we adhere to the very same principle in the online world. Information security researchers act as the digital equivalents of observant citizens, uncovering flaws in systems that could otherwise be exploited by criminals and foreign threat actors.

Based on our experience, when a researcher finds a vulnerability, more often than not, they’ll do what any reasonable person would do: notify the right people to fix the problem. When it’s a bug in widely used software, the owner or producer of the software would be notified and is usually best positioned to address the issue. For this process to work effectively, researchers should feel they can safely report vulnerabilities without fear of reprisal.

For federal agencies, CISA encourages this practice through Binding Operational Directive 20-01, which requires federal agencies to publish both a security contact for each .gov domain and a Vulnerability Disclosure Policy (VDP) covering all of the agency’s internet-accessible systems and services.  Further, the agency-crafted VDP must address numerous critical topics, including a commitment to not recommend or pursue legal action against anyone for security research activities that the agency concludes represents a good faith effort to follow the VDP, as well as deem such activities authorized by the agency.

A Typical Disclosure

What, exactly, should an organization expect for a typical coordinated vulnerability disclosure (CVD)? The most optimal, by-the-book journey of a vulnerability notification would go something like this:

  1. Identification and Reporting: The vulnerability is discovered by the researcher and reported through a designated security contact for the affected organization. Reports may be submitted to any available contact or web form found on the affected organization’s public website. Based on our role as the Federal Government’s central CVD team, our experience tells us contacting the right people is often the greatest hurdle for researchers.
  2. Acknowledgment: The affected organization acknowledges the receipt of the vulnerability report and provides the researcher an estimated timeline for future communication. The organization might request more information from the researcher to better understand the report or their disclosure timeline. 
  3. Assessment and Validation: The organization then assesses the validity and severity of the reported vulnerability. This may require a discussion with the researcher and in-depth knowledge of how the vulnerability is exploitable. Utilizing the Common Vulnerability Scoring System (CVSS) and Stakeholder-Specific Vulnerability Categorization (SSVC) are widely accepted as criteria used to determine the severity of vulnerabilities. 
  4. Remediation: Once validated, the organization fixes or mitigates the vulnerability and thoroughly tests the fix to ensure the vulnerability is resolved without introducing new problems. The researcher is often able and willing to help test and validate the mitigations.
  5. Disclosure: The organization and the researcher agree on a publication timeline. High level details of the vulnerability, Common Weakness Enumeration (CWE) specification, mitigations, severity ratings, a Common Vulnerability Enumeration (CVE) identification number, and researcher acknowledgement are publicly published by the organization and possibly the researcher. This aims to balance transparency with security by informing users, developers, and security professionals about potential risks and encourages patches and mitigation strategies.

Ensuring Clear Communications During a Crisis

When a vulnerability or breach comes to light, involving legal counsel can be both prudent and entirely routine. Staff attorneys help organizations navigate regulatory obligations, potential liabilities, and any contractual requirements. However, while soliciting legal advice is a critical component of any crisis communications plan, it’s also critical to appreciate the context of the situation. How an organization responds publicly, and how it interacts with the security researcher, can shape public perception and the ultimate resolution of the issue.

A few key crisis communication best practices can help maintain credibility:

  1. Acknowledge the issue: Even if all the details aren’t available, make it clear that you are aware of the issue and working toward a resolution. Avoid the perception of being dismissive, excuse-making, or victim-blaming.
  2. Engage researchers constructively: Without the efforts of security researchers, you would likely remain in the dark about the issue in the first place, placing your users, employees, customers, and constituencies at risk.
  3. Provide timely updates: Transparency and regular communication fosters trust and even bad news provides some level of reassurance.
  4. Don’t shoot the messenger: Punishing researchers with legal threats is not productive, as it may backfire and undo the positive perception of work done in addressing the issue. It not only creates an adversarial relationship with the very person seeing something and saying something, but can alienate the research community, and even the general public. Demonstrating cooperation shows commitment to cybersecurity, accountability, and transparency; these are core values we promote in our Secure by Design campaign

Organizations should view these incidents as opportunities to improve their security posture, showcase their response capabilities, and demonstrate a willingness to learn. Working collaboratively with researchers is not only productive for affected organizations, but it also strengthens relationships with the security community which will certainly pay dividends in the years ahead. Today’s crisis is tomorrow’s case study.

Organized Security Researcher Support

Forward-thinking organizations are already embracing bug bounty programs and formal vulnerability disclosure processes. These initiatives send a clear message: reasonable disclosure is welcome, and it will be handled professionally. Companies like Google, Microsoft, and Amazon have seen tremendous value in these programs, as well as have government agencies like DHS. These programs allow organizations to roll out patches quickly and before bad actors can take advantage of issues at scale.

Governments, too, can benefit from engaging security researchers. With critical infrastructure at risk, public entities must encourage vulnerability reporting. Establishing a clear and welcoming process for reasonable disclosure ensures that researchers know how to report vulnerabilities and feel confident that their findings will be handled in good faith.

A Culture of Collaboration

To protect our nation’s digital infrastructure, we must adopt a “See Something, Say Something” mindset in cybersecurity. When researchers report vulnerabilities or evidence of breaches, organizations should engage them as partners rather than adversaries. Involving legal counsel may be a necessary step in handling these situations, but the broader response must focus on resolving the issue and maintaining public trust. Fostering a culture of transparency and awareness is essential to ensuring that your organization emerges stronger and more secure from a potential crisis.

Organizations can also take inspiration from CISA, which has committed to coordinated vulnerability disclosure (CVD) and encourages the public to report incidents and vulnerabilities. More information on reporting can be found at cisa.gov/report.

Incidentally, if you’re interested in having a more direct hand in vulnerability disclosure for your own products and services, consider becoming a CVE Numbering Authority. CISA has worked diligently to grow our community of CNAs. There are over 400 CNAs today; this community is global and is made up of many major technology vendors across IT and OT. It’s a pretty cool community to be a part of, and we here at CISA are happy to help get you started on the path towards joining!

Together with a culture of collaboration with security researchers, companies, open-source projects, and governments alike, we can all bolster defenses and create a safer digital culture for everyone.