We are excited to announce the release of VRT 1.14. With this release, we expand upon our commitment to enable our customers to use human ingenuity to secure and get value from AI quickly and confidently by adding a new vulnerability category: Data Bias Vulnerabilities.
In December 2023, we released our first, big update to the VRT enabling our customers and hackers to have a shared understanding of how the most likely emerging LLM-related vulnerabilities are defined and should be prioritized for reward and remediation. To further expand upon this, the AI Data Bias vuln types focus on mitigating the risk of AI perpetuating social harm through AI bias and discrimination. This is in accordance with government regulations including Executive Order 14110 and the EU Artificial Intelligence Act.
By adding AI Data Bias vulnerability types to the VRT, we empower hackers to focus on hunting for specific vulns and creating targeted POCs, and help engagement owners with LLM-related assets design scope and rewards that produce the best outcomes for AI Safety.
With these AI security-related updates to the VRT (and still more to come) and our experience working with AI leaders like OpenAI, Anthropic, Google, the U.S. Department of Defense’s Chief Digital and Artificial Intelligence Office, and the Office of the National Cyber Director, the Bugcrowd Platform is positioned as the leading option for meeting the challenges of AI risk in your organization.
New “AI Data Bias” category:
Additions to existing categories:
Removed from existing category:
Contributions needed!
This update represents our continued commitment in recognizing these attack vectors within the VRT, but is far from the last. The VRT and these categories will evolve over time as hackers, Bugcrowd Application Security Engineers (ASEs), and customers actively participate in the process. If you would like to contribute to the VRT, Issues and Pull Requests are most welcome!
If you found this useful or have any questions, let’s keep the dialogue going! Tweet me at twitter.com/codingo.