According to Ulf Lindqvist, a Senior Technical Director in the Computer Science Laboratory at SRI International, the issue facing most organizations isn’t a lack of data about CVEs – it’s too much data. That was a concern he expressed during our latest AMA with Brian Contos episode where I spoke with Ulf and Bill Crowell, the former Deputy Director of the NSA. We discussed the challenges organizations face around effective vulnerability management programs.
Another issue? The grading system for the severity system of CVEs is severely lacking, as Bill Crowell says. So when organizations have multiple tools reporting vulnerabilities in their independent silos, and multiple sources for intelligence on the exploitation of those vulnerabilities they discover, it can be an overwhelming challenge to decide which vulnerabilities to focus remediation and mitigation efforts on first. All of this data is useful, but consolidating it, correlating it, and then feeding it into some kind of system for prioritization continues to be a challenge.
Hear what best practices Bill and Ulf recommend organizations follow to tackle these challenges in this latest AMA – from implementing AI where it can help to surfacing parseable business context to processes for remediation validation. And reach out to Sevco to learn how Sevco’s Exposure Management platform can help your organization consolidate these disparate sources of vulnerabilities, correlate them to the assets they affect through the inventory that forms the foundation of Sevco’s capabilities, and layer both threat and asset intelligence on top to provide all the data a security team needs around exploitation and business impact for comprehensive prioritization and remediation – all in a single pane of glass.