The National Vulnerability Database (NVD) issued a new status update on March 19, attempting to clarify the current state of its vulnerability processing pipeline. The agency says it has resumed processing new CVEs at the same rate it maintained before last year’s slowdown, but with vulnerability volumes surging, that’s no longer enough.
We are currently processing incoming CVEs at roughly the rate we had sustained prior to the processing slowdown in spring and early summer of 2024. However, CVE submissions increased 32 percent in 2024, and that prior processing rate is no longer sufficient to keep up with incoming submissions. As a result, the backlog is still growing.
We anticipate that the rate of submissions will continue to increase in 2025. The fact that vulnerabilities are increasing means that the NVD is more important than ever in protecting our nation’s infrastructure. However, it also points to increasing challenges ahead.
To address these challenges, we are working to increase efficiency by improving our internal processes, and we are exploring the use of machine learning to automate certain processing tasks.
The update hinted at internal modernization efforts, including “improving internal processes” and “exploring the use of machine learning” to automate parts of the CVE processing workflow. While those ambitions are notable, the NVD provided no detail on timelines, tooling, or which specific tasks might be automated.
In the meantime, the backlog is not just growing — it’s also getting harder to measure.
Security researchers tracking the NVD’s output have noticed a shift in how CVEs are categorized. While the community previously tracked the number of CVEs marked as “Awaiting Analysis” as a proxy for backlog, the NVD has recently started moving large numbers of unprocessed CVEs into a different bucket: “Undergoing Analysis.”
Vulnerability historian Brian Martin commented on the trend this week:
You cannot say "NVD backlog" anymore, because NVD shifted their tactic as I have pointed out on my feed. There are two buckets that must be observed: awaiting analysis (the buckets we have been using for a year) and undergoing analysis (the bucket I pointed out a week+ ago they started stacking numbers in).
We're in a new era of NVD and/or ANALYGENCE being deceptive, I think. Time to be more specific on what "backlog" means, and more critically, what each bucket means.
The distinction matters. Many CVEs that are “undergoing analysis” have sat in that state for over a year with no meaningful metadata attached — no CVSS score, no CWE classification, no CPEs. Despite their status, they remain functionally unanalyzed.
Even the most recent vulnerabilities are stalling. As of March 25, security researcher Andrey Lukashenkov pointed out that none of the CVEs published the week of March 17 had been marked as Analyzed or Modified. Every single entry remains in a pending state — either “Received,” “Awaiting Analysis,” or “Undergoing Analysis."
"Many researchers like Tom Alrich, Brian Martin, and Jeroen Braak have been pointing out last week that NVD is in a deep trouble right now," Lukashenkov commented on LinkedIn. "It is not that it worked perfectly this past year, but it seems like all the meaningful activity there has stopped completely."
He shared a chart that shows CVE status distribution by the week they were published, as opposed to one that shows activity on the CVE backlog, which he considers to be a better proxy for the overall NVD performance:
The dark blue areas show CVE that were analyzed and later modified. The thin yellow line are those “undergoing analysis” with many CVE being in this state for more than a year. The orange and the purple areas show the growing nonanalyzed CVE backlog.
That’s a dramatic increase from the 5,371 CVEs published during the same timeframe in 2024 — and it shows just how outpaced the NVD’s current processing rate has become.
Security Teams Must Diversify Vulnerability Data Sources#
While some organizations still rely on the NVD as a primary vulnerability data source, security leaders are increasingly urging teams to diversify their sources and workflows.
“Don’t rely on a single source. Start pulling data directly from CVE.org, CISA’s KEV catalog, vendor advisories, and community intel like OSV.dev or ExploitDB," Rich Osolease commented on the current situation.
"Automate your enrichment workflows and push your vendors to keep you informed. The most innovative organizations already diversify their feeds, prioritize by risk, and don’t wait for NVD to tell them what’s important.”
Until the NVD regains its footing and provides more transparency around how it’s categorizing and processing CVEs security teams will need to look elsewhere to stay ahead of emerging threats.
"While the slowdown at NVD is concerning, it's not the end of the road," Osolease said. "It's just a wake-up call to strengthen your vulnerability intelligence game."
The most effective security programs have already adapted to this new reality. As the NVD struggles to keep pace with the exponential growth in vulnerabilities, organizations must diversify their vulnerability intelligence sources, implement risk-based prioritization, and build more automated and resilient vulnerability management processes. The days of relying solely on the NVD for vulnerability intelligence are clearly over.
Secure your dependencies with us
Socket proactively blocks malicious open source packages in your code.
The remediated findings include organization permission bugs, stale project access after transfers, OIDC replay edge cases, audit logging gaps, and an IDOR in API token deletion.
GitHub account BufferZoneCorp published sleeper packages that later added credential theft, GitHub Actions tampering, fake go wrappers, and SSH persistence.