
CMMC Level 2 Gap Analysis: How to Prepare for Your Assessment
Emily Bonnie
Senior Content Marketing Manager
Anna Fitzgerald
Senior Content Marketing Manager
If you've started preparing for CMMC Level 2 certification, you've probably heard the term "gap analysis" more than once. Although it sometimes gets treated like a checkbox exercise, in practice, it's the most consequential step in your entire path to certification. It's where you stop estimating your compliance posture and actually measure it.
A CMMC gap analysis compares your current environment against all 110 requirements in NIST SP 800-171 Rev 2. Done well, it shows you where you stand, how your SPRS score shapes up, and what remediation effort stands between you and certification. Done poorly, or skipped entirely, it means you're making financial and operational decisions based on guesses. That gets expensive fast.
This guide walks through how to conduct a proper Level 2 gap analysis: the steps involved, how long it realistically takes, what some contractors get wrong, and how to use the results to build a defensible path to certification.
What is a CMMC gap analysis?
A CMMC gap analysis is a structured comparison between your current cybersecurity implementation and the 110 requirements in NIST SP 800-171 Rev 2. For each requirement, you determine whether it is fully implemented, partially implemented, or not implemented at all.
The output isn’t just a scorecard. It should produce four concrete things:
First, a clear view of your compliance posture across all 110 controls. Second, a preliminary SPRS score calculated using DoD scoring methodology. Third, a documented list of remediation actions that will eventually become your Plan of Action and Milestones (POA&M). And fourth, a realistic timeline to assessment readiness.
It’s important to understand what a gap analysis is not. It is not your C3PAO assessment. It’s a readiness exercise you conduct internally. Done correctly, it functions as a technical and operational validation of how CUI is actually protected in your environment before an external assessor evaluates it.

CMMC Level 1 Compliance Checklist
Download this checklist for CMMC 2.0 Level 1 listing all requirements and assessment objectives to help guide your compliance efforts and assessment preparations.
When should you conduct a gap analysis?
There are several points in your CMMC timeline where a gap analysis makes sense.
The most important is before you begin remediation. You need to know what you’re fixing before you start spending time and money implementing controls, adding new tools, and investing in training.
It’s also common to run a second gap analysis after initial remediation to validate progress. And if you’re planning a C3PAO assessment, you should conduct a formal readiness review at least six to twelve months beforehand.
With Phase 2 beginning November 10, 2026, many Level 2 contractors should expect third-party certification to become standard for contract awards. For most mid-sized organizations, the timeline from first gap analysis to assessment-ready ranges from three to nine months. Organizations with unclear CUI boundaries or weak documentation often fall on the longer end of that range.
How long does a CMMC gap analysis take?
In a small, well-documented environment with clear CUI boundaries, a disciplined gap analysis can be completed in two to four weeks. Mid-sized contractors with more complex environments should expect four to eight weeks. Multi-site organizations or those with significant documentation gaps should plan for eight to twelve weeks or more.
The variables that extend the timeline most are documentation maturity, system complexity, and stakeholder availability. If key people are hard to pull into evidence review sessions, the process slows. If policies haven't been updated in several years, reconciling them with current operations takes time.
The more organized your environment is before you start, the faster and cleaner the process runs. Time spent here directly reduces risk during your C3PAO assessment.
If you haven’t conducted one yet, now is the time. Here's how to get started:
Step 1: Assemble the right people
A meaningful gap analysis requires direct visibility into how your environment actually operates, not just how it's documented. That means you need someone who understands system configurations, someone who owns your policies and procedures, and someone at a leadership level who can make decisions about remediation priorities and business risk.
In smaller organizations, those roles often overlap. A single IT lead and a compliance coordinator can conduct a solid gap analysis if both have genuine access to the systems and documentation in scope. What matters is that the people involved are looking at real configurations and real operating practices, not summarizing what they think is true.
If you're working with a managed service provider, they should be part of this process from the beginning, particularly if they manage any systems or services that touch CUI.
Step 2: Define your CUI boundary before you score anything
This is the step that most contractors underestimate, and it's the one that causes the most painful surprises during actual assessments.
Before evaluating a single control, you need to define your CUI boundary clearly and specifically. That means identifying which systems, networks, and users process, store, or transmit Controlled Unclassified Information. It means documenting how CUI enters your environment, how it moves internally, and how it exits. It means listing every in-scope asset and understanding which external service providers have any interaction with CUI.
If you're using an enclave strategy to limit scope, this is where you define that enclave precisely, including what keeps CUI contained and what would cause it to cross the boundary.

A poorly defined boundary creates two distinct risks. You may over-scope your environment and significantly inflate remediation costs. Or you may under-scope and have a C3PAO identify systems during assessment that were never evaluated. Most difficult and expensive assessments trace back to scoping errors rather than misunderstood control language.
Everything downstream of this step depends on getting the boundary right.
Step 3: Evaluate all 110 controls methodically
Once your boundary is defined, the actual evaluation begins. Work through the 110 NIST SP 800-171 Rev 2 requirements systematically, and resist the temptation to score controls based on intuition or general familiarity.
Each control should be validated against real evidence: system configurations, operating procedures, log behavior, policy language. That means reviewing actual settings, interviewing the people responsible for specific functions, and confirming that what your documentation describes reflects what your environment does. If your acceptable use policy says users receive security awareness training annually, someone should be able to produce records showing that training actually happened.
For every requirement, document your finding, the evidence reviewed, and any gaps. If you cannot produce defensible evidence for a control, treat it as not fully implemented regardless of your confidence that it's in place. Assessors evaluate objective evidence, not institutional knowledge.
As you work through controls, calculate your preliminary SPRS score in parallel. This gives you a running view of your posture and helps you identify which gaps are carrying the most scoring weight.
Understanding your SPRS score
The Supplier Performance Risk System scoring model starts at 110 points. For each requirement that is not fully met, you subtract its assigned value. Controls are worth one, three, or five points depending on their criticality to CUI protection.
Five-point controls relate to core protections and carry the highest risk impact. A single unmet five-point control affects your posture significantly more than several unmet one-point controls, so it's worth identifying those early.
While the standalone DFARS 7019 SPRS self-attestation requirement was eliminated in February 2026, SPRS scoring remains embedded in the CMMC framework. For Level 2 certification, organizations must achieve at least 88 points and close all controls that are ineligible for POA&M status. Your gap analysis should make your position against that threshold immediately visible.

Building a POA&M that's actually useful
Every control that isn't fully implemented should generate a Plan of Action and Milestones entry. A useful POA&M goes beyond a task list. For each gap, it should describe the specific finding, its root cause, the planned remediation approach, the owner responsible for completion, and a realistic target date.
Some controls cannot remain open at the time of your C3PAO assessment. Multi-factor authentication and FIPS-validated encryption are the most common examples. If these are gaps in your current environment, they should be treated as immediate remediation priorities, not items to carry forward on a timeline.
A well-structured POA&M is what turns your gap analysis findings into an operational roadmap. It also demonstrates to assessors that your organization has a clear-eyed understanding of its gaps and a credible plan to address them.
Recommended reading
Understanding the Plan of Action and Milestones (POA&M): A Practical Guide for CMMC and FedRAMP Compliance
The most common CMMC gap analysis findings
Across readiness reviews and assessments, the same patterns appear with enough regularity that they're worth naming upfront.
Multi-factor authentication is frequently deployed inconsistently, covering some systems but not all in-scope assets. Encryption is often present but not validated to FIPS standards, which is a different problem than having no encryption at all. CUI boundaries are poorly documented more often than not. Logging is enabled on individual systems but not centrally monitored or reviewed. Incident response plans exist on paper but have never been tested, which means no one actually knows whether they work. And policies frequently describe ideal processes that don't match what people do day to day.
Most of these gaps aren't caused by a lack of tools. They're caused by a lack of alignment between technology, documentation, and daily practice. The gap analysis is what makes that misalignment visible.
The most expensive mistake contractors can make
The most expensive mistake in a CMMC gap analysis is treating it as a documentation exercise instead of a systems validation exercise. Updating policies without validating configurations creates artificial compliance that collapses during assessment.
A credible gap analysis aligns three things: configuration, documentation, and evidence. If those three elements don't match, the gap will surface eventually. It's better to surface it yourself.
What to do after your gap analysis
Once the gap analysis is complete, the work shifts from assessment to remediation and documentation.
You finalize your preliminary SPRS score and use it to prioritize remediation by both risk and certification impact. You update your System Security Plan to accurately reflect implemented controls rather than aspirational ones. You assign POA&M items to owners with clear timelines and begin tracking progress on an ongoing basis.
Most organizations benefit from an external readiness review several months before their scheduled C3PAO assessment. This reduces the risk of late-stage findings that require rescheduling. It also gives you an outside perspective on whether your evidence packages will hold up under assessor scrutiny.
When you're ready to schedule your C3PAO, do it early. As Phase 2 approaches, assessor availability will become a real constraint.
One framing worth keeping in mind: certification is triennial, but compliance is continuous. The gap analysis marks the beginning of an ongoing program, not the completion of a project.
Turn your CMMC gap analysis into a continuous compliance program
Managing a gap analysis across 110 controls in spreadsheets is workable for an initial assessment, but it creates a maintenance problem as your environment changes. New users are added, systems are updated, and policies evolve. A static document doesn't reflect those changes unless someone actively maintains it.
Secureframe Defense turns your gap analysis into a live compliance dashboard. It continuously evaluates your environment against NIST SP 800-171 controls, collects evidence automatically, and generates SSP and POA&M documentation based on your actual configuration rather than manual inputs. The result is a live compliance posture rather than a point-in-time snapshot that goes stale.
Streamline your compliance with CMMC 2.0
FAQs
Can I conduct a CMMC gap analysis internally, or do I need outside help?
You can conduct one internally if you have people with direct technical and compliance visibility into your environment. The risk with a purely internal review is that familiarity with your own systems can create blind spots. Many organizations do an internal first pass and then bring in an outside advisor to validate findings before moving to remediation.
What does a gap analysis typically cost?
It varies significantly based on scope and whether you're using internal staff, a consultant, or automated tooling. A focused internal effort may cost primarily in staff time. Consultant-led assessments for mid-sized contractors typically range from $15,000 to $50,000 or more depending on complexity. Automated platforms can reduce ongoing cost significantly once the initial setup is complete.
What if my preliminary SPRS score is very low?
A low score is diagnostic information, not a verdict. It tells you exactly where the remediation work needs to happen and how much of it there is. Organizations with scores well below the 88-control conditional certification threshold often find that a concentrated remediation effort on high-weight controls moves their posture substantially. The key is having a clear, prioritized plan rather than trying to address everything at once.
Do I need a gap analysis if I've already submitted an SPRS score?
A self-assessed SPRS score and a structured gap analysis are different things. The self-assessment may reflect your best judgment about your posture. A gap analysis validates that judgment with evidence. If you're heading toward a C3PAO assessment, the difference matters.
How often should I repeat the gap analysis?
At minimum, you should conduct a formal review annually or whenever significant changes occur in your environment: new systems, new personnel, changes in how CUI is handled, or updates to NIST guidance. Treating compliance as a continuous program rather than a periodic event is what keeps your posture from drifting between assessments.

Emily Bonnie
Senior Content Marketing Manager
Emily Bonnie is a seasoned digital marketing strategist with over ten years of experience creating content that attracts, engages, and converts for leading SaaS companies. At Secureframe, she helps demystify complex governance, risk, and compliance (GRC) topics, turning technical frameworks and regulations into accessible, actionable guidance. Her work aims to empower organizations of all sizes to strengthen their security posture, streamline compliance, and build lasting trust with customers.

Anna Fitzgerald
Senior Content Marketing Manager
Anna Fitzgerald is a digital and product marketing professional with nearly a decade of experience delivering high-quality content across highly regulated and technical industries, including healthcare, web development, and cybersecurity compliance. At Secureframe, she specializes in translating complex regulatory frameworks—such as CMMC, FedRAMP, NIST, and SOC 2—into practical resources that help organizations of all sizes and maturity levels meet evolving compliance requirements and improve their overall risk management strategy.