9 Min Read

CQC's new assessment approach - what providers need to know

By Gill Weatherill, Corinne Slingo, Tracey Longfield, Anna Hart, Belinda Dix and Bena Brown Gill Weatherill & Corinne Slingo


Published 09 February 2018


It's been a long time in the planning, but CQC's new approach to assessing health and social care providers is about to become a reality.  

Is your organisation up-to-speed with what's changing and when?

In this briefing, we look at the headline changes and what they might mean in practice for providers and their ratings.

CQC's new approach - what's changing (and what's not)?

What's staying the same?

Importantly, the central pillars of the CQC's assessment approach - i.e. the 5 key questions (are services 'safe', 'effective', 'caring', 'responsive' and 'well-led') and the ratings scale ('outstanding', 'good', 'requires improvement' and 'inadequate') - are here to stay. 

Similarly, the CQC's civil and criminal enforcement powers are staying the same.

It's the mechanics of how CQC assesses and rates against the 5 key questions that is changing.

Ongoing assessment in, snapshot inspections out

What health and social care providers have been used to is a regulatory system which relies heavily on CQC inspections carried out at a fixed point in time and followed by an inspection report which may contain revised ratings.

Under the new system, however, on-site inspections will just be just one amongst multiple ways in which CQC collects evidence about the quality and safety of care - hence the clear shift in emphasis under the new system from 'inspection' to 'assessment'. 

Whilst CQC has said that a provider's previous ratings will no longer be the main driver behind the timing and frequency of assessments, it is not yet clear what the triggers/criteria will be for CQC to decide to carry out an assessment against one or more of the quality statements, although it is likely that level of risk will be a key factor.

A central feature of this new system will be CQC's ability to change a provider's ratings at any time according to their assessment of evidence about quality and safety, without those changes being tied to an inspection in the sense providers have been used to.  However, the nuts and bolts of how ratings changes will work in practice (including how providers will be able to challenge such changes) are still unclear and providers may understandably be apprehensive about this aspect of the new system, especially given how business-critical CQC ratings can be.

Quality statements in, KLOEs out

In place of the current system of Key Lines of Enquiry (KLOEs) and prompts, the CQC's new Single Assessment Framework is built around 'quality statements' (also referred to as 'we' statements), which describe what people should expect a good service to look like.  The quality statements relate to a total of 34 topic areas across the 'safe', 'effective', 'caring', 'responsive', 'well-led' key questions.

To assess how services measure up against these quality statements, CQC will use 6 categories of evidence. The categories are: people's experiences of health and care services; feedback from staff and leaders; feedback from partners; observation; processes; and outcomes.  CQC's website lists examples of the types of evidence that will fall within each of the evidence categories.  Importantly, not all 6 evidence categories will be relevant for every quality statement in every type of service and CQC's website also includes detailed guidance on evidence categories for sector groups which sets out the categories of evidence CQC will usually look at for each of the quality statements as they apply to different types of service (e.g. NHS acute hospital services, mental health services, ambulance services and care homes/supported living services).

In terms of reports produced following assessments carried out under the new framework, CQC has said these will be shorter and simpler than the inspection reports providers are used to, but we will need to wait and see what this looks like in practice.

Scoring system in, ratings characteristics out

Ratings are currently decided by reference to the CQC's 'ratings characteristics' which describe in text form what 'outstanding', 'good', 'requires improvement' and 'inadequate' look like for each of the KLOEs. Like the KLOEs, however, 'ratings characteristics' will be a thing of the past under the new system.

In their place, CQC is introducing a scoring system which will involve using a 4-point scale to assign a score of 1 (evidence shows significant shortfalls), 2 (evidence shows some shortfalls), 3 (evidence shows a good standard) or 4 (evidence shows an exceptional standard) for each of the evidence categories which apply to the particular quality statement being assessed.  The quality statement scores will be combined to give a total score for the relevant key question. In turn, this score will generate a rating for each of the safe, effective, caring, responsive, and well-led key questions and CQC will then aggregate the key question scores to give an overall service level rating.  A worked example on CQC's website provides more detail about how this scoring system will work, including how CQC will convert the scores into ratings.

Crucially, to have a chance of a 'good' or 'outstanding' rating for a particular key question, providers will need to perform well across all the quality statements that apply. This is to 'make sure any areas of poor quality are not hidden'.  For example, if a key question score is within the 'good' range, but there is a score of only 1 for one or more quality statements, the rating for that key question will be limited to 'requires improvement'.  Similarly, if the key question score is in the 'outstanding' range, but there is a score of 1 or 2 for one or more quality statements, the rating will be limited to 'good'.

CQC says it will initially only publish the ratings themselves, but in future intends to also publish the scores sitting behind those ratings. This should allow for greater transparency in terms of where a provider sits within the ratings bandings - e.g. for a rating of 'good', the score will indicate if this is either in the upper threshold, nearing 'outstanding' or in the lower threshold, nearer to 'requires improvement'.

When will these changes happen?

At time of writing, CQC says it is still developing further guidance to flesh out some of the detail around how the new approach will work in practice.

However, CQC's plan is to go live with the new Single Assessment Framework from 21 November, starting in its South region (which comprises Berkshire, Buckinghamshire, Cornwall, Devon, Dorset, Gloucestershire, Hampshire, Kent, Oxfordshire, Somerset, Surrey, Sussex and Wiltshire).  Between 21 November and 4 December CQC plans to undertake a small number of planned assessments with 14 early adopter providers, whilst 'continuing to respond to risk'.  CQC says it will then expand the new assessment approach to all providers based on a risk-informed schedule.

CQC says it will be in touch with providers in other areas of the country to confirm when it will start using the Single Assessment Framework for them, having previously expressed the goal of the new approach being fully rolled out across England by the end of March 2024. 

Impact in practice?

We will need to wait and see how CQC's new assessment approach works in practice to really get a feel for the opportunities and challenges it presents.

Here are some thoughts given what we know at this stage:


  • Potential benefits to providers of a more flexible system compared with being 'stuck' with reports and ratings that reflect a fixed moment in time, potentially several years ago. This could be particularly important for providers in the 'requires improvement' banding who have been waiting a long time to be re-inspected.
  • Increased certainty and transparency for providers in terms of the types of evidence CQC will always look at for particular quality statements in particular types of service.
  • Increased transparency via the scoring system in terms of how a particular rating was arrived at and where within a ratings banding the service/provider sits. This will help to inform decisions on whether to challenge a rating (subject to clarification of how such a challenge might happen) and which specific areas to focus on within the challenge.
  • The new quality statements include some new areas of assessment which some providers may view as an opportunity to demonstrate high performance, for example in relation to use of technology and ESG considerations.


  • Does the new assessment framework provide enough information about what CQC is looking for to enable providers to benchmark themselves against the new quality statements given how much less detailed these are compared with the old KLOEs and prompts?
  • Although CQC says that all the evidence categories are equally weighted, there is more focus under the new framework on 'people's experiences' than before and, given that people tend to be more enthusiastic about complaining than complimenting, checking against other evidence sources will be key to a balanced approach by CQC.
  • Current uncertainty about the mechanism for challenging assessment reports and scores/ratings - the existing factual accuracy check process is very much based on analysis of lengthy reports relating to a fixed-point-in-time inspection. The current system does not allow scope for challenging judgements made by inspectors during the inspection process and there is no indication that this will change under the new system which means that those frustrations may remain for providers who feel that judgements have been disproportionate or inconsistent.
  • Will the new scoring system make it harder for providers to challenge ratings derived from that scoring system?
  • Will services that are improving and therefore want their ratings to go up get a raw deal under the new system given that CQC's resources tend to be focused on areas of risk?
  • Will anything be included within the new system to support consistency of approach across different services and geographic areas? This is a point that can be frustrating for multi-site providers who experience different approaches in different services and areas.
  • Subject to how quickly the new system is implemented, many providers may be required to navigate the new and old system for an extended period of time which creates an administrative and governance burden and may also deliver different outcomes for similar services.

How we can help

Our experienced team of health and social care regulatory lawyers work with organisations across the independent and public sector to provide advice and support on the full range of issues that CQC regulation brings with it, including the impact of changes to the regulatory landscape.