Evidence Library

What Commissioners Actually Measure

The KPIs that decide whether your contract gets renewed — and how to report them

Most care providers know that KPIs matter. They appear in every contract, every quarterly review, every performance notice. But when asked which KPIs their commissioner actually weights most heavily, many providers draw a blank.

That gap — between knowing KPIs exist and understanding what commissioners do with them — is where contracts quietly erode. Poor reporting does not just risk a performance notice mid-contract. It signals weakness when re-procurement arrives. Commissioners who have spent three years chasing data from you will not score your next bid generously.

This article breaks down the KPI categories commissioners across NHS trusts, local authorities, and integrated care boards (ICBs) consistently measure. More importantly, it shows how to turn live operational reporting into the evidence that wins your next tender.


The six KPI categories commissioners care about

Commissioner frameworks vary by authority and service type, but nearly all care contracts — whether domiciliary care, supported living, children’s services, or patient transport — track performance across six categories:

  1. Quality and compliance
  2. Safeguarding
  3. Staffing and workforce
  4. Complaints and feedback
  5. Outcomes and service user experience
  6. Responsiveness and operational delivery

Each category carries specific metrics. Let’s walk through them.


Quality and compliance metrics

Commissioners use quality KPIs to verify you are meeting regulatory standards and contractual specifications. These are the baseline — fail here and nothing else matters.

What they track:

  • CQC rating and inspection outcomes — your overall rating, domain-level scores, and any requirement notices or enforcement actions. Commissioners monitor rating changes, inspection outcomes, and enforcement action closely between formal reviews.
  • Internal audit results — whether you run a structured audit programme, what it covers, and crucially whether audits lead to documented improvements. A schedule of audits with no follow-through is worse than no audits at all.
  • Care plan review compliance — the percentage of service users with up-to-date, reviewed care plans. Most contracts specify review intervals (typically every 4-12 weeks depending on service complexity). Commissioners measure the percentage completed on time, not just the number completed.
  • Medication management — error rates, near-miss reporting, and the ratio of medication incidents to total administrations. In domiciliary care, this is one of the highest-risk KPI areas.

What good reporting looks like:

Do not wait for quarterly reviews to compile this data. Maintain a rolling quality dashboard that tracks CQC domain alignment, audit completion rates, and care plan review percentages month by month. When re-tender time comes, you want 36 months of trend data, not a scramble through filing cabinets.


Safeguarding metrics

Safeguarding KPIs sit in a category of their own because commissioners view them as non-negotiable indicators of organisational culture. They are not just measuring incidents — they are measuring your reporting discipline.

What they track:

  • Safeguarding incidents reported — the number of referrals made to the local authority safeguarding team, broken down by category (abuse type, neglect, self-neglect, exploitation).
  • Section 42 enquiries — how many incidents progressed to formal enquiry, and your role in the investigation process.
  • Investigation outcomes — whether investigations were substantiated, what actions you took, and whether similar incidents recurred.
  • Training compliance — the percentage of staff with current safeguarding training at the appropriate level (Level 1, 2, or 3 depending on role). Commissioners expect 95%+ compliance and will flag any downward trend.
  • DBS check currency — percentage of staff with in-date enhanced DBS checks, including update service registrations.
Low reporting is not good reporting

Commissioners know that zero safeguarding incidents in a 200-person service is statistically implausible. Under-reporting raises more concern than a provider who reports appropriately and demonstrates learning. Frame your data around reporting culture and response quality, not raw numbers.


Staffing and workforce metrics

Workforce KPIs tell commissioners whether your service is sustainable. A provider delivering good outcomes today but haemorrhaging staff is a re-procurement risk tomorrow.

What they track:

  • Vacancy rates — the percentage of funded posts that are unfilled, reported by role type. Commissioners benchmark against sector averages (Skills for Care publishes these annually).
  • Agency usage — the percentage of total hours delivered by agency or bank staff versus permanent employees. High agency dependency signals instability, particularly in supported living where relationship continuity matters.
  • Staff continuity — the percentage of service users who see the same core team regularly. In domiciliary care, commissioners increasingly measure this as a named metric (sometimes called “consistency of carer” or “familiar faces” percentage).
  • Supervision and appraisal compliance — the percentage of staff receiving supervision at contracted frequency (typically monthly for frontline, quarterly for managers) and annual appraisals completed on time.
  • Training matrix completion — mandatory training compliance across all required modules, not just safeguarding. This includes moving and handling, medication, Mental Capacity Act, and any service-specific requirements.

What commissioners do with this data:

They trend it. A vacancy rate of 12% in one quarter is unremarkable. A vacancy rate climbing from 8% to 15% over three quarters triggers a conversation. Commissioners compare your trajectory against the sector and against other providers delivering similar services in their area.


Complaints and feedback metrics

Complaints KPIs reveal how your organisation handles dissatisfaction — and whether you learn from it.

What they track:

  • Complaint volumes — total formal complaints received per period, often normalised per service user or per 1,000 hours delivered.
  • Response times — percentage of complaints acknowledged within the contractual timeframe (typically 3 working days) and fully responded to within the resolution window (typically 20 working days).
  • Resolution outcomes — whether complaints were upheld, partially upheld, or not upheld, and what remedial actions were taken.
  • Escalation rates — how many complaints escalated to the Local Government and Social Care Ombudsman (LGSCO) or the Parliamentary and Health Service Ombudsman (PHSO).
  • Trend analysis — whether complaints cluster around specific themes, locations, times, or staff members. Commissioners expect you to present thematic analysis, not just raw numbers.
Compliments count too

Track and report positive feedback with the same rigour as complaints. Commissioners reviewing your performance want the full picture. A provider reporting 14 complaints and 67 compliments in a quarter tells a different story from one reporting 14 complaints with no compliment data. Include compliments in your quarterly reports and code them by theme — they become tender evidence later.


Outcomes and service user experience

Outcomes KPIs are where commissioners assess whether your service actually makes a difference. These are increasingly weighted in both contract monitoring and tender evaluation.

What they track:

  • Service user satisfaction — results from structured satisfaction surveys, typically annual or biannual. Commissioners look at response rates as well as scores — a 95% satisfaction rate from a 20% response rate is less convincing than 88% from a 75% response rate.
  • Progress against personal goals — the percentage of service users with documented personal outcomes and the percentage making measurable progress. In supported living, this might track independence milestones. In children’s services, educational or behavioural goals.
  • Health outcomes — hospital admission rates, A&E attendance, pressure ulcer incidence, falls rates, and weight/nutrition monitoring where relevant.
  • Quality of life measures — some commissioners use validated tools like ASCOT (Adult Social Care Outcomes Toolkit) or EQ-5D. Others use locally designed frameworks. Know which one your commissioner expects.

Where providers fall short:

Most providers collect some outcomes data but fail to aggregate and trend it. Individual care plans may document progress brilliantly, but if you cannot produce a service-level summary showing that 73% of service users met or exceeded their personal goals this year, that data is invisible to commissioners — and invisible at re-tender.


Responsiveness and operational delivery

These KPIs measure whether you deliver the contracted service reliably, on time, and at the right volume.

What they track:

  • Referral-to-start times — the elapsed time between receiving a new referral and commencing the service. Contract targets vary but 48-72 hours for domiciliary care and 5-10 working days for supported living are typical.
  • Missed and late calls — in domiciliary care, this is often the single most scrutinised KPI. Commissioners track the percentage of calls delivered within the contracted time window (typically +/- 15 or 30 minutes) and any calls missed entirely. Electronic call monitoring (ECM) data feeds directly into this.
  • Hospital discharge targets — for providers involved in discharge-to-assess or intermediate care pathways, commissioners measure your ability to accept and commence packages within the agreed timeframe (often same-day or next-day).
  • Contract utilisation — the percentage of contracted hours or placements actually delivered. Under-utilisation can signal operational problems; over-utilisation without agreement signals poor boundary management.
  • Reporting timeliness — whether you submit your KPI reports on time. This meta-KPI matters more than providers realise. Late reporting is one of the most common triggers for enhanced monitoring.

Turning operational data into commissioner-ready reporting

Collecting KPI data is necessary but not sufficient. Commissioners want reporting that is structured, contextualised, and honest.

Structure your reports consistently. Use the same template every quarter. Commissioners who monitor 20+ providers appreciate consistency — it lets them compare quickly. Include a summary dashboard on page one, detailed metrics in the body, and an improvement actions section at the end.

Contextualise every number. A missed call rate of 2.3% means nothing without context. Is that against a target of 1%? Is it up or down from last quarter? How does it compare to the previous year? What caused the variance and what are you doing about it? Raw data without narrative is lazy reporting.

Be honest about underperformance. Commissioners respect providers who identify their own weaknesses and present credible improvement plans. They do not respect providers who spin every number as positive. If your agency usage spiked to 22% because three staff left in the same month, say so — and show the recruitment plan.

Automate where possible. If you are still compiling KPI reports from paper records and spreadsheets, you are spending time on data entry that should go into analysis. Care management platforms (e.g. Access, Nourish, CarePlanner, Birdie) can generate most of these metrics automatically. The investment pays for itself in reporting quality.


The evidence loop: how today’s KPIs win tomorrow’s tenders

Here is where KPI reporting connects directly to tender success. Every metric you report to your commissioner today is potential evidence for your next bid.

When a contract comes up for re-procurement, the evaluation panel wants to see that you can deliver the service. The strongest evidence is not what you promise — it is what you have already done. Thirty-six months of KPI data showing consistent performance, honest reporting, and genuine improvement is more persuasive than any amount of bid writing polish.

Build the loop:

  1. Capture KPIs monthly in a structured format, even if the commissioner only requires quarterly reporting.
  2. Store them in your evidence library — tagged by KPI category, service type, and commissioner area.
  3. Track contract milestones in a shared tracker so you know exactly when each contract enters its re-procurement window. Record win/loss reasons and renewal milestones against each opportunity.
  4. Review KPI trends before bid writing starts. When you know a re-tender is 12 months away, pull your KPI history and identify the strongest performance stories. That is your evidence base.
  5. Reference specific data in your bids. Instead of writing “we have low staff turnover,” write “our staff turnover across this contract has averaged 14% annually over three years, against the sector average of approximately 28% (Skills for Care).”

If you want a deeper framework for structuring this, our guide on defending your contract at re-tender covers the strategic approach to incumbent advantage.

Start the evidence loop now, not at re-tender

The most common mistake we see is providers waiting until re-procurement is announced before gathering evidence. By then, it is too late to fill gaps. If you have 18 months of strong KPI data but a 6-month hole where reporting lapsed, that gap will cost you marks. Consistent capture from day one is the only approach that works.


What weak reporting signals at re-tender

Commissioners talk to each other. The contract manager who received your quarterly reports for three years will brief the procurement team evaluating your re-bid. Here is what weak reporting signals:

  • Inconsistent submissions — quarterly reports that arrive late, use different formats, or skip sections suggest an organisation that treats compliance as an afterthought.
  • Data without analysis — tables of numbers with no narrative, no trend commentary, and no improvement actions suggest you collect data because you have to, not because you use it.
  • Defensiveness about underperformance — reports that explain away every negative metric without acknowledging genuine issues suggest a provider that lacks self-awareness.
  • No outcomes data — output metrics (hours delivered, calls completed) without outcomes data (service user progress, satisfaction, health improvements) suggest a provider focused on volume, not quality.
  • No evidence of learning — the same issues appearing quarter after quarter without visible improvement activity suggest a provider that has stopped investing in service development.

At re-tender, evaluators will look at your bid and mentally cross-reference it against what the contract manager experienced. If your bid promises “robust governance and continuous improvement” but your quarterly reports told a different story, the scores will reflect reality, not aspiration.


Making KPIs work for you

KPI reporting is not a compliance burden — it is the mechanism by which you build your re-tender case in real time. Every quarterly report is a draft chapter of your next bid.

Start treating it that way. Structure your data, contextualise your numbers, capture everything in your evidence library, and track your contract renewal milestones so nothing catches you off guard.

The providers who win re-tenders are not always the ones with the best KPIs. They are the ones who reported honestly, improved consistently, and turned three years of operational data into a compelling performance narrative.

Need help turning your KPI data into tender-winning evidence?

We help care providers structure their performance reporting and build evidence libraries that translate directly into stronger bids. If your next re-tender is on the horizon, let’s make sure your data tells the right story.

Book a free call

Want an honest steer on your next bid?

Book a free tender strategy call if you want help deciding whether to bid, what support makes sense, or which resource to use first.