Skip navigation

Data and intelligence

Our approach to the use of education provider performance data

In our previous education quality assurance model, we did not routinely use structured data (internal / external) or intelligence from other organisations in our decision making. One of the pillars of our current quality assurance model is using data and intelligence to inform our regulatory decision making.

Using data and intelligence allows us to be:

  • Proactive – where data and intelligence identifies risks, we can trigger some form of engagement with education providers;
  • Risk-based – have an evidence-based understand of risks for education providers; and
  • Proportionate – use risk profiling to undertake bespoke and right touch regulatory interventions.


Our approach functions as follows:

  • we proactively source a range of key data points, which cover most HCPC-approved education providers;
  • where data points are not available, education providers can establish a regular supply of these data points (see the section below for further exploration of this area);
  • we use data when assessing education providers or programmes through approvals, focused review, and performance review;
  • within these assessments, data is not used as the final word, but as part of a quality picture – we ask education providers to consider and reflect on data points in their submissions;
  • we supply this information to our professional expert partners, with contextual information such as benchmarks, to help inform their assessment, including any specific areas from data which we need to follow up through quality activities; and
  • outside of assessments, when data points change, we can trigger interventions with education providers where we consider it necessary to inform our view of the quality of an education provider’s provision.


We use the following data areas to consider education provider performance:

  • Numbers of learners
  • Learner non continuation
  • Outcomes for those who complete programmes
  • Learner satisfaction


The use of education provider performance data has added value through to our assessments. We set up education providers to reflect on data points, and our partners to consider data through their assessment, including comparison to benchmarks and trend analysis for each data point.

Data helps us to explore specific areas with education providers through our quality activities in our assessments, and to take assurance where performance data metrics are at or above benchmarks.

 

Education providers not included in external data supplies

Where risk assessment allows, we will lengthen the period between performance review engagements up to a maximum of 5 years. To remain confident with education provider performance, we rely on regular supply of data and intelligence to help us understand education provider performance outside of the periods where we directly engage with them.

We recognise that not all HCPC-approved education providers are included in external data returns the HCPC has established linked to the normative areas noted above. Where regular supply of data points has not been established, the maximum length of time we will allow between performance review engagements is two years. This is so we can continue to understand risks in an ongoing way when data is not available.

We discussed education provider’s developing regular data reporting to HCPC in the detailed findings in appendix 2.

 

Engagement with other bodies

We have become a more active partner in the sector in the two-year period, with the aim to understand the sector to contextualise our assessments.

We have established a professional body / HCPC education forum group to share information to support and assure high quality education and training in the HCPC-regulated professions. 21 professional bodies are members of this group, and we have good attendance at regular meetings, with a standard agenda that covers developments and challenges facing education provision for the professions we regulate.

We have shared and received information with professional bodies and commissioning organisations, which has informed our assessments. Normally, this enables us to contextualise assessments (for example, where a body provides information about shortages of practice-based learning in a nation or region), and ensure we are evidence informed to the situation when making judgements against our standards.

We have established formal information sharing arrangements with two professional bodies, and are working with several others, to enable more structured and consistent information sharing through our assessments.

 

Year in registration survey

We run a yearly survey to seek the views of those who have been HCPC-registered for a year. This survey focuses on respondents’ education and training programme, how this prepared them to practice, and their first year in employment. We integrate insight from results into our education quality assurance activities, and inform focus areas for our Policy and Standards, and Professionalism and Upstream Regulation teams. For example, we used findings linked to interprofessional education and service user involvement in the academic setting to inform the questions we asked of education providers through their performance review portfolio submissions.

We most recently undertook this exercise in the summer of 2023. Over 1,200 individuals responded to this survey, across all professions and nations / regions.

We ask a set of questions focused on:

  • preparation for practice;
  • the quality of the education and training undertaken, focused on interprofessional education, programme and staff interactions, academic learning, practice-based learning, and service users involvement in the delivery of education; and
  • preceptorship and in-employment support, focused on availability, length, and quality.


In the most recent survey, agree responses significantly outweighed disagree responses for all questions, which is a positive finding. Results for education and training preparing learners for practice were particularly positive, with 8 per cent or less of respondents disagreeing with each statement.

Across the last three years, too many respondents noted they had no interprofessional education within their academic learning (which links to SET 4.9), and that service user involvement was not visible / embedded within their programmes (linking to SET 3.7).

We have developed our ask through performance review portfolios in line with these responses, and this links to the problems reported in the performance review section of this report, meaning there is still work to be done on these two areas with education providers.

Respondents were overwhelmingly likely to recommend their programme to a friend of family member, and for all three years the words ‘supportive’ and ‘challenging’ were the most used words to describe programmes.

Page updated on: 17/04/2024
Top