Information for:

National strategy: analysis of responses to call for evidence

OFFA and the Higher Education Funding Council for England (HEFCE) are working with a range of stakeholders, including higher education providers, students and other organisations to develop a national strategy for access and student success. As part of this, in December 2012 and March 2013 we called for evidence about effective approaches and practices in widening access and ensuring success across the student lifecycle. 

The call for evidence is now closed. The evidence has been analysed with a view to identifying best practice and to gain an insight into the work under way in the sector.

Analysis of responses

By Ceri Nursaw, Nursaw Associates, October 2013

Number of submissions received

Eighty four organisations submitted 249 case studies, representing the full breadth of the sector and its partner organisations (duplicate submissions were counted only once). This included 51 per cent of the 149 universities and colleges that had an access agreement in 2012-13.

Two out of 25 further education colleges (FECs) and others with access agreements submitted information. Seventy four of the 84 organisations (88 per cent) that submitted information were higher education institutions (HEIs). Fifty HEIs did not submit information.

How submissions were assessed

Case studies were assessed on a range of criteria:

Specific case studies were chosen for inclusion in the national strategy using the criteria above, with the aim of providing a wide range of new examples of organisational activity to the sector.

How evaluation in submissions was assessed

The evaluation included in case studies was assessed using the following criteria: excellent; acceptable; limited in scope; or none/poor.

From the 249 case studies, 48 (19 per cent) were of the highest quality and demonstrated a robust approach to evaluation. Of these, 16 were submitted by Russell Group institutions. Over half of all submissions (141) were of excellent/acceptable quality.

Twenty case studies (8 per cent) noted that their evaluation systems were in place but had yet to produce results. OFFA and HEFCE’s messages regarding the importance of evaluation seem to have been well received, with many institutions ‘just starting’ the process of introducing robust evaluation systems.

Forty case studies (16 per cent) did not include any evaluation; the majority of these consisted of descriptions of activity and the ways in which participants were targeted. These case studies were submitted by 19 organisations (23 per cent).

Methods of evaluation in submissions

During the analysis of responses, attempts were made to categorise how often organisations reviewed their activity. This was not always made clear in the information submitted. It is difficult to draw definitive conclusions, but it would appear that in some cases the evaluation submitted by organisations was a one-off, without regular review mechanisms in place.

The information submitted varied greatly. There was much variation in the quality and depth of evaluation, between institutions and even between case studies that had come from different parts of the same institution (in the majority of cases, this was larger institutions). This suggests that greater support might be needed to help institutions standardise their evaluation methodologies.

Many organisations focused solely on statistical data (for example, numbers progressing to higher education) or attitudinal information (for example, “university could be for me”). Twenty-eight case studies (11 per cent) used a combination of this information. Attitudinal information is important but becomes more powerful when combined with statistical data about behavioural change (e.g. recording the number of students with a positive attitude towards university becomes a more robust method of evaluation when compared to the number that actually progress to university).

The questions asked in institutions’ attitudinal surveys and questionnaires were very similar. The use of standardised questions across the sector could be encouraged, to assist in creating a national picture.

Most institutions undertook their own evaluation (only seven case studies cited external agencies). This suggests that sharing best practice could have a significant effect across the sector.

Collaborations between local authorities and higher education providers resulted in more robust evaluation in case studies as they involved detailed tracking of individuals and their progression through education. The data recorded in these databases is very rich and provides a relatively straightforward mechanism with which institutions can evaluate the impact of different activities.

Some universities mentioned establishing an institution-wide framework for evaluation.

Activity included in submissions

Institutions provided examples of activity covering the breadth of the student lifecycle, from outreach through to admission and employability. While it is recognised that information was provided voluntarily and would be difficult to extrapolate too much, the analysis of institutional responses suggests some interesting areas for further work and consideration.

As the table below shows, most activity focused on outreach work targeted at young people: 37 per cent of examples focused on outreach work targeted at post-16 year olds and 31 per cent focused on outreach targeted at Years 9-11. Eighteen per cent of examples focused on induction and retention, and 17 per cent on recruitment. Limited numbers of examples focused on employability (4 per cent) raising attainment (5 per cent). No examples were given which related to postgraduate study.

Focus of activity described Number of submissions Percentage of submissions received
Post-16 year olds 91 37%
Years 9-11 77 31%
Years 7-9 33 13%
Primary schools 16 6%
Collaborative programmes with other HEIs 16 6%
Equality and diversity 17 7%
Young people in care or care leavers (these were often listed as a target group) 12 5%
Employability 10 4%
Student success 17 7%
Retention and induction 42 18%
Admission programmes (e.g. compact schemes or preparatory programmes) 28 12%
Recruitment 44 17%
Subject based activities (primarily GCSE mathematics) 12 5%
Adults over 21 29 11%
Postgraduate study 0 0%

Evaluation of activity included in submissions

Twenty-nine examples described activity with mature students (adults aged 21 or over), yet very few of these demonstrated high quality evaluation. Instead their evaluation tended to focus on progression to particular institutions and limited satisfaction surveys.

The majority of the subject-based activities designed to raise achievement in examination grades focused on GCSE mathematics. It was clear that these programmes were, in the main, having a positive effect on grades.

The vast majority of admission-based programmes (typically compact schemes) were cited by Russell Group institutions. However, during the analysis of responses, it became clear that other universities are increasingly adopting compact and admissions programmes to support entry to higher education.

Twenty four examples of activity (10 per cent) claimed to have a national reach. These examples were cited by the Open University and Russell Group institutions, reflecting their national focus.

Areas for possible future development/investigation

There were some gaps in the coverage of the call for evidence (for example, the evidence around some subject areas and some types of specialist institution). OFFA and HEFCE may wish to target these gaps in future calls for evidence.

A number of themes can be drawn from the calls for evidence which may help inform future work within the sector. These themes suggest that in HEFCE and OFFA’s ongoing approach to gathering evidence, they could encourage: