Follow

Programme Evaluation

Ref/ ACSST/OPS/SOP/ProgEval100215

SOP – Programme Evaluation

Issue

Programme Evaluation is a process by which content, design and delivery of Pamoja Education (PJE) courses are critically examined, both quantitatively and qualitatively, using key data points collected during the yearly educational cycle (from course opening in September through to the following August). The outcome of Programme Evaluation is a report that describes the Programme Evaluation findings and conclusions as well as identifying key actions and strategies for programme improvement and enhancement. These actions address issues found to detract from course quality, identify potential enhancements to course content, design and delivery as well as promote possible improvements to operations that will best serve students and school enrolled in PJE courses.

 

Rationale

The Programme Evaluation process considers both quantitative and qualitative data. The process of Programme Evaluation has the following steps that will be considered in more detail below:

  1. Data Collection
  2. Data Summary
  3. Data Analysis
  4. Action
  5. Reporting

Data analysis takes place for individual data points and in the context of other data. The goal of the analysis is to give a holistic view of how PJE courses are delivered, the effectiveness of PJE courses in retaining students, the academic performance of students in PJE courses and how PJE courses are perceived by teachers, schools and students. Analysis also uses data collected in previous academic cycles. This analysis aims to identify trends over time in student retention and performance as well as efficacy in specific changes made to courses and procedures.

The analysis of the data collected during the academic cycle will lead to a series of conclusions, actions and a final Programme Evaluation report.

 

Data Points

There are two key types of data points in Programme Evaluation. These are quantitative data collected as Business Intelligence during the academic cycle and qualitative data collected by means of reports and surveys from PJE stakeholders.

Quantitative Data

  • Student Profiles
  • Enrolment Data
  • User Behaviours
  • Drop Reasons Data
  • Student Outcomes
  • User Satisfaction

Qualitative Data

  • Student Surveys
  • Teacher Surveys
  • Site Based Coordinator (SBC) Surveys
  • Academic Services Report (including Heads of Departments, Head of Academic Services, Curriculum Advisor and Faculty Advisor)
  • Course Development Report
  • School Services Report
  • Business Development Report

 

 

Data Collection

  • Quantitative Data

Business Intelligence (BI) data is collected during the academic cycle either on a weekly basis or as soon as it becomes available. These data will be monitored on a regular basis by PJE staff and action taken on a needs basis. This is not directly part of the Programme Evaluation process although the PJE department in question will record immediate action taken on BI data and include this in the relevant report for Programme Evaluation. Three weeks after courses close in June a final data report will be available for each of the data points listed above, apart from final IB grades that will be available in July. These data reports will record final numbers and where data was reported on regular basis will have a track of the data over the academic cycle.

  • Qualitative Data

This data is collected at specific points during the academic cycle. The data collected may include quantitative elements but there are open questions and the issues considered may require a narrative response. The data takes two clear forms; surveys and reports:

  • Surveys are conducted with Site Based Coordinators (SBCs), teacher and students and aggregated and reported on by the appointed external programme evaluator (at the time of writing Susan Lowes of Colombia University).
  • Feedback on SBC, student and teacher survey reports will be required from HoDs, FA, CA, HoAS, SST and Dev.
  • Reports are written by members of the different PJE departments responsible for course development and delivery
    • Academic Services: Heads of Department write a report on the courses they coordinate and their teaching teams on a semester basis. These are shared with the Curriculum Advisor and Head of Academic Services. The Curriculum Advisor aggregates these reports in a final summary HoD report in June. The academic services team as a group also writes a report on the academic cycle for the year to be submitted in June that will include commentary on the aggregated HoD reports.
    • Course Development Team: The development team writes a report based on key issues arising during the academic cycle. This will report on course development, course maintenance, the Learning Management System, other key operational systems and any technical issues arising. This report will also be important in setting the course enhancement agenda that may arise from the Programme Evaluation process.
    • School Services Report: The School Services Team (SST) play a key role in the delivery of PJE courses as well as following the life cycle of students from marketing, enrolment to final outcome. The input of the SST is a key element in the Programme Evaluation process, in particular when considering the most effective and efficient procedures available to PJE.
    • Business Development (BD) Team: The Business Development team have direct experience of potential students and actual PJE students and so a report on issues that relate to student retention and academic performance are important.

Data Point

Conducted

Reporting Deadline

Author

SBC Survey

Jan/Feb

March

External Evaluator

Student Survey

April

June

External Evaluator

Teacher Survey

April

July

External Evaluator

HoD Report (Sem 1 & Sem 2)

Dec & June

June

Curriculum Advisor

Academic Report

April/May

July

HoAS (with CA and FA)

Course Development Report

April/May

July

Head of Course Development

School Services Report

April/May

July

HoSS

Business Development Report

April/May

July

HoBD

 

Data Summary

A key part of the Programme Evaluation process is taking the data points and summarising them in a form where analysis can take place. Each major quantitative data point will have key measures associated with it and qualitative data will have summary reports as defined by the table in the appendix. Most of the quantitative data points will be selected from the Business Intelligence data that is collected throughout the year and the IB outcomes will be collected at the release of IB results in July. Note a separate document that deals with the release of IB data (see - PJE IB Academic Results Analysis of Outcomes Phase1 & 2 040215)

In order to understand the detail for the focus of summary for each data point, the timing of the summary and the measures to be used refer to the appendix at the end of this SOP.

 

Data Analysis

The data analysis will have begun once summary data points are available from Business intelligence and as surveys are published in June. The process will continue with a meeting with Susan Lowes in July and as the IB results for the current cohort have been issued and summarised. Data analysis will have the following actions:

  • Review of key data points from Business Intelligence (June)
  • Review of user behavior; SBC, student, teacher surveys; academic, development and BD reports with Susan Lowes (July).
  • Review of IB results and cohort retention rates (July).
  • HoD Feedback on IB results (July)
  • Publication of initial analysis for review by Academic Services, Course and Business Development (August)

The analysis of the data will take the following forms:

  • Comparison of individual data points in a time series from previous academic years (where available).
  • Consideration of the change in a data point during the course of an academic year (e.g. engagement rating)
  • Analysis of data points against each other (e.g TG7 v IB results, IB results v Retention Rates, Satisfaction Ratings v TER)
  • Any patterns of data that are unexpected or contradict perception/other data.
  • Any patterns of data that confirm perception/other data.
  • Any data point that are not available that would enhance future data collection.

The initial analysis of the data will propose a number of conclusions that will be reviewed and approved by Academic Services, Course and Business Development. These conclusions and the analysis that supports them will form the body of the Interim Programme Evaluation report which will be in draft form no later than XX date.

 

Action

Initial Programme Evaluation Analysis will also produce a number of proposed action points for discussion. These actions points will be responses to the conclusions that have been identified from data analysis and focus on each of the key reporting departments: Academic Services, School Services, Course Development and Business Development. These action points are to address urgent issues identified during the Programme Evaluation analysis and set the agenda for programme improvements and enhancements for the coming year. A formal Programme Evaluation meeting is held in October to finalise the actions points to be published in the Interim Programme Evaluation Report v1.0. These action points will have timelines and their implementation will be evaluated during subsequent Programme Evaluation cycles.

 

 

 

 

 

 

Reporting

The timeline for the production of the Programme Evaluation report is as follows:

 

Report Stage

Deadline

Initial Programme Evaluation Analysis

August 2015

Interim Programme Evaluation Report v0.1

September 2015

Interim Programme Evaluation Report v1.0

October 2015

Programme Evaluation Report v1.1

November 2015

Programme Evaluation Summary Report v1.1

November 2015

 

The Interim Programme Evaluation report v0.1 will contain the following sections:

  • Introduction

This chapter gives an overview of the aims and methodology of Programme Evaluation

  • Purpose
  • Data Type
  • Source
  • Collection Method
  • Programme Evaluation Data Sets

This chapter will give an overview of the data from each of the following.

  • Student Profile Data
  • Enrolment Data (including retention rates)
  • User Activity
  • User Outcomes
  • Survey Data
  • Reports
  • Analysis

This section will reflect to a large extent the initial analysis published in August and have at least the following sections. There may be others that come about due to the analysis of the data.

  • Introduction
  • Student Learning
  • Student Retention
  • Course Delivery
  • Interactions
  • Time Management and Study Skills
  • Red Alerts
  • Next Steps

This chapter will focus on the actions to be taken as described in the next section.

  • Business Development and School Services
  • Academic Services
  • Course Development
  • Summary

 

 

 

 

Appendix

Data Point

Focus

Timing

Owner

Measures

Student Profiles

·              Global distribution of PJE students

·              Student circumstances (SIS flags)

Also a graph/bar chart of percentages on year on year for the above.

1.      Course Opening Y2 (30th Sept 14)

2.      End of Drop Period Y1 (30th Sept 14).

3.      End of Course Y2 (10th June 2015)

4.      End of Course Y1 (10th June 2015).

 

Curriculum Advisor

·              Raw Student Numbers

·              Percentages out of total students

 

Enrolment Data

·              Y1 Retention

·              Y2 Retention

·              Diaspora

·              Cohort Retention

Note that raw numbers for the following will be available:

·              Y1 End of Drop

·              Y1 Transfer

·              Y1 Complete

·              Y2 Roll Over

·              Y2 Transfer

·              Y2 Complete

Also a graph/bar chart is required year on year for

·              Y1 End of Drop

·              Y1 Complete

·              Y2 Roll Over

·              Y2 Complete

 

 

 

 

 

 

 

 

1.      Y2 & Cohort Retention – 10th June 2015

2.      Y1 Retention - 10th June 2015

3.      Diaspora – 10th June 2015

4.      Raw Data – 10th June 2015

5.      Graphs of enrolments – 10th June 2015

Curriculum Advisor.

·              All PJE students for the M2015 cohort (Y2 and Cohort Retention).

·              Each course for the M2015 cohort (Y2 and Cohort Retention).

·              All PJE students for the M2016 cohort (Y1Retention).

·              Each course for the M2016 cohort (Y1Retention).

Data Point

Focus

Timing

Owner

Measures

User Behaviour

·              Access (log ins)

·              Views (e.g. content) announcements, gradebooks)

·              Interactions (emails, posts, BBB attendance)

·              Y2 – 13th May 2015

·              Y1 – 24th June 2015

Curriculum Advisor

·              Raw numbers for each data point for PJE and by course.

·              Average per completed student over the duration of the year (Y1 & Y2) for each data point.

·              Raw numbers by weeks for each data point for PJE and each course.

·              Graph of the raw number over the year for PJE and each course.

Drop Data

·              External Reasons

·              Course Specific Reasons

·              Style of Learning Reasons

·              Personal Reasons

·              Y2 – 13th May 2015

·              Y1 – 24th June 2015

School Services Manager

·              A breakdown of the raw numbers into the classified drop reasons

·              Percentage of students both within the set of “drops” and the number of students overall in PJE and each course.

Student Outcomes

·              Attainment in PJE courses (TG7, Final TER)

·              IB Outcomes (IA, PG, Final Grade)

·              PJE attainment and engagement – 13th May 2015

·              IB scores and results (Phase 1) – 17th July 2015

·              IB scores and results (Phase 2) – 4th September 2015

Curriculum Advisor

·              PJE attainment (1 – 7) measured as an average for PJE and each course.

·              PJE attainment and engagement distribution of grades measured as percentage of total enrolled students in the cohort for both PJE and each course.

·              IB Results (see PJE IB Academic Results Analysis of Outcomes Phase1 & 2   040215)

User Satisfaction

·              Course Satisfaction

·              Service Satisfaction

·              Training Satisfaction

I am not sure how this is going to be done and need more information.

School Services Manager

·              Different levels of satisfaction from total responses as percentages.

SBC Survey

·              SBC circumstances (role, rewards etc)

·              Approaches to Monitoring

·              Use of LMS data

·              Use of PJE reports

·              Grading and reporting in client schools

·               

·              Report released – March 30th 2015

External Evaluator

·              Final report with recommendations from External Evaluator.

Student Survey

·              To be discussed

·              Report released – May 31st 2015

External Evaluator

See discussion document “Student and Teacher Surveys 2015” (to be written)

Teacher Survey

·              To be discussed

·              Report released – June 20th 2015

 

 

 

 

 

External Evaluator

See discussion document “Student and Teacher Surveys 2015” (to be written)

Data Point

Focus

Timing

Owner

Measures

Academic Report

·              To be discussed but might include the following

·              Teacher behaviours

·              HoD performance

·              Overview of dept issues

·              Teacher Training

·              PD

·              Policy and SOP development and requirements

·              July 1st 2015

HoAs

Relevant KPIs

Development Report

·              To be discussed

·              July 1st 2015

Head of Course Development

Relevant KPIs

School Services Report

·              To be discussed

·              Student recruitment

·              Student enrolment

·              Transfers

·              Student Engagement

·              School relationship development

·              July 1st 2015

Schools Services Manager

 

Relevant KPIs

Business Development Report

·              To be discussed

·              New business overview

·              School relationship development

·              Transfers

 

 

·              July 1st 2015

Head of BD

Relevant KPIs

 

 

 

 

 

 

 

 

 

 

 

 

Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.
Powered by Zendesk