Design & Delivery of Services

Education services are complex and its development comprises different core phases, such as design, planning and delivery, including preparation and delivery of classes, assessment of learning and certification of competences and qualifications achieved by the learners.

Given this complexity, the overall quality of educations depends on many aspects and each step of the process(es) associated plays an important role, as the quality of each process/activity output will influence in different manners the quality of the service as a whole.

For this reason, both EQAVET:2019 and ISO 21001:2018 contain clear and at times, highly prescriptive, requirements to assure these different phases are conducted in such a way to contribute to the excellence of the final results.

Examples are:

  • The ISO 21001:2018 requirements for the minimum information to be considered while designing educational services – being at the level of programmes, courses and/or classes;
  • The emphasis both EQAVET:2019 and ISO 21001:2018 put on adopting a learner-centered approach, including consideration of the needs of learners with special needs (on both ends of the spectrum – impaired learners and gifted learners, as all may require adaptations to the learning services being provided);
  • The added-value of controlling all phases of the service through a systematic approach, including planning and monitoring, considering documented information such as procedures and forms to support the operations and documented information, such as records, to provide evidence regarding to which degree planned results are being achieved;
  • The need to assure the reliability of the assessment of learning process, based on the fitness-for-purpose of the instruments used in it, which need to be capable of providing valid results regarding what learners actually learned.

EQAVET:2019 and ISO 21001:2018 requirements also cover educational services that are provided in partnership with the working market, such as apprenticeships, traineeships and internships, but given the recent  tools published by other ERASMUS+ research consortia and by ISO to control design, delivery and quality control of work-based learning (WBL), we recommend the following sources as complementary information to the VET21001 Toolkit:

For WBL in the healthcare sector:

For cross-sector WBL:

Criteria

2.5. VET providers’ programmes enable learners to meet the expected learning outcomes and become involved in the learning process
2.6 VET providers respond to the learning needs of individuals by using a learner – centred approach which enable learners to achieve the expected learning outcomes
2.7. VET providers promote innovation in teaching and learning methods, in school and in the workplace, supported by the use of digital technologies and online-learning tools
2.8. VET providers use valid, accurate and reliable methods to assess individuals’ learning outcomes
8.3 Design and development of the educational products and services
8.5 Delivery of the educational products and services
8.6 Release of the educational products and services

VET21001 Tools

This tool presents the relevant items for the creation of new qualification profiles namely:

  • The area of education and training
  • The identification of the qualification code (according to the national qualification catalogue) and of the designation
  • If it is a new qualification or a change
  • If there are regulated jobs, connected to the qualification
  • If there is the possibility of obtaining partial certifications
  • If applicable, the entry into force date
  • The type of structure (Short length training units – SLTU, compulsory and chosen)
  • The number of hours of the technological component, divided between compulsory and chosen
  • A general description
  • The activities or the tasks
  • The knowledge
  • The Do/Doing
  • The soft skills/interpersonal skills
  • The code, designation and number of hours of the SLTU, specific and common to other qualifications
  • Comments

Download this tool in editable format

Download an example of the use of this tool

This tool carries a mechanism that enables users to design and/or update VET programmes to better address current and future student and labor market needs, thereby providing stakeholders with the most recent information on market developments and requirements. This tool is therefore to be used as a template for the creation of the proposal of a new programme or the updating of an existing one. Most of the sections require the user to describe the new programme/update at length, including detailed information on separate sheets and attaching them to this form to compile the report.

The salient reason for this is the integration of identification of training needs within the design, development and approval process of programmes, as well as during the cyclical review process. As per college policy and procedure, the design, development and approval of new accredited programmes and the review of existing programmes are key processes of any educational institution. Interested parties/proposers need to submit a form that indicates key information that is required to be submitted in the proposal. This includes Internal and external stakeholder feedback from, for example, students, alumni, industry, employers, NGOs,  etc… This process involves collaboration between various offices/departments within the institution to ensure that the programme being developed is of the best quality and suitable for the needs of the target audience. .

The intention of this tool  is to create different platforms whereby key stakeholders in education and training work together to develop programmes that are relevant and sustainable and that recognize the input from industry.

Stakeholders including academics, students and those hailing from industry and NGOs, etc., are asked to provide feedback on the level of the programme, content, learning outcomes, delivery approach, relevance and other general comments. This data is to be consolidated input for subsequent programme development and review.

Download this tool in editable format

Download an example of the use of this tool

Download another example of the use of this tool

The template for course curriculum was designed in a way to allow the educational organizations to identify critical aspects of the curricula, which not only support class planning, but also serve as a tool to communicate with learners and other beneficiaries.

The template is self-explanatory and contains fields to describe:

  • Course identification
    • Course name and when applicable, the programme of study to which it belongs
    • Pre-requirements, when applicable
    • Level, by reference to the European Quality Framework (EQF) or equivalent (e.g. ISCED), when applicable
    • Form of delivery (presential, e-learning, blended)
    • Timeline (delivery dates and when applicable, reference to semesters or other timeline identification)
  • Responsibilities
    • Curriculum Authors
    • Teaching team
    • Any applicable Learner Support contacts (Programme, Administrative, IT Support, etc.)
  • Qualification and certification information
    • Form of certification
    • Credit System and number of credits earned, when applicable (e.g. ECVET, ECTS, etc)
    • Contact hours and, when applicable, autonomous study hours expected from learners
  • Pedagogy
    • Learning objectives
    • Pedagogic methods
    • Form of learning assessment and, when applicable the weight of each learning assessment element
    • Bibliography (documents, articles, standards, books, podcasts, videos, etc.)

Download this tool in editable format

Download an example of the use of this tool

Download another example of the use of this tool

The most important activity of monitoring and measuring in education is related to the assessment of learning – in other words, determining what learners have learned. The results of these activities provide relevant information to modulate the learning services – in the case of formative assessment – and to decide on the certification of qualifications – in the case of summative assessment. Given the impact of these decisions on the conformity of the learning services, educational organizations need to trust the results provided by the assessment of learning instruments they use. However, there are many threats to its reliability and, consequently, to the validity of its results:

  • Each learner may have a better or worse performance depending on the type of questions chosen for a given test/exam – e.g. due to individual talents, individual communication styles, personality, stress, etc.;
  • The same student may have a better or worse performance depending on the day in which takes the test/exam – e.g. due to mental and physical heath conditions;
  • Different graders may grade the same content differently – e.g. due to previous experience, halo effect, etc.
  • The same grader may grade differently the same content, depending on the moment in which performs the task – e.g. due to fatigue and health conditions, halo effect, etc.

Considering the above, assessment of learning instruments should not be used without being validated. There are several statistical techniques that can be used to do so, such as the Kuder-Richardson Coefficient for muti-choice questions and the Cronbach Alpha for open questions. Both measure the internal consistency, providing results between 0­1 and the higher the internal consistency, the better the reliability of the test/exam. Additionally, non-statistical methods, such as pre-tests with monitored reaction, can be used.

In some circumstances, although technically possible, it might be unviable financially to use statistical techniques nor pre-tests to validate the assessment of learning instruments, as the re-use of the instrument will not bring return on investment. After all there is a significant difference between an exam needed to achieve a Microsoft certification, which will be applied for years all over the world; and a test used once in a secondary school located somewhere in the globe. In those case, educational organizations should at least assure a systematic approach to the process of developing assessment of learning instruments. This can be done by establishing minimum requirements regarding competence, the process and the instruments. Examples are:

  • Competence requirements:
    • Educators understand the threats to the reliability of the assessment of learning instruments and, consequently, to the validity of its results; and
    • Educators know how to mitigate these threats while designing and developing formative and summative assessment of learning instruments
  • Process requirements:
    • A diversify of methods are used
    • A diversify of instruments are used
    • Results are used to increment the “True Score” across time
  • Requirements for instruments
    • Instruments are designed by multi-cross-discipline teams
    • Type of questions used is diversified
    • Number of questions per instrument is incremented
    • Objective grading criteria is pre-defined
    • Blind and double grading are implemented
    • Etc.

The template for Grading & Feedback can be used to implement a systematic approach to grading coursework, from tests and exams to group work assignments, supporting the definition of objective criteria to grade them at the design phase and then supporting the records of the grading attributed to each learner and the feedback to be provided to them.

The template contains fields to identify the type of coursework and its elements, which criteria apply to each of them and what are the maximum grades that can be attributed to each element and its total. It can support formulas to automatically calculate the final grade. It also contains fields to record the feedback to be provided to learners so they can better understand their grade and what they can do to improve their competences.

Download this tool in editable format

Download an example of the use of this tool