Summary: Regulation workstream
This workstream identified how regulatory mechanisms might enable all providers within the FE system to use Learning Technology (LT) more effectively and support learners. Key themes included support for increased online assessment methods and increased awareness of optimal uses of LT to develop awarding industry’s confidence in using, assessing and quality-assuring LT. Another key area that emerged was to increase the evaluation of the quality and effectiveness of technologies deployed by providers.
Draft recommendations:
- The regulation system should fully support the adoption of new technologies and learning methods, challenging and adapting its quality and success measures in accordance with the leading edge in the FE sector. It should ensure that this is reflected in measures used by Ofqual, SFA and Ofsted and that the change is communicated in a clear and high-profile way to ensure it is fully understood and adopted in the sector
- Develop and deliver pilot courses of study linked to regulated qualifications that maximise the appropriate use of technology in learning and assessments. Federation of Awarding Bodies (FAB) and Joint Council for Qualifications (JCQ) should work with Ofqual and others to increase awareness of optimal uses of learning technologies and develop confidence to use them in assessment and quality assurance across the awarding industry
- Ensure that the Ofqual review of the regulation of Vocational Qualifications 1 and other associated regulatory developments supports the use of technology in teaching, learning and assessment of regulated qualifications and that it allows awarding organisations working with providers, employers and learners to develop qualifications and assessments that make best use of technology. Develop a definition of Guided Learning Hours that supports and does not inhibit the use of learning technology in the delivery of qualifications
- Ofsted should take on an enhanced role in evaluating the quality and effectiveness of technologies deployed by providers. Comprehensive training of Ofsted inspectors will enable them to identify good practice and evaluate the effectiveness of deployed learning technologies. Develop, in collaboration with Ofsted and FE stakeholders, means by which the evaluation of learning technologies can be given more emphasis in inspection; include the development of plans for the use and implementation of technology by providers which inspectors can review
- Local Enterprise Partnerships (LEPs) should set up Learning Technology Innovation Groups, consisting of FE providers, employers and other stakeholders to support and champion local use of learning technologies. Pilot innovation groups to champion the uptake of learning technologies and demonstrate how changes can have a positive impact on outcomes for learners
- Require FE providers, universities and industry to collaboratively develop up-to-date and relevant professional development and initial/early training for managers, teachers and trainers aimed specifically at improving their knowledge of, and confidence in using learning technologies
Your views:
- Building on these proposals, what specific changes to regulation are needed to make the biggest difference to you?
- Can you suggest any good mechanisms for achieving these recommendations?
- Are there other issues relating to regulation that should be considered?
Rate the regulation workstream:
In your opinion, how useful are these draft recommendations on regulation ( 5 = very, 1 = not at all).
I agree that robust evaluation is critical.
But I disagree strongly with the view expressed here that we need new forms of assessment to enable new forms of learning technology. If you move the goalposts at the same time as developing new ways of scoring goals, then you have no way of knowing if your new methods are any better than the old ones – or whether you are scoring more goals merely because the goalposts have been moved nearer. And as significant sectors (if not the majority) of the TEL community seeks to use technology to change the purpose of education – away from robust representations of learning objectives and towards learner-created knowledge – there is a real danger that TEL will be used to subvert good education, not to improve it.
I agree that Ofsted should be given a formal role in evaluating the application of ed tech – but there is a danger in relying too much on formal evaluations. The experience of Becta’s evaluations in the schools sector was that these evaluations were too often driven by political considerations. Excessively optimistic research findings were repeatedly fabricated to justified Becta’s own existence. I can supply details on request.
It is vital that you maintain diversity in the range of sources of evaluation, and that evaluations are contested. User reviews are therefore a critical part of the mix, although professional, academic and official evaluations can also be used in complementary ways.
Local Enterprise Partnerships and other forms of champions should be avoided at all costs. The primary objective of all bureaucracies is to protect and extend their own remits. They find things to champion, even if nothing really works, and the long-term damage that this process does to the reputation of education technology is crippling.
Collaborative networks, on which such structures are based, are nearly always hostile to the sort of counter-consensual thinking on which genuine innovation depends.
A similar point can be made about training. We don’t yet know what “good” looks like. Creating training programmes in such circumstances merely entrenches unproven and generally ineffective practice, blocking rather than enabling genuine innovation. And in the hands of officially-funded programmes, training almost always becomes a means of proselytizing, rather than responding to genuine demand.
In the FE sector the main assessment goal is competence whereas in the graded academic field in higher education the emphasis is on differentiating and grading academic performance. For this reason the assessment methods adopted in one sector will not necessarily be appropriate in another. Practical competence is associated with relevance of context and the closer the assessment can be in the context the better. With sophisticated flight simulators training pilots using technology mimicking flight it is tempting to think that all practical and vocational assessment can be done through technological simulation. This might be true in the future but the economics need to stand up too. Some occupations require face to face dealings with people and people are not easy to simulate. Informal learning in the work place can often be assessed much more economically by direct observation than by fabricating a simulation of that environment. Technology can help in this process with evidence management, progress tracking and certification. There is plenty of scope to improve implementation at this level without necessarily going into the expense of programming simulations that might or might not be valid reflections of the circumstances. In the extreme, if we can accurately simulate the working environment well enough for valid assessment, why bother with people doing the job itself? Surely the machines will do it as well or better? The main reason machines don’t do some jobs are many and varied. There is a human emotional and psychological dimension that needs to -be taken into account and sometimes it is simply that people are the most cost-effective resource.
Technology enables human creativity. Learning outcomes and assessment criteria can just as easily relate to competence in creativity as any other field. Indeed it is highly desirable for learning outcomes and associated assessment criteria to evolve over time to reflect changes in how difficult it is to achieve an outcome. It is far easier to publish a book now than it was 20 years ago so what was a Level 5 learning outcome related to publishing in 1993 could well be a Level 2 learning outcome now. Levelling qualifications has never been an exact science and probably never will be. In a relatively narrow range of academic qualifications it might be possible but across thousands of vocational qualifications delivered by hundreds of awarding organisations it is far more difficult and expensive. Within an occupational sector the main purpose is for employers to know what the employee is competent to do and for the employee to be motivated by seeing that they are making progress and acquiring new skills and knowledge relevant to their job. Technology makes this much more straightforward, for example, by relating the certificate directly to the evidence the learner has provided to get the certificate. Putting a simple QR code on the certificate enables the certificate to be authenticated. TLM has been doing this for nearly 10 years. Relying on paper certificates is extremely high risk because it is so easy to forge a certificate and not easy to authenticate it. Scanning the QR code also opens up the opportunity to provide samples of the evidence underpinning the certificate, the employer reference and any other relevant background. It doesn’t make much sense to spend a lot of money on expensive assessment technology eg for countering plagiarism when it is far easier and less expensive to just forge the certificate than it is to go through a course and cheat.
We need to start systematically making the most of simple technologies that exist now and that have demonstrated the potential to significantly reduce the cost of assessment. We do need to keep research and development going but constantly chasing technological rainbows when we are only scratching the surface of what can be done with existing low cost technologies makes no economic sense at all.
OFSTED should change how it inspects training provision so that it is clear that a failure to invest in and use technology effectively will have a significant and material impact on how that organisation is judged. Government should ask OFSTED to:
– incorporate e-delivery as a core part of the Common Inspection Framework
– ensure inspection staff understand what “good looks like” in relation to e-delivery (i.e. every OFSTED team should have at least one expert on e-delivery good practice)
– complete a pilot of an e-inspection by the end of 2015 and make e-inspection 50% of OFSTED’s work by 2020 (i.e. where an OFSTED inspector looks at and evaluates teachers’ and learners’ activity online e.g. by looking into e-portfolios, observing e-delivered classes, reviewing online activity, looking at evidence/outcomes etc.); this will also have the by-product of finally enabling Guided Learning Hours to be removed from the inspection (and funding) systems
Awarding Organisations should also start assuming that their work, including assessment and verification, should default to being done online, with physical visits the minority.
Government should require Training Providers to have all learners’ portfolios available online by 2017 as a condition of funding and for Awarding Bodies to be capable of e-verifying learners’ portfolios by 2017.
All exams and tests should be capable of being completed online (and to be as robust as offline exams and tests are now) by 2017 too.
Government should also insist that e-delivery capability forms a core part of the competency assessment in allowing new organisations onto the Register of Training Organisations (ROTO).
I really appreciate this opportunity to get involved in this discussion.
The problem with taking this tack of looking at technologies and how we train teachers in their use is coming at the problem from an unreal perspective. Technology is a means to an end and as such I use technology not because it is good in itself but because it makes my life more interesting, better connected, more efficient and less bureaucratic in some respects. We need to think about using technology from how it affects our behaviour and describing its value this way rather than discussing the the processes and systems coming from the software itself. To illustrate, I like using Amazon because it gives me quick access to a arrange of choices, speed of choosing and convenience of delivery; and not because I like the Amazon software.
It is better to think about the higher level thinking skills teachers need (rather than knowledge and processes) to demonstrate to be agile, confident, curious and imaginative in how we use technology. To this we need to add adaptability to change how we work to reflect the uses of technology chosen by students and their employers. we are not in charge and not always in control of technology. This is possible to evaluate by asking teachers to account for their effectiveness, that includes these skills, captured through using ‘tagging’. I will try to develop these points elsewhere.
Technology is not one dimensional.
1. We use it generally in college,
2. We use it specifically in teaching and learning and
3. We teach vocational specific technology.
This is Digital Literacy, learning technologies and vocational skills. They are different.
In summary we need to be careful in how we describe the value of technology generally and learning technology in particular to learners, how we evaluate it and who can do this. OfSTED can be seen as a heavy hand of audit, that looks for overt systems and processes only which is a fundamental misunderstanding of what makes technology valuable.
Pearson participated in FELTAG’s Regulation Workstream and supports its recommendations, which will aid all providers in their recognition of the role of learning technology.
It would be challening but quite possible for OfSTED to evaluate good e-learning through the use of technologies. What they are after is what happens in the heads of learners and is evidenced in their actions and changes of approach to their world. Contrast this with the historical concentration on evaluating technology that is used in the classroom.
Learning activity is now a continuous process. OfSTED would need to look not at the technology but what is adds: How learners collaborate better, capture their divergent thinking and ideas, listens to their conversations about learning in social media, and tracks the administration of their learning and do all of this beyond the classroom as well as in it. All of this is possible because of personalised technology. Whilst challenging, OfSTED concentration on these behaviours, made possible by the efficient deployment of combinations of learning technologies (including those learners bring to the table) would be helpful. Auditing lists of technology without considering the specific learning benefits has no value and will simply push every provider towards a closed shopping list approach.
If these behaviours were added to the Framework and evidence always sought from the use of technology then we might move away from this unfortunate idea of Inspection being an audit on content to it being a discovery of new learning. The nice thing is that collaborative learning, divergent thinking etc. are all great soft skills needed for the digitally literate world of work.
A good mechanism for achieving the recommendation that initial/early training for managers should be provided is to use an online learning platform – in that way the managers/teachers will be using the technology as well as learning about the how to use it to support learning. If a manager/teacher doesn’t themselves have basic online skills they could easily use the free and online Learn My Way; it offers relevant and up to date e-courses on how to use a computer and the internet.