Action Research: Learner Engagement Intervention

The +XP Generator Gamification: Learner Engagement Intervention

Abstract

This report explores the relationship between level one students with various learning difficulties and a ‘gamification’ based rewards system that awards XP (experience points) for effort. The system is also designed to be exclusively positive for learners allowing them to make an attempt at an answer and still gain points (rather than being ‘wrong’), thus eradicating any negativity that would have previously put learners off the remainder of a session. It looks at action research methodology and the data generated through it during the course of the intervention, the professional learning aspects and the conclusions that can be drawn. Overall the response by the group was positive, the learners did produce more notes than expected, and there were some interesting side effects that came to light as a result of this strategy – for example some learners became more engaged in other lessons as a result of the intervention in my own.

Introduction

The primary question this research is addressing is “Can learners be influenced to actively develop note taking skills through perceived incentives of a points-based marking system?” and secondary question is “can learner engagement be increased through the use of gamification?”.


This work is for reference only, please note it has been through the TurnItIn system, therefore you cannot plagarise my work. All work is in British English.


Gamification is a points based learning experience and marking system is more commonly known as ‘Gamification’. Gamification has been increasingly used in schools and post compulsory education classrooms in recent years due to the changing nature of learners, and their preference for instant gratification (over delayed gratification). The theory of this method aims to lead to deeper involvement; or flow. Csikszentmihalyi (1993) defines flow as ‘a state of consciousness that is sometimes experienced by individuals who are deeply involved in an enjoyable activity’, that is the key to successful gamification. This flow is often hard to attain with groups such as the one this intervention was tested on. It involved a group of ten learners, the eldest is 21 and is evidently struggling with education, the youngest being 16.

Section 1: Justification for research

An effective intervention strategy for teachers to use in these situations is to recognise this individual/environment relationship and change the conditions that influence this dynamic (Nuttin, 1984). This can be quantified through the theory of operant conditioning as described by Michie in 2008 and Skinner 1948. Operant conditioning means that the subject – the learners – display a trait more readily as a result of positive reinforcement through reward. In this case, the points are being rewarded for good work and desirable behaviour. Conversely, they reduce the instances of undesirable behaviours through the deduction of points, ie punishment. Through my gamification, I aim to increase the frequency of reinforcing responses give through the worksheets and lesson that this group is exposed to. This thereby should move towards eliminating, or at least reducing, instances of ‘punishing’ responses from teachers that would dissuade usually learners from participating in activities or engaging in lessons. This is similar to how most primary schools implement simple reward systems like house points, most often rewarded for good behaviour. This action research is not about making a brand new intervention, more about altering an existing form of incentivising learning, particularly for a further education setting.

Ofsted’s requirements are that tutors are required to be engage learners throughout lessons; this is of particular interest in further education due to the low engagement rates (Osted 2012), and is an area that needs to be addressed. It also incorporates a portion of self assessment, which is also a requirement that must be evidenced for Ofsted; learners are compelled, although it is not mandatory, to keep track of their scores (Ofsted 2012).

The majority of the group this new method was carried out on have mild learning difficulties (MLD) issues, such as attention deficit hyperactivity disorder (ADHD), Autism, Asperger’s, Dyslexia amongst other semi-serious and ongoing mental and physical health conditions, speech impediments and issues with their personal circumstances outside of college. This makes them a harder than average group to keep engaged during lessons, and on task when set tasks. There is normally one teaching assistant in their sessions who supports a number of individuals by helping them complete tasks. They are a friendly group who are accepting of the fact that they need to be in college in order to progress. They are studying information communication technology, and in this course I teach them one unit, ‘Using the internet’, on a Wednesday afternoon 2.30pm to 4.30pm. They are a capable group, albeit level one learners who are attempting to access higher levels of learning, however this late stage of the day means their energy and enthusiasm to learn is often waning. I have found that, with this being a level one group, on the whole they are easily distracted. This is partly due to their various conditions. The core reason for doing this research with this group was to ascertain whether gamification can improve overall results. These learners, despite understanding that they need to be in college to progress and learn, needed their intrinsic motivation to learn improving as the quality of in-class notes and overall quality of work (for assessments also) needed to be raised.

This intervention is oriented around creating an element of ‘fun’ in lessons. In 2011, Gabe Zichermann, a prominent theorist in the area of game design and Gamification, said, “…it is the mechanics of a game — not the theme — that make it fun.” This is the key to my intervention. The aim is to make the action of learning more fun, regardless what the topic is. However, Zichermann does also say “Over time, an excessive dependence on “free stuff” [in the context of my intervention, this would be the ‘participation’ XP] or discounts habituates players [learners] to constantly expect that as a condition of purchase [participation]”.

Section 2: Research intervention

This intervention focused on behaviourism, an alternative reward system that was still in alignment with the curriculum, unit learning outcomes and minute-by-minute delivery of the content. I implemented it over a period of 8 weeks, this timescale being preferable to a shorter period of six weeks, in order to allow a baseline of quality to be ascertained in the initial stages of the unit. The intervention is, in essence, an opportunity for learners to gain points throughout their lessons, a pseudo reward scheme aimed at gaining higher returns of work output

This is a combination of two things; worksheet that learners can only gain points from (rather than have a deductive score oriented task). An straightforward example of this could be an acronym on the worksheet such as ‘What does W3C stand for?’. The question would have a total of 20 points available, 5 for each word, noted on the side of the page. The answer to the example is ‘World Wide Web Consortium’, and each word is worth 5XP points, therefore the total available is 20XP. Approaching the question in this way ideally means that learners who cannot answer fully with the complete four words will be able to answer at least partially, and therefore gain some points for trying. This encourages an attempt by the less confident learners, rather than them simply leaving a question completely blank, which is something they would do on a regular basis otherwise. In turn, this allows the learner to internalise the information in bits that are comfortable and paced for them.

The second, but equally as important aspect of this intervention was the addition of XP point’s opportunities throughout the main session. Learners had additional incentives to learn, take notes and contribute through the incorporation of XP point rewards. I also encouraged punctuality, prompt returns from break time, and partially successful assessment and task completion rates. There were various bonus points available for those who completed tasks successfully, accurately, and in particular when they completely finished set tasks.

My intervention is a minor alteration to the lesson format that is the norm; however, this form of assessment does tie in with the Unit 107. I am able to implement this as I am able to integrate my interests and the unit outcomes very closely, and this improved the group outcome and rapport with me as their tutor through the fun we had as a result of the ‘games’ and the entertainment value.

Each lesson was designed to have an around 400XP points on offer, through the various mediums. The majority of points were available through the recap worksheets, the remainder through the note taking cues in the rest of the lesson. These worksheets were created using the previous week’s content, however the real focus was, as mentioned, on increasing the learners note taking. Learners were initially asked to take notes in week 3 and 4 of the intervention using Microsoft Word, and over the duration of the rest of the unit they moved over to an online blog, which was worth more points. In appendix A, figure 1, the image elements 1-5 (amongst others) were was placed in the lower left corner of slides on which the group were required to make notes on topics or subjects covered. Appendix A shows examples of image elements created in order to implement this, a graphical and ‘playful’ style that would appeal to the class (Perry et al, 2009). These image elements were applied to areas of the lessons to indicate where points could be gained, and on recap worksheets that were used at the start of the lessons. See appendix A for examples of these in use within the lessons actual resources.

The worksheet could easily be converted to generate a generic mark out of ten, a percentage or a straightforward pass or fail. The choice to incorporate the whole lesson score as one is intended to induce the feeling that actions at the start of the lesson affect scores at the end. This helps students feel some continuity with their course.

This application was varied in its approaches; sometimes there would be up to seven topics, sometimes as few as one, each instance in which I wanted the learners to write about a topic. I would award them +5xp for each (complete) note taken, even if it was a copy and pasted piece of information taken from the internet. I would mark the note taking outside of the session, not during, as this would have taken time away from the lesson.

Each session had broadly the same format – starting with a register, followed by the ‘XP

Generator worksheet activities, a main session with slideshow discussion, followed by a plenary that usually preps the group for the following week. As I began this research with an initial assessment of the group’s ability to take notes over the first two weeks of the unit, I gained a good understanding of the abilities of each learner. I assessed their responsiveness to questioning, worked out their learning style preference in terms of audio, visual and kinaesthetic, and general demeanour. This assessment was recorded using a score (out of ten) for a number of traits, such as ‘Individual Feedback’ and ‘Adjustment Acceptance’, amongst others (see Appendix B). These sessions consisted of slides without any ‘gamification’, a ‘normal lesson’, so to speak. In the third week, introduced the XP points reward system. Not only did this immediately capture all learners’ attention, it also facilitated additional aims; getting the group willingly involved in discussion, confident and versed in showing their work and partaking in the worksheet recap activities.

There is one exception, in the fifth week as the unit needs a mid-way assessment. It has been deliberately altered, to see if there is any extra work carried out in preparation for a presentation, in exchange for points. The format for week five consists of a fifteen-minute starter, followed by ten minute talks by the students, during which learners take the tutor and the group through the work done on their blog, followed by some general questions and answers on their work. Each article learners had produced for their homework was worth 10XP. This unit will continue for another 6 to 8 weeks, and the XP accumulation will continue through until the end of this year, across other units, with this group that others and I will be teaching.

There was positive response to this idea from the group, and in the week following the discussion, I handed out the first worksheet. It was a colourful, playful looking worksheet, quite unlike anything they had been asked to do before (see Figure 2). Some learners immediately started on the questions, which was encouraging for me, as I had never seen a group respond so well to a recap quiz.  Some expressed their delight at seeing something that was not another ‘boring test’. This was a positive start, and I felt that I could repeat this method in future lessons.

Section 3: Methodology, data collection & ethics

Carrying out action research in this manner was the most suitable approach/method due to the nature of the unit, group setting and the group dynamics. Jean McNiff & Jack Whitehead (2005) compel educators to use action research methods when testing a new intervention because they generate meaningful results during testing and after. This allows flexibility and responsiveness to the changeability of teaching methods during the intervention and a deeper understanding of why these alterations may or may not have been made after the research is complete.

The limitation of action research in this scenario is that the focus is only on level one learners, and the content and focus of their course orbits around computers (almost exclusively) and therefore there is minimal written work (despite this being intervention being about note taking skill development). This means that over time, this method should be tested at other levels of learning to prove the effectiveness over a wider learner audience.

I opted to create the styles of image (Appendix A, Figure 1) for the gamification iconography, and the use of the term +XP rather than just arbitrary ‘points’, as it suited this groups demographic, age and outlook, focus on the fact that the unit was about computers and their use, and make the note taking solely digital. In light of this, I gradually decreased the amount of ‘free’ XP given out on the worksheets, and increased the amount of points that needed to be gained from the remainder of the lesson in order to gain their ‘target’ points for the week. This was an attempt to engage learners covertly, and I feel it worked.

The group was informed however that they were part of a ‘mini-experiment’ – this was my way of informing an often volatile group that there would be some differences in the teaching style in the weeks that followed. During my initial discussion about the intervention, they were told that my ‘idea’ [intervention] was designed to make the sessions ‘more fun, entertaining and memorable’. I used the words ‘fun’ and ‘entertaining’ rather than ‘engaging’ to accommodate for their level of understanding, and in order to make the process feel less prescriptive and formal.

Contingency planning for any learners who did not want to take part in the XP Generator was not to provide a markedly different worksheet or lesson plan; the same worksheets would have been used for the recap starters, only without the XP images added. I would have been simply marking their answers in the same manner. This is because the XP points reward system fits around an existing lesson and resources without weighing it down.

Quantitative, empirical data was collected during each session with the exception of the initial two weeks. The initial two weeks did not involve XP points deliberately in order to establish an overview of the groups abilities without a rewards system. As each of the learners’ generated two scores for each week (although they perceive these to be one combined score per lesson), is a total XP score was realised at the end of the lessons. Some from the worksheets, the remainder through note taking and partaking in the lessons activities. Punctuality bonus points were also incorporated into break times to encourage a prompt return. Sometimes I added bonus points for the class starting again early, something that worked well and therefore became a trend that continued throughout the rest of the unit.

These scores were recorded on a per session basis in a spreadsheet, anonymised and accompany this paper (see Appendix B), under each learner’s name, sub sectioned to weeks. This is collected weekly and recorded week by week, along with a weekly average for the whole group. This should show the individual and classes’ overall progression. Positive or negative trends over the week will be identified, and action can be taken on them in the sessions that follow. Through the recording of individuals’ XP points, the need to revisit work or explore topics more thoroughly will be highlighted.

Qualitative data was collected via short focus group discussions during each of the sessions, responses were recorded in my my own words in my journal. I asked learners individually how they felt about the new methods whilst I went around helping with the worksheets. I asked them what they thought of the system, if they felt it was helping them to learn, take notes and increase enjoyment of the lessons. On the whole, the response was positive.

Doing this whilst the +XP Generator worksheets were underway, I could identify any issues with the worksheet design; for example, one learner (see Appendix C, example 2) used the dashed lines for a question as if each dash were intended for a letter… which was not the case. Regardless of this he got the question right. Doing this walk around helped me respond to learner needs and in turn I gained an insight into their experience and perception of the tasks they were doing. These results were recorded on the worksheet and transferred to a spreadsheet at a later time (see Appendix C).

These data collection methods were fundamentally sound, however on some occasions (in the third week of the intervention, marked week 5 in the spreadsheet in Appendix C) learners began to show less gumption to participate — making these parts of the course less specific, perhaps changing the assessment method at that stage to something more literary based rather than a presentation, or by simply doing a longer test worksheet.

I feel this slow down was for two reasons: 1. They were being asked to do presentations of their blogs, to show how much work they had (or had not) done. This is where the groups’ learning difficulties were more of a barrier to progression than at any other point throughout the unit. Four learners (1, 5, 8 and 9) simply could not bring themselves to present their work, and as they had expressed this the week before, so all I could do was apply gentle pressure and encouragement to have a try. This was a partial objective in terms of these learner’s background notes, to encourage them to share their work more readily. Others were unwilling because they were not seeing a tangible reward for their points, at which point I had to think on my feet (see journal, entry 2) and come up with a motivation to would compel them to push for more points.

I decided to go with the possibility of them using their points to ‘buy’ items in an auction I would hold in the last week; something I do have the resources and time for, but will nonetheless have to incorporate into my planning for the end section of the unit.

As for the learners who were too shy to do the presentation, I feel that they are the ones that need the confidence boost the most. Their abilities to contribute to discussion throughout sessions are increasing, due to the points. If they are to present themselves, their work or ideas in future, they will have to push themselves have to make the most of their course, as it seems that simply offering points is not enough, and this is a possible limitation of this reward system with learners with these learning disabilities and conditions. I tried to entice them to do it by offering more points — 400XP, 500XP, instead of the original 200, unfortunately with no success.

There are also a number of students struggled to see the worth of just attempting to be confident, seeing a stigma attached to showing confidence, or doing presentations in front of the group. Despite my attempts and offers of more points, the three learners stuck fast and would not present, to the detriment of their points collection. One learner gained partial points for presenting; his attempt and explanation went a little awry and he finished early, but despite this, I made it clear that his attempt was being rewarded and he deserved credit for what he did manage.

As this is a predominantly quantitative study; the majority of data generated shows progression of each student, and any setbacks are clearly visible. If I were to repeat this intervention in future, depending on the group, I would have a score sheet that the whole group can see at any time, printed and pinned up in the classroom. This is solely to show progression, not student shortfalls; however, it would be a good thing to measure the effect of this alteration on the groups’ determination levels.

Section 4: Critical Analysis of Data & the process

Looking closer at the charts in appendix C that were generated from the XP points inputted into results spreadsheet, there is a visible upward trend in the total scores between the start of the intervention and the end. Overall averages are lower in the third week due to the three students not participating in the presentations, therefore lowering the average by approximately 150 points. This was not reflective of the learner’s ability to make notes. This improvement is only apparent through comparing submitted notes on a weekly basis. Appendix D shows the improvement between week 1 and 4. I feel I answered the primary research question positively; the learners were more active in their note taking once I introduced the points – they felt previously that they were doing notes just for the sake of making notes, but I had no way of knowing if they were learning: students have a tendency to not take notes when they ought to. By imposing this points system, and in turn they want the points so they take the notes, they learn more effectively and have content to refer to when it comes to assessments.

The numerical data in appendix B shows that the majority of the group were completing work when asked.  This was expected and as noted in the journal “This had the intended effect; they are generating enough points to make me comfortable that they are doing enough work”. This was surprising for some colleagues. However, it does not show so much what the qualitative data brought to light; ie. their preference for having this gamification reward system over not having it at all. This addresses the secondary research question about increasing learner engagement through the use of gamification, which is seen positively by Ofsted.

Judging by the individual learner responses during the focus groups, I asked ‘Are you enjoying the XP Generator worksheets?’.  Their response was that apparent that they do enjoy them, and that they felt more compelled to do the work in the light that they got points for doing so (Journal, Entry; “They were expecting to have to take notes, but since they are getting points for doing so, they now intrinsically want to gather points). The notes were improving because I was giving a little more time for them to make them more complete (I would often say that 50 words on a topic was needed in order to get the points on offer. This led to a growth in the notes the group made, and an increase in quality.

There is one important point that I discovered in my initial two weeks of working with the group, and this needs to be emphasised about the nature of the group; some of them do not respond at all well to negativity; for example, saying to learner 10; “looks like you have word spelt wrong there” would often quickly cause a migraine, headache, anxiety or other reason for the individual to have to take a few minutes outside the classroom. This is why I created the suite of worksheets and notation expectations that had no negativity whatsoever involved in them. Even the partial answers would be rewarded, and the complete answer reinforced whilst going through the answers on the overhead projector. This helped all students feel engaged and, to varying degrees, accomplishment.

As I mention a number of times throughout my research journal, the learners are engaging in the tasks I ask of them far more readily when I offer points for doing so. This is more so than many other classes I teach, and compared to their own performance in the first two weeks. Though colleague feedback, it became apparent that the group were asking for points in other lessons; displaying a preference for gaining points in return for work outside of my own sessions. This was a strong indicator that they were firstly beginning to understand that rewards (and praise) can be gained for doing good work, but also that they have to actually do the work in order to gain the points.

Reflection on professional learning

Undertaking this process of recording my interventions has furthered my understanding of how my pedagogical approach significantly impacts learners – especially when dealing with entry level groups, and in particular with those with learning difficulties. I feel that my intervention addressed a number of issues that were reducing the effectiveness teachers having with this group in particular.  My journal informed this paper considerably, as there were a few moments that were not fully clear in their direction at the time, but they became clearer after some time had passed and I could reflect on them without the ‘clouding’ of judgement that set in at the end of a busy session with the learners.

This intervention also helped my lesson plans. I had a considerable amount of forward thinking to do in order to incorporate the new intervention. Throughout this unit, Kolb’s (1984) reflective cycle helped me understand how to review and update my approach with the experiential aspect of the intervention, ie the worksheets, as did Gibbs (1988) iterative (learning through doing) model, particularly in respect to the notation and blog article creation tasks. For example, in week four I felt that the group would benefit from a longer worksheet, as they appeared to find the shorter, one page ones quite easy. I designed a two page worksheet to stretch and challenge the whole group, involving a practical task, using of the internet to find very detailed and specific information (see appendix A, example 4). This worksheet was twice the length of the previous weeks’, having some much harder questions on the first page and regular, straightforward recap questions on the second. The aim of this was to see how far the group were willing to go with my worksheets in order to gain points – the entire worksheet was worth 460 points if completed in its entirety, however only one learner really made it past 400 points; the majority of the class made a half hearted attempt at the second page, even though I was easier, shorter and would have gained them more points. From this I concluded that as a whole, this group could not hold focus for more than ten to fifteen minutes on one task. This has informed my lesson plans and altered the scheme of work, in which I state that the lessons’ sections should not be longer than fifteen minutes if they are doing an individual task.

Conclusion

Overall I feel this intervention worked as I intended it to; through its introduction, it reduced negativity in the classroom by a large degree, therefore ‘allowing’ more students to stay involved throughout the sessions, and it allowed the learners to access the information they needed in order to complete the course more readily. These students walking out of the sessions was a serious issue previously, causing slow downs and stoppages in the flow of the lesson for everyone – ie. I would have to follow the learner out of the room, leave the learning support assistant to oversee the rest of the class and attempt to progress them with the tasks at hand, whilst I talked to the student one on one and brought them back into the room. This could sometimes be up to five minutes taken out of the session. Now it has been reduced to a rare occurrence, which is a significant step for the group, and an unexpected and welcome side effect of the intervention.

I feel that using gamification with the lower levels of learner groups is worthwhile when carefully applied to the right areas of a unit or throughout a course, it will enhance and accelerate learning. It is untested in my practice at higher levels, although I feel that it would always be appropriate to be used on a weekly basis. It could be tried out in a similar manner up to level three, however beyond that there is a risk the learners will feel patronised.

Using gamification in this manner allows teachers a range of opportunities to incorporate and embed English and mathematics development in any section of a lesson through the worksheets, slide or just reward points; this allows the remainder of the session to be focused on the topics and learning outcomes that need the most attention. It also makes the learners more responsible for their own progress tracking – they always want to know what points they have. I expect to see more colleagues use of gamification throughout my career in teaching, and I expect to use it more myself in future to incentivise learning. In light of the results of my intervention, and to encourage the use of gamification in future, I will revisit and refine making my resources and materials and data collection spreadsheets (as templates) freely available for others to use. They are simple to incorporate into a new or existing schemes of work, lesson plans and slideshows alike and would likely contribute positively to the learners’ results.

[5040 words excluding appendices’, references & datasheets]

References:

Ackerman, J. M. (1972): Operant Conditioning techniques for the classroom teaching. Scott, Foresman and Company, Illinois, USA.

Ayers, H. et al. (1995): Perspectives on Behaviour, a practical guide to interventions for teachers (second edition) David Fulton Publishers, London.

Bowen, J., Jenson, W., Clark, E. (2004): School based interventions for students with behaviour problems. Library of Congress Cataloguing-in-Publication Data.

Carter, M. Gosper and J. Hedberg (Eds.), Electric Dreams. Proceedings ascilite 2013 Sydney. (pp.573-577)

Gibbs, G. (1988). Learning by Doing: A Guide to Teaching and Learning Methods. Oxford: Oxford Further Education Unit

Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development (Vol. 1). Englewood Cliffs, NJ: Prentice-Hall.

McDonnell, K (2015); Behaviour: putting theory into practice, http://bit.ly/1Of6jq7 (accessed 21st April 2016)

McGrath, N. & Bayerlein, L. (2013). Engaging online students through the gamification of learning materials: The present and the future. . In H.

McLeod, S. A. (2013). Kolb – Learning Styles. Retrieved from www.simplypsychology.org/learning-kolb.html

Michie, S. et al. (2008): From theory to intervention: Mapping theoretically derived behaviour determinants to behaviour change techniques. Applied Psychology International Review

Nuttin, J. (1984): Motivation, Planning and Action: a relational theory of behaviour dynamics. Lawrence Erlbaum Associates Inc.

O’Brien, T. (1998): Promoting positive behaviour. David Fulton Publishers.

Ofsted, (2012), How Colleges Improve, A review of effective practice: what makes an impact and why,

Perry, D (2009) David Perry on Game Design, Charles River Media, Canada

Porter, L. (2006): Behaviour in Schools: Theory and Practice for teachers (2nd edition). Open University Press, Berkshire.

Walker, M. (2010): DCSF report. Special Educational Needs: An analysis. Department for Education, Cheshire.

Zichermann, G, (2011) Gamification By Design, O’Reilly, Toronto, Canada.

Appendix A: Examples of +XP Generator elements in context

Figure 1: Examples of the ‘+XP Generator’ Gamification image elements

Figure 2. Sample worksheet, used in week 3.

2. Sample Slide, showing answers to worksheet spelling/word recognition test.

3. Sample Slide, showing an example of notation and topics.

Appendix B: Datasets generated from weekly sessions

Spreadsheets, graphs

Appendix C:

Example 2: Luke N’s worksheet (dashed lines)

Appendix D: Learner submissions