- Research article
- Open Access
Teacher professional development in the contexts of teaching English pronunciation
© The Author(s) 2017
- Received: 23 January 2017
- Accepted: 16 May 2017
- Published: 17 August 2017
In this study we focus on the effects of an intervention aiming to improve the English pronunciation skills of secondary school students in the Netherlands. In order to implement a new pedagogy successfully it is of the essence to take into account how teachers learn and what motivates them to adapt and change their way of teaching. Teachers need time to test and adapt a teaching design to fit the needs of their classroom practice and the students’ needs. In this paper the main focus is on finding evidence of teacher professional development in teaching English pronunciation. Results show that teachers are extrinsically motivated to change their teaching behaviour and classroom practice after using a computer assisted teaching tool to teach English pronunciation.
- Professional development
- Inquiry learning
- Teacher education
Teacher professional development
Research shows that teacher quality is significantly and positively correlated with student attainment and that it is the most important within-school aspect explaining student performance. Its effects are much larger than the effects of school organisation, leadership or financial conditions (Hattie 2009, 2012; Meiers and Ingvarson 2005; Veen et al. 2010).
Laurillard (2012) and Mor and Mogilevsky (2013) see the teacher as the initiator of defining an educational challenge and of the conceptualisation of its solution. This, however, means that certain conditions at a teacher’s workplace should already be met before this first step can be taken. School leaders should have already facilitated teachers in a way that they are able to devote time to thinking about an educational challenge they would like to address, without being hunted by the school’s curriculum and short-term students’ achievements. For most secondary school teachers in the Netherlands the situation of the day-to-day practice of teaching (and the curriculum) leaves no room for in-depth research and design initiatives.
In this study we explore the process of teacher professional development and the effect of implementing a new teaching design on the behaviour of teachers. This takes place in the context of teaching English pronunciation to secondary school pupils (who from now on we refer to as students) and students at schools for intermediate and higher vocational education (universities of applied sciences) in the Netherlands.
Context of the case study
Error types that are made by more than half of the students in more than 50% of the cases in which such a mistake could be made in the tests set to the students
Number of students making a particular error
percentage of errors of a particular kind made by the subject group
voiceless th(θ) is /d/
no aspiration /p, t, k/
Bad is bed /æ/ is /e/
θ, is t,f,s
r + no linking r
/Ә℧/ is Dutch /o:/
Can we provide evidence of teacher professional development by involving teachers in an intervention phase, implementing a pre-structured teaching design?
We focus on evidence of teacher motivation, leading to signs of professional development and changes in teacher attitude concerning their classroom practice. We also seek to learn about the teachers’ opinions concerning the intervention tool and their ideas on how they would like to professionalise.
Test phase 1
Based on previous research (Hermans & Sloep, 2015) a Computer Assisted Pronunciation Teaching Tool (CAPTT) was designed and implemented in a dedicated website, using Liferay Portal EE (http://www.liferay.com). It provided students with seven chapters: one introduction on the topic of English pronunciation and six chapters covering six error type categories. The instructions were in Dutch to avoid any possible confusion. The only computer skills needed for students was knowledge about how to click and go to the next step. The recording tasks required a mobile phone or a tablet (voice recorder) and Wi-Fi in order to send the recordings to the teacher or the Principal Investigator (PI). Most phones and tablets have a standard voice recorder installed. Next to that there are plenty of voice recorders which can be downloaded for free. With a laptop, phone or tablet and an Internet connection the online module could be used.
Teachers were provided with a code and a password to be able to access the teacher area of the website. The teacher area provided them with background information on phonetics and pronunciation, test materials and keys to all of the assignments. Once the link to the website was opened, the teacher only had to instruct the students once by pointing out where to start. The instructions were all self-explanatory and the teacher’s role, once the module was running, was more of a guider than a teacher. With the background information on phonetics and pronunciation and the recording tasks a teacher could give feedback to students on an individual basis or in-group sessions. No further ICT skills were required.
The website made use of a straightforward format with text information and embedded videos. The contents only focused on the necessary information needed to improve students’ English pronunciation in the six error types categories. Since the website only required limited and basic technical options, there was no need to charge schools for using the CAPTT. In the Netherlands students without a mobile phone are the exception. Using the mobile phone or a tablet as a recording devise is a cost-effective way to gather sound data. A student without a phone or tablet could always borrow a fellow student’s phone to record a task in class. As most schools in the Netherlands (and all the schools involved in this research) have free Wi-Fi, students incurred no costs with gathering their sound data that were sent to the PI via Email.
The website only offered the necessary information to address the six error types. The module provided seven lessons covering error types most Dutch speakers of English students struggled with. Every error type category was introduced with textual information supported by video files, followed by recording tasks and practice materials. There were no side tracks and there was no unnecessary extra information or technical options added to the necessary information to be learnt about the selected difficulties in pronunciation. All of the lessons set achievable goals for students.
Teachers were offered the basic background information on phonetics necessary to provide students with extra information and specific articulatory information concerning the six error type categories. Students were also provided with some background information on some of the phonetic symbols used, only to point out the difference between certain phonemes that are difficult to distinguish by Dutch speakers of English (e.g., the difference between voiceless th /θ/ and voiced th /ð/.
The students were asked to shadow (and record) the words and sentences after hearing them pronounced in the videos in class. The auditory speech was enhanced by visual aspects of articulation (Dias and Rosenblum 2016) to enhance phonetic convergence.
The recording tasks allowed teachers to give more individual student feedback. Students were able to analyse their own pronunciation by listening to their own recordings following a given evaluation format. Teachers were able to analyse the pre- and post-intervention test according to the same format the researchers were using and discuss this with their students.
The pre- and post-intervention test provided data on student results before and after working with the CAPTT. Specific texts were designed in which all error type categories were represented in an equal frequency of occurrence. A strict protocol was handed out to three analysts who looked into the sound files individually and then compared the results.
Teachers received a strict procedure on how to present the CAPTT. The procedure prescribed the order of presenting the chapters concerning the six error type categories. Every teacher received an individual instruction for the order of dealing with the various chapters in order to minimize the effect of error type categories being dealt with first or last, on the end result of the sampled group. The teachers were also instructed to only use the CAPTT’s introduction to introduce the topic (English pronunciation) and to use no more than five minutes to introduce each chapter, using the background information on phonetics provided by the CAPTT in the teacher’s section. For the pre- and post-intervention test there was a strict procedure as well. The teacher’s role was one of facilitating (essentially run the CAPTT) and providing guidance only if necessary (and so not one of traditional instructing and teaching).
All of the CAPTT’s tasks were self-explanatory so students could work at their own pace. With these strict procedures the aim was to minimize the effect of teachers (i.e., their personal interests, attitudes towards the teaching topic, backgrounds, skills e.g.,) on the results of the post-intervention test. Minimizing the teacher’s input enhances the validity of the CAPTT’s effect on the post-intervention results.
Subject group test phase 1
Teachers were invited to participate via email and were asked to take part in a test phase of a newly developed teaching approach for teaching English pronunciation. Initially 17 teachers spread over five schools agreed to take part in an introductory meeting. Five teachers at a school for the higher vocational training were aligned with the teacher training college and taking part in the test phase was part of their teaching task. One of the teachers, working at the PI’s home institution, was also teaching at a school for intermediate vocational education and she decided to use the new design there too.
During the introductory meeting the educational challenge was explained and teachers were informed about the newly designed teaching approach dealing with the six most occurring pronunciation mistakes made by Dutch speakers of English. Teachers were told that the first test phase would take place from September 2014 until December 2015.
In July 2014 23 teachers received a letter with information about the setup of the test phase. The mail included the necessary access codes and information to be able to use the online module (CAPTT) Do your pupils sound English? They also received a strict protocol, explaining how to work with the module, in order to make sure all teachers followed the same procedures. The protocol included a procedure for a pre-intervention test to establish the students’ skills before working with the module and a post-intervention test after working with the module, in order to measure possible student improvements. Seven lessons of 50 min had to be scheduled by the teachers themselves in order to teach all the topics of the module. September 1st 2014 was indicated as starting point of the test phase (2 weeks after the summer holidays) and December 30th 2015 as the end point.
On September 1st 2014 all teachers received an email to remind them of the starting point of the test phase and to wish them good luck with using the online module. Twelve teachers either failed to respond to any further emails, did not use the module due to personal circumstances or a lack of teaching time, or failed to hand in the final post-intervention test results. In the end eleven teachers taught the seven lessons of the CAPTT and handed in the pre- and post-intervention test results (similar test covering the six error type categories before and after working with the CAPTT in class).
details subject group teachers test phase 1
bachelor (studying for master)
Higher voc. ed.
Higher voc. ed.
Higher voc. ed.
Higher voc. ed.
bachelor (studying for master)
Higher voc. ed.
studying for bachelor
bachelor (studying for master)
Intermediate voc. ed.
Data collection test phase 1
All the parents of the students aged under 18, that were recorded in a pre- and post-intervention test (in test phase 1 and 2), received a mail from their teachers or headmaster, explaining the purpose of the research. The procedures were explained and it was pointed out that participation was on a voluntary basis. Participant older than 17 received a similar mail from the PI, explaining the purpose of the research and the procedures. The results of the pre-intervention test of 70 students (70 recordings randomly chosen out of 162 available recordings) were set against the results of the post-intervention test. The analyses of the sound data was done by the PI and two assistants. The inter reliability rate was high at,871 .
Were you teaching pronunciation in your EFL-lessons before working with the CAPTT? (If so, how were you teaching pronunciation?)
Has your approach and attitude towards teaching pronunciation changed after working with the CAPTT?
How did you adjust the approach to fit your classroom practice?
What are your suggestions for improving the CAPTT so it would better meet your and your students’ needs?
What would be the best teacher professional development method for you as an EFL- teacher?
The aim after test phase 1 was to find out about the teacher’s experience working with the CAPTT and to gain input on how to adapt and improve the intervention tool (CAPTT) so it would better meet the teachers’ and students’ needs in a classroom situation.
Data were analysed in two stages. In the first stage one researcher analysed the transcriptions. An analysis (Patton, 2002) was performed identifying interview fragments on the basis of categories derived from the research questions as sensitizing concepts (Strauss & Corbin, 1990). Interviews were analysed for the teachers’ feedback on the teaching design, their personal standpoint towards the necessity of the intervention, changes in their attitude towards teaching pronunciation (signs of professional development), willingness to take part in design inquiry activities themselves.
Test phase 2
Based on the outcome (students’ results), feedback and teaching experiences from teachers (and students), gathered after the first test phase, the teaching design was reviewed and slightly adapted in order to be tested again to find out whether teachers’ and students’/pupils’ feedback and idea-input had improved the design, led to better (student) results and showed proof of significant teacher professional development.
Subject group (teachers) test phase 2
In the second test phase the same procedure was followed as in test phase 1. All the teachers involved in the first test phase agreed to take part in the second test phase as well. Additionally, four new teachers teaching bachelor students took part and three left because their teaching contract ended, leaving a total of six teachers. Nine additional secondary school teachers from three additional schools agreed to take part after an introductory presentation. September 1st 2015 was indicated as starting point of the test phase and February, 15th 2016 as the rounding off date.
Details subject group teachers test phase 2
Higher voc. ed.
Higher voc. ed.
Higher voc. ed.
Higher voc. ed.
Higher voc. ed.
Higher voc. ed.
bachelor (studying for master)
Data collection test phase 2
The results of the pre-intervention test of 60 students (60 recordings randomly chosen out of 222 available recordings) were set against the results of the post-intervention test. For test phase 2 we followed the same procedures as for test phase 1 (See 2.3). Additionally we adopted Fishbein’s Integrative Model of Behaviour Prediction (Fishbein and Yzer 2003; Kreijns et al. 2013) in order to gain data on dispositional variables including attitude, self-efficacy and subjective norm influencing teachers’ motivation to take part in an experimental teaching design using a CAPTT.
Test phase 1: interview
After the interview analysis quotes were categorised. All teachers considered the module to be useful and were willing to use the teaching design again in the second test phase. One teacher however, demanded a less strict procedure for working with the module, as she thought some parts in the teaching design were too easy for her students and she wanted to be able to skip tasks to speed up the teaching process. Another teacher disagreed with the statement that a good pronunciation increases the credibility of the speaker. He was only interested in the module because of the intelligibility aspect. Almost all the teachers wanted to have the ability to evaluate their own students and missed test materials and repetition tasks in the module.
Categories as sensitizing concepts based on research questions test in phase one
examples of quotes by teacher
feedback intervention design
The module fits my students’ needs. (Deejay)
Students love to sound British. (Lidy)
75% of my students showed improvement. (Jacky)
I do believe that with the right approach Dutch students can sound become
more native-like. (Jacky)
It is nice to have a limited set of pronunciation error types to start with. (Katrien)
I do consider the presented error types in the module to be the most important ones and see no need to put in more. (Boy)
This module is a good starting point for me to teach pronunciation. (Any)
I particularly like the structure of the teaching approach which makes it easy to teach. (Deejay)
need for intervention
I do believe accents influence the credibility of the speaker. (Jacky)
Good pronunciation is important to enhance a learners’ intelligibility. (Andrew)
The only time I was involved in aspects dealing with pronunciation, was when I was studying phonetics. (Lucas)
I never thought pronunciation as the course books do not offer any materials to do so. (Katrien)
My pronunciation deteriorated once I left the teacher training college and started teaching secondary school students. (Katrien)
Pronunciation and credibility go hand in hand. (Linda)
In secondary school I never thought pronunciation though it’s obvious that it’s important. (Lucas)
change in attitude (TPD)
I started using the recording device for other EFL tasks too. (Deejay)
Working with the module was like a wake-up call for myself as a teacher. (Katrien)
Teaching pronunciation was never on my mind as I did not know how. Now I know where to
This module gives me a teaching approach for something I did not know how to teach before. (Jacky)
I’m more aware of my own pronunciation now because of working with the
I needed the module to get me started with teaching pronunciation, which I’ve always considered to be important. (Boy)
I need to invest more time in teaching pronunciation. (Deejay)
attitude towards professional development
I am only interested in teacher professional development projects if I can anticipate better results for my students. (Lucas)
I would not like to do research myself, even given time. (Suus)
I have no time for research myself. (Katrien)
I’d rather create lesson materials than being involved in research activities. (Jacky)
I’d rather professionalise on my own by reading research papers and books. (Andrew)
I’m only interested in professional development projects if what I learn can be put to practice immediately. (Boy)
I need to be convinced that my students will benefit from what I learn before getting involved. (Lidy)
All of the teachers were very eager to advise the researchers on how to improve and adjust the intervention tool to make it more suitable for personal classroom usage. One teacher did not really have a good Internet connections in the classroom and wished for the online materials to be available on paper too. One teacher disliked the fact that the instruction language was Dutch, and for that reason not really suitable for her students following the bilingual course (most subjects taught in English). She wanted an English version of the module. Almost all the teachers missed the chance to evaluate their students themselves and give personal feedback on their students’ performances. They also wanted more repetition in the materials and more freedom in how to work with the materials. They wanted to break free from the strict procedures. They also felt a desire to inform all of their students about the progress in their achievements.
Test phase 2: interview
Categories as sensitizing concepts based on research questions in test phase two
examples of quotes
feedback intervention design
The six error type categories were very clear and useful. (Simon)
Working with the module was easy and did not take a lot of time. (Danny)
The CAPTT is well structured. (Deejay)
The videos were very enlightening. (Danny)
Well-chosen lesson materials. (Deejay)
need for intervention
I do thing that a better pronunciation enhances the credibility of the speaker. (Bo)
I thought less of my English teachers when their English pronunciation wasn’t near native like. (Ella)
I do consider a good pronunciation very important for my pupils, especially for their
follow up studies at university. (Ella)
I good pronunciation leads to more credibility. (Sanna)
A good pronunciation is important and the speaker will be taken more seriously. (Simon)
I am well aware of the importance of pronunciation, but I realise I should work on it sooner. (Simon)
It’s also still important because of intelligibility. (Deejay)
I do consider teaching pronunciation to be very important and we need to devote more time to it. (Sergio)
A good pronunciation immediately determines what you think of the speaker’s intelligence. (Jacky)
The listener will automatically judge a speaker on his pronunciation or accent. (Andrew)
I think that near-native pronunciation is only affecting the speaker’s credibility when he’s speaking to a native speaker. (Lizz)
A good pronunciation comes with a good first impression. (Sanna)
change in attitude (TPD)
I pay more attention to the pronunciation mistakes students make in class. (Bo)
I correct students more when I notice them making the mistakes we discussed in the CAPTT. (Ella)
I teach more pronunciation than before working with the CAPTT. (Jacky)
I am more interested in the pronunciation mistakes my students make. (Sanna)
Even when I teach grammar or other things, when I speak with friends or watch TV I focus more on pronunciation. (Sergio)
Pronunciation is on my mind even if it’s not the focus of the EFL lesson. (Deejay)
I pay more attention to my own pronunciation now. (Andrew)
I’m more aware of my own pronunciation after working with the CAPTT (Bo)
I didn’t do a lot of pronunciation teaching, but now that I have these materials I feel I can actually teach pronunciation. (Simon)
Teaching English at this level doesn’t improve your own pronunciation. This CAPTT made me focus more on my own pronunciation. (Simon)
I learned a lot myself, using the CAPTT. (Sergio)
I became less accurate with my own pronunciation through the years, but teaching pronunciation helped me to focus on my own performance again. (Katrien)
Nine out of thirteen (13 = n) teachers indicated they altered their classroom practice, devoting more time to teaching pronunciation and feeling better equipped to do so than before taking part in the experiment. Ten out of 13 teachers commented on the intervention being important. Eight teachers considered a good pronunciation to be related to more credibility of the speaker. Only three teachers commented on the quality of the CAPTT. There were two teachers, Andrew and Deejay, with a Dutch/Moroccan background. Andrew was born in the Netherlands and Deejay was born in Morocco, but has been living in the Netherlands for 20 years. Andrew spoke negative about the link between a good pronunciation and credibility, and argued that native speakers should not judge on the basis of accent but on the basis of content. He was only interested in improving students’ intelligibility and did not want his students to loose part of their identity while struggling for a near-native English pronunciation. DeeJay considered both credibility and intelligibility to be important, but was also more interested in the intelligibility part. There were two native English speaking teachers who both considered good pronunciation to be positively influencing the speaker’s credibility.
Test phase 2: Integrative Model of Behaviour Prediction
In order to measure teachers’ motivation to use the CAPTT we used a measure that was derived from the Perceived Locus of Causality measure (PLOC) of Ryan and Connell (1989); we refer to this as the adapted PLOC measure or for short a-PLOC. This measure assesses different types of motivation that regulate behaviour as defined by the Self-Determination Theory (SDT) of Ryan et al. (2000). In short, SDT distinguishes between intrinsic and extrinsic motivation. Intrinsic motivation is concerned with the enjoyment and challenges the engagement with an activity or object (i.e., CAPTT) gives. Extrinsic motivation encompasses four types of motivation that vary in their degree of autonomy. From highest to lowest autonomy these four types are: integrated, identified, introjected, and external motivation. Integrated motivation means that the engagement is in complete harmony with the self. Identified motivation means that the engagement is seen as important or beneficial. Introjected motivation refers to the engagement as a consequent of feeling of guilt and shame when engagement is not done. Finally, external motivation means that there were rewarding and/or coercive powers that force someone to be engaged. Next to intrinsic and extrinsic motivation is a-motivation which in essence means that one admittedly engages with an activity or object but without any intention.
intrinsic-affective; Chronbach’s alpha = .95
intrinsic - potency; Chronbach’s alpha = .76
integrated; Chronbach’s alpha = .95
suits my preferred way to learn new skills
is compatible with how I usually want to improve my skills
fits my teaching style
fits my beliefs about how students learn
identified; Chronbach’s alpha = .85
is useful to me
is meaningful to me
makes me confident
is productive to me
will help me
introjected; Chronbach’s alpha = .93
I want others (e.g., my colleagues, my students) to think I am a smart person
I want others (e.g., my colleagues, my students) to be satisfied with me
I want to give the impression that I am doing the sensible thing
otherwise I would feel ashamed of myself
otherwise I would feel guilty of not doing so
otherwise I would feel bad about myself
extrinsic; Chronbach’s alpha = .78
will give me credits for my professional development targets
will cause me not to look bad in my professional environment
will cause some people not to become angry with me
will avoid poor future perspectives
amotovation; Chronbach’s alpha = .88
but I really have no desire to do so
although I think it is a waste of time
but personally I give no priority to that
although I would prefer to do other things
but actually I don’t see the point of it
but in fact I have no energy for it
Behavioural intention measure
intention - willingness; Chronbach’s alpha = .94
I intend to use the CAPTT in my teaching in the future
I am pretty sure I’m going to use the CAPTT in my teaching
I think I’m going to use the CAPTT in my teaching
I am willing to use the CAPTT in my teaching
I’m definitely going to apply the CAPTT in my teaching
Discussion and further research
Literature shows that there is limited evidence for the link between teacher professional development and student learning outcome (McRae et al., 2000; Desimone and Le Floch 2004; Supovitz, 2001; Cohen and Hill, 2000; Thompson, 2003). In many studies on professional development programmes, teacher learning, the teacher’s active role in the learning process (Little, 2006; Verloop & Kessels, 2006) and the conditions for professional development (Supovitz, 2001; Guskey and Sparks, 2002; Cohen and Hill, 2000) are being described as key element for professional development programs to be successful.
In our study we were more interested in the simplicity of the intervention tool and its effect on teacher behaviour and student outcome, paying less attention to the teacher’s initial beliefs, expectations, content-knowledge, experience or learning conditions. For us the teacher’s motivation to use the tool and follow our protocol was the starting point, in the hope that teachers would be motivated to copy strategies provided by the tool even when teaching other aspects of the English language.
Ultimately, not using the intervention tool itself but increasing the teachers’ motivation and improving the teachers’ skills to teach English pronunciation following certain strategies, was the goal. The initial, improved student outcome was supposed to work as an incentive for a teacher to change or adapt his or her teaching behaviour. So the initial evidence of better student outcome does not show the link between teacher professional development and student learning outcome but more the link between the use of the intervention tool (CAPTT) and better student outcome, with the teacher having had more of a facilitating role in test phase 1. However, teachers were asked to follow a strict procedure, asking them to briefly introduce each topic. For that purpose the intervention tool provided the teacher with a teacher’s guide, aiming to build on the teacher’s content knowledge (what to teach) but also on the pedagogical content-knowledge (how do students learn this best), which in many studies is considered to be eminent for improving and changing teachers’ teaching practice and improving student learning outcome (Van Driel and Berry, 2012; Yoon et al., 2007; Borko, 2004).
About 70% of the teachers claimed to have adjusted their pedagogy concerning teaching pronunciation, feel more competent to teach pronunciation or focus more on their own pronunciation. For five of the six error type categories the post intervention test results showed a significant improvement in student achievement. With the teachers ‘statements about personal change and the improved student achievement we surmise to have provided evidence of teacher professional development, by involving teachers in an intervention phase, in which a pre-structured teaching design is implemented and tested.
A crucial element of teacher participation in a cycle of evidence-based research is their belief in the beneficial outcome of the new approach for their students. The fact that they do not define the educational challenge and design the intervention tool themselves from the start, does not fit the ideal situation for inquiry based learning (Mor 2010; Anastopoulou et al. 2012) but it does involve teachers, who state they do not have the time to invest in full research activities, in the process of implementing a new design, test it and consequently become involved in the second cycle of “devising new practices, plans of activity, resources and tools aimed at achieving particular educational aims in a given situation” (Mor and Craft 2012).
In our case study teachers showed various signs of professional development. For one, all teachers claimed they were able to teach pronunciation in a structured way for the first time. For some teachers it meant teaching pronunciation differently from how they did it in the past (incidentally correcting individual students on the spot versus teaching pronunciation in a structured way, reaching more students with the same difficulties in a classroom setting). For other teachers it meant teaching English pronunciation for the first time and feeling safe doing so because of the set-up of the teaching design. Four teachers claimed the intervention tool refreshed their own theoretical knowledge. Five teachers stated that they were more aware of their own pronunciation after rounding off the test phase.
Another aspect of the teachers’ professional development in this case study was the teachers’ ability to reflect on the practical implementation of the new teaching design. Teachers either adapted the materials to their own classroom needs or advised the designers on how to perfect the materials for classroom usage. The combination of initial research, designing the intervention tool, testing the tool with the help of the practitioners in the field who then provide the necessary feedback based on practical experience in order to improve the teaching approach, allows for a solid start of the second cycle of implementing and testing. Teachers were already aware of the students’ progress, which increased the motivation for implementing and testing the new design even further.
In this study the student achievements did not significantly improve after adapting the CAPTT in test phase two, on the basis of teacher input. The adaptations to the CAPTT mostly concerned improving the time-efficiency aspect for teachers (flipping the classroom tasks to make sure students could do most of the work at home, so it would not take up too many classroom teaching hours). However, the teachers who took part in both test phases indicated that they would rather work with the adapted CAPTT as it was better structured, had more practice materials and was more time-efficient. Although student results in test phase one and two did not significantly differ, teacher satisfaction and motivation to use the CAPTT increased.
Involving teachers in a model of design inquiry (Mor and Mogilevsky 2010) or expecting teachers to adopt a design science attitude towards their practice (Diana Laurillard, 2012) and make them responsible for identifying an educational challenge (Mor and Craft 2012) can only be successful if the teachers involved have the right motivation, a belief in the need to change, a positive attitude towards research activities, the support of their superiors, enough time to invest, research skills and peers to consult. A situation that meets all of these requirements is hard to find in the Dutch educational system, as e.g., teachers in secondary education teach an average of 25 h and are bound to strict protocols leading to final exams. The claim that there is not enough time for research activities is valid for most teachers. Next to that researchers and designers, when designing a new pedagogy or teaching tool, often have a perfect picture of a motivated teacher whose only goal in life is to improve his students’ achievements. In our study we were amazed at the number of teachers who initially indicated to be interested to take part, received all the information and monthly mails, to finally fail to respond to any of the PI’s mails asking for the results. This all, in the light of a first successful test phase showing better student achievements.
We believe that when there is an educational challenge which exceeds the individual teacher’s classroom practice, it is sometimes wiser to drop the educational challenge top-down (What does the school need? Is there a nation-wide priority?), leave the design to researchers and educational experts, start involving teachers in the testing phase of the experimental intervention, and use their expertise and classroom experience to adjust the intervention tool or pedagogy.
Researchers and those responsible for education in general, who sometimes have a better overview of existing educational challenges, should always take into account that perfect teaching conditions are never met and that there is a significant number of teachers not able, capable or willing to define educational challenges, design a new pedagogy and get involved in a full cycle of design inquiry. Involving smaller groups of teachers who are able to test and adjust a new pedagogy, might lead to a well-tested and classroom-adjusted pedagogy, which could influence a broader network of teachers and which could even reach those teachers who lack the motivation to get involved in a cycle of design inquiry. The stronger the new pedagogy and the easier to implement it, the more chance of reaching teachers who find it hard to change their classroom practice.
Although the outcome of this intervention demonstrates improved teacher skills and student achievement, we realise that for further studies it is equally important to provide information on the sustainability of the Teacher Professional Development (TPD)-intervention. What happens if the necessity for active participation in the intervention programme is absent, the researchers and programme leaders are no longer visiting the workplace and there is no request for pre- and post-intervention data on student achievement and teacher skills anymore? It is important to find out how certain effects of TPD-interventions are embedded in a teacher’s day-to-day classroom practice or school organisation. We need to detect proof of sustainability of any professional development programme and focus on which contexts for promoting professional development influence the sustainability of a TPD-project the most. For this purpose we plan to revisit all the teachers who were involved in the TPD-intervention programme 1 year after the final post-intervention test in test phase 2, in order to research which elements of the TPD-intervention are still present in the day-to-day teaching practice of the teachers. For this we will not only depend on the teachers’ narrative by means of interviews and questionnaires, but also on student classroom experience concerning practising English pronunciation.
FH carried out the studies and drafted the manuscript. PS participated in its design and and performed statistical analysis. KK participated in the design and performed statistical analysis on teacher motivation. All authors read and approved the final manuscript.
The authors declare that they have no competing interests.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
- Anastopoulou, S., Sharples, M., Ainsworth, S., Crook, C., O’Malley, C., & Wright, M. (2012). Creating personal meaning through technology-supported science inquiry learning across formal and informal settings. International Journal of Science Education, 34(2), 251–273.View ArticleGoogle Scholar
- Borko, H. (2004). Professional development and teacher learning: mapping the terrain. Educational Researcher, 33, 3–15.View ArticleGoogle Scholar
- Cohen, D. K., & Hill, H. C. (2000). Instructional policy and classroom performance: The mathematics reform in California. Teachers College Record, 102(2), 294–343.View ArticleGoogle Scholar
- Desimone, L. M., & Le Floch, K. C. (2004). Are We asking the right questions? Using cognitive interviews to improve surveys in education research. Educational Evaluation and Policy Analysis, 26(1), 1–22.View ArticleGoogle Scholar
- Dias, J. W., & Rosenblum, L. D. (2016). Visibility of speech articulation enhances auditory phonetic convergence. Attention, Perception & Psychophysics, 78, 317–333.View ArticleGoogle Scholar
- Fishbein, M., & Yzer, M. C. (2003). Using theory to design effective health behavior interventions. Communication Theory, 13(2), 164–183.View ArticleGoogle Scholar
- Fishbein, M., & Ajzen, I. (2010). Predicting and changing behaviour: The reasoned action approach. New York: Psychology Press.Google Scholar
- Guskey, T., & Sparks, D. (2002). Linking professional development to improvements in student learning. New Orleans, LA: Annual Meeting of the American Educational Research Association. April 1-5.Google Scholar
- Hattie, J. (2009). The Black Box of Tertiary Assessment: An Impending Revolution. In Tertiary Assessment & Higher Education Student Outcomes: Policy, Practice & Research (pp. 259–275).Google Scholar
- Hattie, J. (2012). Visible learning for teachers. British Journal of Educational Technology, 43(4), E134–E136.View ArticleGoogle Scholar
- Hermans, F., & Sloep, P. B. (2015). Teaching the Dutch how to pronounce English. International Journal of Language Studies, 9(4), 55–80.Google Scholar
- Kreijns, K., Vermeulen, M., Kirschner, P. a., van Buuren, H., & van Acker, F. (2013). Adopting the Integrative Model of Behaviour Prediction to explain teachers’ willingness to use ICT: A perspective for research on teachers’ ICT usage in pedagogical practices. Technology, Pedagogy and Education, 22(1), 55–71.View ArticleGoogle Scholar
- Laurillard, D. (2012). Teaching as a design science: Building Pedagogical Patterns for Learning and Technology. By Diana Laurillard. British Journal of Educational Studies, 60(4), 448–450.View ArticleGoogle Scholar
- Little, J. W. (2006). Professional community and professional development in the learning-centered school. Washington, DC: NEA.Google Scholar
- McRae, D., Ainsworth, G., Groves, R., Rowland, M., & Zbar, V. (2000). PD 2000 Australia: A National Mapping of School Teacher Professional Development. Canberra: Commonwealth Department of Education, Training and Youth Affairs.Google Scholar
- Meiers, M., & Ingvarson, L. (2005). Investigating the links between teacher professional development and student learning outcomes. Volumes 1 and 2. Science, 1.Google Scholar
- Mor, Y., & Craft, B. (2012). Learning Design: reflections on a snapshot of the current landscape. Research in Learning Technology.Google Scholar
- Mor, Y. (2010). Embedding design patterns in a methodology for a design science of e-Learning. In Problems investigations of e-Learning patterns: Context factors problems and solutions (pp. 107–134).Google Scholar
- Mor, Y., & Mogilevsky, O. (2013). The learning design studio: Collaborative design inquiry as teachers’ professional development. Research in Learning Technology, 21.Google Scholar
- Patton, M. Q. (2002). Two decades of developments in qualitative inquiry: a personal, experiential perspective. Qualitative Social Work, 1(3), 261–283.View ArticleGoogle Scholar
- Ryan, R. M., & Connell, J. P. (1989). Perceived locus of causality and internalization: Examining reasons for acting in two domains. Journal of Personality and Social Psychology, 57(5), 749–761.View ArticleGoogle Scholar
- Ryan, R. M., Ryan, R. M., Deci, E. L., & Deci, E. L. (2000). Intrinsic and extrinsic motivations: Classic de nitions and new directions. Contemporary Educational Psychology, 5(1), 54–67.View ArticleGoogle Scholar
- Strauss, A., & Corbin, J. (1990). Basics of Qualitative Research Grounded Theory Procedures and Techniques, 32–42. Retrieved from.Google Scholar
- Supovitz, J. A. (2001). Translating Teaching Practice into Improved Student Achievement. Yearbook (National Society for the Study of Education), 2, 81–98.Google Scholar
- Thompson, C. L. (2003). Improving student performance through professional Development for Teachers. NC: Education Research Council.Google Scholar
- Van Driel, J. H., & Berry, A. (2012). Teacher professional development focusing on pedagogical content knowledge. Educational Researcher, 41(1), 26–28.View ArticleGoogle Scholar
- Veen, K. Van, Zwart, R., Meirink, J., & Verloop, N. (2010). Professionele ontwikkeling van leraren een reviewstudie naar effectieve kenmerken van professionaliseringsinterventies van leraren. Reviewstudie, ICLON(December), 2/150.Google Scholar
- Verloop, N., & Kessels, J. W. M. (2006). Opleidingskunde: Ontwikkelingen rond het opleiden en leren van professionals in onderwijs en bedrijfsleven. Pedagogische Studiën, 83, 301–321.Google Scholar
- Yoon, K. S., Garet, M., Birman, B., & Jacobson, R. (2007). Examining the effects of mathematics and science professional development on teachers’ instructional practice: Using professional development activity log. Washington, DC: Council of …, 1–59.Google Scholar