Skip to main content
  • Research article
  • Open access
  • Published:

Teacher professional development in the contexts of teaching English pronunciation

Abstract

In this study we focus on the effects of an intervention aiming to improve the English pronunciation skills of secondary school students in the Netherlands. In order to implement a new pedagogy successfully it is of the essence to take into account how teachers learn and what motivates them to adapt and change their way of teaching. Teachers need time to test and adapt a teaching design to fit the needs of their classroom practice and the students’ needs. In this paper the main focus is on finding evidence of teacher professional development in teaching English pronunciation. Results show that teachers are extrinsically motivated to change their teaching behaviour and classroom practice after using a computer assisted teaching tool to teach English pronunciation.

Introduction

Teacher professional development

Research shows that teacher quality is significantly and positively correlated with student attainment and that it is the most important within-school aspect explaining student performance. Its effects are much larger than the effects of school organisation, leadership or financial conditions (Hattie 2009, 2012; Meiers and Ingvarson 2005; Veen et al. 2010).

Hattie (ibid.) indicates that six sources influence a student’s achievement: 50% is what the student is capable of bringing to the table himself. Other sources are home situations, schools, peer influences and principals, which altogether make up 20%, leaving a staggering 30% to teachers. So investing in teachers is the most important external key to influence students’ achievements (Fig. 1).

Fig. 1
figure 1

Percentage of achievement variance

Laurillard (2012) and Mor and Mogilevsky (2013) see the teacher as the initiator of defining an educational challenge and of the conceptualisation of its solution. This, however, means that certain conditions at a teacher’s workplace should already be met before this first step can be taken. School leaders should have already facilitated teachers in a way that they are able to devote time to thinking about an educational challenge they would like to address, without being hunted by the school’s curriculum and short-term students’ achievements. For most secondary school teachers in the Netherlands the situation of the day-to-day practice of teaching (and the curriculum) leaves no room for in-depth research and design initiatives.

In this study we explore the process of teacher professional development and the effect of implementing a new teaching design on the behaviour of teachers. This takes place in the context of teaching English pronunciation to secondary school pupils (who from now on we refer to as students) and students at schools for intermediate and higher vocational education (universities of applied sciences) in the Netherlands.

Context of the case study

In a previous published article, (Hermans & Sloep, 2015) sound data provided by secondary school students and bachelor students studying at a university of applied sciences was analysed in order to determine the error types most frequently occurring in the English pronunciation of most of the Dutch pupils and students. Six error types (Table 1) were present in the pronunciation of more than 50% of the target group in more than 50% of the cases in which a mistake in those error type categories could be made.

Table 1 Error types that are made by more than half of the students in more than 50% of the cases in which such a mistake could be made in the tests set to the students

In this study we aim to gain data on how teachers learn and change their classroom practice, on how they implement a new pedagogy, and on what motivates them to take part in experimenting with and implementing a new teaching tool. The context of the experiment is English pronunciation teaching in EFL lessons in the Netherlands. A pedagogical approach was designed and EFL teachers were asked to take part in the test phase of the design. They were asked to test the design in their classrooms and to suggest alterations to the design in order to adjust the materials to fit their classroom situation. Our research question is:

Can we provide evidence of teacher professional development by involving teachers in an intervention phase, implementing a pre-structured teaching design?

We focus on evidence of teacher motivation, leading to signs of professional development and changes in teacher attitude concerning their classroom practice. We also seek to learn about the teachers’ opinions concerning the intervention tool and their ideas on how they would like to professionalise.

Method

Test phase 1

Based on previous research (Hermans & Sloep, 2015) a Computer Assisted Pronunciation Teaching Tool (CAPTT) was designed and implemented in a dedicated website, using Liferay Portal EE (http://www.liferay.com). It provided students with seven chapters: one introduction on the topic of English pronunciation and six chapters covering six error type categories. The instructions were in Dutch to avoid any possible confusion. The only computer skills needed for students was knowledge about how to click and go to the next step. The recording tasks required a mobile phone or a tablet (voice recorder) and Wi-Fi in order to send the recordings to the teacher or the Principal Investigator (PI). Most phones and tablets have a standard voice recorder installed. Next to that there are plenty of voice recorders which can be downloaded for free. With a laptop, phone or tablet and an Internet connection the online module could be used.

Teachers were provided with a code and a password to be able to access the teacher area of the website. The teacher area provided them with background information on phonetics and pronunciation, test materials and keys to all of the assignments. Once the link to the website was opened, the teacher only had to instruct the students once by pointing out where to start. The instructions were all self-explanatory and the teacher’s role, once the module was running, was more of a guider than a teacher. With the background information on phonetics and pronunciation and the recording tasks a teacher could give feedback to students on an individual basis or in-group sessions. No further ICT skills were required.

The website made use of a straightforward format with text information and embedded videos. The contents only focused on the necessary information needed to improve students’ English pronunciation in the six error types categories. Since the website only required limited and basic technical options, there was no need to charge schools for using the CAPTT. In the Netherlands students without a mobile phone are the exception. Using the mobile phone or a tablet as a recording devise is a cost-effective way to gather sound data. A student without a phone or tablet could always borrow a fellow student’s phone to record a task in class. As most schools in the Netherlands (and all the schools involved in this research) have free Wi-Fi, students incurred no costs with gathering their sound data that were sent to the PI via Email.

The website only offered the necessary information to address the six error types. The module provided seven lessons covering error types most Dutch speakers of English students struggled with. Every error type category was introduced with textual information supported by video files, followed by recording tasks and practice materials. There were no side tracks and there was no unnecessary extra information or technical options added to the necessary information to be learnt about the selected difficulties in pronunciation. All of the lessons set achievable goals for students.

Treatment procedure

Teachers were offered the basic background information on phonetics necessary to provide students with extra information and specific articulatory information concerning the six error type categories. Students were also provided with some background information on some of the phonetic symbols used, only to point out the difference between certain phonemes that are difficult to distinguish by Dutch speakers of English (e.g., the difference between voiceless th /θ/ and voiced th /ð/.

The students were asked to shadow (and record) the words and sentences after hearing them pronounced in the videos in class. The auditory speech was enhanced by visual aspects of articulation (Dias and Rosenblum 2016) to enhance phonetic convergence.

The recording tasks allowed teachers to give more individual student feedback. Students were able to analyse their own pronunciation by listening to their own recordings following a given evaluation format. Teachers were able to analyse the pre- and post-intervention test according to the same format the researchers were using and discuss this with their students.

The pre- and post-intervention test provided data on student results before and after working with the CAPTT. Specific texts were designed in which all error type categories were represented in an equal frequency of occurrence. A strict protocol was handed out to three analysts who looked into the sound files individually and then compared the results.

Teachers received a strict procedure on how to present the CAPTT. The procedure prescribed the order of presenting the chapters concerning the six error type categories. Every teacher received an individual instruction for the order of dealing with the various chapters in order to minimize the effect of error type categories being dealt with first or last, on the end result of the sampled group. The teachers were also instructed to only use the CAPTT’s introduction to introduce the topic (English pronunciation) and to use no more than five minutes to introduce each chapter, using the background information on phonetics provided by the CAPTT in the teacher’s section. For the pre- and post-intervention test there was a strict procedure as well. The teacher’s role was one of facilitating (essentially run the CAPTT) and providing guidance only if necessary (and so not one of traditional instructing and teaching).

All of the CAPTT’s tasks were self-explanatory so students could work at their own pace. With these strict procedures the aim was to minimize the effect of teachers (i.e., their personal interests, attitudes towards the teaching topic, backgrounds, skills e.g.,) on the results of the post-intervention test. Minimizing the teacher’s input enhances the validity of the CAPTT’s effect on the post-intervention results.

Subject group test phase 1

Teachers were invited to participate via email and were asked to take part in a test phase of a newly developed teaching approach for teaching English pronunciation. Initially 17 teachers spread over five schools agreed to take part in an introductory meeting. Five teachers at a school for the higher vocational training were aligned with the teacher training college and taking part in the test phase was part of their teaching task. One of the teachers, working at the PI’s home institution, was also teaching at a school for intermediate vocational education and she decided to use the new design there too.

During the introductory meeting the educational challenge was explained and teachers were informed about the newly designed teaching approach dealing with the six most occurring pronunciation mistakes made by Dutch speakers of English. Teachers were told that the first test phase would take place from September 2014 until December 2015.

In July 2014 23 teachers received a letter with information about the setup of the test phase. The mail included the necessary access codes and information to be able to use the online module (CAPTT) Do your pupils sound English? They also received a strict protocol, explaining how to work with the module, in order to make sure all teachers followed the same procedures. The protocol included a procedure for a pre-intervention test to establish the students’ skills before working with the module and a post-intervention test after working with the module, in order to measure possible student improvements. Seven lessons of 50 min had to be scheduled by the teachers themselves in order to teach all the topics of the module. September 1st 2014 was indicated as starting point of the test phase (2 weeks after the summer holidays) and December 30th 2015 as the end point.

On September 1st 2014 all teachers received an email to remind them of the starting point of the test phase and to wish them good luck with using the online module. Twelve teachers either failed to respond to any further emails, did not use the module due to personal circumstances or a lack of teaching time, or failed to hand in the final post-intervention test results. In the end eleven teachers taught the seven lessons of the CAPTT and handed in the pre- and post-intervention test results (similar test covering the six error type categories before and after working with the CAPTT in class).

The EFL teachers (n = 11) followed in this case study were not actively involved in teacher professional development activities in this field and there was no ongoing collaboration of teachers working on an educational challenge in order to improve students’ achievements concerning English pronunciation. None of the teachers were actively involved in research activities at the time. Table 2 provides information on the individual teacher’s age, teaching degree, years of experience, the school-type he/she teaches teach at, and their first language background in test phase one. To protect their privacy, names are fictional.

Table 2 details subject group teachers test phase 1

Data collection test phase 1

All the parents of the students aged under 18, that were recorded in a pre- and post-intervention test (in test phase 1 and 2), received a mail from their teachers or headmaster, explaining the purpose of the research. The procedures were explained and it was pointed out that participation was on a voluntary basis. Participant older than 17 received a similar mail from the PI, explaining the purpose of the research and the procedures. The results of the pre-intervention test of 70 students (70 recordings randomly chosen out of 162 available recordings) were set against the results of the post-intervention test. The analyses of the sound data was done by the PI and two assistants. The inter reliability rate was high at,871 .

All the teachers who took part in the first test phase were invited to take part in a semi-structured interview. One teacher could not find the time for an interview. Nine teachers were interviewed (individually, semi-structured interview) at their school and one teachers agreed to be interviewed at his home. The interview questions allowed the teachers to narrate their thoughts on the teaching topic (teaching English pronunciation), their former teaching activities concerning teaching English pronunciation, using the new teaching design and their own professional development. For this paper we focussed on the topic of teacher professional development. The interview questions directly related to Teacher Professional Development (TPD) were:

  1. 1.

    Were you teaching pronunciation in your EFL-lessons before working with the CAPTT? (If so, how were you teaching pronunciation?)

  2. 2.

    Has your approach and attitude towards teaching pronunciation changed after working with the CAPTT?

  3. 3.

    How did you adjust the approach to fit your classroom practice?

  4. 4.

    What are your suggestions for improving the CAPTT so it would better meet your and your students’ needs?

  5. 5.

    What would be the best teacher professional development method for you as an EFL- teacher?

The aim after test phase 1 was to find out about the teacher’s experience working with the CAPTT and to gain input on how to adapt and improve the intervention tool (CAPTT) so it would better meet the teachers’ and students’ needs in a classroom situation.

Data were analysed in two stages. In the first stage one researcher analysed the transcriptions. An analysis (Patton, 2002) was performed identifying interview fragments on the basis of categories derived from the research questions as sensitizing concepts (Strauss & Corbin, 1990). Interviews were analysed for the teachers’ feedback on the teaching design, their personal standpoint towards the necessity of the intervention, changes in their attitude towards teaching pronunciation (signs of professional development), willingness to take part in design inquiry activities themselves.

Test phase 2

Based on the outcome (students’ results), feedback and teaching experiences from teachers (and students), gathered after the first test phase, the teaching design was reviewed and slightly adapted in order to be tested again to find out whether teachers’ and students’/pupils’ feedback and idea-input had improved the design, led to better (student) results and showed proof of significant teacher professional development.

Subject group (teachers) test phase 2

In the second test phase the same procedure was followed as in test phase 1. All the teachers involved in the first test phase agreed to take part in the second test phase as well. Additionally, four new teachers teaching bachelor students took part and three left because their teaching contract ended, leaving a total of six teachers. Nine additional secondary school teachers from three additional schools agreed to take part after an introductory presentation. September 1st 2015 was indicated as starting point of the test phase and February, 15th 2016 as the rounding off date.

All the teachers (six) teaching bachelor students sent in the post-intervention test before the deadline. Of the 15 secondary school teachers seven teachers met the deadline. Two teachers never responded to any of the PI’s mails before and after the deadline. The remaining teachers all answered the PI’s mail in which they were asked for the post-intervention recordings, claiming they did not find the opportunity to round off or even present the CAPTT due to circumstances. Finally four new teachers teaching bachelor students and four new secondary school teachers taught the seven lessons and handed in the pre- and post-intervention test results of the second test phase. Two teachers teaching bachelor students and three secondary school teachers, who all took part in the first test phase, also taught the seven lessons and handed in the pre- and post-intervention test results of the second test phase. In the end 13 teachers completed working with the module as planned. Table 3 provides information on the individual teacher’s age, teaching degree, years of experience, the school-type he/she teaches teach at, and the L1 background in test phase two. To protect their privacy, names are fictional.

Table 3 Details subject group teachers test phase 2

Data collection test phase 2

The results of the pre-intervention test of 60 students (60 recordings randomly chosen out of 222 available recordings) were set against the results of the post-intervention test. For test phase 2 we followed the same procedures as for test phase 1 (See 2.3). Additionally we adopted Fishbein’s Integrative Model of Behaviour Prediction (Fishbein and Yzer 2003; Kreijns et al. 2013) in order to gain data on dispositional variables including attitude, self-efficacy and subjective norm influencing teachers’ motivation to take part in an experimental teaching design using a CAPTT.

Results

Test phase 1: interview

After the interview analysis quotes were categorised. All teachers considered the module to be useful and were willing to use the teaching design again in the second test phase. One teacher however, demanded a less strict procedure for working with the module, as she thought some parts in the teaching design were too easy for her students and she wanted to be able to skip tasks to speed up the teaching process. Another teacher disagreed with the statement that a good pronunciation increases the credibility of the speaker. He was only interested in the module because of the intelligibility aspect. Almost all the teachers wanted to have the ability to evaluate their own students and missed test materials and repetition tasks in the module.

All the teachers struggled with teaching pronunciation in the past. One teacher claimed that she had always focussed on pronunciation in the past and that she had corrected students on the spot, but that she had never used a structured approach before. Eight teachers admitted they had never really paid attention to teaching pronunciation (except for the occasional attention given to the th-sounds) before working with the module. The teachers teaching bachelor students were used to dealing with pronunciation in their lessons as their subject was mainly teaching English speaking skills. However, none of them had a specific teaching approach for this. They corrected students when they took turns in speaking activities and focussed on a student’s individual pronunciation mistakes only. The speaking activities allowed students to speak a lot amongst each other but were not really useful for individual feedback. The opportunities for individual feedback were limited. The teachers welcomed the structured approach. For them a very important positive aspect was the introduction of using cell phones and tablets to record students’ pronunciation performances. Teachers were now able to listen to a student’s performance more often, which allowed them to give accurate feedback and analyse the performance together with the student. Two teachers indicated they were using the recording devices for other tasks now. One teacher, who was practising a play (in English) with his students, started to record sessions to work on students’ use of intonation (Table 4).

Table 4 Categories as sensitizing concepts based on research questions test in phase one

All of the teachers were very eager to advise the researchers on how to improve and adjust the intervention tool to make it more suitable for personal classroom usage. One teacher did not really have a good Internet connections in the classroom and wished for the online materials to be available on paper too. One teacher disliked the fact that the instruction language was Dutch, and for that reason not really suitable for her students following the bilingual course (most subjects taught in English). She wanted an English version of the module. Almost all the teachers missed the chance to evaluate their students themselves and give personal feedback on their students’ performances. They also wanted more repetition in the materials and more freedom in how to work with the materials. They wanted to break free from the strict procedures. They also felt a desire to inform all of their students about the progress in their achievements.

Test phase 2: interview

After the interview analysis quotes were categorised. We were most interested in sings of change in teacher behaviour and professional development. We were also interested in signs indicating teachers might have had some reservations towards working with the CAPTT and the need for the intervention. As we did not receive any response to three mails asking teachers to get involved using the online feedback tool (sharing experiences with peers), and with the experience of test phase 1, we assumed there was no interest in sharing information with peers using the CAPTT option for this. There were no more questions asked concerning this topic during the interview. As test phase 1 already provided us with data on how teachers prefer to be professionalised, and because most of the teachers were also working in test phase two, and we had another longer set of questions for teachers to fill in, concerning teacher motivation, we did not ask questions about preferable ways of professionalising in test phase 2 (Table 5).

Table 5 Categories as sensitizing concepts based on research questions in test phase two

Nine out of thirteen (13 = n) teachers indicated they altered their classroom practice, devoting more time to teaching pronunciation and feeling better equipped to do so than before taking part in the experiment. Ten out of 13 teachers commented on the intervention being important. Eight teachers considered a good pronunciation to be related to more credibility of the speaker. Only three teachers commented on the quality of the CAPTT. There were two teachers, Andrew and Deejay, with a Dutch/Moroccan background. Andrew was born in the Netherlands and Deejay was born in Morocco, but has been living in the Netherlands for 20 years. Andrew spoke negative about the link between a good pronunciation and credibility, and argued that native speakers should not judge on the basis of accent but on the basis of content. He was only interested in improving students’ intelligibility and did not want his students to loose part of their identity while struggling for a near-native English pronunciation. DeeJay considered both credibility and intelligibility to be important, but was also more interested in the intelligibility part. There were two native English speaking teachers who both considered good pronunciation to be positively influencing the speaker’s credibility.

Test phase 2: Integrative Model of Behaviour Prediction

In order to measure teachers’ motivation to use the CAPTT we used a measure that was derived from the Perceived Locus of Causality measure (PLOC) of Ryan and Connell (1989); we refer to this as the adapted PLOC measure or for short a-PLOC. This measure assesses different types of motivation that regulate behaviour as defined by the Self-Determination Theory (SDT) of Ryan et al. (2000). In short, SDT distinguishes between intrinsic and extrinsic motivation. Intrinsic motivation is concerned with the enjoyment and challenges the engagement with an activity or object (i.e., CAPTT) gives. Extrinsic motivation encompasses four types of motivation that vary in their degree of autonomy. From highest to lowest autonomy these four types are: integrated, identified, introjected, and external motivation. Integrated motivation means that the engagement is in complete harmony with the self. Identified motivation means that the engagement is seen as important or beneficial. Introjected motivation refers to the engagement as a consequent of feeling of guilt and shame when engagement is not done. Finally, external motivation means that there were rewarding and/or coercive powers that force someone to be engaged. Next to intrinsic and extrinsic motivation is a-motivation which in essence means that one admittedly engages with an activity or object but without any intention.

a-PLOC measure assesses each type of motivation to use the CAPTT. Thereby, intrinsic motivation has two dimensions, namely affect and potency. Whereas affect refers to the fun and enjoyment, potency refers to the challenging and stimulating aspects of the engagement. Items of a-PLOC were all rated using a 7-point Likert scale with endpoints ‘always false’ (1) and ‘always true’ (7). The results of the administration are shown in Table 1. As can be seen, Chronbach’s alphas were all satisfactory (Table 6).

Table 6 A-PLOC measure

We also measured teachers’ intention to use the CAPTT. Behavioural intention is defined as “an indication of a person’s readiness to perform a behaviour” (Fishbein and Ajzen, 2010). Intentions is seen as a proxy for actual behaviour; that is, actually using the CAPTT. However, it should be noted that the relationship is not perfect. The instrument to measure behavioural intention is constructed according the guidelines given by (Fishbein and Ajzen, 2010). Items of intention measure were all rated using a 7-point Likert scale with endpoints ‘always false’ (1) and ‘always true’ (7). The results of the administration are shown in Table 2. The Chronbach’s alpha was very satisfactory (Table 7).

Table 7 Behavioural intention measure

To investigate the relationships between the different types of motivation and the behavioural intention, Spearman correlations were calculated. What can be concluded from Table 8 is that apparently teachers intend to use the CAPTT because they find it interesting and challenging to use (potency) and also because they believe its use is connected with their identity as being a teachers (integrated) as well as that they find the CAPTT a useful tool. However, there is also an external force that pressures them to use the CAPTT (extrinsic).

Table 8 Spearman correlations

Discussion and further research

Literature shows that there is limited evidence for the link between teacher professional development and student learning outcome (McRae et al., 2000; Desimone and Le Floch 2004; Supovitz, 2001; Cohen and Hill, 2000; Thompson, 2003). In many studies on professional development programmes, teacher learning, the teacher’s active role in the learning process (Little, 2006; Verloop & Kessels, 2006) and the conditions for professional development (Supovitz, 2001; Guskey and Sparks, 2002; Cohen and Hill, 2000) are being described as key element for professional development programs to be successful.

In our study we were more interested in the simplicity of the intervention tool and its effect on teacher behaviour and student outcome, paying less attention to the teacher’s initial beliefs, expectations, content-knowledge, experience or learning conditions. For us the teacher’s motivation to use the tool and follow our protocol was the starting point, in the hope that teachers would be motivated to copy strategies provided by the tool even when teaching other aspects of the English language.

Ultimately, not using the intervention tool itself but increasing the teachers’ motivation and improving the teachers’ skills to teach English pronunciation following certain strategies, was the goal. The initial, improved student outcome was supposed to work as an incentive for a teacher to change or adapt his or her teaching behaviour. So the initial evidence of better student outcome does not show the link between teacher professional development and student learning outcome but more the link between the use of the intervention tool (CAPTT) and better student outcome, with the teacher having had more of a facilitating role in test phase 1. However, teachers were asked to follow a strict procedure, asking them to briefly introduce each topic. For that purpose the intervention tool provided the teacher with a teacher’s guide, aiming to build on the teacher’s content knowledge (what to teach) but also on the pedagogical content-knowledge (how do students learn this best), which in many studies is considered to be eminent for improving and changing teachers’ teaching practice and improving student learning outcome (Van Driel and Berry, 2012; Yoon et al., 2007; Borko, 2004).

About 70% of the teachers claimed to have adjusted their pedagogy concerning teaching pronunciation, feel more competent to teach pronunciation or focus more on their own pronunciation. For five of the six error type categories the post intervention test results showed a significant improvement in student achievement. With the teachers ‘statements about personal change and the improved student achievement we surmise to have provided evidence of teacher professional development, by involving teachers in an intervention phase, in which a pre-structured teaching design is implemented and tested.

A crucial element of teacher participation in a cycle of evidence-based research is their belief in the beneficial outcome of the new approach for their students. The fact that they do not define the educational challenge and design the intervention tool themselves from the start, does not fit the ideal situation for inquiry based learning (Mor 2010; Anastopoulou et al. 2012) but it does involve teachers, who state they do not have the time to invest in full research activities, in the process of implementing a new design, test it and consequently become involved in the second cycle of “devising new practices, plans of activity, resources and tools aimed at achieving particular educational aims in a given situation” (Mor and Craft 2012).

In our case study teachers showed various signs of professional development. For one, all teachers claimed they were able to teach pronunciation in a structured way for the first time. For some teachers it meant teaching pronunciation differently from how they did it in the past (incidentally correcting individual students on the spot versus teaching pronunciation in a structured way, reaching more students with the same difficulties in a classroom setting). For other teachers it meant teaching English pronunciation for the first time and feeling safe doing so because of the set-up of the teaching design. Four teachers claimed the intervention tool refreshed their own theoretical knowledge. Five teachers stated that they were more aware of their own pronunciation after rounding off the test phase.

Another aspect of the teachers’ professional development in this case study was the teachers’ ability to reflect on the practical implementation of the new teaching design. Teachers either adapted the materials to their own classroom needs or advised the designers on how to perfect the materials for classroom usage. The combination of initial research, designing the intervention tool, testing the tool with the help of the practitioners in the field who then provide the necessary feedback based on practical experience in order to improve the teaching approach, allows for a solid start of the second cycle of implementing and testing. Teachers were already aware of the students’ progress, which increased the motivation for implementing and testing the new design even further.

In this study the student achievements did not significantly improve after adapting the CAPTT in test phase two, on the basis of teacher input. The adaptations to the CAPTT mostly concerned improving the time-efficiency aspect for teachers (flipping the classroom tasks to make sure students could do most of the work at home, so it would not take up too many classroom teaching hours). However, the teachers who took part in both test phases indicated that they would rather work with the adapted CAPTT as it was better structured, had more practice materials and was more time-efficient. Although student results in test phase one and two did not significantly differ, teacher satisfaction and motivation to use the CAPTT increased.

Involving teachers in a model of design inquiry (Mor and Mogilevsky 2010) or expecting teachers to adopt a design science attitude towards their practice (Diana Laurillard, 2012) and make them responsible for identifying an educational challenge (Mor and Craft 2012) can only be successful if the teachers involved have the right motivation, a belief in the need to change, a positive attitude towards research activities, the support of their superiors, enough time to invest, research skills and peers to consult. A situation that meets all of these requirements is hard to find in the Dutch educational system, as e.g., teachers in secondary education teach an average of 25 h and are bound to strict protocols leading to final exams. The claim that there is not enough time for research activities is valid for most teachers. Next to that researchers and designers, when designing a new pedagogy or teaching tool, often have a perfect picture of a motivated teacher whose only goal in life is to improve his students’ achievements. In our study we were amazed at the number of teachers who initially indicated to be interested to take part, received all the information and monthly mails, to finally fail to respond to any of the PI’s mails asking for the results. This all, in the light of a first successful test phase showing better student achievements.

We believe that when there is an educational challenge which exceeds the individual teacher’s classroom practice, it is sometimes wiser to drop the educational challenge top-down (What does the school need? Is there a nation-wide priority?), leave the design to researchers and educational experts, start involving teachers in the testing phase of the experimental intervention, and use their expertise and classroom experience to adjust the intervention tool or pedagogy.

Researchers and those responsible for education in general, who sometimes have a better overview of existing educational challenges, should always take into account that perfect teaching conditions are never met and that there is a significant number of teachers not able, capable or willing to define educational challenges, design a new pedagogy and get involved in a full cycle of design inquiry. Involving smaller groups of teachers who are able to test and adjust a new pedagogy, might lead to a well-tested and classroom-adjusted pedagogy, which could influence a broader network of teachers and which could even reach those teachers who lack the motivation to get involved in a cycle of design inquiry. The stronger the new pedagogy and the easier to implement it, the more chance of reaching teachers who find it hard to change their classroom practice.

Although the outcome of this intervention demonstrates improved teacher skills and student achievement, we realise that for further studies it is equally important to provide information on the sustainability of the Teacher Professional Development (TPD)-intervention. What happens if the necessity for active participation in the intervention programme is absent, the researchers and programme leaders are no longer visiting the workplace and there is no request for pre- and post-intervention data on student achievement and teacher skills anymore? It is important to find out how certain effects of TPD-interventions are embedded in a teacher’s day-to-day classroom practice or school organisation. We need to detect proof of sustainability of any professional development programme and focus on which contexts for promoting professional development influence the sustainability of a TPD-project the most. For this purpose we plan to revisit all the teachers who were involved in the TPD-intervention programme 1 year after the final post-intervention test in test phase 2, in order to research which elements of the TPD-intervention are still present in the day-to-day teaching practice of the teachers. For this we will not only depend on the teachers’ narrative by means of interviews and questionnaires, but also on student classroom experience concerning practising English pronunciation.

References

  • Anastopoulou, S., Sharples, M., Ainsworth, S., Crook, C., O’Malley, C., & Wright, M. (2012). Creating personal meaning through technology-supported science inquiry learning across formal and informal settings. International Journal of Science Education, 34(2), 251–273.

    Article  Google Scholar 

  • Borko, H. (2004). Professional development and teacher learning: mapping the terrain. Educational Researcher, 33, 3–15.

    Article  Google Scholar 

  • Cohen, D. K., & Hill, H. C. (2000). Instructional policy and classroom performance: The mathematics reform in California. Teachers College Record, 102(2), 294–343.

    Article  Google Scholar 

  • Desimone, L. M., & Le Floch, K. C. (2004). Are We asking the right questions? Using cognitive interviews to improve surveys in education research. Educational Evaluation and Policy Analysis, 26(1), 1–22.

    Article  Google Scholar 

  • Dias, J. W., & Rosenblum, L. D. (2016). Visibility of speech articulation enhances auditory phonetic convergence. Attention, Perception & Psychophysics, 78, 317–333.

    Article  Google Scholar 

  • Fishbein, M., & Yzer, M. C. (2003). Using theory to design effective health behavior interventions. Communication Theory, 13(2), 164–183.

    Article  Google Scholar 

  • Fishbein, M., & Ajzen, I. (2010). Predicting and changing behaviour: The reasoned action approach. New York: Psychology Press.

    Google Scholar 

  • Guskey, T., & Sparks, D. (2002). Linking professional development to improvements in student learning. New Orleans, LA: Annual Meeting of the American Educational Research Association. April 1-5.

    Google Scholar 

  • Hattie, J. (2009). The Black Box of Tertiary Assessment: An Impending Revolution. In Tertiary Assessment & Higher Education Student Outcomes: Policy, Practice & Research (pp. 259–275).

    Google Scholar 

  • Hattie, J. (2012). Visible learning for teachers. British Journal of Educational Technology, 43(4), E134–E136.

    Article  Google Scholar 

  • Hermans, F., & Sloep, P. B. (2015). Teaching the Dutch how to pronounce English. International Journal of Language Studies, 9(4), 55–80.

    Google Scholar 

  • Kreijns, K., Vermeulen, M., Kirschner, P. a., van Buuren, H., & van Acker, F. (2013). Adopting the Integrative Model of Behaviour Prediction to explain teachers’ willingness to use ICT: A perspective for research on teachers’ ICT usage in pedagogical practices. Technology, Pedagogy and Education, 22(1), 55–71.

    Article  Google Scholar 

  • Laurillard, D. (2012). Teaching as a design science: Building Pedagogical Patterns for Learning and Technology. By Diana Laurillard. British Journal of Educational Studies, 60(4), 448–450.

    Article  Google Scholar 

  • Little, J. W. (2006). Professional community and professional development in the learning-centered school. Washington, DC: NEA.

    Google Scholar 

  • McRae, D., Ainsworth, G., Groves, R., Rowland, M., & Zbar, V. (2000). PD 2000 Australia: A National Mapping of School Teacher Professional Development. Canberra: Commonwealth Department of Education, Training and Youth Affairs.

    Google Scholar 

  • Meiers, M., & Ingvarson, L. (2005). Investigating the links between teacher professional development and student learning outcomes. Volumes 1 and 2. Science, 1.

  • Mor, Y., & Craft, B. (2012). Learning Design: reflections on a snapshot of the current landscape. Research in Learning Technology.

    Google Scholar 

  • Mor, Y. (2010). Embedding design patterns in a methodology for a design science of e-Learning. In Problems investigations of e-Learning patterns: Context factors problems and solutions (pp. 107–134).

  • Mor, Y., & Mogilevsky, O. (2013). The learning design studio: Collaborative design inquiry as teachers’ professional development. Research in Learning Technology, 21.

  • Patton, M. Q. (2002). Two decades of developments in qualitative inquiry: a personal, experiential perspective. Qualitative Social Work, 1(3), 261–283.

    Article  Google Scholar 

  • Ryan, R. M., & Connell, J. P. (1989). Perceived locus of causality and internalization: Examining reasons for acting in two domains. Journal of Personality and Social Psychology, 57(5), 749–761.

    Article  Google Scholar 

  • Ryan, R. M., Ryan, R. M., Deci, E. L., & Deci, E. L. (2000). Intrinsic and extrinsic motivations: Classic de nitions and new directions. Contemporary Educational Psychology, 5(1), 54–67.

    Article  Google Scholar 

  • Strauss, A., & Corbin, J. (1990). Basics of Qualitative Research Grounded Theory Procedures and Techniques, 32–42. Retrieved from.

  • Supovitz, J. A. (2001). Translating Teaching Practice into Improved Student Achievement. Yearbook (National Society for the Study of Education), 2, 81–98.

    Google Scholar 

  • Thompson, C. L. (2003). Improving student performance through professional Development for Teachers. NC: Education Research Council.

    Google Scholar 

  • Van Driel, J. H., & Berry, A. (2012). Teacher professional development focusing on pedagogical content knowledge. Educational Researcher, 41(1), 26–28.

    Article  Google Scholar 

  • Veen, K. Van, Zwart, R., Meirink, J., & Verloop, N. (2010). Professionele ontwikkeling van leraren een reviewstudie naar effectieve kenmerken van professionaliseringsinterventies van leraren. Reviewstudie, ICLON(December), 2/150.

  • Verloop, N., & Kessels, J. W. M. (2006). Opleidingskunde: Ontwikkelingen rond het opleiden en leren van professionals in onderwijs en bedrijfsleven. Pedagogische Studiën, 83, 301–321.

    Google Scholar 

  • Yoon, K. S., Garet, M., Birman, B., & Jacobson, R. (2007). Examining the effects of mathematics and science professional development on teachers’ instructional practice: Using professional development activity log. Washington, DC: Council of …, 1–59.

Download references

Authors’ contributions

FH carried out the studies and drafted the manuscript. PS participated in its design and and performed statistical analysis. KK participated in the design and performed statistical analysis on teacher motivation. All authors read and approved the final manuscript.

Competing interests

The authors declare that they have no competing interests.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Frans Hermans.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hermans, F., Sloep, P. & Kreijns, K. Teacher professional development in the contexts of teaching English pronunciation. Int J Educ Technol High Educ 14, 23 (2017). https://doi.org/10.1186/s41239-017-0059-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41239-017-0059-9

Keywords