Skip to main content

Curriculum design for social, cognitive and emotional engagement in Knowledge Building

Abstract

Knowledge Building has been advanced as a pedagogy of engaged learning where students identify as a community whose purpose is to advance their shared ideas. This approach, which has been studied for three decades (Scardamalia & Bereiter, in: K. Sawyer (ed) Cambridge handbook of the learning sciences, Cambridge University Press, 2014), includes cognitive, social constructivist, and emotional elements (Zhu et al. in User Modeling and User-Adapted Interaction, 29: 789–820, 2019b). This paper investigates how refining Knowledge Building activities based on students’ feedback impacts their social, cognitive, and emotional engagement. Using a design-based research method, we refined successive course activities based on feedback from 23 Masters of Education students. With successive iterations, we found that the density of students’ reading networks increased; they theorized more deeply, introduced more authoritative resources, and made greater efforts to integrate ideas within the community knowledge base. As well, their level of negative affect decreased. These findings suggest that soliciting students’ input into course design can benefit their engagement and disposition toward learning, with implications for curriculum design.

Introduction

Engaged and positive learners are a priority for any learning design. Learner engagement is a multi-faceted, dynamic, and highly contextualized construct that includes behavioral, social, and cognitive elements (Sinha et al., 2015). Prior research has emphasized the importance of engagement in studies of social presence (e.g., Garrison & Arbaugh, 2007; Kreijns et al., 2013) as well as collaboration (e.g., Rummel et al., 2012; Xing et al., 2020). Students who are more engaged in collaboration were found to exhibit greater levels of elaboration and fewer ineffective learning strategies (Mullins et al., 2011). Greeno (2006) observed that situational factors such as curriculum materials, tasks, pedagogical approaches, and learning environments may influence engagement.

Trowler (2010) found that student-centered pedagogical approaches (i.e., in which students actively constructing knowledge through inquiry and reflective activities, may promote engagement. According to that view, knowledge Building may be described as a student-centered pedagogy, as well as “idea centered” emphasizing student voice and agency in determining what to work on and how to take collective responsibility for improving community knowledge (Scardamalia & Bereiter, 2006, 2014). We hypothesize that Knowledge Building contributes to students’ engagement, as a consequence of valuing student voice and agency. Moreover, by involving students in curriculum design, we may improve their motivation, sense of relevance, and shared responsibility for learning (Bovill et al., 2011). Finally, digital media and technology environments (e.g., Knowledge Forum, Google Docs) can scaffold learners in Knowledge Building activities, further supporting engagement. To that end, we are studying how the inclusion of students in the design of Knowledge Building activities might influence their level of engagement with the curriculum, in a technology-enhanced learning environment.

Social, cognitive, and emotional engagement

Student engagement occurs when students are interested in, take active roles in, and commit to their own learning (Kuh et al., 2005; Wolf-Wendel et al., 2009). Sinha et al. (2015) conceptualized engagement as including behavioral, social, cognitive, and conceptual-to-consequential forms. Fredricks et al. (2004) identified three dimensions of student engagement: behavioral, social, and emotional, which were then investigated by Jung and Lee (2018) in their study of student engagement in MOOCs. This study integrated these two classifications and investigated social, cognitive, and emotional forms of engagement. Social engagement is about the interaction quality of groups when completing tasks; Cognitive engagement refers to students’ cognitive effort to construct knowledge and solve tasks using domain-specific knowledge; and emotional engagement describes students’ affective reactions (e.g., interest, enjoyment, sense of belonging) to their learning experiences (Fredricks et al., 2004; Sinha et al. 2015).

Several studies have explored the relationships between engagement and students’ learning outcomes in collaborative learning. For example, Sinha et al. (2015) found that students’ behavioral, social and cognitive engagement has an impact on the quality of their designs, in terms of making connections to broader questions, constructing evidence and rationales, and connecting to prior units. Arguedas et al. (2016) measured the engagement, motivation, self-regulation, and learning outcomes of high school students who were provided with an affective analysis of their discourse. They found that students who were aware of their emotions improved in engagement, motivation, and self-regulation. Jung and Lee’s (2018) study of MOOC learners indicated that their learning engagement is significantly influenced by academic self-efficacy, teaching presence, and perceived usefulness or relevancy of the MOOC.

Students’ active participation in learning is critical to their engagement (Bovill et al., 2011). Students have unique perspectives on their learning and should be invited to share their insights regarding how to revise curricula (e.g., Brooker & Macdonald, 1999; Fielding, 2004; Rudduck & McIntyre, 2007). Students’ participation in curriculum design improves its relevance, changes power relations, enables the marginalized to speak and be heard, and contributes to students’ persistence and achievement (Bron & Veugelers, 2014; Hattie, 2009; Oliver & Oesterreich, 2013).

Knowledge Building

Knowledge Building is a socio-constructivist approach that emphasizes students’ collective responsibility in their learning and places students’ ideas at the center (Scardamalia & Bereiter, 2006, 2014). Knowledge Building empowers students to work on ideas that they care about and continuously advance community knowledge. Ideas are considered as immaterial knowledge objects (e.g., languages, tales, scientific conjectures) that can be tested, criticized, questioned, and improved (Scardamalia et al., 1994). Students engage in discourse moves such as asking questions, working with information, theorizing, integrating diverse ideas, identifying knowledge gaps, and improving explanations (Chen et al., 2017).

Knowledge Building involves socio-emotional and cognitive interactions amongst students (Dillenbourg, 1999). Cognitive interactions cannot be separated from social, motivational, emotional, and identity processes (Palincsar, 1998). In the cognitive process of seeking explanatory coherence (Thagard, 2007), students must feel comfortable and motivated to identify the weaknesses or gaps in the community knowledge—suggesting the need for a positive social-emotional environment. Social-emotional interactions have been shown to impact how students perceive their community climate and how they express their emotions (Bakhtiar et al., 2017; Järvelä et al., 2016). Respectful and cohesive environments have been shown to enhance cognitive interactions while unsupportive and disorganized environments may hinder collaboration (Isohätälä et al., 2019).

Interactive digital media are changing the way people learn and build knowledge. Resta and Laferrière (2007) identified four instructional advantages of using technologies to support collaborative learning and knowledge building. The first is preparing students with knowledge creation and collaboration skills, enabling students to formulate different ideas, views, and opinions within a shared social space. The second is that social interactions can serve as a source of cognitive advancement, which may foster deep understanding amongst students. Third, technologies add the flexibility of time and space for learners to involve in collaborative learning. Fourth, technologies make it easier to keep track of students’ collaborative work (e.g., online activities, behaviors, written discourse). Thus, by including such elements in our designs, we may promote student engagement in technology-enhanced Knowledge Building activities.

A variety of technologies and approaches can serve the goals of Knowledge Building. The Knowledge Forum (KF, Scardamalia, 2004) encourages students to advance community knowledge because they all contribute ideas to a shared space, read the ideas presented by others and work together to engage in the Knowledge Building discourse (i.e., “building on” and “rising above” ideas from the community). More recently, technologies such as wikis and Google Drive (e.g., collaborative editing of docs, and comments) have been shown to support communities of learners in co-creating knowledge resources and building on one another’s ideas (Peters & Slotta, 2010). Within Google Docs, for example, the shared access and permissions allow for social annotation of documents, such that one person can read another’s embedded comments (i.e., about a specific passage within a PDF text), and then reply in a threaded fashion. Technology environments can thereby support collective reading and discussion, with the potential to deepen students’ understanding of the materials and support their learning.

Adopting the Knowledge Building approach involves respect for student voice about what they want to inquire about and how to inquire. However, student voice is rarely included in the overall architecture of the curriculum (e.g., which technology environments to use; what forms of discourse to emphasize; how long to engage in certain topics, and how to connect to the broader curriculum). In previous studies in higher education, instructors have typically created a context in which students are engaged in Knowledge Building—using Knowledge Forum, Google Docs, Moodle, or other technology environments—but have not included students in the articulation of those designs (Chai & Zhu, 2021; Hong, et al., 2019a). This study investigated how highlighting student voice in the design of Knowledge Building activities may influence the level of social, cognitive, and emotional engagement with the curriculum in technology-enhanced learning environments.

Method

Participants and course design

Participants were 23 Master of Education students enrolled in a course titled, Introduction to Computers in Education, within a large public university in Canada. The course was built around weekly themes such as online learning, equity and social justice, and wellness and whole-child learning. The course lasted 12 weeks, with each week including 3-h meetings held within an active learning classroom environment, as well as supportive homework activities designed to encourage Knowledge Building and feed into ensuing classroom activities.

The course adopted an overarching perspective of a learning community and drew upon a variety of technology environments including Google Drive, Padlet, Nearpod, and KF. In the homework preceding each class session, students collectively read, annotated, and commented on articles related to the weekly theme, using the native affordances of Google Drive (i.e., threaded comments on PDF files). They also shared and built knowledge about relevant issues and applications of weekly topics using the KF environment. During class, activities were progressively designed to deepen students’ understanding and Knowledge Building about the themes. Activities included lectures, student group presentations, and small group activities that built on homework to consolidate ideas and experiences, discuss issues, and advance collective knowledge.

Design-based research to highlight student voice

This study employed a design-based methodology (Collins, 1992) to take advantage of its iterative design, continuous improvement, and close teacher-researcher collaboration nature (Anderson & Shattuck, 2012; Barab & Squire, 2004). Learning researchers have introduced design-based research to achieve desirable results in natural settings (Brown, 1992; Collins, 1992), as it is difficult, if not impossible, to implement controls in real and complex classroom contexts. Design-based research is focused on examining an intervention such as an instructional approach, a learning activity, or a technological tool by continuous iteration of design, enactment, analysis, and redesign (Brown, 1992; Cobb et al., 2003; Collins, 1992). It not only aims to meet local needs but also to explore, advance, or confirm theoretical relationships (Barab & Squire, 2004). Through cycles of design, enactment, detailed study, and revisions (Bell et al., 2004; Cobb et al., 2003), teachers and researchers continually refine theoretical claims to produce “ontological innovations” (DiSessa & Cobb, 2004) and sustained innovations in education (Bereiter, 2002).

At the beginning of the course, students were engaged in the development of course themes that were interesting to them, such that the final selected course themes represented the authentic interests of the particular cohort of students. We selected three focus weeks (weeks 3, 6 and 9), in which we would study student contributions, to examine the impacts of the pedagogical approach, Knowledge Building, on students’ multi-faceted engagement. The themes for these weeks were: learning communities (week 3); technology supports for equity and social justice (week 6); and wellness and whole child learning (week 9). In each of the 3 focus weeks, students provided feedback regarding their emotions and engagement, and suggestions about activities in which their peers and instructor could better support Knowledge Building. Building on the feedback received, we designed and refined the activities for the ensuing weeks, as outlined next.

The theme of learning communities (week 3) was salient, given the stated pedagogy for the course, and that most students had never encountered such a perspective. At the outset of the course, students were told that we would be engaging as a learning community, which was one possible approach to the use of computers in education (the course topic). Hence, in their brainstorm and selection of course themes, they chose to further examine learning communities as one theme. For the week, students read and collectively annotated papers (using Google Drive comments on PDFs of the papers) from various learning community researchers, including Scardamalia (2002) and Slotta et al. (2018). During the class session, their collective annotations were then re-introduced as a resource for a critical reflection activity and a Knowledge Building activity in which students used the KF. During class, one researcher demonstrated how to use the KF, including how to create “rise-above” and “build on” notes. At the end of the week, student feedback was collected through surveys that included questions about how the Knowledge Building activities could be more engaging and effective. This input was discussed with students and used as a basis for our design of week 6 activities.

Week 6 explored the topic of technology supports for equity and social justice. Students had suggested the need for increased sharing of ideas before class, to allow more time for consolidation of ideas during class. Therefore, we included homework where students shared, read, and built on each other’s experience, knowledge, and questions about the themes. Then, in class, students identified knowledge gaps, discussed relevant ideas, and synthesized ideas in small groups. Small groups of students shared their ideas with the class and used KF to add syntheses or new questions.

The theme for Week 9 was wellness and whole-child learning. Similar to week 6, students contributed resources and ideas during homework activities and then consolidated ideas in class. After week 6, students suggested that their KF views were “messy”, due to the unorganized structure of the sheer number of notes (see Fig. 1a). Therefore, in week 9, we added a whole-child learning framework as an organizational background to the discussion space within KF, to help students organize their ideas (see Fig. 1b).

Fig. 1
figure1

Knowledge forum views

Data collection

Data were collected from several sources, including 259 written notes that the students posted in KF, as well as surveys of students in weeks 3 and 9 regarding their feelings regarding the Knowledge Building activities and their suggestions on curriculum revisions. For a small number (n = 3) of students, we also collected facial expressions and computer screen content during some KF sessions. Finally, after the course was completed we conducted nine semi-structured interviews concerning how students felt about knowledge advancement and interactions amongst peers, and how their emotions were influenced by Knowledge Building activities.

Figure 1 illustrates how KF was used in week 9, including our use of an organizational frame (the background image of Whole Child learning) to guide placement of notes. The square icons represent notes, and the blue lines between notes show the building-on relationships between notes. The red square icons indicate the notes have been read by the user while the blue icons suggest the notes have not been read. KF notes (including their author information, relationships between notes) as well as which notes were read by students, and any build-on or rise-above information, were collected.

The survey consisted of ten Likert scale (1–5) items concerning student emotions, and four open-ended questions about their Knowledge Building experiences. Students were surveyed on affective dimensions of Joy, Anger, Surprise, Fear, Contempt, Confusion, Sadness, Frustration, Anxiety, and Boredom. We chose these emotions because they are the ones that can be identified by the iMotions™ Emotient software (Moreno et al., 2019), which were used in this study to analyze students’ facial expressions. Sample open-ended questions were as follows: In today’s Knowledge Building activities, what were the most enjoyable or rewarding features? Please say why these were important. What are some ways my classmates could better support me in Knowledge Building? In week 3, eight students filled in the survey, while in week 9, 21 students filled in the survey because they were given time to do so at the end of the class. Only the responses of the eight students who completed surveys in both weeks 3 and 9 were included in the analysis, allowing for a direct (within subject) comparison.

In class, when the students worked in KF after small group discussions, students could choose to record their screen and/or camera if they consented to participate in this study. Three students did complete video and screen recordings in weeks 3 and 6, and their videos were analyzed. Incomplete videos were not included in this study.

The semi-structured interview was focused on how students thought the pedagogical and technological design influenced their community interactions and knowledge advancement, as well as their emotional experiences. A sample interview question is “In what ways, if any, did Knowledge Building discussions (both offline and in KF) influence the community progress of ideas and community interactions?”.

Data analysis

To analyze social interactions amongst students, in terms of their reading connections in KF, we conducted social network analysis using Gephi—open-source software that can display large networks in real time (Bastian et al., 2009). We chose directed graphs in which each student is represented as a node, with multiple edges (“directed” means the edges have directions). In the reading network, edges start from the readers of a note and go to the authors. Furthermore, we calculated the density of each reading network and the degree of centrality of each node. For directed graphs, density is reflected by the number of edges divided by the maximum number of possible edges between nodes (Hanneman, 2001). The degree centrality of a node is the number of notes it connects to divided by the maximum number of notes it could connect to.

To examine how students participate in collective Knowledge Building discourse, we conducted a content analysis of the 259 KF notes. The coding scheme (Table 1) was refined based on “categories of statements” (Hmelo-Silver & Barrows, 2008) and a content analysis of discourse framework (Yang et al., 2016; Zhu et al., 2019a). The revised version consists of five categorical dimensions: appraisal, questioning, theorizing, referencing, and integrating. Questioning includes subcategories of the factual question, explanatory question, and idea-deepening/elaborating question, as a measure of the extent to which students tend to deepen explanations, take initiatives, and sustain discourse. Appraisal, theorizing, referencing and integrating each have two sub-classifications based on the complexity of the note (i.e., simple and elaborated). A note can fall into more than one category. Two researchers discussed the types and coded 30 notes together to achieve an agreement of coding. Then both researchers independently coded 30% (79) of the notes and compared the ratings. The inter-rater agreement between the two researchers was 78.48%. Any discrepancies in the ratings were debated until an agreement was achieved. One researcher coded the rest of the notes. Furthermore, we open-coded the interview transcripts to understand how students’ understanding of the course themes changed.

Table 1 Content analysis of student notes

To examine students’ emotional changes over time, we compared the positive and negative emotions of the eight students who filled in both the week 3 and week 9 surveys. We coded the interview transcripts of nine students for their emotional experiences over time and synthesized the coding. Regarding the videos recorded by three participants in weeks 3 and 6, we analyzed students’ facial-muscular emotion data extracted from the videos using iMotions™ Emotient software (Moreno et al., 2019). Using the Computer Expression Recognition Toolbox (Littlewort et al., 2011), the Emotient software first automatically detects the face and facial features from videos. Then it registrars the face and extracts features using Gabor filters. Finally, it recognizes Action Units and calculates expression intensity and dynamics over time. The Emotient adopts the Facial Action Coding System (FACS, Ekman & Friesen, 1978) to categorize 19 different Action Units into nine basic emotions, as mentioned above. For each emotion based on observed expressions, an evidence value greater than one was further analyzed to investigate what the participants were doing in KF by examining the relevant screen recordings. Taking Joy as an example, an evidence value greater than 1 indicates that it is ten times likely to be categorized as Joy than not Joy by experts. Each participant’s duration of each emotion was compared in weeks 3 and 6.

Findings

We first report a social network analysis of note reading within KF. Next, we report a content analysis of students’ KF discourse, including how they thought their understanding of computers in education has changed as a result of the course. Finally, we describe students’ self-report of emotional experiences in two selected weeks (3 and 9), interviews regarding how their emotions may have shifted as a result of the Knowledge Building approach.

Social engagement: reading networks

Figure 2 shows the reading network of students (represented by S1 to S23), instructor (represented by “I”), and one researcher (represented by “R”) in weeks 3, 6, and 9. As shown in the figure, the reading network density improved over the successive weeks, from 0.57 to 0.74, to 1.09. This indicates that students read more of each other’s notes with each successive improvement of our design. In week 3, the researcher’s notes were the ones most commonly read within the network, whereas in weeks 6 and 9 student-contributed notes became increasingly influential within the reading network.

Fig. 2
figure2

The reading network of students in weeks 3, 6 and 9

Cognitive advancement

Content analysis of KF notes

Figures 3 and 4 show their different forms of cognitive contributions to the KF discourse—shown separately to avoid a single busy figure. Figure 3 shows the percentage of different types of appraisals and questions in students’ KF discourse over the three weeks. In weeks 6 and 9, the students had greater percentages of Simple Appraisal and Elaborated Appraisal than they did in week 3, indicating that they built on peers’ notes with opinions or suggestions. As shown in Fig. 3, the students asked a decreasing percentage of Factual Questions, Explanatory Questions, and Idea-deepening Questions over the weeks, suggesting their tendency to ask fewer questions.

Fig. 3
figure3

The percentage of different types of appraisals and questions of the three weeks

Fig. 4
figure4

The percentage of different types of theorizing, referencing and integrating of the three weeks

Figure 4 shows the percentage of students’ contributions in terms of theorizing, referencing, and integrating across the three weeks. Compared to week 3, students contributed fewer Intuitive Theorizing notes but more Elaborated Theorizing notes in weeks 6 and 9, indicating that they were more likely to explain the rationales of their theorizing and add details to support their ideas. Over the three design iterations, there was an increasing trend of contributing Simple Referencing and Elaborated Referencing, suggesting the students gradually introduced more authoritative resources to support their community Knowledge Building. Also, the decrease of Simple Integrating and increase of Elaborated Integrating over the three weeks indicates that the students were more likely to provide supportive details when putting their knowledge together.

Student learning about course themes

Most students who were interviewed indicated that their understanding of computers in education had improved. For instance, they had more awareness of the tools used by teachers and understood the weekly themes better. They recognized that even simple tools like Google Docs could be used to promote deep learning and planned to use such tools in their teaching. One student mentioned that she was “convinced that learning community, both online and offline, are crucial to an individual’s learning and personal growth. It provides not only models of best practices but also supports an individual’s learning in a social and emotional way.” One student indicated that she previously used “collaborative learning for idea generation” but would go beyond that in the future to help “students build upon each other’s ideas.” One student suggested that the course helped her to recognize that educators do not necessarily understand the affordances of different technologies. Another observed that Knowledge Building should go beyond sharing experiences or opinions to help students “intentionally construct theories, principles, or solutions (i.e., ideas as objects of construction).”

Emotional engagement

Survey of student emotions

Figure 5 shows the comparison of eight students’ self-reported emotions in weeks 3 and 9, categorized according to positive (i.e., joy) and negative dimensions (i.e., anger, fear, contempt, sadness, confusion, frustration, anxiety, boredom). Students expressed a lower sense of negative emotions across this time span, while their reports of positive emotions remained at a high level.

Fig. 5
figure5

Students’ self-reported feelings, categorized as positive (i.e., joy) and negative emotions (i.e., anger, fear, contempt, sadness, confusion, frustration, anxiety, boredom)

Interview on student emotions

Most of the students interviewed felt confused, nervous, or frustrated when the KF was first introduced to them in week 3 because they were not sure they could use the KF properly or whether it would be used frequently in the future. They also expressed some discouragement about the overall “messy” interface of views in the KF, once many notes had been added. Also, when students added rise-above notes, the lower-level notes they were using would disappear from the view, which frustrated other students who had planned to work on those notes that disappeared from the views. Overall, learning to use a new tool itself and the messy views left the students with some negative feelings in week 3.

However, in week 9, as students became more familiar with the KF, they became emotionally engaged, excited, comfortable, and curious when working on their ideas. They felt more emotionally engaged when sharing stories with others face-to-face and reading other students’ ideas online because “the idea flow became visible so that we all wanted to build more on them” and “it is a pleasure when I know exactly how to use it and watch the ideas grow.” The use of KF enabled the students to hear from those who did not participate in class discussions, suggested by “later I felt more comfortable using it and more engaged as I was able to read ideas of students’, who never participated in-class discussion.” It seems that as students became more familiar with the features of the KF, they could focus more on their Knowledge Building and interactions with peers and thus experienced more positive emotions, indicated by “by the end of this experience, I start to feel familiar with the interface, and know how to use it to interact with my classmates. I feel accomplished and resolved.”

iMotions analysis

Three students (i.e., S1, S15, and S23) had more complete recordings while working in KF, which enabled us to examine how their behaviors cooccurred with emotions temporally. The iMotions analysis shows that the duration that S1 displayed joy in week 6 is longer than that in week 3. S1’s facial expressions and screen recordings suggested that in week 3, S1 expressed joyful when talking to the instructor or peers, reading certain notes, beginning to respond to notes, writing his notes, and after contributing notes. Furthermore, S1 expressed surprised when reading the note-only technology-based; contempt after adding a note and looking for notes to read, writing the title of a rise-above, and after talking to the instructor. S1 expressed confused when starting to write the note “beyond personal knowledge.” In week 6, S1 displayed joyful facial expressions after talking with people, when writing his personal experience, when finishing a note and when opening his own newly contributed note.

Unlike S1, S15’s positive emotions did not increase, and negative emotions did not disappear, suggested by her self-reports and the iMotions analysis results. In week 6, the computer mouse S15 used in the lab did not function well, and it significantly influenced S15’s operations in KF. The iMotions analysis showed that S15 displayed joyful when talking to the researcher, opening the login page of the KF. However, when the mouse did not allow her to select or drag notes to a rise-above view, her facial expression was identified as disgust. She experienced frustration in the process of creating a rise-above note, which is possible because of the mouse.

The iMotions analysis confirmed that S23 experienced a longer duration of joy and a shorter duration of anger, confusion, and frustration in week 6 compared to week 3. In week 3, S23 displayed joyful facial expressions when reading the researcher’s notes, reading and building on peers’ notes, successfully contributing his note and revising the note title, and organizing the community view by dragging notes around. S23 also experienced confusion and frustration, mainly because the mouse did not function well when he tried to open notes, drag notes, or navigate. In week 6, S23 displayed joy when someone was talking. Because the screen recording is not available, we could not match S23’s emotions and actions in week 6.

Discussion

This design-based research offers some insight into students’ emotions during learning can interact with the specific activities employed, and the potential benefits of including student voice as a formative input into curriculum design. We investigated how such a pedagogical design impacted students’ social, cognitive, and emotional engagement. We engaged 23 graduate students in three Knowledge Building sessions. At the beginning of the course, we surveyed the students on the weekly themes that they were interested in. Throughout the course, we collected feedback regarding how they thought they could be better supported by their peers and instructor, and revised the subsequent curriculum based on the feedback. As a result of this approach, students’ reading networks became denser, they were able to better advance community knowledge, and their perceived negative emotions decreased.

The increasingly dense reading networks and increasing frequency of Simple and Elaborated Appraisal notes suggest that students were more likely to read and respond to community notes as a result of our design improvements. In response to their feedback about needing more time to reflect on peers’ notes, we had encouraged students, in weeks 6 and 9, to contribute their experiences and ideas to weekly themes before class. This gave them more time to read their peers’ contributed notes. We would expect some degree of increasing reading network density as a simple result of the course progression (i.e., even with no changes to the design) as the participants became more familiar with their peers and more comfortable using KF. However, the striking level of this shift—visually visible within the graphs of Fig. 2—suggests that it derives to some extent from our intentional efforts to improve the scaffolding and integration of KF discussions within the broader class session design. Similarly, Hong et al. (2019b) found that Knowledge Building activities such as assuming agency, fostering community, and working with ideas help students become more engaged in the design of their STEM projects in a higher education context. Another reason may be that the researchers had posted two questions in week 3, to help focus students’ discussions, which made the students mainly read and respond to the two questions. In weeks 6 and 9, the researcher did not constrain the Knowledge Building with any predefined questions but rather encouraged them to share their experiences (i.e., with social justice and whole child learning) as a starting point.

Overall, the depth of students’ notes increased over the three sample weeks—a change that may have derived from our giving students more time to consolidate ideas during class. Creating resources and ideas during homework activities and then reading those notes as a Knowledge Building activity allowed students more time and flexibility to contribute their ideas, experiences, and knowledge to weekly themes as well as to read and build on peers’ contributions. These preparations before class also made the students more capable of engaging in activities that require more cognitive efforts, such as improving theorizing and integrating diverse ideas during class time. These improvements are reminiscent of the “flipped classroom” approach in which students do preparation work such as watching lecture videos before class, then engage in more active, collaborative forms of activity during class time, to apply the ideas from lectures (Chen et al., 2019; Gilboy et al., 2015; Krathwohl, 2002; Seaboyer, 2013). Studies on flipped classroom approach suggest its positive effects on students’ academic performance, perception of engagement, and satisfaction (e.g., Bergmann & Sams, 2012; Elmaadaway, 2018; Sergis et al., 2018). Similarly, this study shows the effectiveness of requiring students to contribute ideas and experiences before class and then consolidate and extend those ideas during class. More importantly, this design was suggested by the students themselves in this study, which indicates they sought higher levels of cognitive work during classroom time and preferred to do preparation work as homework.

Our findings indicated that students’ negative emotions tended to decrease, while their positive emotions remained at a relatively high level. This was indeed the goal of making real-time improvements to the Knowledge Building design based on students’ feedback. Gros and López (2016) also found that all the students who participated in the process of co-designing technology-rich learning activities reported their willingness to co-design again if had opportunities. In a study conducted with elementary students, Zhu et al. (2020) found that the students’ positive emotions such as confidence and enjoyment increased when they were involved in deciding what to learn and how to learn. Hence, highlighting student voice in the curriculum design is a way to support their responsibility, autonomy, and epistemic agency, which tended to enhance the relevance of curricula. But how does this impact our measures of student emotions? According to Dewey (1938), learning starts with the experiences of students and builds towards the growth of students’ knowledge and insights. The relevance of curricula to students may influence their perceived value of learning. According to the control-value theory, students’ subjective appraisals of control and values play an essential role in the arousal of achievement emotions (Pekrun, 2006). Subjective value refers to students’ perceived importance of activities, outcomes, and success (Pekrun, 2006). Students’ decreased negative emotions in this study may have resulted from their perceptions that the revised Knowledge Building design and activities had become more relevant and valuable.

In weeks 6 and 9, we emphasized student creation of resources and ideas as homework activities, with Knowledge Building activities (reading peers’ notes, rising above, and building new ideas) addressed in the ensuing class. This allowed for students to feel they had time to first exercise their ideas and explore resources, and then work on the wealth of ideas from the community, during class time when everyone was together (i.e., allowing for whole-class discussions as well as individual and small group Knowledge Building). We employed technologies like Google Docs to enable students to collectively annotate readings and build knowledge before class. Such technologies support a learning community epistemology and related pedagogies, making students’ social interactions a source of cognitive advancement and academic achievement (Resta & Laferrière, 2007). This allows the knowledge and skills that students bring with them to instruction to be mobilized and consolidated by students and the teacher during class time (Bron & Veugelers, 2014).

It is important to note that our students improved social, cognitive, and emotional engagement cannot be entirely attributed to highlighting student voice in Knowledge Building activities. Indeed, establishing the precise extent of any causal links is not generally attainable through design-based research. Other common factors, such as the students becoming more familiar with the KF platform, and gradually forming a more coherent learning community over the weeks, will also likely have influenced students’ engagement. In the interviews, several students mentioned that becoming more familiar with the KF enabled them to focus on improving ideas. Future research could consider including a controlled class (e.g., the same courses taught by the same instructor) and extending the design duration. Future research could also consider addressing technical issues to reduce potential distractive impacts on engagement. Below, we discuss our findings with an awareness of these limitations and in light of the literature, then summarize how our design iterations supported students’ social, cognitive, and emotional engagement.

This study contributes to and extends the literature concerning engaging students in co-designing and co-producing their learning and further confirms the positive impacts of doing so. Overall, the results of this study are consistent with previous research. For instance, in Oliver and Oesterreich’s (2013) study, the instructor and pre-service teachers adopted a model of student-centered inquiry as curriculum. The model includes “a cyclical process of building the foundation, planning, responding to students, listening to respond and analyzing the responses” (Oliver & Oesterreich, 2013, p. 394). However, the impacts of this model were not evaluated. Brooman et al. (2015) described how 44 students participated in focus group interviews on their learning experiences in a seminar course, which informed the redesign of lectures. Such as design improved student perception of the learning modules and interest in postgraduate study.

Our findings have implications for the design of curricula that aim to enhance students’ learning and positive feelings. First, it is important to solicit students’ input into course design to address any potential issues that may hinder their learning in real time. Second, class time needs to be assigned for group discussions to help students to consolidate and synthesize diverse ideas. Third, future research can consider helping students become familiar with technologies first and addressing potential technical issues so that technology would not be disruptive in any way. Finally, it must be noted that our three sample weeks each used a different theme (e.g., social justice, or whole-child) so that Knowledge Building re-started each week. Future research could investigate more coherent Knowledge Building goals, where students continue to improve their community ideas over a more protracted period.

Availability of data and materials

These data will be made available to other researchers on a case-by-case basis.

References

  1. Anderson, T., & Shattuck, J. (2012). Design-based research: A decade of progress in education research? Educational Researcher, 41(1), 16–25. https://doi.org/10.3102/0013189X11428813

    Article  Google Scholar 

  2. Arguedas, M., Daradoumis, T., & Xhafa, F. (2016). Analyzing how emotion awareness influences students’ motivation, engagement, self-regulation and learning outcome. Educational Technology & Society, 19(2), 87–103.

    Google Scholar 

  3. Bakhtiar, A., Webster, E. A., & Hadwin, A. F. (2017). Regulation and socio-emotional interactions in a positive and a negative group climate. Metacognition and Learning, 13(1), 57–90. https://doi.org/10.1007/s11409-017-9178-x

    Article  Google Scholar 

  4. Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. The Journal of the Learning Sciences, 13(1), 1–14. https://doi.org/10.1207/s15327809jls1301_1

    Article  Google Scholar 

  5. Bastian, M., Heymann, S., & Jacomy, M. (2009). Gephi: An open source software for exploring and manipulating networks. Proceedings of the International AAAI Conference on Web and Social Media3(1). https://ojs.aaai.org/index.php/ICWSM/article/view/13937

  6. Bell, P., Hoadley, C. M., & Linn, M. C. (2004). Design-based research in education. In M. C. Linn, E.A. Davis, & P. Bell (Eds.), Internet environments for science education (pp. 73–84). Lawrence Erlbaum Associates. https://doi.org/10.1007/BF02504682

  7. Bereiter, C. (2002). Design research for sustained innovation. Japanese Cognitive Science Society, 9(3), 321–327. https://doi.org/10.11225/jcss.9.321

    Article  Google Scholar 

  8. Bergmann, J., & Sams, A. (2012). Flip your classroom: Reach every student in every class every day. International Society for Technology in Education.

    Google Scholar 

  9. Bovill, C., Cook-Sather, A., & Felten, P. (2011). Students as co-creators of teaching approaches, course design, and curricula: implications for academic developers. International Journal for Academic Development, 16(2), 133–145. https://doi.org/10.1080/1360144X.2011.568690

    Article  Google Scholar 

  10. Bron, J., & Veugelers, W. (2014). Why we need to involve our students in curriculum design: Five arguments for student voice. Curriculum and Teaching Dialogue, 16(1/2), 125–139.

    Google Scholar 

  11. Brooker, R., & Macdonald, D. (1999). Did we hear you?: Issues of student voice in a curriculum innovation. Journal of Curriculum Studies, 31(1), 83–97. https://doi.org/10.1080/002202799183313

    Article  Google Scholar 

  12. Brooman, S., Darwent, S., & Pimor, A. (2015). The student voice in higher education curriculum design: Is there value in listening? Innovations in Education and Teaching International, 52(6), 663–674. https://doi.org/10.1080/14703297.2014.910128

    Article  Google Scholar 

  13. Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141–178. https://doi.org/10.1207/s15327809jls0202_2

    Article  Google Scholar 

  14. Chai, S., & Zhu, G. (2021). The relationship between groups’ adoption of Knowledge Building Principles and their performance in creating artifacts. Educational Technology Research and Development, 69, 787–808. https://doi.org/10.1007/s11423-021-09986-3

    Article  Google Scholar 

  15. Chen, B., Resendes, M., Chai, C. S., & Hong, H. Y. (2017). Two tales of time: Uncovering the significance of sequential patterns among contribution types in knowledge-building discourse. Interactive Learning Environments, 25(2), 162–175. https://doi.org/10.1080/10494820.2016.1276081

    Article  Google Scholar 

  16. Chen, M. R. A., Hwang, G. J., & Chang, Y. Y. (2019). A reflective thinking-promoting approach to enhancing graduate students’ flipped learning engagement, participation behaviors, reflective thinking and project learning outcomes. British Journal of Educational Technology, 50(5), 2288–2307. https://doi.org/10.1111/bjet.12823

    Article  Google Scholar 

  17. Cobb, P., Confrey, J., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13. https://doi.org/10.3102/0013189X032001009

    Article  Google Scholar 

  18. Collins, A. (1992). Toward a design science of education. In E. Scanlon & T. O'Shea (Eds.), New directions in educational technology (pp. 15–22). Springer-Verlag. https://doi.org/10.1007/978-3-642-77750-9_2

  19. Dewey, J. (1938). Experience in education. MacMillan.

    Google Scholar 

  20. Dillenbourg, P. (1999). Introduction: What do you mean by ‘collaborative learning?’ In P. Dillenbourg (Ed.), Collaborative learning: Cognitive and computational approaches (pp. 1–19). Pergamon Elsevier Science.

    Google Scholar 

  21. DiSessa, A. A., & Cobb, P. (2004). Ontological innovation and the role of theory in design experiments. The Journal of the Learning Sciences, 13(1), 77–103. https://doi.org/10.1207/s15327809jls1301_4

    Article  Google Scholar 

  22. Ekman, P., & Friesen, W. V. (1978). Manual for the facial action coding system. Consulting Psychologists Press.

    Google Scholar 

  23. Elmaadaway, M. A. N. (2018). The effects of a flipped classroom approach on class engagement and skill performance in a Blackboard course. British Journal of Educational Technology, 49(3), 479–491. https://doi.org/10.1111/bjet.12553

    Article  Google Scholar 

  24. Fielding, M. (2004). “New wave” student voice and the renewal of civic society. London Review of Education, 2(3), 197–217. https://doi.org/10.1080/1474846042000302834

    MathSciNet  Article  Google Scholar 

  25. Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109. https://doi.org/10.3102/00346543074001059

    Article  Google Scholar 

  26. Garrison, D. R., & Arbaugh, J. B. (2007). Researching the community of inquiry framework: Review, issues, and future directions. The Internet and Higher Education, 10(3), 157–172. https://doi.org/10.1016/j.iheduc.2007.04.001

    Article  Google Scholar 

  27. Gilboy, M. B., Heinerichs, S., & Pazzaglia, G. (2015). Enhancing student engagement using the flipped classroom. Journal of Nutrition Education and Behavior, 47(1), 109–114. https://doi.org/10.1016/j.jneb.2014.08.008

    Article  Google Scholar 

  28. Greeno, J. G. (2006). Learning in activity. In R. K. Sawyer (Ed.), Cambridge handbook of the learning sciences (pp. 79–96). Cambridge University Press.

    Google Scholar 

  29. Gros, B., & López, M. (2016). Students as co-creators of technology-rich learning activities in higher education. International Journal of Educational Technology in Higher Education, 13(1), 1–13. https://doi.org/10.1186/s41239-016-0026-x

    Article  Google Scholar 

  30. Hanneman, R. A. (2001). Introduction to social network methods. University of California, Riverside, Department of Sociology. https://faculty.ucr.edu/~hanneman/nettext/Introduction_to_Social_Network_Methods.pdf

  31. Hattie, J. (2009). Visible learning: A synthesis of 800+ meta-analyses on achievement. Routledge.

    Google Scholar 

  32. Hmelo-Silver, C. E., & Barrows, H. S. (2008). Facilitating collaborative knowledge building. Cognition and Instruction, 26(1), 48–94. https://doi.org/10.1080/07370000701798495

    Article  Google Scholar 

  33. Hong, H. Y., Lin, P. Y., Chai, C. S., Hung, G. T., & Zhang, Y. (2019a). Fostering design-oriented collective reflection among preservice teachers through principle-based knowledge building activities. Computers & Education, 130, 105–120. https://doi.org/10.1016/j.compedu.2018.12.001

    Article  Google Scholar 

  34. Hong, H. Y., Lin, P. Y., Chen, B., & Chen, N. (2019b). Integrated STEM learning in an idea-centered knowledge-building environment. The Asia-Pacific Education Researcher, 28(1), 63–76. https://doi.org/10.1007/s40299-018-0409-y

    Article  Google Scholar 

  35. Isohätälä, J., Näykki, P., & Järvelä, S. (2019). Cognitive and socio-emotional interaction in collaborative learning: Exploring fluctuations in students’ participation. Scandinavian Journal of Educational Research, 64(6), 831–851. https://doi.org/10.1080/00313831.2019.1623310

    Article  Google Scholar 

  36. Järvelä, S., Järvenoja, H., Malmberg, J., Isohätälä, J., & Sobocinski, M. (2016). How do types of interaction and phases of self-regulated learning set a stage for collaborative engagement? Learning and Instruction, 43, 39–51. https://doi.org/10.1016/j.learninstruc.2016.01.005

    Article  Google Scholar 

  37. Jung, Y., & Lee, J. (2018). Learning engagement and persistence in massive open online courses (MOOCS). Computers & Education, 122, 9–22. https://doi.org/10.1016/j.compedu.2018.02.013

    Article  Google Scholar 

  38. Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview. Theory into Practice, 41(4), 212–218. https://doi.org/10.1207/s15430421tip4104_2

    Article  Google Scholar 

  39. Kreijns, K., Kirschner, P. A., & Vermeulen, M. (2013). Social aspects of CSCL environments: A research framework. Educational Psychologist, 48(4), 229–242. https://doi.org/10.1080/00461520.2012.750225

    Article  Google Scholar 

  40. Kuh, G. D., Kinzie, J., Schuh, J. H., & Whitt, E. J. (2005). Student success in college: Creating conditions that matter. Jossey-Bass.

    Google Scholar 

  41. Littlewort, G., Whitehill, J., Wu, T., Fasel, I., Frank, M., Movellan, J., & Bartlett, M. (2011). The computer expression recognition toolbox (CERT). In Face and gesture 2011 (pp. 298–305). IEEE. https://doi.org/10.1109/FG.2011.5771414

  42. Moreno, M., Schnabel, R., Lancia, G., & Woodruff, E. (2019). Between text and platforms: A case study on the real-time emotions and psychophysiological indicators of video gaming and academic engagement. Education and Information Technologies. https://doi.org/10.1007/s10639-019-10031-3

    Article  Google Scholar 

  43. Mullins, D., Rummel, N., & Spada, H. (2011). Are two heads always better than one? Differential effects of collaboration on students’ computer-supported learning in mathematics. International Journal of Computer-Supported Collaborative Learning, 6(3), 421–443. https://doi.org/10.1007/s11412-011-9122-z

    Article  Google Scholar 

  44. Oliver, K. L., & Oesterreich, H. A. (2013). Student-centred inquiry as curriculum as a model for field-based teacher education. Journal of Curriculum Studies, 45(3), 394–417. https://doi.org/10.1080/00220272.2012.719550

    Article  Google Scholar 

  45. Palincsar, A. S. (1998). Social constructivist perspectives on teaching and learning. Annual Review of Psychology, 49(1), 345–375. https://doi.org/10.1146/annurev.psych.49.1.345

    Article  Google Scholar 

  46. Pekrun, R. (2006). The control-value theory of achievement emotions: Assumptions, corollaries, and implications for educational research and practice. Educational Psychology Review, 18(4), 315–341. https://doi.org/10.1007/s10648-006-9029-9

    Article  Google Scholar 

  47. Peters, V. L., & Slotta, J. D. (2010). Scaffolding knowledge communities in the classroom: New opportunities in the Web 2.0 era. In M. J. Jacobson & P. Reimann (Eds.), Designs for learning environments of the future: International perspectives from the learning sciences (pp. 205–232). Springer. https://doi.org/10.1007/978-0-387-88279-6_8

    Chapter  Google Scholar 

  48. Resta, P., & Laferrière, T. (2007). Technology in support of collaborative learning. Educational Psychology Review, 19(1), 65–83. https://doi.org/10.1007/s10648-007-9042-7

    Article  Google Scholar 

  49. Rudduck, J., & McIntyre, D. (2007). Improving learning through consulting pupils. Routledge.

    Book  Google Scholar 

  50. Rummel, N., Mullins, D., & Spada, H. (2012). Scripted collaborative learning with the cognitive tutor algebra. International Journal of Computer-Supported Collaborative Learning, 7(2), 307–339. https://doi.org/10.1007/s11412-012-9146-z

    Article  Google Scholar 

  51. Scardamalia, M. (2002). Collective cognitive responsibility for the advancement of knowledge. Liberal Education in a Knowledge Society, 97, 67–98.

    Google Scholar 

  52. Scardamalia, M. (2004). CSILE/Knowledge forum®. Education and technology: An encyclopedia (pp. 183–192).

  53. Scardamalia, M., & Bereiter, C. (2006). Knowledge building: Theory, pedagogy, and technology. In K. Sawyer (Ed.), Cambridge handbook of the learning sciences (pp. 97–118). Cambridge University Press.

    Google Scholar 

  54. Scardamalia, M., & Bereiter, C. (2014). Knowledge building and knowledge creation: Theory, pedagogy, and technology. In K. Sawyer (Ed.), Cambridge handbook of the learning sciences, 2nd edn. (pp. 397–417). Cambridge University Press. https://doi.org/10.1017/CBO9781139519526.025

    Chapter  Google Scholar 

  55. Scardamalia, M., Bereiter, C., & Lamon, M. (1994). The CSILE project: Trying to bring the classroom into World 3. In K. McGilly (Ed.), Classroom lessons: Integrating cognitive theory and classroom practice (pp. 201–228). The MIT Press.

    Google Scholar 

  56. Seaboyer, J. (2013). The role of technology-assisted assessment in fostering critical reading in undergraduate literary studies. International Computer Assisted Assessment Conference. UK: Computer Assisted Assessment

  57. Sergis, S., Sampson, D. G., & Pelliccione, L. (2018). Investigating the impact of flipped classroom on students’ learning experiences: A self-determination theory approach. Computers in Human Behavior, 78, 368–378. https://doi.org/10.1016/j.chb.2017.08.011

    Article  Google Scholar 

  58. Sinha, S., Rogat, T. K., Adams-Wiggins, K. R., & Hmelo-Silver, C. E. (2015). Collaborative group engagement in a computer-supported inquiry learning environment. International Journal of Computer-Supported Collaborative Learning, 10(3), 273–307. https://doi.org/10.1007/s11412-015-9218-y

    Article  Google Scholar 

  59. Slotta, J., Quintana, R., & Moher, T. (2018). Collective inquiry in communities of learners. In F. Fischer, C. Hmelo-Silver, P. Reimann, & S. Goldman (Eds.), The international handbook of the learning sciences.Routledge.

    Google Scholar 

  60. Thagard, P. (2007). Coherence, truth and the development of scientific knowledge. Philosophy of Science, 74, 28–47. https://doi.org/10.1086/520941

    Article  Google Scholar 

  61. Trowler, V. (2010). Student engagement literature review. The Higher Education Academy, 11(1), 1–15.

    Google Scholar 

  62. Wolf-Wendel, L., Ward, K., & Kinzie, J. (2009). A tangled web of terms: The overlap and unique contribution of involvement, engagement, and integration to understanding college student success. Journal of College Student Development, 50(4), 407–428. https://doi.org/10.1353/csd.0.0077

    Article  Google Scholar 

  63. Xing, W., Zhu, G., Arslan, O., Popov, V., & Shim, J. (2020). Using learning analytics to understand the multifaceted engagement in collaborative learning. Manuscript submitted for publication.

  64. Yang, Y., van Aalst, J., Chan, C. K., & Tian, W. (2016). Reflective assessment in knowledge building by students with low academic achievement. International Journal of Computer-Supported Collaborative Learning, 11(3), 281–311. https://doi.org/10.1007/s11412-016-9239-1

    Article  Google Scholar 

  65. Zhu, G., Moreno, M., Mafla, A., & Scardamalia, M. (2019a). Idea improvement patterns in Knowledge Building: Undergraduate and graduate levels [Paper Session]. AERA Annual Meeting, Toronto, ON.

  66. Zhu, G., Xing, W., Costa, S., Scardamalia, M., & Pei, B. (2019b). Exploring emotional and cognitive dynamics of knowledge building in grades 1 and 2. User Modeling and User-Adapted Interaction, 29(4), 789–820. https://doi.org/10.1007/s11257-019-09241-8

    Article  Google Scholar 

  67. Zhu, G., Scardamalia, M., Nazeem, R., Donoahue, Z., Leanne, M., & Lai, Z. (2020). Collective reflections, knowledge advancement, and emotional well-being of young students. Manuscript submitted for publication.

Download references

Acknowledgements

The authors are indebted to the participating students.

Funding

This study is not supported by any funding sources.

Author information

Affiliations

Authors

Contributions

GZ designed the study, collected and analyzed the data, and wrote the paper. PR assisted in collecting and analyzing the data and edited the writing. WX transcribed the interview data and conducted the social network analysis. JS co-designed the study, edited the paper, and provided guidelines through the process. All authors read and approved the manuscript.

Corresponding author

Correspondence to Gaoxia Zhu.

Ethics declarations

Competing interests

The authors declare that there is no potential conflict of interest in the work.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Zhu, G., Raman, P., Xing, W. et al. Curriculum design for social, cognitive and emotional engagement in Knowledge Building. Int J Educ Technol High Educ 18, 37 (2021). https://doi.org/10.1186/s41239-021-00276-9

Download citation

Keywords

  • Student engagement
  • Student voice
  • Socio-emotional interaction
  • Knowledge Building
  • Design-based research