Skip to main content

Table 2 Audience response systems joint display

From: The contributions of mixed insights to advancing technology-enhanced formative assessments within higher education learning environments: an illustrative example

Mixed Insights

Thematic Categories

%

Fall 2013

(n = 117)

%

Winter 2014

(n = 81)

%

Fall 2014

(n = 76)

Influences on involvement

Participation trends (Use/purchase rates)

72.6/61.5

High yet decreasing frequency

67.9/61.7

Lower and inconsistent use

75.0/65.8

Higher and consistent use

Authentic preparation

85.6

“the questions were hard but good practice”

83.3

“made me think about exam items”

86.4

“I learned what questions to expect”

Engaging interactions*

84.4

“It created more interactions”

66.7

“I wish I had more chances”

80.3

“I enjoyed these activities”

Effects on learning

Understandings check

94.4

“fun way to confirm understanding”

90.0

“I thought I knew more”

97.0

“I was surprised at what I knew”

 

Weakness identification for remediation*

92.2

“now I know what I did not know”

76.7 “I need to study more”

92.4

“I know what to study”

Accessibility of feedback

Personal use

94.4

“I like that no one knows my answer”

90.0

“I know quickly if I am right”

97.0

“telling me right away”

 

Peer comparisons*

85.6

“want to know how class is doing”

68.3

“I knew as much as others”

86.4

“I was lagging behind others”

Impacts on instruction

Emerging awareness

Cost barrier yet participation independent

“I learned as much from watching as if I had used one.”

Ongoing resistance to purchase yet want more often

“I will not pay to participate”

Multiple platforms challenging “more expertise needed”

 

Responsive actions

Sought lower cost options & revised lecture content

Offered cost-effective alternatives Increased frequency of classroom use

Continued to explore options & revisit content

  1. Note. * Denotes a statistically significant difference across terms (p < 0.05)