Skip to main content
  • Review article
  • Open access
  • Published:

Development guidelines for individual digital study assistants in higher education

Abstract

Increasing student numbers, heterogeneity and individual biographies lead to a growing need for personalized support. To meet these challenges, an Individual Digital Study Assistant (IDSA) provides features to help students improve their self-regulation and organizational skills to achieve individual study goals. Based on qualitative expert interviews, a quantitative student survey, and current literature we derived requirements for an IDSA. Based on them, we designed, developed, and implemented a first IDSA prototype for higher education institutions (HEI). We continuously evaluated the prototype within different workshops and analyzed the usage data to improve it further in three enhanced prototypes. Based on this iterative process, we derived guidelines for an IDSA design and development. Accordingly, the framework, project management, content, team selection, team development, team communication, marketing, and student habits are important to consider. The guidelines advance the knowledge base of IDSA in HEI and guide and support practitioners in the design, development, and implementation of IDSA in HEI.

Motivation and research needs

Previous reforms in the educational context, such as the Bologna Process in Europe or the Bradley Report in Australia, enable more students to start studying independently of their social and educational background. Thus, student numbers continuously rise, and students and their needs become more heterogeneous (Clarke et al., 2013; OECD, 2023; Van der Wende, 2000) and the importance of individual counseling and recommendations is increasing (Wong & Li, 2019). Due to a relatively constant number of available lectures and university employees (Hornsby & Osman, 2014), providing personal and individualized support is a challenge for higher education institutions (HEI) (Wambsganss et al., 2021a). In recent years, there has been a rise in online lectures, distance courses, and massive open online courses (MOOCS). These allow to respond to the increasing number of students (Winkler et al., 2020). Thereby, online lectures often have low interaction rates (Lehmann et al., 2022). Eom et al. (2006) and Hone and El Said (2016) found that low interaction rates decrease learning success, increase dropout rates, and decrease satisfaction with the overall learning experience. Accordingly, self-organization, motivation, and regulation are particularly important (Eom et al., 2016; Ritz et al., 2022).

These changed conditions lead to the need for more personalized in individualized student support. One opportunity to respond to heterogeneous needs and to support students individually are digital assistants combined with HEI information systems (IS), such as a learning management system (LMS) (Karrenbauer et al., 2021; Winkler et al., 2020). Here it is important to consider not only functional and non-functional requirements, but also the didactical perspective (Busse et al., 2020). In recent years, there has already been much research on conversational agents (CA) and pedagogical conversational agents (PCA) to improve learning outcomes (Wollny et el., 2021). For instance, Ruan et al.'s (2019) mobile QuizBot supports students to learn factual knowledge, Wambsganss et al. (2021b) developed ArgueTutor to improve students' writing skills, and Hobert (2019a) introduced a chatbot to learn to program.

Existing research focuses on (P)CA, different requirements, design principles, and their development, and evaluation (e.g., Hobert, 2019a; Wambsganss et al., 2021b). Thereby, (P)CA focus more on the support and improvement of direct learning outcomes (Hobert, 2019a; Wambsganss et al., 2021b). However, there is also the need to support general study-related competencies, such as self-regulation and self-organization, and general support in study decisions, e.g., suitable courses or majors and minor selection. IDSA provide this support and guide students with situational and individual recommendations, reminders, and enable first level student support. However, existing literature about IDSA is limited. Except some studies about IDSA in general (e.g., Tenspolde et al., 2019; König et al., 2020; Karrenbauer et al., 2021), current literature sparsely investigates the design, development, and evaluation of an IDSA. To address this gap and react to HEI's changing conditions, we designed, developed, and evaluated an IDSA in several cycles. We identified design and development requirements based on a student survey (n = 570), 28 HEI expert interviews, and a literature review. Based hereon, we developed an IDSA prototype and evaluated and adapted it in several iterations. We developed an IDSA that supports students to strengthen their self-organization and self-regulation skills and goal achievement. While existing research on digital assistants in the learning context focuses much on (P)CA and their requirements (Meyer von Wolff et al., 2019), design principles (Hobert, 2019a), classification and archetypes (Knote et al., 2019; Weber et al., 2021), we specifically focused on IDSA, its requirements, implementation, and evaluation. Based on these results and findings, we then deduced general guidelines for HEI decision makers. Further, unlike many educational studies, our IDSA is not only tested and evaluated under laboratory conditions, but also in the field (Hobert, 2019b). We extend existing design knowledge and provide an implementation, development, and adaption process for IDSA and provide guidelines for practitioners and researchers. We address the following research questions (RQ):

RQ1: How can an IDSA in HEI be designed, developed, and evaluated?

RQ2: What guidelines constitute the design, development, and evaluation of IDSA in HEI?

We review the theoretical foundations before we ground our research design and research methods. We applied an ADR process to iteratively interact with researchers, students, lecturers, and other HEI stakeholders (lecturers, students, enrollment office, study administration, psychological-therapeutic counseling center, study finance etc.) to enable theoretical and practical discussions and contributions. We discuss our results and findings, encompassing IDSA requirements, different prototypes, their evaluation, and adaptation iterations. We then deduce development guidelines, theoretical and practical implications and recommendations. We conclude with limitations, further research topics, and conclusions.

Background and status Quo

Status Quo IDSA in HEI

In general, digital assistants interact with their user with the goal to provide support, relying on a specific extend of intelligence. Well-known examples are Amazon’s Alexa and Apple’s Siri (voice-based assistants) or Facebook’s Messenger (text-based assistants) (Kumar et al., 2016; Maedche et al., 2019). The latter are also known as chatbots or conversational agents (CA). CA are software programs with the goal to communicate and interact with their users based on a natural language interface and a knowledge base (Wambsganss et al., 2021b). PCA are CA specifically used in the educational context and interact and support students interactively and individually (Wellnhammer et al., 2020). Winkler et al. (2020) confirm that they can positively impact learning success. Further, Lee et al. (2022) and Daradoumis et al. (2021) showed that digital assistants can positively influence students’ self-regulation, study performance, and soft skills. Thereby, PCA can interact directly with students, discuss with them, or give individual support and recommendations. Thus, they offer similar support options as in personal counseling and can counteract the limited capacity of lectures and counselors (Weber et al., 2021). Several PCA have already been developed, e.g., to help students to learn program (Hobert, 2019a), strengthen argumentation skills (Wambsganss et al., 2021b), improve mathematical skills (Cai et al., 2019), recommend learning steps (Iatrellis et al., 2023), or provide quizzes for learning (Ruan et al., 2019). Smart personal assistants (SPA) are systems "that interact with the user via natural language and offer many opportunities of service and information provision to reduce effort and complexity of users' everyday tasks" (Knote et al., 2019, p. 2025). Previous research classifies these SPA into five archetypes: chatbots, adaptive voice (vision) assistants, embodied virtual assistants, passive pervasive assistants, and natural conversation assistants (Knote et al., 2019). Existing research on PCA and SPA focus on requirements for these systems, developed and evaluated prototypes, and deduced design principles (e.g., Hobert, 2019a; Wambsganss et al., 2021b).

Besides SPA and PCA, IDSA are another software in the educational context. Depending on an IDSAs design, architecture, and functionalities, it can be classified into one of the five archetypes according to Konte et al. (2019). Along with the previous definitions and Karrenbauer et al., (2021), we define an IDSA as a software program that provides suitable functionalities to strengthen students' study organization, self-regulation, and goal achievement, considering individualized interest, goals, and competencies. Thus, an IDSA, for instance, suggests open educational research (OER) and teaching networks based on own interests, supports to select majors and courses based on a self-assessment, or provides individual learning strategies (Karrenbauer et al., (2021)). Data for these functionalities and recommendations are provided by various sources, such as learning or campus management systems (CMS) (e.g., performance report, completed/open modules), external sources (e.g., OER repositories), or through a direct IDSA-student interaction (Granić, 2022). While PCA support students to learn content to improve learning outcomes (Winkler et al., 2020), IDSA improve students' self-regulation and organization skills. They strengthen students' abilities to manage and organize their study individually and provide situation-specific guidance and recommendations. IDSA thus rather deal with learning on a reflective level. To be able to generate these added values, it is necessary to analyze the process of IDSA design, development, and evaluation in more detail.

Self-regulation theory

The aim is for an IDSA is to support students in organizing their studies, find own interests, develop own goals, and at the same time we want to promote study skills by developing functionalities that promote the important skills of working in a self-regulated, self-structured, self-organized, self-motivated and collaborative way (Payan-Carreira et al., 2023). These are the key skills that a student needs to successfully perform and complete its study. Of secondary importance are future skills such as agile working, responding flexible to changes and uncertainties, dealing with information overload, etc. (Foelsing, 2021).

Therefore, in our project we decided to focus on key competences. When we talk about self-regulated work, we refer to Zimmerman (2013) who defines self-regulation as a meta-cognitive, motivational and behavioral process. Students need to use specific strategies to achieve their goals and this must be based on perceptions of self-efficacy (Maddux & Gosselin, 2012). Three elements are important in this context: students' self-regulated learning strategies, their self-efficacy perceptions of their performance capabilities, and their commitment to realistically set goals (Locke & Lantham, 2019). The relationship between attributes (e.g. passion and persistence), visions, goals, and self-efficacy is critical to student success. This process requires students to prepare, monitor and evaluate their own learning. Digital technologies provide an open gateway to new learning alternatives and options that support the acquisition of self-regulation skills (Bernacki et al., 2011). Yot-Domínguez and Marcelo (2017) note that students' use of technology to support self-regulated learning skills is limited. It makes sense to develop the content of the IDSA together with students, teachers and university administrators, who will also evaluate the functionalities in the next step. Functionalities that include a balance of intrinsic and extrinsic factors can be included in an IDSA. Intrinsic and extrinsic factors can be implemented in an IDSA, for example, through the Interactive, Constructive, Active, Passive (ICAP) framework (Michelene et al., 2014).

Student life cycle

Each phase of study has different requirements and presents students with new opportunities and challenges. The Student Life Cycle (SLC) encompasses all relevant tasks and areas of activity of students, faculty, and university administrators related to study (Sprenger et al., 2010). In general, the following phases can be highlighted (Lizzio et al., 2012): (1) orientation, (2) application for admission and matriculation, (3) participation in courses and examinations, (4) graduation and exmatriculation, and (5) alumni activities. IDSA can support these phases with specific functionalities. The decision on which phase to offer an IDSA depends on several factors. The structure and focus of the SLC varies in terms of instruction (Harlan, 1994), quality management (Manarbek et al., 2020), and the cost of a CMS (Sprenger et al., 2010). Bates and Hayes (2017) point out that students in transition, such as those moving from a bachelor's to a master's program, need more intensive and targeted support to make sustainable decisions. Wymbs (2016) points out that a lot of electronic data is collected at the point of enrolment. This data can be used, for example, to match self-assessment data with artificial intelligence (AI) data to provide tailored support in the decision-making process of finding a suitable course. Overall, the focus is on user-centered action within the study phases.

The progressive digital transformation of the university enables a wide range of study programs, seminars and lectures and realizes individual needs such as study financing and psychological support and media-supported designs. In this context, an SLC as an organizational structure provides a binding, market-oriented set of rules for students, faculty, and university administration, thus ensuring stability in diversity (Harlan, 1994). We use the differentiated SLC of Sprenger et al. (2010) to design support services and functionalities. The following three phases in Table 1 include structured sub-dimensions that provide guidance for the conceptualization, development, implementation, and evaluation of IDSA functionalities.

Table 1 Student Life Cycle (Sprenger et al., 2010)

Research design and methods

Our research is part of a long-term project to develop and evaluate an IDSA in HEI. A user- centered IDSA considers the perspectives of students, faculty, and administrators, as well as models and theories that provide theoretical support for various aspects. We chose the ADR approach because it directly involves the client and stakeholders in the process. The ADR method is generally iterative in the context of participatory inquiry between stakeholders and researchers. Sein et al. (2011) provide rigor to the ADR process through a structured approach that comprises four phases, see Fig. 1. In Table 2, we briefly summarize each phase of our research, as well as the activities undertaken and their outcomes. Due to their repetition, there are some overlaps and/or additions between the four phases.

Fig. 1
figure 1

ADR research cycles

Table 2 ADR: phases, tasks, and outcomes

Phase 1: problem formulation

We set out to explore the impact of HEI reforms and the challenges arising from this (see above), and to explore how this impacted on our own HEI. Drawing on our own experiences and a literature review, we held a kick-off meeting with all the researchers at the beginning of the project to formulate individual research sub-questions.

Phase 2: building, intervention, and evaluation

Our research process is divided into three cycles, as shown in Fig. 1. In a first iteration, we defined the research questions and planned the entire research process. We conducted an online survey with students from three German HEIs to identify initial requirements for an IDSA. They were asked about successful IDSA functionalities, required features, barriers to use and other important IDSA topics. A total of 570 students from different disciplines participated. In the next iteration, guided expert interviews were conducted with nine lecturers (INT.L.) and 19 staff members from different organizational units (INT.U.). A list of all expert profiles can be found in Additional file 1: Appendix S1. The interview guide included Critical Success Factors (CSF) of an IDSA, incentives to make use attractive, barriers to use and challenges to use. All interviews lasted between 35 and 65 min. They were recorded, transcribed and qualitatively analyzed using MAXQDA19 according to Corbin and Strauss (2014). Based on the results of the interviews, we were able to identify further requirements. We then conducted an extensive literature review using Cooper's (1988) taxonomy and following vom Brocke et al., (2009, 2015) and Watson and Webster (2020). The findings guided us in the development of the IDSA alpha prototype to translate the initial requirements into a technical implementation. During the initial implementation, only three requirements were implemented in a rudimentary way, as the focus was more on decisions about how to technically implement the IDSA in future design cycles. All development processes used a waterfall model (Thesing et al., 2021) combined with Scrum (Schwaber, 1997) for a hybrid software development process. The practitioners conducted four design thinking workshops (Plattner et al., 2011) with a total of 24 students to obtain initial feedback from the students. The six steps of Plattner et al. (2011) were carried out. These included testing our alpha prototype and generating ideas for further features, again using pre-defined personas (Lübcke et al., 2020).

Phase 3: reflection and learning

After the evaluation iterations of our alpha prototype, we systematically evaluated all feedback. Together with the identified requirements, we adapted our prototype in a hybrid software development process. For our IDSA beta prototype, we changed the three features and added seven new ones. We imported our prototype into a local LMS of three German HEIs. Parallel to the field test, the practitioners conducted a second design thinking workshop with 31 students in four workshops to evaluate our prototype. This was similar to the previous design thinking workshops, but without the use of personas (Lübcke et al., 2020). Again, we used the feedback and new insights along with the user data to adjust our beta prototype. After technical adjustments, the gamma prototype was made available to the students for three months. A more detailed evaluation by the experts took place in another workshop with four students. Based on the student survey, we reflected on the requirements of an IDSA from a student-centered perspective in a first inductive-explorative step. This resulted in the need to conduct further interviews with experts from different organizational units of the university and faculties from different subject areas in order to explore additional perspectives and identify needs. Following these studies, we conducted a thematic and methodological literature review. In parallel, we developed our first IDSA alpha prototype. In three-and-a-half-hour design thinking workshops with small groups of students, we analyzed further requirements. We also had a large number of testers, but hardly anyone gave feedback or generated ideas. This finding was confirmed during the upcoming prototype tests. There were more and more testers, but the feedback was comparatively weak. While developing the prototype and testing the functionalities, we took a closer look at the SLC: not an ISDA for all students, but a targeted one, e.g. for first-year students who are getting to know the HEI and have different questions and needs than students at the end of their bachelor's degree.

Phase 4: learning formalization

We identified a first set of general IDSA requirements. Based on these requirements, we developed our first IDSA alpha prototype and evaluated it in three iterations with multiple students and usage data. This allowed us to further refine our prototype and develop an integrated system. Our results and findings from all iterations provide a guide for the design, development, implementation, and evaluation of an IDSA in HEI. The initial general requirements for an IDSA emerged from our findings. With these requirements in mind, we developed our first IDSA alpha prototype and evaluated it in three iterations with multiple students and usage data. This allowed us to improve our prototype and develop an integrated system. Our findings and insights from all iterations provide guidelines for the design, development, implementation, and evaluation of an IDSA in HEI. Our third prototype was perceived as an improvement by students because we stopped trying to reach all students and focused on essential areas, such as supporting first-year students—the transition from school to college. There was a significant learning curve in designing an IDSA for specific target groups, e.g. new or older students or transition from Bachelor to Master.

First results, findings, and recommendations

Kickoff and project planning

We developed the research agenda and organized the research steps in a kick-off meeting with all project partners. Subsequently, the project partners and the individual universities got to know each other. Work packages with concrete responsibilities were assigned in a task-oriented manner. Basic tools for collaborative work were compiled on different platforms. Eight important research questions were to be addressed, two of which were discussed in detail: "How to achieve high actual usage" and "How to share teaching services and cross data". At the end of the meeting, the requirements analysis and the development of a back-end system were on the agenda. As a first step, the needs and requirements of the different stakeholders for an IDSA in relation to the above-mentioned research questions were collected.

Requirements analysis for an IDSA

As part of the requirements analysis, surveys were conducted on the needs and requirements of the various stakeholders, students, university management, and finally faculty. Initial requirement profiles were developed. Their questions and answers can be found in Additional file 1: Appendix S2. We then integrated the perspectives of HEI organizational units through qualitative interviews and a literature review. This provided us with an initial overview of the needs and requirements of these groups. From all the data, we derived eight IDSA requirements. We considered critical those that were mentioned by at least several participants and/or in the literature.

According to these, an IDSA must have functionalities (Requirement(R)1) that provide value to students. In particular, these include functionalities that support self-regulation (R1.1), course recommendations (R1.2), or OER and exchange networks (R1.3), learning organization (1.4), study abroad (R1.5), and achievement of learning goals (R1.6). Students provided examples: “References to open source materials for learning would also be great” or “I think it is important to reflect on one's own learning progress”. Students and staff also emphasized opportunities for sharing and networking (R1.7). “I find it particularly attractive when students have contact with groups with whom they can exchange ideas directly” (INT.L.1). This includes sharing experiences from previous courses as well as forming and finding communities or small groups for learning. All functionalities must support students at the first level and facilitate daily study in order to add value (R1.8). In addition, an IDSA must provide contact options (R1.9) for both technical support and content-related questions and provide timely responses (Bani-Salameh & Abu Fakher, 2015; La Rotta et al., 2020; McPherson & Nunes, 2006; Naveh et al., 2010; Zengh, 2013; student survey). In addition, users must be able to provide feedback on the IDSA (R1.10).

Another identified requirement is the usability and accessibility (R2) of an IDSA. It must be simple, intuitive, and easy to use (R2.1) and provide the flexibility to hide or discard irrelevant or uninteresting features through a modular design (R2.2) (Alla et al., 2013; Freeman & Urbaczewski, 2019; Lu & Dzikria, 2019; Uppal et al., 2017; student survey; expert interviews). "I see barriers in usability. It needs to be self-explanatory" (INT.U.1). Additionally, students request that an IDSA be easily accessible (R2.3). Complex and lengthy registration and login processes are major barriers to use (Alhabeeb & Rowley, 2018; Freeman & Urbaczewski, 2019). An IDSA needs to be platform agnostic (R2.4), for example, on browsers and mobile devices (Student Survey). It must also integrate with existing university platforms (R2.5). “Linking possibilities with already existing online platforms […] are very important, otherwise there will be redundancies and overlaps” (student survey).

We identify the quality and availability of information as another IDSA requirement (R3). The content, information, and recommendations provided by an IDSA must be credible, current, complete, and relevant (R3.1) (Holsapple & Lee-Post, 2006; Isaac et al., 2017; Naveh et al., 2012; Odunaike et al., 2013; Raspopovic & Jankulovic, 2014). Unreliable and outdated information is a critical barrier to the use of an IDSA. Redundancies must be eliminated (R3.2). Data and content must be limited to avoid information overload (R3.3) (Holsapple & Lee-Post, 2006; student survey). One student: “It would bother me if I got the same suggestions from [an IDSA] more than once” (student survey).

Data integration (R4) is another identified need. An IDSA needs to integrate data (R4.1) from multiple sources, such as LMS or CMS, to expand the database for recommendations. It also reduces the manual effort required to enter all the data and reduces conversion costs (R4.2). Students request “linkage to content in the examination regulations” and “portability of previous data”. Linking data (R4.3) allows more information to be considered and more reliable recommendations to be made. For example, recommendations must always be based on individual strengths, interests, competencies, and courses already taken (R4.4). Self-reliance (R4.5) must be maintained. Recommendations must remain recommendations and should not be automatically implemented. "No changes [should] be made without my manual approval […]. Recommendations tailored to me are great, but of course I want to make the decisions myself" (student respondent).

An IDSA must also offer the possibility of individualization (R5). Students want to individualize the IDSA according to their needs (R5.1) (Raspopovic & Jankulovic, 2014).

This includes "disabling some features when possible, but enabling other features to customize [the IDSA]" (Student Survey). In addition, based on integrated data (R4), an IDSA must have knowledge of a user's studies, progress, outcomes, and regulations (R5.2) and provide individualized recommendations and faculty-specific information on study, testing, and learning organization (R5.3).

We also identified marketing strategies and efforts for the IDSA as a requirement (R6). Early development of operational and strategic marketing activities and concepts (R6.1) is imperative. "It has to be made known that there is such a thing and that it is very much appreciated" (INT.U.14). This will increase awareness of the IDSA (R6.2) and encourage a greater number of users (R6.3). The wording and communication of these marketing activities, as well as of the IDSA, needs to be appropriate for the target audience (R6.4) and, e.g., rather simple and short (Machado-Da-Silva et al., 2014; Rapapovic & Jankulovic, 2014). Students also showed low willingness to use a paid IDSA. There should be “no additional cost to students for the service” (student survey). Therefore, the pricing of the IDSA (R6.5) is another critical barrier.

Our results show that privacy and data security (R7) need to be considered. According to Alsabawy et al. (2011), robust security and appropriate data communication and transmission lead to higher user trust. For students and university staff, transparent handling (R7.1), no misuse (R7.2), and anonymized data collection (R7.3) are essential. “Anonymized data collection and no disclosure to third parties or use of data for purposes outside known functions” (student survey). It must be clear to students what personal data is used for and why it may affect functionality. Therefore, detailed privacy settings (R7.4) are needed that describe the use of the data and the purpose of the processing and allow for customization options. A student: “complete transparency in the handling of personal data, possibility to refuse individual aspects of use if necessary” (student survey).

An IDSA must not be prone to error (R8). Continuous testing and adaptation during development (R8.1) is important (student survey; expert interviews). Even during field testing and after implementation, an IDSA must be continuously developed, bugs must be fixed (R8.2), and new updates must be provided (8.3). “Extensive testing before implementation [of the IDSA] for everyone to avoid as many problems as possible later” (student survey). In all of these processes, it is recommended that students be included as the primary end-user audience (R8.4). "It is best to have students test the application to identify weaknesses" (student interview). All identified requirements and their implementation during IDSA development are shown in Additional file 1: Appendix S3.

Prototypes’ development, usage analysis, and evaluation

Alpha prototype

As a result of the student survey and interviews, we collected many requirements, some of which we implemented in our alpha prototype. The first obstacle was that the system worked almost flawlessly, but only a few requirements could be met. Based on the results of the interviews, three functionalities were implemented: Study Abroad (information and guidance for planning a semester abroad) (R1.5), Interests (recommendation of learning resources, including events at own or partner universities, OERs, MOOCs) (R1.2, R1.3), and Learning Organization (education about learning organization techniques) (R1.4).

The alpha prototype was tested with 732 students, mainly on the criteria wording, system errors, and usability. Improvements were explored intensively in focus groups during a design thinking workshop with different personas (Lübcke et al., 2020). The goal was to get creative feedback and new ideas for improvements. Four workshops with five to seven participants were conducted at three different universities. The criteria for selecting participants were based on the SLC. The workshop lasted three and a half hours and was used to identify additional needs. Students put themselves in the shoes of the described personas to identify needs from their perspective, such as organizational and social integration, etc.

Beta prototype

Improvements

Our evaluation implied that students did not want to study alone, e.g., they wanted to be socially integrated. This requirement was realized in the functionality Get Together (R1.7). Further, one focus group’s focus was on supporting the transition phase from school to studies. Requirements were adapted in the functionality Study Orientation. Personal interests (R1.2) were supported first level by the functionality Interests. In addition, the functionalities Learning Organization (R1.4, R1.6), Memory & Attention (R1.1), Data Ethics (R7), Evaluation (R1.9, R1.10), Study Finance (R1.1), OER (R1.3), and Scientific Career (R1.1), To-do (R1.1, R1.4, R1.6), Personality (R1.1) were developed. Table 3 gives an overview of all implemented functionalities with a short description.

Table 3 Developed functionalities during our ADR cycles
First field study usage data analysis

We integrated our beta prototype into the LMS of the three German universities for three months (R2.5; R3; R4; R6.2; R6.3; R6.5) to get feedback from our target users (R8.4). It was accessible via web browser or mobile phone (R2.4). Use and testing were voluntary. All features were presented with a brief description, and users had the option to enable or disable features based on their interests. This allowed for self-reliance (R4.5) and individualization of the IDSA (R2.2; R5). A total of 1036 students used the prototype. In accordance with the individually adjustable privacy settings and opt-out options (R7), 634 students released their data for research purposes. Of these, an additional 145 did not submit data, leaving 489 for further analysis. Of these, 72% were in the bachelor's program (72%) and 23% were in the master's program. Of the bachelor's students, most were in their first semester (30%) and the rest were in their second to seventh semester (39%), while 19% of the master’ students were in their first three semesters. The heterogeneity of the students resulted in a variety of different majors, such as science, education, and business. Overall, the functionalities Learning Organization, Interests, OER, Personality, and To-Do (R1.1; R1.4; R1.6) had the highest usage with 480 to 555 uses. Get Together, Data Ethics, Scientific Career, Study Abroad, and Study Finance were activated between 310 and 387 times. Students also had the opportunity to provide more feedback on the added value (R1.8; R1.9; R1.10) and the wording (R6.4) of the assessment functionality (R1.9; R8). Additional questions were included in the evaluation to assess the usability and user interface (R2) in more detail. Students rated the usability and design, and whether they found it easy and intuitive. Feedback was received from 271 students. With the exception of one free-text option, all questions in the evaluation were answered on a five-point Likert scale (1 “strongly disagree” to 5 “strongly agree”). Our evaluation shows that students did not always see added value in the functionalities (R1.8) (M = 2.79) and could only partially imagine using the IDSA on a regular basis (M = 3.1). The IDSA provided students with functionalities that supported reflection and thinking about their study goals (R1.1; R1.6) (M = 3.58). Students found the wording used in the texts and content to be understandable (R6.4) (M = 3.52). Students rated the use of the IDSA as pleasant (R2) (M = 3.75) and moderately intuitive (R2.1; R2.3) (M = 3.44). Design (M = 3.08) and implementation (M = 2.93) were rated as weak and in need of improvement.

First student evaluation workshop

In addition to the usage data, four three-and-a-half-hour design thinking workshops (Plattner et al., 2010) were conducted with seven to nine students (n = 31) from each of the three German universities (R8) (Lübcke et al., 2020, 2021). The students were divided into groups according to their level of study. The aim of the workshops was to evaluate the prototype in order to improve it. The learning assistant was presented with its functionalities and the further procedure was explained. The students tested two or three functionalities in small groups. They used a function-specific questionnaire to provide feedback on the functionality. The individual functionalities were grouped in a matrix with four dimensions: very/not helpful and very/not understandable. Based on this, the students made suggestions for improvement and moved the features from unhelpful/unintelligible to helpful/understandable. Overall, the Personality and Getting Together features were selected most frequently with seven tests, followed by Interests (n = 4), Learning Organization (n = 4), Study Abroad (n = 2), To-do (n = 2), Academic Career (n = 2), and Data Ethics (n = 1). The Personality, Get Together, and Interests features were rated as the most helpful in the matrix. Personality and Get Together were described as sometimes difficult to understand. Students saw a lot of potential in the Personality feature, especially in freshman orientation (new requirement: R5.4—audience specific). They recommended that the feature be more personalized and include more individual input (R4; R5). For the Get Together feature, students wanted more information about their matches. Students saw a lot of potential in the Interests feature. It was unclear how they should describe their interests and they suggested that students should be given more information on how to formulate their interests, e.g., with reflection questions about their studies and appropriate example formulations from other students (new requirement: R1.11—reflection questions). Students used the feedback option and the matrix to critically question functionalities, e.g., because they do not offer significant added value in their current state (R1.8). This is especially true for the To-Do, Learning Organization, Scientific Career, and Data Ethics functionalities. In general, the importance of customization (R5), support (R1), and usability (R2) was emphasized in these workshops.

Gamma prototype

Adaptions

Based on the usage data and performed design thinking workshops, we adapted and improved our prototype and increased added values (R1.8; R8). We merged together the functionalities To-do and Learning Organization to prevent redundancies and overlaps between them. Also, we included the functionality OER into the Interests functionality. In addition to matching events from the own HEI, the IDSA now also recommends relevant OER and MOOCS (R1.2; R1.3). In addition, we introduced reflection questions to help students formulate their interests (R1.11). Further, we excluded the functionality Scientific Career due to missing added values (R1.8). Although the Data Ethics functionality was also rated as not helpful in the design thinking workshops, we decided to keep this functionality in the system to sensitize students to data protection and strengthen data protection awareness (R7). Regarding the Study Abroad functionality, we adapted and improved the current scope. It now consists of a checklist of tasks and things to do before during or after the application phase to support self-organization abilities (R1.1). The Get Together functionality now allows to create a summary with personal information, such as name, picture, a short description, and information of the Stay Abroad and Interests functionality. This information is either entered manually or, with the consent of the users (R7), taken from other functionalities (R4). After a match, information about the respective person is given. Again, the users decide which information is displayed (R7). Due to the high number of users (R6.2; R6.3), especially in the early study phase, we introduced a new functionality Study Orientation to support them (R5.4). It contains information about studying, such as the library, exam preparation, or self- and time management. The previously stand-alone Study Finance function has also been integrated here. We further added an additional functionality Study Goals to further support students' study goal achievement (R1.6). It allowed to map individual goals to derive strategies to reach them. The functionality Personality was restructured and changed to Memory and Attention. We did not want to offer a psychological personality test, because this would have necessitated another offer: the individual evaluation and dialogue with the student. Memory and Attention is more appropriate to our overall idea of the IDSA. The functionality tests the long- and short-term memory and, depending on the result, provides tips and methods to train the skills, self-controlled. The functionality Learning Organization was extended by a self-test, and depending upon the result, methods and tips were offered. All realized functionalities of all prototypes can be seen in Table 3. In addition to the functional changes, we also adjusted and specified the privacy settings (R7).

Second field study data analysis

We deployed the next prototype in the local LMSs of the three German universities for three months to get feedback from the target users. Usage was voluntary and students had the opportunity to select functionalities according to their interests and competencies and to provide feedback in the evaluation function. A total of 1027 students used the prototype and 274 provided their usage data for research purposes (R7). Of these, 106 provided their student data. Most users were undergraduate students (78%) in their first (40%), third (16%), or fifth (22%) semester, and 22% were master students in their first two semesters (57%). In addition, students from a variety of disciplines used the prototype. The Study Goals and Memory & Attention functionalities had the most interactions, with more than 180 users each. Interests, Study Abroad, and Learning Organization each had between 130 and 140 users, and the remaining functionalities, Data Ethics, Get Together, and Study Orientation, had between 100 and 125 users. Our gamma prototype also included the rating feature to get feedback. Compared to our beta prototype with 271 interactions, four students provided feedback on the gamma prototype. This feedback was valuable in terms of quality, but the comparison based on usage data to our beta prototype was limited in terms of the impact of adjustments on value-added, wording, and usability.

Second student evaluation workshops

The practitioners also conducted evaluations for our Gamma prototype. This time it was an evaluation workshop with four students. After explaining the current state of our prototype and its functionalities, the students evaluated the outcome, impact and output of selected functionalities using a program tree according to Beywel and Widmer (2009). For Learning Organization (R1.4), the feedback was that the functionality allows for more efficient learning, makes the learning process less stressful, acquires general learning organization skills, and develops learning strategies. Study Orientation (R1.12) provides added value to students in terms of needs-based presentation of course offerings and contributes to personal development, the Memory and Attention functionality (R1.13) supports learning and enables individual suggestions to increase attention. Interest (R1.2) is seen as particularly useful at the beginning of the semester, as it supports students in planning the course of the semester. Another result of the workshop is that the Study Abroad functionality (R1.14) helps with self-organization before departure. Overall, our Gamma prototype was perceived as a useful support, especially for better and more structured study planning (R1.1). After the third iteration of the evaluation, we finished the design and development process and transferred the prototype into an active system.

Discussion, implications, and generalized recommendations

The ADR allows for diversity in the choice of data collection in many tools. These served to capture the different perspectives of the stakeholders and thus also solidified our theoretical basis, the needs of the students, such as decision support, e.g., in choosing a master's program, setting goals and pursuing them in a focused way, or orientation in the university system as a freshman. According to Hone and El Said (2016), the dropout rate has also increased in recent years. Based on Eom et al., (2016) and Ritz et al. (2022), lecturers and the management of university units, e.g., central student advisory services, are also convinced that self-organization, motivation and regulation are particularly key future skills. While the field of learning assistants has been well researched, the field of these in combination with self-organization and first-level support has been less intensively researched. This is particularly the case since the framework conditions, such as independence and stability in decision making, are a prerequisite for being accepted in a system. With the development of the IDSA, we moved in the direction of strengthening key future skills by designing functionalities with regular feedback from students. It should be noted that the need for research has been identified and the first prototypes of an IDSA have been created, but it remains to be seen whether the IDSA will have the desired effect, i.e. whether the need for individual guidance has increased in the course of the university reforms. It also remains to be seen whether self-organization will be encouraged. The IDSA was originally designed for different target groups, but over the years it has become clear that the student body is too heterogeneous and the needs too diverse. At the same time, it has become clear that the SLC orientation is a good way to focus on each phase of study with its specificities: What are the needs of first-year students, students in the middle of their studies, students transferring from a bachelor's to a master's program, or students who have already graduated? Another long-term observation shows that students demonstrate little self-control competence, i.e. we also saw a pedagogical task in guiding students to formulate and realize their own educational goals step by step, to focus on each phase of this goal with their specific needs. The development of focused learning phases (cf. SLC) for the selection and thus for the promotion of self-control competence is a great challenge for the students; at the same time, it is always a balancing act between “I, the lecturer, will show you how to do it” and the provocation of resistance in the sense of “don’t patronize me”. The stakeholders were unanimous in warning against this “paternalism”. Furhter, students already use many the applications on the market to support themselves and their studies. At best, our approach can be inviting—in the sense of: “we’ll show you an approach that numerous studies show contributes to student success” (e.g. Michelene et al., 2014; Payan-Carreira et al., 2023). We can generate interest by offering an “all-in-one solution”, in the sense that when a student enrolls, he or she already has access to all systems, the library and the course catalogue, the IDSA and all units such as the central student advisory service, the international office, the financial advisory service, and the psychological counseling.

Our alpha prototype was very rudimentary in its functionality, the beta prototype that students could already work with, and our gamma prototype showed useful functionalities. Qualitative feedback was sparse. Many students accepted the invitation to log into the system and tried the prototype, but only a small percentage were willing to share their data and experiences. During the development of the first two prototypes, focus groups were used to provide qualitative feedback on the functionalities of the system and its technical functioning. They also provided suggestions for further development of the features. The experts (Lübcke et al., 2020) indicated that they were able to discuss the results of the prototype in small workshop groups. After the development of our gamma prototype, it was transferred to an active LMS. The theory-driven prototype development supported the goal of user-centered development. In addition to these criteria, there are other project management tasks that are important for decision makers who want to implement an IDSA. An interdisciplinary team has the advantage of looking at the product to be developed from many perspectives, which raises questions that need to be answered, like whether or not to include a chatbot in the IDSA. One discipline sees only the educational effect, another is concerned with the training of the AI software, yet another is critical of the maturity of the system. The approach of installing a team development process from the beginning, with everyone involved in the project, and on an ongoing basis—every six months or so—contributes significantly to the success of the project. When a team is distributed, it is tempting to hold virtual meetings all the time. A mix of face-to-face and virtual meetings is beneficial for team collaboration.

It is invaluable to attract and involve top management as multipliers and to publicize the project. Marketing efforts must begin early. An IDSA must be publicized beyond the boundaries of the institution. In developing the guidelines for IDSA decision makers (see Table 4), we considered the dimensions of IS success (DeLone & McLean, 2016). Our results and findings align with these dimensions. Consistent, clearly articulated, relevant, and timely information provides a high level of information quality, while qualified personnel providing professional and efficient technical support and maintenance influences the service quality dimensions. Table 4 summarizes our findings and the practical benefits of the IDSA. There is a need for further research as a result of field testing by numerous students who test the functionalities in their everyday studies and provide feedback. In particular, it is clear that PCA and IDSA should continue to collaborate on research.

Table 4 IDSA Guidelines for HEI Decision-Makers

Limitations, further research, and conclusions

We focused on the design, development, implementation, and evaluation of an IDSA in HEI with our identified requirements and prototype development guidelines. However, we cannot say anything about the long-term effects of our IDSA. Further research can analyze the IDSA’s long-term influence on a student’s self-organization abilities and study support and simplification, using, e.g., quantitative and qualitative analyses or field studies. During our ADR iterations, we realized that there are many hurdles for an IDSA, not all of which are addressed here. Further research can systematically analyze the necessary changes and adjustments within a HEI to enable an IDSA design and development. Once introduced, the usage rate critically influences the success of an IDSA. Further research can focus on what factor influence the IDSA usage and how to influence adoption, trust, and acceptance. In case the IDSA is not used widely, it is also possible to determine failure reasons. During our IDSA implementation, we were restricted by the LMS. Despite much research on voice-based assistants, our IDSA does not include it. Further research can analyze how to transform our IDSA in a voice-based assistant by adapting the functionalities or introducing new ones. Our literature review to derive IDSA requirements was international in scope, and the findings from our qualitative and quantitative research and evaluations were from German participants only. This weakens the generalizability, because HEIs and studies in, e.g., Germany, America, Australia and Asia are quite different. Thus, further research can build on our results and findings to identify international similarities and differences in IDSA requirements and guidelines. Further, some of our evaluations were limited in scope. In particular, for our Gamma prototype, we did not receive much feedback in the form of usage data, and the workshop was limited to four participants only. In summary, however, we contribute to theory and practice and provide insights, knowledge, expertise, and guidance on IDSA in HEI.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Abbreviations

ADR:

Action Design Research

IDSA:

Individual Digital Study Assistants

HEI:

Higher education institutions

COVID19:

Coronavirus disease 2019

MOOC:

Massive open online courses

LMS:

Learning management system

IS:

Information systems

CA:

Conversational agents

PCA:

Pedagogical conversational agents

SPA:

Smart personal assistants

RQ:

Research question

OER:

Open educational research

CMS:

Campus management systems

ICAP:

Interactive, Constructive, Active, Passive

SLC:

Student life cycle

INT(U):

Interview organizational units

CSF:

Critical success factors

INT.L.:

Interview lecturers

(R1):

Requirement

IT:

Information Technology

References

  • Alhabeeb, A., & Rowley, J. (2018). E-learning success factors: Comparing perspectives from academic staff and students. Computers and Education, 127, 1–12. https://doi.org/10.1016/j.compedu.2018.08.007

    Article  Google Scholar 

  • Alla, M. M. S. O., Faryadi, Q., & Fabil, N. B. (2013). The impact of system quality in e-learning system. Journal of Computer Science and Information Technology, 1(2), 14–23.

    Google Scholar 

  • Alsabawy, A. Y., Cater-Steel, A., & Soar, J. (2011). Measuring e-learning system success. Proceedings of the Pacific Conference on Information Systems. https://doi.org/10.4018/978-1-4666-0170-3.ch015

    Article  Google Scholar 

  • Bani-Salameh, H., & Abu Fakher, S. (2015). E-learning critical success factors model: Empirical investigation. Proceedings of the International Conference on Intelligent Information Processing, Security and Advanced Communication. https://doi.org/10.1145/2816839.2816870

    Article  Google Scholar 

  • Bates, L., & Hayes, H. (2017). Using the student lifecycle approach to enhance employability: An example from criminology and criminal justice. Asia-Pacific Journal of Cooperative Education, Special Issue, 18, 141–151.

    Google Scholar 

  • Bernacki, M. L., Aguilar, A. C., & Byrnes, J. P. (2010). Self-regulated learning and technology-enhanced learning environments: An opportunity-propensity analysis. In G. Dettori & D. Persico (Eds.), fostering self-regulated learning through ICT (pp. 1–26). IGI Global. https://doi.org/10.4018/978-1-61692-901-5.ch001

    Chapter  Google Scholar 

  • Beywel, W., Widmer, T. (2009) Stand und Perspektiven der Evaluation. Z Politikberat 2, 499–506. https://doi.org/10.1007/s12392-009-0210-7

    Article  Google Scholar 

  • Busse, J., Lange, A., Hobert, S., & Schumann, M. (2020). How to design learning applications that support learners in their moment of need—Didactic requirements of micro learning, Proceedings of the Americas Conference on Information Systems.

  • Cai, W., Grossman, J., Lin, Z., Sheng, H., Tian, J., Wei, Z., Williams, J. J., & Goel, S. (2019). MathBot: a personalized conversational agent for learning math. Association for Computing Machinery (ACM).

    Google Scholar 

  • Clarke, J., Nelson, K., & Stoodley, I. (2013). The place of HEI institutions in assessing student engagement, success and retention: A maturity model to guide practice. In S. Frielick, N. Buissink-Smith, P. Wyse, J. Billot, J. Hallas, & E. Whitehead (Eds.), Research and Development in HEI: The Place of Learning and Teaching (pp. 91–101). HEI Research and Development Society of Australasia.

    Google Scholar 

  • Cooper, H. M. (1988). Organizing knowledge syntheses. A Taxonomy of Literature Reviews, Knowledge in Society, 1, 104–126. https://doi.org/10.1007/BF03177550

    Article  Google Scholar 

  • Corbin, J., & Strauss, A. C. (2014). Basics of qualitative research. Techniques and Procedures for Developing Grounded Theory, Thousand Oaks. https://doi.org/10.4135/9781452230153

    Article  Google Scholar 

  • Daradoumis, T., Marquès Puig, J. M., Arguedas, M., & Liñan, L. C. (2021). A distributed systems laboratory that helps students accomplish their assignments through self-regulation of behavior. Educational Technology Research and Development, 69, 1077–1099. https://doi.org/10.1007/s11423-021-09975-6

    Article  Google Scholar 

  • DeLone, W. H., & McLean, E. R. (2016). Information systems success measurement. Foundations and Trends in Information Systems. https://doi.org/10.1561/2900000005

    Article  Google Scholar 

  • Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education, 4(2), 215–235. https://doi.org/10.1111/dsji.12097

    Article  Google Scholar 

  • Eom, S.B., Wen, H.J., Ashill, N. (2016). The determinants of students’ perceived learning outcomes and satisfaction in University online education: An empirical investigation. Decision Sci J Innov Educ. https://doi.org/10.1111/dsji.12097

    Article  Google Scholar 

  • Foelsing, J., & Schmitz, A. (2021). New work braucht (needs) new learning., (A perspective journey through the transformation of our organizational and learning worlds). Springer Gabler. https://doi.org/10.1007/978-3-658-32758-3

    Book  Google Scholar 

  • Freeman, L., & Urbaczewski, A. (2019). Critical success factors for online education: Longitudinal results on program satisfaction. Communications of the Association for Information Systems, 44, 630–645. https://doi.org/10.17705/1CAIS.04430

    Article  Google Scholar 

  • Granić, A. (2022). Enhancing online participation in education: Quarter century of research. Journal of Computers in Education. https://doi.org/10.1007/s40692-022-00238-8

    Article  Google Scholar 

  • Harlan, R. (1994). The automated student advisor: A large project for expert systems courses. ACM SIGCSE Bulletin, 26, 31–35.

    Article  Google Scholar 

  • Hobert, S. (2019a). Say hello to ‘coding tutor’! Design and evaluation of a chatbot-based learning system supporting students to learn to program, Proceedings of the International Conference on Information Systems.

  • Hobert, S. (2019). How are you chatbot? Evaluating chatbots in educational settings—results of a literature review. Proceedings of the DELFI Conference. https://doi.org/10.18420/delfi2019_289

    Article  Google Scholar 

  • Holsapple, C. W., & Lee-Post, A. (2006). Defining, assessing, and promoting e-learning success: An information systems perspective. Decision Sciences Journal of Innovative Education, 4(1), 67–85. https://doi.org/10.1111/j.1540-4609.2006.00102.x

    Article  Google Scholar 

  • Hone, K. S., & El Said, G. R. (2016). Exploring the factors affecting MOOC retention: A survey study. Computers and Education, 98, 157–168. https://doi.org/10.1016/j.compedu.2016.03.016

    Article  Google Scholar 

  • Hornsby, D. J., & Osman, R. (2014). Massification in HEI: Large classes and student learning. HEI, 67(6), 711–719. https://doi.org/10.1007/s10734-014-9733-1

    Article  Google Scholar 

  • Iatrellis, O., Stamatiadis, E., Samaras, N., Panagiotakopoulos, T., & Fitsilis, P. (2023). An intelligent expert system for academic advising utilizing fuzzy logic and semantic web technologies for smart cities education. Journal of Computers in Education, 10, 293–323. https://doi.org/10.1007/s40692-022-00232-0

    Article  Google Scholar 

  • Isaac, O., Abdullah, Z., Ramayah, T., & Mutahar, A. M. (2017). Internet usage, user satisfaction, task-technology fit, and performance impact among public sector employees in Yemen. International Journal of Information and Learning Technology, 34(3), 210–224. https://doi.org/10.1108/IJILT-11-2016-0051

    Article  Google Scholar 

  • Karrenbauer, C., König, C. M., & Breitner, M. H. (2021). Individual digital study assistant for higher education institutions: Status quo analysis and further research agenda, Proceedings of the International Conference of Wirtschaftsinformatik.

  • Knote, R., Janson, A., Söllner, M., & Leimeister, J. M. (2019). Classifying smart personal assistants: An empirical cluster analysis. Proceedings of the Hawaii International Conference on System Sciences. https://doi.org/10.24251/HICSS.2019.245

    Article  Google Scholar 

  • König, C. M., Karrenbauer, C., Guhr, N., & Breitner, M. H. (2020). Dialogue-driven digital study assistants for higher education – A morphological analysis, Proceedings of the International Conference of Education, Research and Innovation.

  • Kumar, V., Dixit, A., Javalgi, R. G., & Dass, M. (2016). Research framework, strategies, and applications of intelligent agent technologies (IATs) in marketing. Journal of the Academy of Marketing Science, 44, 24–45. https://doi.org/10.1007/s11747-015-0426-9

    Article  Google Scholar 

  • La Rotta, D., Cecilia Usuga, O., & Clavijo, V. (2020). Perceived service quality factors in online HEI. Learning Environments Research, 23, 251–267. https://doi.org/10.1007/s10984-019-09299-6

    Article  Google Scholar 

  • Lehmann, T., Blumschein, P., & Seel, N. M. (2022). Accept it or forget it: Mandatory digital learning and technology acceptance in HEI. Journal of Computers in Education. https://doi.org/10.1007/s40692-022-00244-w

    Article  Google Scholar 

  • Lee, Y. F., Hwang, G. J., & Chen, P. Y. (2022). Impacts of an AI-based chatbot on college students’ after-class review, academic performance, self-efficacy, learning attitude, and motivation. Educational Technology Research and Development, 70, 1843–1865. https://doi.org/10.1007/s11423-022-10142-8

    Article  Google Scholar 

  • Lizzio, A., & Wilson, K. (2012). Student lifecycle, transition and orientation, in Facilitating Commencing Student Success Across the Lifecycle: Strategic Student Orientation. Griffith University.

    Google Scholar 

  • Locke, E. A., & Latham, G. P. (2006). New directions in goal-setting theory. Current Directions in Psychological Science, 5(5), 265–268. https://doi.org/10.1111/j.1467-8721.2006.00449.x

    Article  Google Scholar 

  • Lu, H.-P., & Dzikria, I. (2019). Critical success factors (CSFs) of distance learning systems: A literature assessment. Proceedings of the International Joint Conference on Information, Media and Engineering. https://doi.org/10.1109/IJCIME49369.2019.00044

    Article  Google Scholar 

  • Lübcke, M., Seyfeli, F., & Wannemacher, K. (2020). Taking the Role of the Other: How Personas in a Design-Thinking based Workshop help to develop Requirements for a data-based Student Assistant. Proceedings of the 13th International Conference of Education, Research and Innovation (ICERI), Sevilla, Spanien.

  • Lübcke, M., Seyfeli-Özhizalan, F., Wannemacher, K. (2021). Getting Peer-Matched by Study interests: Students’Expactations and initial experience with a digital study assistant. in Proceedings of the International Conference on Cognition and Exploratory Learning in Digital Age.

  • Machado-da-Silva, F., Meirelles, F. S., Filenga, D., & Filho, M. B. (2014). Student satisfaction process in virtual learning system: Considerations based in information and service quality from Brazil’s experience. Turkish Online Journal of Distance Education, 15(3), 122–142. https://doi.org/10.17718/tojde.52605

    Article  Google Scholar 

  • Maddux, J. E., & Gosselin, J. T. (2012). Self-efficacy. In M. R. Leary & J. P. Tangney (Eds.), Handbook of self and identity (pp. 198–224). New York: The Guilford Press.

    Google Scholar 

  • Maedche, A., Legner, C., Benlian, A., Berger, B., Gimpel, H., Hess, T., Hinz, O., Morana, S., & Söllner, M. (2019). AI-based digital assistants: Opportunities, threats, and research perspectives. Business & Information Systems Engineering, 61, 535–544. https://doi.org/10.1007/s12599-019-00600-8

    Article  Google Scholar 

  • Manarbek, G., Kondybayeva, S., Doszhan, R., Turarov, D., Abylay, A., & Pantea, F. (2020). Quality management of HEI: Innovation approach from perspectives of institutionalism. An exploratory literature review, Cogent Business & Management. https://doi.org/10.1080/23311975.2020.1749217

    Article  Google Scholar 

  • McPherson, M., & Nunes, M. B. (2006). Organisational issues for e-learning: Critical success factors as identified by HE practitioners. International Journal of Educational Management, 20(7), 542–558. https://doi.org/10.1108/09513540610704645

    Article  Google Scholar 

  • Meyer von Wolff, R., Nörtemann, J., Hobert, S., & Schumann, M. (2019). Chatbots for the information acquisition at universities—a student’s view on the application area. In A. Følstad, T. Araujo, S. Papadopoulos, E. Lai-Chong Law, O.-C. Granmo, E. Luger, & P. Bae Brandtzaeg (Eds.), Chatbot Research and Design. New York: Springer.

    Google Scholar 

  • Michelene, T. H., Chi, & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist, 49(4), 219–243. https://doi.org/10.1080/00461520.2014.965823

    Article  Google Scholar 

  • Naveh, G., Tubin, D., & Pliskin, N. (2010). Student LMS use and satisfaction in academic institutions: The perspective. The Internet and HEI, 13(3), 127–133. https://doi.org/10.1016/j.iheduc.2010.02.004

    Article  Google Scholar 

  • Odunaike, S. A., Olugbara, O. O., & Ojo, S. O. (2013). E-learning implementation critical success factors, Proceedings of the International Multi-Conference of Engineers and Computer Scientists.

  • OECD (2023), Number of students (indicator), (Accessed 05 July 2023). https://doi.org/10.1787/efa0dd43-en

  • Payan-Carreira, R., Sebastião, L., Cristóvão, A., & Rebelo, H. (2023). How to enhance students’ self-regulation. In J. Dutton (Ed.), The psychology of self-regulation. New York: Nova Science Publishers, Inc.

    Google Scholar 

  • Plattner, H., Meinel, C., & Leifer, L. (2011). Design thinking. Understand—improve—apply. New York: Springer. https://doi.org/10.1007/978-3-642-13757-0

    Book  Google Scholar 

  • Raspopovic, M., & Jankulovic, A. (2014). Performance measurement of e-learning using student satisfaction analysis. Information Systems Frontiers, 19, 869–880. https://doi.org/10.1007/s10796-016-9636-z

    Article  Google Scholar 

  • Ritz, E., Wambsganss, T., Rietsche, R., Schmitt, A., Oeste-Reiss, S, & Leimeister, J. M. (2022). Unleashing process mining for education: Designing an IT-tool for students to self-monitor their personal learning paths, Proceedings of the International Conference on Wirtschaftsinformatik.

  • Ruan, S., Jian, L., Xu, J., Joe-Kun Tham, B., Qiu, Z., Zhu, Y., Murnane, E. L., Brunskill, E., & Landay, J. A. (2019). QuizBot: A dialogue-based adaptive learning system for factual knowledge. Proceedings of the CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3290605.3300587

    Article  Google Scholar 

  • Schwaber, K. (1997). SCRUM development process. In J. Sutherland, C. Casanave, J. Miller, P. Patel, & G. Hollowell (Eds.), Business object design and implementation (pp. 117–134). New York: Springer. https://doi.org/10.1007/978-1-4471-0947-1_11

    Chapter  Google Scholar 

  • Sein, M., Henfridsson, O., Purao, S., Rossi, M., & Lindgren, R. (2011). Action design research. Management Information Systems Quarterly, 35(1), 37–56. https://doi.org/10.2307/23043488

    Article  Google Scholar 

  • Sprenger, J., Klages, M., & b.f.R. (2010). Cost-benefit analysis for the selection, migration, and operation of a campus management system. Business & Information Systems Engineering, 2, 219–231. https://doi.org/10.1007/s12599-010-0110-z

    Article  Google Scholar 

  • Tenspolde, C., Greiff, P., König, C. M., Guhr, N., & Hoppe, U. (2019). Barriers of a digital study assistant – Classification within a digital transformation taxonomy, Proceedings of E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2019.

  • Thesing, T., Feldmann, C., & Burchardt, M. (2021). Agile versus waterfall project management: Decision model for selecting the appropriate approach to a project. Procedia Computer Science, 181, 746–756. https://doi.org/10.1016/j.procs.2021.01.227

    Article  Google Scholar 

  • Uppal, M. A., Ali, S., & Gulliver, S. R. (2017). Factors determining e-learning service quality. British Journal of Educational Technology, 49(3), 412–426. https://doi.org/10.1111/bjet.12552

    Article  Google Scholar 

  • Van der Wende, M. C. (2000). The bologna declaration: Enhancing the transparency and competitiveness of European HEI. Journal of Studies in International Education, 4(2), 3–10. https://doi.org/10.1177/102831530000400202

    Article  Google Scholar 

  • Vom Brocke, J., Simons, A., Niehaves, B., Reimer, K., Plattfaut, R., & Cleven, A. (2009). Reconstructing the giant: On the importance of rigour in documenting the literature search process, Proceedings of the European Conference on Information Systems.

  • Vom Brocke, J., Simons, A., Riemer, K., Niehaves, B., Plattfaut, R., & Cleven, A. (2015). Standing on the shoulders of giants: Challenges and recommendations of literature search in information systems research. Communications of the Association for Information Systems, 37(1), 205–224. https://doi.org/10.17705/1CAIS.03709

    Article  Google Scholar 

  • Wambsganss, T., Schmitt, A., Mahnig, T., Ott, A., Soellner, S., Ngo, N. A., Geyer-Klingeberg, J., Nakladal, J., & Leimeister, J. M. (2021a). The potential of technology-mediated learning processes: A taxonomy and research agenda for educational process mining, Proceedings of the International Conference on Information Systems.

  • Wambsganss, T., Küng, T., Söllner, M., & Leimeister, J. M. (2021b). ArgueTutor: An adaptive dialog-based learning system for argumentation skills. Proceedings of the CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3411764.3445781

    Article  Google Scholar 

  • Watson, R. T., & Webster, J. (2020). Analysing the past to prepare for the future: Writing a literature review a roadmap for release 2.0. Journal of Decision Systems, 29, 129–147. https://doi.org/10.1080/12460125.2020.1798591

    Article  Google Scholar 

  • Weber, F., Wambsganss, T., Rüttimann, D., & Söllner, M. (2021). Pedagogical agents for interactive learning: A taxonomy of conversational agents in education, Proceedings of the International Conference on Information Systems.

  • Wellhammer, N., Dolata, M., Steigler, S., & Schwabe, G. (2020). Studying with the help of digital tutors: Design aspects of conversational agents that influence the learning process. Proceedings of the Hawaii International Conference on System Sciences. https://doi.org/10.24251/HICSS.2020.019

    Article  Google Scholar 

  • Winkler, R., Hobert, S., Salovaara, A., Söllner, M., & Leimeister, J. M. (2020). Sara, the lecturer: Improving learning in online education with a scaffolding-based conversational agent. Proceedings of the CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3313831.3376781

    Article  Google Scholar 

  • Wollny, S., Schneider, J., Di Mitri, D., Weidlich, J., Rittberger, M., & Drachsler, H. (2021). Are we there yet?—A systematic literature review on chatbots in education. Frontiers in Artificial Intelligence, 4, 654924. https://doi.org/10.3389/frai.2021.654924

    Article  Google Scholar 

  • Wong, B. T. M., & Li, K. C. (2019). Using open educational resources for teaching in HEI: A review of case studies. Proceedings of the International Symposium on Educational Technology. https://doi.org/10.1109/ISET.2019.00046

    Article  Google Scholar 

  • Wymbs, C. (2016). Make better use of data across the student life cycle. Enrollment Management Report, 20, 8.

    Article  Google Scholar 

  • Yot-Domínguez, C., & Marcelo, C. (2017). University students’ self-regulated learning using digital technologies. International Journal of Educational Technology in HEI. https://doi.org/10.1186/s41239-017-0076-8

    Article  Google Scholar 

  • Zimmerman, B. J. (2011). Motivational sources and outcomes of self-regulated learning and performance, in Handbook of Self-Regulation of Learning and Performance, B. J. Zimmerman and D. H. Schunk (eds.), Routledge, 49–64.

  • Zheng, Y., Zhao, K., & Stylianou, A. (2013). The impacts of information quality and system quality on users’ continuance intention in information-exchange virtual communities: An empirical investigation. Decision Support Systems, 56, 513–524. https://doi.org/10.1016/j.dss.2012.11.008

    Article  Google Scholar 

Download references

Funding

This work was supported by the German Federal Ministry of Education and Research, Bonn (Grant number 16DHB2123).

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization: CM.K, CK, MH.B; Data curation: CM.K, CK; Formal analysis: CM.K, CK; Methodology: CM.K, CK, Project administration: CM.K; Roles/Writing- original draft: CM.K, CK; Writing—review and editing: CM.K, CK, MH.B; Funding acquisition: MH.B; Supervision: MH.B.

Corresponding author

Correspondence to Claudia M. König.

Ethics declarations

Ethics approval and consent to participate

Ethics approval was not required for this study.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1:

Appendix 1. Interviewee Profiles. Appendix 2. Quantitative Student Survey: Questions and Response Behavior. Appendix 3. Identified Requirements and their Realization.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

König, C.M., Karrenbauer, C. & Breitner, M.H. Development guidelines for individual digital study assistants in higher education. Int J Educ Technol High Educ 21, 9 (2024). https://doi.org/10.1186/s41239-024-00439-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41239-024-00439-4

Keywords