Skip to content

Advertisement

  • Research article
  • Open Access

Profiling the digital readiness of higher education students for transformative online learning in the post-soviet nations of Georgia and Ukraine

International Journal of Educational Technology in Higher Education201815:37

https://doi.org/10.1186/s41239-018-0119-9

  • Received: 30 November 2017
  • Accepted: 5 July 2018
  • Published:

Abstract

This study profiles the digital readiness of university students in Georgia and Ukraine for fully online collaborative learning, theorized as an educational pathway to democratic transformation. The Digital Competency Profiler was used to gather data from 150 students in Georgia and 129 in Ukraine about their digital competences. The analysis grouped students into high-, medium- and low-readiness segments for 52 actions in technical, communicational, informational and computational dimensions. Findings show that large percentages of Georgian and Ukrainian students are ill-prepared for many online-learning activities, and there is generally greater readiness on mobile devices than desktops/laptops. However, large percentages of Ukrainian students appear in high-readiness segments for communicating online and using social networks. In Georgia, many students report high-readiness for technical and computational interactions. Therefore, the researchers recommend using the digital-readiness data in tandem with a well-chosen, online-learning framework to align these patterns of strengths with future educational innovation.

Keywords

  • ICT
  • Social constructivist
  • Online learning
  • Ukraine
  • Georgia
  • Higher education

Introduction

Purpose

The post-Soviet nations of Georgia and Ukraine seek to align higher education with democratic development and social progress. Having theorized the potential of fully online collaborative learning for democratization (Blayone, vanOostveen, Barber, DiGiuseppe, & Childs, 2017) and facilitated a pilot course for students in Ukraine (Mykhailenko, Blayone, & vanOostveen, 2016), a broader program of educational-transformation research was launched with partners in several post-Soviet countries. Conducted from socio-cultural (Langemeyer, 2011; Somekh & Nissen, 2011) and human-computer-interaction (Jonassen & Rohner-Murphy, 1999; Kuuti, 1995) perspectives, this program began with an initial probe of student and professor digital competencies in Ukraine (Blayone, Mykhailenko, VanOostveen, Grebeshkov, Hrebeshkova, et al., 2017). Next, a lab-based study comparing self-reported digital competences to recorded digital-learning activities produced an observationally-grounded approach to readiness assessment (Blayone, vanOostveen, Mykhailenko, & Barber, 2017, 2018). Using this approach, this study profiles the digital readiness of higher-education students in Georgia and Ukraine for fully online collaborative learning. The driving purposes are to contribute to ongoing educational-transformation in the post-Soviet world and offer online-learning researchers and practitioners an effective readiness-assessment toolkit.

Post-soviet educational transformation

Ukraine and Georgia share a 70-year Soviet experience that shaped their institutions, psychologies and social values (Raikhel & Bemme, 2016). Since achieving independence in 1991, both nations have pursued multi-level transformations accelerated by peoples’ revolutions (Börzel, 2015; Delcour & Wolczuk, 2015). The resulting experience has included economic distress, loss of security and social benefits (Haerpfer & Kizilova, 2014; Roztocki & Weistroffer, 2015), and socio-psychological “fallout,” such as loss of trust and dissatisfaction with life (Sapsford, Abbott, Haerpfer, & Wallace, 2015). Within this challenging context, Ukraine and Georgia have both taken significant strides towards transforming higher education, joining the Bologna process in 2005 to realign its Soviet institutions with the goals of the European Higher Education Area (Powell, Kuzmina, Yamchynska, Shestopalyuk, & Kuzmin, 2015). These efforts have produced positive results despite some bureaucratic resistance (Raver, 2007) and ongoing practices of corruption (Habibov, 2016).

Importantly, prospects for digital learning are well-supported by developing national ICT infrastructures (Ianishevska, 2017) with both Ukraine and Georgia achieving a top-60 ranking in the 2017 Social Progress Index’s information and communication category (Social Progress Imperative, 2017a, 2017b; Stern, Wares, & Epner, 2017). Moreover, government support for distance learning is increasing (Powell et al., 2015), MOOC providers are making inroads into formal education (Ed-Era, 2017; Prometheus, 2017), and online-learning pilot projects are appearing in the English-language literature (Gravel & Dubko, 2013; Mykhailenko, Blayone, & vanOostveen, 2016; Powell, Kuzmina, Kuzmin, Yamchynska, & Shestopalyuk, 2014). Despite these positive developments, however, limited financial resources and signs of low digital-readiness among students, teachers and administrators remain (Blayone et al., 2017; Synytsya & Manako, 2010).

Conceptual framework

Online learning in higher education

Online learning, like distance learning (Anderson & Dron, 2010), blended learning (Halverson, Graham, Spring, Drysdale, & Henrie, 2014; Palalas, Berezin, Gunawardena, & Kramer, 2015) and mobile learning (Alhassan, 2016; Crompton, Burke, Gregory, & Gräbe, 2016) is a form of digital learning (Siemens, Gašević, & Dawson, 2015)—a melding of learning activities, digital devices and global networks to achieve educational objectives. The practices of online learning are diverse, incorporating many technologies, pedagogies and guiding values (Aparicio, Bacao, & Oliveira, 2016). Some forms of online learning, such as MOOCs (Massive Open Online Courses) focus on making premium educational content globally accessible (De Corte, Engwall, & Teichler, 2016). Others seek to implement scalable learning-management systems that maximize individual flexibility while supporting optional forms of cooperation (Dalsgaard & Paulsen, 2009; Paulsen, 2003, 2008). Still others, such as those developed within the transactional tradition (Garrison & Archer, 2000), emphasize collaborative learning, targeting both the social and cognitive development of participants (Blayone et al., 2017; Garrison, 2017; Swan, 2010; vanOostveen, DiGiuseppe, Barber, Blayone, & Childs, 2016). By integrating the individual and the social dimensions of learning, and foregrounding active participation, open expression, democratic deliberation and collective inquiry, this orientation appears especially well-aligned with the goal of modelling participatory democratic functioning. However, to realize meaningful results from any implementation of digitally-mediated learning, the host environment, digital infrastructure and human participants must achieve a degree of readiness.

Readiness for online learning

Readiness for online learning is an international research domain conceptualizing and measuring, various success factors and enabling conditions. There are numerous readiness models (Alaaraj & Ibrahim, 2014; Darab & Montazer, 2011), instruments (Dray, Lowenthal, Miszkiewicz, Ruiz-Primo, & Marczynski, 2011; Hung, 2016; Hung, Chou, & Chen, 2010; Lin, Lin, Yeh, Wang, & Jansen, 2015), and empirical studies, set in a variety of national contexts (Aldhafeeri & Khan, 2016; Chipembele, Chipembele, Bwalya, & Bwalya, 2016; Gay, 2016; Parkes, Stein, & Reading, 2015; van Rooij & Zirkle, 2016). Researchers generally adopt either a macro-level perspective, addressing the readiness of organizations, regions and countries (Beetham & Sharpe, 2007; Bui, Sankaran, & Sebastian, 2003), or a micro-level perspective, focused primarily on students (Dray et al., 2011; Parkes et al., 2015) or teachers (Gay, 2016; Hung, 2016). At the micro level, digital competencies, defined as knowledge, skills and attitudes supporting purposeful and effective use of technology (Ala-Mutka, 2011), figure as the most prominent set of readiness factors within frameworks (Al-Araibi, Mahrin, & Mohd, 2016; Demir & Yurdugül, 2015) and instruments (Dray et al., 2011; Hung et al., 2010; Lin et al., 2015; Parasuraman, 2000; Pillay, Irving, & Tones, 2007; Watkins, Leigh, & Triner, 2004). However, existing operationalizations tend to be unidimensional and inconsistent, showing little awareness of current, multidimensional digital-competency frameworks (Blayone et al., 2018). To address these shortcomings, researchers at the EILAB, University of Ontario Institute of Technology, Canada, are leveraging the General Technology Competency and Use (GTCU) framework (Desjardins, 2005; Desjardins, Lacasse, & Belair, 2001) and the accompanying Digital Competency Profiler (DCP) for measuring digital readiness for online learning (EILAB, 2017).

A digital readiness framework and profiler

As shown in Fig. 1, the GTCU is a multi-contextual (i.e., applicable to education, work, home, etc.) and multi-dimensional framework for conceptualizing digital-technology uses and related competences. In short, Desjardins identified four human-computer-object interaction types: computational, informational, communicational and technical. The first three were derived directly from the core capabilities of computer hardware (i.e., process, store and transmit) (IEEE, 1990). To account for operational skills and those instances when individuals focus on technology itself (e.g., when a device fails), a technical order of interaction was also introduced. Avoiding complex competence descriptions like some other frameworks (Ferrari, 2013; Vuorikari, Punie, Gomez, & Van Den Brande, 2016), the GTCU conceptualizes effective use by matching interaction types to corresponding sets of knowledge and skills typically developed through frequent and confident computer-mediated activity.
Fig. 1
Fig. 1

Conceptual overview of GTCU framework, authored by Desjardins (2005)

For the purpose of assessing digital readiness for online learning, the GTCU framework offers five key features. First, by using the core capabilities of computer hardware to conceptualize digital uses and competencies, the GTCU insulates itself from the changing designs of hardware and software platforms, and environmental factors affecting technology use in particular contexts. Second, three of its four dimensions (technical, informational and social) represent a common core among major frameworks (Iordache, Mariën, & Baelden, 2017), and its computational dimension addresses competencies that are achieving prominence in the educational literature (Bocconi, Chioccariello, Dettori, Ferrari, Engelhardt, et al., 2016; Jun, Han, Kim, & Lee, 2014). Third, the GTCU’s online data-collection application—the DCP—has been used repeatedly to profile the technology uses of both students and professors in higher education (Barber, DiGiuseppe, vanOostveen, Blayone, & Koroluk, 2016; Desjardins & vanOostveen, 2015; Desjardins, vanOostveen, Bullock, DiGiuseppe, & Robertson, 2010). Fourth, by incorporating behavioural and attitudinal indicators, and associating items with specific types of devices, the DCP provides a tremendously rich set of data points unmatched by other readiness instruments. Finally, owing to growing international adoption, the DCP has been translated into several languages, and has been used previously in non-Western contexts (Blayone et al., 2017).

Research question

The following research question guided the methodology and analysis: Across four foundational orders of technology use, what is the state of digital-readiness of the Georgian and Ukrainian student cohorts for online learning?

Method

Having obtained approval from the Academic Research Councils of the participating universities and UOIT’s Research Ethics Board, participants were recruited on a volunteer basis from the student population by local officials at Batumi State Maritime Academy (BSMA), Georgia, and Ivan Franko National University (IFNU) of Lviv, Ukraine. Data were collected using the online DCP application during the period of May–July 2017.

Instrument: Digital competency profiler

As shown in Fig. 2, the DCP facilitates data collection, profile visualization and the extraction of raw data. For this study, the DCP data set consisted of: (a) socio-demographic and device-usage items, and (b) 26 indicator groups—five for technical, and seven for each of the communicational, informational and computational dimensions of use. Each group included six action-device items (a single action coupled with different device types), following a common structure: “To perform a software-level action, I use a specific hardware device type.” (A full list of actions is provided in the Appendix.) The DCP includes six device types: computer/laptops (as a single type), smartphones, tablets, gaming systems, computer appliances, and wearable devices.
Fig. 2
Fig. 2

Digital Competency Profiler, action-device groups and visualizations

The DCP attaches two measures to each action-device item, using 5-point Likert scales. The frequency with which an individual performs a device-specific action is measured using: (1) never, (2) a few times a year, (3) a few times a month, (4) a few times a week, and (5) daily. Frequency of action is an important indicator of digital competency because transferable procedural knowledge is reinforced through repeated activity. The confidence with which an individual performs a device-specific action is measured using: (1) do not know how to use, (2) not confident—require assistance, (3) confident—can solve some problems, (4) fairly confident—can use with no assistance (5) very confident—can teach others. Device-action confidence addresses an individual’s motivation to explore novel situations and problems (Bandura, 1993) with a particular tool. These twin indictors of competency replaced direct claims (“I am able to do x”) in the instrument’s early development. It is expected that individuals are able to differentiate and reliably report, the frequencies with which they perform certain actions and their relative levels of comfort performing an action with a particular type of device (Desjardins et al., 2010; DiGiuseppe, Partosoedarso, vanOostveen, & Desjardins, 2013). All told, the DCP action-device groups provide researchers with 312 points of data.

Validity and reliability

The original DCP survey instrument underwent content validation through the participation of 10 Canadian teachers and parents (Desjardins et al., 2001). Subsequently, six experts joined Desjardins et al. (2001) in a process of construct validation, which included statistical investigation of correlation matrices. All retained items related well to their conceptualized dimension (Desjardins et al., 2001). The current DCP application houses an expanding database populated from ongoing data collection. The aggregate data set has been checked for reliability, with Cronbach’s alpha values ranging from .76 to .94 on the sub-scales. The alpha values for the subset collected for this study ranged from .78 to .88 for the computer/laptop composite (frequency and confidence) scores, and from .8 to .91 for the mobile composite scores.

Although the DCP consists of four sub-scales, with items mapped to a foundational order of human-computer interaction (described above), actual digitally-mediated activity most often possesses characteristics of more than one order (Desjardins, 2005). Consequently, ongoing validation of the DCP has not focused on statistical procedures such as factor analysis (F. Desjardins, personal communication, April 17, 2018). Rather, validation is being pursued by assessing the usefulness and predictive value of the DCP data in specific contexts of application. Recently, this author found strong-positive correlations between learners’ reported DCP competences and their performance levels conducting authentic online-learning activities (Blayone et al., 2017).

Localization

Several localizations of the DCP application were implemented over time. For this study, a Ukrainian localization was prepared, reviewed and tested by two trilingual (Russian, Ukrainian, English) researchers familiar with the field of digital-learning research, in consultation with a native Ukrainian- and a native English-speaking researcher (Blayone et al., 2017). Owing to time constraints, the English version was used in Georgia. As part of the recruitment process, Georgian participants were advised that participation would require reading skills in English. Although this lowered the participant pool, it also generated enthusiasm by highlighting the international scope of the research.

Sample

Students were recruited from the Faculty of Business Management at BSMA and the Department of Management Economics at the IFNU. As shown in Table 1, 150 students (24% of the participating faculty’s student body) from BSMA and 129 students (38% of the participating department’s student body) from IFNU volunteered to complete an online profile. Overall, both undergraduates and graduates participated, primarily between the ages of 17 and 24. More graduate than undergraduates participated from Georgia, which is reflected in the age groupings. In Georgia, 79% were female, and in Ukraine, 69%. This aligns with a reported demographic trend in Ukrainian higher education in which students are over 60% female in the social sciences, business and law (Kogyt, 2016).
Table 1

Socio-demographic characteristics of participants

Variables

Values

N (BSMA, GA)

% (BSMA, GA)

N (IFNU, UA)

% (IFNU, UA)

Participants

Total by School

150

100

129

100

Gender

Female

118

79

89

69

Male

32

21

40

31

Age group

17–19

2

1

70

54

20–24

134

89

54

42

25–29

4

3

2

2

30+

10

7

3

2

Educational Status

Undergraduate

69

46

100

78

Graduate

81

54

29

22

Educational Domain

Business

110

73

91

71

Economics

3

2

14

11

Tourism

20

13

0

0

Science

3

2

3

2

Other

14

9

21

16

Analysis

This study adopted a three-step analytical procedure (Fig. 3) derived from recent observational research (Blayone et al., 2017). As a first step, the full DCP data set was reduced to the most relevant indicators for assessing online-learning readiness. The device-ownership data indicated that 57% of Georgian and 75% of Ukrainian participants owned a laptop/desktop. Similarly, 58% of Georgian and 78% of Ukrainian participants owned a smartphone. Only 29% of Georgians and 20% of Ukrainians owned a tablet. Because these are the most relevant devices for online learning, action indicators using other devices were ignored. Therefore, the 26 desktop/laptop items were selected to produce one set of competency scores, and second set of 26 mobile scores were constructed primarily from smartphone items. In eight cases—four in Georgia and four in Ukraine, in which a participant’s tablet values exceeded their smartphone values—tablet data were substituted.
Fig. 3
Fig. 3

DCP data-analysis methodology

As a second step, the five-point, frequencys and confidence measures were summed to create item-competency scores for each of the device-action items, with 10 as the maximum value (indicating daily use and high confidence). The rationale for summing frequency and confidence measures is rooted in the operational logic of the GTCU framework. The frequency with which an individual performs an activity and the related level of confidence are mutually reinforcing synergistic indicators of digital competence (Blayone et al., 2017). This resulted in 26 desktop/laptop and 26 mobile scores for each participant.

The third step built directly on the strength of the DCP to predict performance most reliably when self-reported competency scores are high or low (Blayone et al., 2017). Adapting this finding, participants were positioned into one of three segments for each action-device item. Participants with scores greater than 6 (of 10) for an action-device item were placed in a high-readiness segment, the members of which would be expected to demonstrate a good degree of effectiveness performing aligned tasks. Those individuals with scores less than 4 for an action-device item were placed in a low-readiness segment, the members of which, without sufficient support, would be expected to demonstrate troubled performances, and require formal support. The middle segment included those individuals for which the DCP less reliably predicts performance (Blayone et al., 2017). Therefore, although performance levels may prove adequate, inferences regarding the expected functioning of these individuals are not drawn. Importantly, these observationally-informed thresholds are consistent with the logic of the twin, 5-point measures (presented in Appendix).

Findings

Findings are organized by the four GTCU dimensions of use. For each dimension, the constituent action-device items are defended as relevant to online learning, and data for each item are presented in a single, tabular format for the BSMA, Georgia and IFNU, Ukraine cohorts. The size of each readiness segment is given as a percent of the host cohort. The high-readiness and low-readiness percentages are bolded in the tables because these are the values from which we draw inferences regarding expected performance. The middle-segment values receive little comment because the DCP less reliably predicts performance within this range (Blayone et al., 2017).

Based on our research and praxis, a suggested guideline for interpreting findings at the group level may be offered. Namely, one might expect good levels of group communication and collaborative-research performance in an online-learning environment when a majority of students in a cohort are positioned in the high-readiness segment, and a small minority (e.g., less than 20%) in the low-readiness segment for actions aligned with course activities. Where a high percentage of students are positioned in a low-readiness segment, the need for substantial support would be expected. For each dimension, the following analytical summaries highlight: (a) the relative strength of desktop/laptop versus mobile usage, (b) the relative sizes of the high- and low-readiness segments within a cohort, and (c) selected patterns of general difference between cohorts—providing a comparative lens for contextualizing the results. The overall aim was to present data in an accessible format and encourage participating institutions to draw further inferences in relation their own learning goals, activity types and selected technologies.

Digital readiness for technical actions

Technical actions include a foundational academic activity (T1: creating/editing a document) and four other items related to successful functioning in online-learning environments. Operational abilities, included in this dimension, are prerequisite to effective functioning in other GTCU dimensions, and can often be acquired quickly when one has sufficient technology access and motivational resources. As shown in Table 2, for all the technical actions (with the exception of creating/editing documents among the IFNU cohort), there are generally more members in the high-readiness segment using mobile devices than using desktop/laptops in both cohorts. This finding highlights the relative strength of mobile-device usage.
Table 2

Digital readiness segments for technical actions

Technical Actions

Segments

BSMA D/L*

BSMA M*

IFNU D/L*

IFNU M*

T1. Create/edit documents

High

20%

33%

46%

43%

Middle

30%

27%

38%

27%

Low

50%

40%

16%

29%

T2. Create/edit audio

High

14%

27%

8%

22%

Middle

23%

33%

34%

29%

Low

63%

40%

58%

49%

T3. Create/edit multimedia

High

16%

31%

16%

26%

Middle

31%

29%

44%

37%

Low

53%

41%

40%

36%

T4. Manage accounts

High

23%

31%

33%

47%

Middle

29%

29%

43%

28%

Low

48%

40%

24%

25%

T5. Manage or operate other devices

High

14%

20%

5%

6%

Middle

23%

29%

14%

14%

Low

63%

51%

81%

80%

*Note. The percentage of individuals from BSMA and LFNU student cohorts in each readiness segment based on item competency scores of technical actions using computer/laptop (D/L) and mobile (M) devices (BSMA, GA: N = 150; IFNU, UA: N = 129). Bolded items are the highest and lowest percentages for an action-device item from both cohorts

Within the BSMA cohort, 40–63% of students appear in the low-readiness segment across all action-device items. This includes 50% in the low-readiness segment for creating/editing documents (T1) with a desktop/laptop—an essential academic procedure—and 40% when using a mobile device. The high-readiness segment includes 27–33% of the cohort on items T1-T4 using a mobile device. Within the IFNU cohort, large high-readiness segments are found for creating/editing documents (T1: 46% with desktop/laptop, and 43% with mobile) and managing accounts (T4: 47% with mobile, and 33% on a desktop/laptop). Looking across cohorts, despite positioning a consistently high number of students in low-readiness segments, BSMA achieves greater numbers in the high-readiness segment than IFNU in three of five items (T2, T3 and T5). Ukraine, however, achieves the highest readiness numbers in the dimension for creating and editing documents (T1: 46% with a desktop/laptop).

Digital readiness for communicational actions

In online-learning contexts, communicational actions support sharing ideas, building trusting relationships, exploring perspectives, and collaborating towards common objectives. Many of the DCP communicational actions that once defined specific genres of software (e.g., S6, S7, S8, S11 and S12), now appear within multi-purpose applications. Hybrid collaboration platforms such as Slack, for example, support communication, file-sharing, and content publishing. Similarly, social-network environments (S10) continue to gain momentum as multi-purpose platforms in educational contexts (Correa, 2015; Dickie & Meier, 2015; Ellefsen, 2015; Halpern & Gibbs, 2013; Kosinski, Matz, Gosling, Popov, & Stillwell, 2015). Facebook is noteworthy owing not only to its diverse functionality, but also because it is the most popular social platform with over two-billion active monthly users (Statista, 2017). Taken together, the communicational actions defined by the DCP—and accompanying competencies related to socio-emotional and cultural intelligence, privacy and security, and identify representation—are critical for effective participation in increasingly global, online-learning environments. As shown in Table 3, there are again generally more members from both cohorts in the top readiness segment using mobile devices.
Table 3

Digital readiness segments for communicational actions

Communicational Actions

Segments

BSMA D/L*

BSMA M*

IFNU D/L*

IFNU M*

S6. Communicate using text messages

High

19%

31%

49%

64%

Middle

20%

25%

11%

6%

Low

61%

43%

40%

29%

S7. Communicate using audio

High

17%

29%

35%

71%

Middle

26%

29%

37%

12%

Low

57%

42%

28%

18%

S8. Communicate using video

High

17%

25%

23%

39%

Middle

21%

28%

43%

26%

Low

61%

47%

34%

35%

S9. Communicate using email

High

19%

26%

35%

36%

Middle

23%

25%

36%

33%

Low

57%

49%

29%

32%

S10. Use social networks

High

23%

32%

69%

78%

Middle

25%

25%

12%

11%

Low

52%

43%

19%

12%

S11. Use collaboration tools

High

15%

19%

19%

18%

Middle

23%

29%

40%

27%

Low

62%

53%

40%

55%

S12. Share works and ideas online

High

5%

13%

6%

15%

Middle

30%

35%

24%

15%

Low

65%

51%

70%

71%

*Note. The percentage of individuals from BSMA and LFNU student cohorts in each readiness segment based on item competency scores of communicational actions using computer/laptop (D/L) and mobile (M) devices (BSMA, GA: N = 150; IFNU, UA: N = 129). Bolded items are the highest and lowest percentages for an action-device item from both cohorts

Within the BSMA cohort, there are consistently large percentages of students (52–65%) in the low-readiness segment across all seven communicational actions with a desktop/laptop. (With mobile devices, the range improves to 42–51%.) Within the IFNU cohort, three mobile-action items (S6: text messaging; S7: audio messaging; and, S10: using social networks) have over 60% of students in the high-readiness segment. For social-network usage alone, 69% appear in the high-readiness segment for desktop/laptops, and 78% for mobile. These findings suggest a strong foundation for ongoing digital-competency development (Correa, 2015), and they highlight the communicational strengths of Ukrainian students noted in a previous study (Blayone et al., 2017). However, the IFNU cohort has at least 70% in the low-readiness segments for sharing one’s works and ideas online (S12)—an important item focused on self-expression. This finding also aligns with results from the previous study (Blayone et al., 2017). Within the BSMA cohort, the consistently large percentages (42–65%) in low-readiness segments across the entire range of communicational items present a development opportunity, especially given that 25–32% of students show high readiness for five items (S6 to S10).

In both cohorts, using collaboration tools (S11) produced large low-readiness segments (BSMA: 62% using desktop/laptop, and 53% mobile; IFNU: 40% using desktop/laptop, and 55% mobile). This finding is coupled with even higher low-readiness segments for sharing one’s works or ideas online (S12) (BSMA: 65% using desktop/laptop, and 51% mobile; IFNU: 70% using desktop/laptop, and 71% mobile), suggesting that frequent use of social networks, which offer affordances for collaboration and content publishing, has not yet been associated with these “serious” activities, or leveraged for such purposes. In the end, compared to IFNU, and given the general popularity of social-networking, BSMA’s low readiness for using social networks stands out in this dimension.

Digital readiness for informational actions

Informational items target interactions between a subject and knowledge artifacts. Searching and accessing journal articles (I14), electronic books (I18) and short videos (I15) are essential research skills. The ability to find quality films (I16) and music (I17)—particularly those available for educational repurposing—is critical when building multimedia objects. Using digital maps (I13) becomes a survival skill when navigating in unfamiliar places, a situation, for example, in which international students frequently find themselves. Finally, content-aggregation tools can dramatically increase the efficiency and effectiveness of online research, especially when coupled with a reference-management application (I19). Therefore, as a group, these seven informational actions address vital digital actions in higher education. As shown in Table 4, once again, there are generally greater numbers of high-readiness users for mobile actions within each cohort. Only for I16 (searching or downloading movies) do we see greater numbers of students in the top segment using desktop/laptops.
Table 4

Digital readiness segments for informational actions

Informational Actions

Segments

BSMA D/L*

BSMA M*

IFNU D/L*

IFNU M*

I13. Access maps or GPS

High

4%

13%

10%

28%

Middle

22%

32%

26%

36%

Low

74%

55%

64%

36%

I14. Search for journal articles

High

12%

18%

26%

33%

Middle

28%

29%

36%

28%

Low

60%

53%

37%

40%

I15. Search for short videos

High

16%

29%

60%

61%

Middle

25%

29%

19%

17%

Low

59%

42%

22%

22%

I16. Search or download movies

High

14%

9%

36%

18%

Middle

24%

33%

36%

29%

Low

62%

57%

27%

53%

I17. Search or download music

High

15%

20%

36%

44%

Middle

27%

33%

29%

26%

Low

58%

47%

35%

29%

I18. Read or download digital books

High

7%

11%

22%

24%

Middle

27%

31%

30%

35%

Low

66%

57%

48%

41%

I19. Automate information sources

High

7%

10%

5%

6%

Middle

19%

23%

15%

6%

Low

73%

67%

81%

88%

*Note. The percentage of individuals from BSMA and LFNU student cohorts in each readiness segment based on item competency scores of informational actions using computer/laptop (D/L) and mobile (M) devices (BSMA, GA: N = 150; IFNU, UA: N = 129). Bolded items are the highest and lowest percentages for an action-device item from both cohorts

Within the IFNU cohort, there are large numbers of students in the high-readiness segments for searching short videos (I15: 60% with desktop/laptop, and 61% for mobile). The IFNU cohort also shows substantial high-readiness segments for searching journal articles (I15: 33% using mobile), searching movies (I16: 36% using a desktop/laptop) and downloading music (I17: 36% using a desktop/laptop and 44% on mobile). However, there are also large numbers in the low-readiness segment for automating information sources (I19: 81% using a desktop/laptop and 88% on mobile). Within the BSMA cohort, a key finding relates to very large low-readiness segments across all informational items, ranging from 42 to 67% for mobile use, and 59–74% for desktop/laptop use.

Looking across cohorts, the large percentages of students from both cohorts in the low-readiness segment for searching online journal articles (BSMA: 60%, desktop/laptop and 53%, mobile; IFNU: 37%, desktop/laptop and 40% mobile) and electronic books (BSMA: 66%, desktop/laptop and 57%, mobile; IFNU: 48%, desktop/laptop and 41% mobile) is noteworthy. Effectively accessing articles and books are a starting point for university-level research. Overall, where the IFNU cohort show some moderate high-readiness segments in this dimension, the BSMA cohort has significant majorities of students in the low-readiness segment for all desktop/laptop items, and most mobile items.

Digital readiness for computational actions

Computational actions leverage the processing power of digital hardware and software to organize, transform and visualize, numerical and non-numerical data to address complex problems. Functioning effectively in this dimension requires substantial domain knowledge and the ability to assign “cognitive processes” to the computer either through a software application or programming interface. This includes interacting with online calendar systems (E20); data-visualization tools, such as concept-mapping, diagramming and graphing applications (E21, 22 and 24); numerical and statistical-analysis packages (E23 and 25); and scripting/programming environments (E26). Indeed, it is difficult to imagine conducting research today without significant experience in some of these competencies, particularly in an age of “big data” (Bocconi et al., 2016).

As shown in Table 5, and consistent with other studies set in Eastern Europe (Blayone et al., 2017) and Canada (Barber et al., 2016), activities in this dimension, which are usually performed on desktop and laptops, continue to challenge students. For all seven action items in this dimension, a very large percentage of students are positioned in the low-readiness segments in both cohorts (BSMA: 72–78% using desktop/laptops and 55–71% with mobile devices; IFNU: 61–87% using desktop/laptops and 59–96% with mobile devices). Looking across cohorts, BSMA places a slightly greater percentage of their students in the high-readiness segments than IFNU for all items using a desktop/laptop, and five of seven items using a mobile device.
Table 5

Digital readiness segments for computational actions

Computational Actions

Segments

BSMA D/L*

BSMA M*

IFNU D/L*

IFNU M*

E20. Use/share calendar or organizer

High

8%

19%

8%

21%

Middle

14%

26%

15%

20%

Low

78%

55%

78%

59%

E21. Create concept maps or flow charts

High

8%

11%

5%

5%

Middle

13%

21%

16%

9%

Low

79%

68%

80%

87%

E22. Create/modify figures and diagrams

High

6%

10%

5%

2%

Middle

22%

19%

32%

6%

Low

72%

71%

63%

91%

E23. Sort large amounts of data

High

8%

10%

10%

6%

Middle

19%

21%

29%

6%

Low

73%

69%

61%

88%

E24. Generate graphs from numbers

High

9%

8%

4%

3%

Middle

16%

21%

33%

2%

Low

75%

71%

64%

95%

E25. Do complex calculations

High

11%

11%

6%

9%

Middle

15%

28%

17%

16%

Low

74%

61%

77%

74%

E26. Program or automate procedures

High

7%

9%

3%

2%

Middle

15%

21%

10%

2%

Low

78%

69%

87%

96%

*Note. The percentage of individuals from BSMA and LFNU student cohorts in each readiness segment based on item competency scores of computational actions using computer/laptop (D/L) and mobile (M) devices (BSMA, GA: N = 150; IFNU, UA: N = 129). Bolded items are the highest and lowest percentages for an action-device item from both cohorts

Discussion

Collaborative forms of fully online learning appear well-aligned with aspirations for educational transformation and democratic development in Ukraine and Georgia. Assessing and building the technology-readiness of learners in these contexts, however, is challenging. Profiling digital competencies with the DCP, and positioning students within high-, medium- and low-readiness segments for a variety of digital interactions, can help guide faculty and administrators during the preparation and implementation phases of online programs.

Large numbers of students in low-readiness segments, like those found in this study, suggest immediate opportunities for skill-development interventions. For example, faculty might introduce greater use of digital devices and activities (e.g., web quests, blogging, social-media posting, etc.) into the current curriculum, and pursue digital “maker” activities (Blikstein, Kabayadondo, Martin, & Fields, 2017; Pangrazio, 2014). Those in the middle segment can be helped to diagnose their readiness level further by attempting a few (instructor-designed) digital-learning scenarios made available online prior to course launch (Blayone et al., 2017). Once a collaborative online course starts, students with high readiness can serve a critical community function: to model best practices and support those who are less comfortable leveraging the technology affordances.

When implementing a fully-online or blended-learning course/program, DCP findings should be used in tandem with a digital-learning model well-aligned with the context and desired outcomes. Two recommendations include the Community of Inquiry (CoI) theoretical framework (Garrison, 2017; Richardson, Arbaugh, Cleveland-Innes, Ice, Swan, et al., 2012) and the Fully Online Learning Community (FOLC) model (Blayone et al., 2017; vanOostveen, 2016; vanOostveen, DiGiuseppe, Barber, & Blayone, 2016). These collaborative models emphasize: (a) active participation, freedom of expression, and critical deliberation (Garrison, 2016); (b) the empowering, connecting and cognitive-partnering qualities of digital-learning tools (Blayone et al., 2017; vanOostveen et al., 2016); (c) “deep learning” instead of rote learning, fostering reflective thinking and cognitive agility (Akyol & Garrison, 2011; Garrison, Anderson, & Archer, 2001); and (d) culture and experience as contextual foundations for building meaningful knowledge (Dewey, 1897).

With a specific model selected, digital-readiness findings can be mapped to target learning processes. For example, the CoI has theorized and validated three key dimensions of online learning—social presence, cognitive presence and teaching presence—which have been operationalized through well-defined elements, categories and indicators (Garrison, 2017; Swan, Garrison, & Richardson, 2009). By using this COI apparatus in tandem with DCP readiness data, one can anticipate the degree to which learning activities are aligned with the technology strengths of a cohort. For example, the strength of Ukrainian students for using social-networks points toward Facebook as a potential environment for building both social presence (SP) and cognitive presence (CP). (Within the CoI, SP relates to building interpersonal trust and open expression, and CP relates to dynamics of collaborative thinking and knowledge building.) It should be noted, however, that the technology-readiness of students remains a necessary but insufficient condition for building successful online-learning experiences. High-quality activity design, strong environmental supports for nurturing student motivation (Deci & Ryan, 2000; Nakamura & Csikszentmihalyi, 2002), and competent online facilitators, are also vital.

Limitations

There are four limitations to note. First, the sample was recruited from the departments to which the contributing authors from BSMA and IFNU were affiliated and this resulted in heavy concentrations of Business majors. Moreover, data were collected via an online application in Ukrainian at IFNU and English at BSMA, which limited access to those without the requisite language skills and Internet connectivity. More generally, under the limitations of the international research partnerships involved, representative samples of the full student bodies at each university were not sought, and therefore, the results obtained are not readily generalizable.

Second, examples attached to some DCP action indictors (e.g., S10 refers to Facebook, Google+, LinkedIn and Twitter, as examples of social-networking systems) are biased towards Western contexts. In much of Eastern Europe, Russian networks such as ВКонтакте (V Kontakte) and Одноклассники (Odnoklassniki) are popular. Importantly, in 2017, Ukraine blocked Russian social networks (Luhn, 2017), which encouraged use of Western platforms. This may partly account for the high-readiness counts among Ukrainians for using social networks, and the differences between Ukrainian and Georgian usage. That is, the examples given may have been less familiar to Georgian students and may have influenced their response.

Third, drawing inferences from self-reported digital competencies in relation to expected patterns of performance is always difficult. The literature reports misalignments between perceived abilities and observed performance using other instruments (Bradlow, Hoch, & Hutchinson, 2002; Hargittai & Shafer, 2006; Litt, 2013). Some also report instrumentation issues related to conceptual ambiguity, incompleteness and over-simplification (van Deursen, Helsper, & Eynon, 2016). We acknowledge these challenges, and in this study, we recognized the inability of the DCP to predict performance levels reliably when moderate digital-competency scores are reported.

Finally, although human capacities to use digital tools effectively are widely considered the most significant set of micro-level readiness factors for successful online learning (Blayone et al., 2018), other micro- and macro-level factors are also important. For example, in post-Soviet contexts, levels of corruption among institutional leaders may limit physical and motivational resources for digital-learning innovation (Habibov, 2016). Moreover, national and regional cultural-values invariably shape student and instructor willingness to function in virtual spaces (Gunawardena, 2014; Mittelmeier, Heliot, Rienties, & Whitelock, 2015; Parrish & Linder-VanBerschot, 2010) and engage in less-structured forms of active learning (Blayone et al., 2017).

Conclusion

Within the frames of a multifaceted, international research program addressing post-Soviet educational transformation (Blayone et al., 2017; Mykhailenko et al., 2016), this study assessed the digital readiness of students for fully online collaborative learning in Ukraine and Georgia. Although large percentages of students in both cohorts appeared ill-prepared for many types of online-learning activity, there were hopeful findings. Among students from the IFNU, Ukraine cohort, large numbers reported high-readiness for communicating via social networks and finding information via social-media sites. Within the BSMA, Georgia cohort, greater percentages of students were found in high-readiness segments for most technical and computational actions than at IFNU, Ukraine. A target-learning-model approach to rendering the data actionable was proposed. In addition, the researchers suggested taking immediate action to encourage greater use of digital technologies in current classroom praxis to develop digital-learning competencies.

We believe this study makes several positive contributions. First, it extends online-learning readiness and digital-competence research to the post-Soviet sphere and introduces a readiness methodology tied to performance analysis. Second, although identifying deep pockets of low digital readiness, it presents several positive findings on which the participating Georgian and Ukrainian institutions might build. Finally, it demonstrates the use of a multi-contextual DCP research apparatus that can be made available to other researchers and practitioners.

Declarations

Acknowledgments

The authors acknowledge the participation of students and professors from Batumi State Maritime Academy, Georgia and Ivan Franco National University, Ukraine. They also acknowledge the infrastructural support of the EILAB, UOIT, Canada.

Funding

This research study was not funded.

Availability of data and materials

The data set on which this study is based is stored as part of the Digital Competency Profiler’s aggregate database on a secure server at the EILAB, University of Ontario Institute of Technology (UOIT), Canada following the approved guidelines of the Research Ethics Board of UOIT. To inquire about access to raw data, please contact the corresponding author.

Authors’ contributions

TJBB was the primary researcher and sole author. He designed the study, conducted the analysis, authored the manuscript and completed revisions. OM was the International Research Director, translator and an editor for this study. She developed the research partnerships, consulted with the author on cross-cultural communication, supervised data-collection processes, translated instruments and materials, and facilitated communication between participants. She also provided detailed, conceptual and style-related feedback on the draft manuscript. MK1 and MK2 participated in on-boarding and data-collection processes. Each directed and supervised data collection at their respective universities. Each also facilitated ethics approvals with their scientific-research committees and participated in instrument-translation processes. RVO was responsible for submissions to the UOIT’s Research Ethics Board, and for configuring and providing access to the EILAB’s Digital Competency Profiler application. He also facilitated the application-localization process and provided feedback. WB contributed to the project design and provided feedback on the analysis methodology and the draft manuscript.

Authors’ information

Todd Blayone is an international researcher, sessional instructor and technologist addressing digital readiness and post-industrial learning in post-Soviet contexts for educational transformation and social progress. Living in Kyiv, Ukraine and Toronto, Canada, he is co-founder of Collaboritsi.com, a cultural-intelligence and collaborative-learning consultancy, and an Associate Researcher at the EILAB, UOIT, Canada.

Dr. Olena Mykhailenko is a trilingual educator, economist, former government advisor and consultant living in Kyiv, Ukraine and Toronto, Canada. As co-founder of Collaboritsi.com and an Associate Researcher of the EILAB, UOIT, Canada, she facilitates workshops for educators and business professionals on cultural intelligence and collaborative thinking. Her publications span development economics, cross-cultural analysis and post-industrial, educational transformation.

Dr. Medea Medea Kavtaradze, is Associate Professor at the Faculty of Business and Management, Batumi State Maritime Academy, Georgia. Holding a PhD in Engineering Sciences, she is head of the undergraduate program in Port Management and the graduate program in International Business Management.

Dr. Marianna Kokhan is Associate Professor at the Faculty of Economics, Ivan Franko National University of Lviv, Ukraine. She has over 10-years’ experience as a teacher in higher education. Her research interests include digital management and communications, business coaching and management psychology. She is co-founder of the student employment platform, studlava.com.

Dr. Roland vanOostveen is director of the Educational Informatics Laboratory (EILAB) and the Educational Studies and Digital Technology programs at the Faculty of Education, UOIT. He holds a PhD in Curriculum, Teaching and Learning from the Ontario Institute for Studies in Education at the University of Toronto. His research explores digital competency and technology use in learning, and the development of fully online learning environments.

Dr. Wendy Barber is the Past Director of the Bachelor of Education program at the University of Ontario Institute of Technology in Oshawa, Canada. Her research interests lie in health and physical education, and creating online communities. She is a passionate advocate for teacher education, and provides instruction in authentic assessment, adult education, and psychological perspectives on digital technology.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors’ Affiliations

(1)
EILAB, Oshawa, Canada
(2)
Batumi State Maritime Academy, Batumi, Georgia
(3)
Ivan Franko National University of Lviv, Lviv, Ukraine
(4)
University of Ontario Institute of Technology, Oshawa, Canada

References

  1. Akyol, Z., & Garrison, D. R. (2011). Understanding cognitive presence in an online and blended community of inquiry: Assessing outcomes and processes for deep approaches to learning. British Journal of Educational Technology, 42(2), 233–250. https://doi.org/10.1111/j.1467-8535.2009.01029.x.View ArticleGoogle Scholar
  2. Alaaraj, H., & Ibrahim, F. W. (2014). An overview and classification of e-readiness assessment models. International Journal of Scientific and Research Publications, 4(12), 1–5.Google Scholar
  3. Ala-Mutka, K. (2011). Mapping digital competence: Towards a conceptual understanding. Seville: Institute for Prospective Technological Studies (IPTS), European Commission, Joint Research Centre Retrieved from http://ftp.jrc.es/EURdoc/JRC67075_TN.pdf.Google Scholar
  4. Al-Araibi, A. A. M., Mahrin, M., & Mohd, R. C. (2016). A systematic literature review of technological factors for e-learning readiness in higher education. Journal of Theoretical and Applied Information Technology, 93(2), 500–521.Google Scholar
  5. Aldhafeeri, F. M., & Khan, B. H. (2016). Teachers’ and students’ views on e-learning readiness in kuwait’s secondary public schools. Journal of Educational Technology Systems, 45(2), 202–235. https://doi.org/10.1177/0047239516646747.View ArticleGoogle Scholar
  6. Alhassan, R. (2016). Mobile learning as a method of ubiquitous learning: Students’ attitudes, readiness, and possible barriers to implementation in higher education. Journal of Education and Learning, 5(1), 176. https://doi.org/10.5539/jel.v5n1p176.View ArticleGoogle Scholar
  7. Anderson, T., & Dron, J. (2010). Three generations of distance education pedagogy. The International Review of Research in Open and Distributed Learning, 12(3), 80–97.View ArticleGoogle Scholar
  8. Aparicio, M., Bacao, F., & Oliveira, T. (2016). An e-learning theoretical framework. Journal of Educational Technology & Society, 19(1), 292–307.Google Scholar
  9. Bandura, A. (1993). Perceived self-efficacy in cognitive development and functioning. Educational Psychologist, 28(2), 117–148. https://doi.org/10.1207/s15326985ep2802_3.View ArticleGoogle Scholar
  10. Barber, W., DiGiuseppe, M., vanOostveen, R., Blayone, T., & Koroluk, J. (2016). Examining student and educator use of digital technology in an online world. Paper presented at the second International Symposium on Higher Education in Transformation, UOIT, Oshawa.Google Scholar
  11. Beetham, H., & Sharpe, R. (2007). Rethinking pedagogy for a digital age: Designing for 21st century learning. New York: Routledge.Google Scholar
  12. Blayone, T., Mykhailenko, O., VanOostveen, R., Grebeshkov, O., Hrebeshkova, O., & Vostryakov, O. (2017). Surveying digital competencies of university students and professors in Ukraine for fully online collaborative learning. Technology, Pedagogy and Education, 1–18. https://doi.org/10.1080/1475939X.2017.1391871.
  13. Blayone, T., vanOostveen, R., Barber, W., DiGiuseppe, M., & Childs, E. (2017). Democratizing digital learning: Theorizing the fully online learning community model. International Journal of Educational Technology in Higher Education, 14(13), 1–16. https://doi.org/10.1186/s41239-017-0051-4.Google Scholar
  14. Blayone, T., vanOostveen, R., Mykhailenko, O., & Barber, W. (2017). Ready for digital learning? A mixed-methods exploration of surveyed technology competencies and authentic performance activity. Education and Information Technologies, 23(3), 1377–1402. https://doi.org/10.1007/s10639-017-9662-6.View ArticleGoogle Scholar
  15. Blayone, T., vanOostveen, R., Mykhailenko, O., & Barber, W. (2018). Reexamining digital-learning readiness in higher education: Positioning digital competencies as key factors and a profile application as a readiness tool. Accepted for publication in International Journal on e-Learning.Google Scholar
  16. Blikstein, P., Kabayadondo, Z., Martin, A., & Fields, D. (2017). An assessment instrument of technological literacies in makerspaces and FabLabs. Journal of Engineering Education, 106(1), 149–175. https://doi.org/10.1002/jee.20156.View ArticleGoogle Scholar
  17. Bocconi, S., Chioccariello, A., Dettori, G., Ferrari, A., Engelhardt, K., Kampylis, P., & Punie, Y. (2016). Exploring the field of computational thinking as a 21st century skill. Paper presented at the EDULearn 2016 8th Annual International Conference on Education and New Learning Technologies, Barcelona.Google Scholar
  18. Börzel, T. A. (2015). The noble west and the dirty rest? Western democracy promoters and illiberal regional powers. Democratization, 22(3), 519–535. https://doi.org/10.1080/13510347.2014.1000312.View ArticleGoogle Scholar
  19. Bradlow, E. T., Hoch, S. J., & Hutchinson, J. W. (2002). An assessment of basic computer proficiency among active internet users: Test construction, calibration, antecedents and consequences. Journal of Educational and Behavioral Statistics, 27(3), 237–253. https://doi.org/10.3102/10769986027003237.View ArticleGoogle Scholar
  20. Bui, T. X., Sankaran, S., & Sebastian, I. M. (2003). A framework for measuring national e-readiness. International Journal of Electronic Business, 1(1), 3–22. https://doi.org/10.1504/ijeb.2003.002162.View ArticleGoogle Scholar
  21. Chipembele, M., Chipembele, M., Bwalya, K. J., & Bwalya, K. J. (2016). Assessing e-readiness of the Copperbelt University, Zambia: Case study. The International Journal of Information and Learning Technology, 33(5), 315–332. https://doi.org/10.1108/IJILT-01-2016-0005.View ArticleGoogle Scholar
  22. Correa, T. (2015). Digital skills and social media use: How internet skills are related to different types of Facebook use among ‘digital natives’. Information, Communication & Society, 19(8), 1095–1107. https://doi.org/10.1080/1369118x.2015.1084023.View ArticleGoogle Scholar
  23. Crompton, H., Burke, D., Gregory, K. H., & Gräbe, C. (2016). The use of mobile learning in science: A systematic review. Journal of Science Education and Technology, 25, 149–160. https://doi.org/10.1007/s10956-015-9597-x.View ArticleGoogle Scholar
  24. Dalsgaard, C., & Paulsen, M. F. (2009). Transparency in cooperative online education. The International Review of Research in Open and Distributed Learning, 10(3), 1–13. https://doi.org/10.19173/irrodl.v10i3.671.View ArticleGoogle Scholar
  25. Darab, B., & Montazer, G. A. (2011). An eclectic model for assessing e-learning readiness in the Iranian universities. Computers & Education, 56(3), 900–910. https://doi.org/10.1016/j.compedu.2010.11.002.View ArticleGoogle Scholar
  26. De Corte, E., Engwall, L., & Teichler, U. (2016). The hype of MOOCs. In E. De Corte, L. Engwall, & U. Teichler (Eds.), From books to MOOCS? Emerging models of learning and teaching in higher education, (vol. 88, pp. xv–xxv). London: Portland Press.Google Scholar
  27. Deci, E. L., & Ryan, R. M. (2000). The “what” and “why” of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11(4), 227–268. https://doi.org/10.1207/s15327965pli1104_01.View ArticleGoogle Scholar
  28. Delcour, L., & Wolczuk, K. (2015). Spoiler or facilitator of democratization?: Russia's role in Georgia and Ukraine. Democratization, 22(3), 459–478. https://doi.org/10.1080/13510347.2014.996135.View ArticleGoogle Scholar
  29. Demir, Ö., & Yurdugül, H. (2015). The exploration of models regarding e-learning readiness: Reference model suggestions. International Journal of Progressive Education, 11(1), 173–194.Google Scholar
  30. Desjardins, F. J. (2005). Teachers’ representations of their computer related competencies profile: Toward a theory of ICT. Canadian Journal of Learning and Technology/La revue canadienne de l’apprentissage et de la technologie, 31(1), 1–14. https://doi.org/10.21432/t2f603.MathSciNetGoogle Scholar
  31. Desjardins, F. J., Lacasse, R., & Belair, L. M. (2001). Toward a definition of four orders of competency for the use of information and communication technology (ICT) in education. In Proceedings of the IASTED International Conference. Computers and Advanced Technology in Education, (pp. 213–217). Banff: ACTA Press.Google Scholar
  32. Desjardins, F. J., & vanOostveen, R. (2015). Faculty and student use of digital technology in a “laptop” university. In S. Carliner, C. Fulford, & N. Ostashewski (Eds.), EdMedia: World conference on educational media and technology 2015, (pp. 990–996). Montreal: Association for the Advancement of Computing in Education (AACE).Google Scholar
  33. Desjardins, F. J., vanOostveen, R., Bullock, S., DiGiuseppe, M., & Robertson, L. (2010). Exploring graduate student’s use of computer-based technologies for online learning. In J. Herrington, & C. Montgomerie (Eds.), EdMedia: World Conference on Educational Media and Technology 2010, (pp. 440–444). Association for the Advancement of Computing in Education (AACE).Google Scholar
  34. Dewey, J. (1897). My pedagogical creed. School Journal, 54(3), 77–80.Google Scholar
  35. Dickie, V. A., & Meier, H. (2015). The Facebook tutor: Networking education. Ubiquitous Learning: An International Journal, 8(2), 15–20.Google Scholar
  36. DiGiuseppe, M., Partosoedarso, E., vanOostveen, R., & Desjardins, F. J. (2013). Exploring competency development with mobile devices. In M. B. Nunes, & M. McPherson (Eds.), International Association for Development of the Information Society (IADIS) International Conference on e-Learning, (pp. 384–388). Prague: International Association for Development of the Information Society.Google Scholar
  37. Dray, B. J., Lowenthal, P. R., Miszkiewicz, M. J., Ruiz-Primo, M. A., & Marczynski, K. (2011). Developing an instrument to assess student readiness for online learning: A validation study. Distance Education, 32(1), 29–47. https://doi.org/10.1080/01587919.2011.565496.View ArticleGoogle Scholar
  38. Ed-Era. (2017). Retrieved from https://www.ed-era.com/
  39. EILAB. (2017). Digital Competency Profiler. Retrieved from https://dcp.eilab.ca/
  40. Ellefsen, L. (2015). An investigation into perceptions of Facebook-use in higher education. International Journal of Higher Education, 5(1), 160–172. https://doi.org/10.5430/ijhe.v5n1p160.View ArticleGoogle Scholar
  41. Ferrari, A. (2013). DIGCOMP: A framework for developing and understanding digital competence in Europe. Seville: Institute for Prospective Technological Studies (IPTS), European Commission, Joint Research Centre Retrieved from http://publications.jrc.ec.europa.eu/repository/bitstream/JRC83167/lb-na-26035-enn.pdf.Google Scholar
  42. Garrison, D. R. (2016). Thinking collaboratively: Learning in a community of inquiry. New York: Routledge.Google Scholar
  43. Garrison, D. R. (2017). E-learning in the 21st century: A community of inquiry framework for research and practice, (3rd ed., ). New York: Routledge.Google Scholar
  44. Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7–23. https://doi.org/10.1080/08923640109527071.View ArticleGoogle Scholar
  45. Garrison, D. R., & Archer, W. (2000). A transactional perspective on teaching and learning: A framework for adult and higher education. New York: Pergamon.Google Scholar
  46. Gay, G. (2016). An assessment of online instructor e-learning readiness before, during, and after course delivery. Journal of Computing in Higher Education, 28(2), 199–220. https://doi.org/10.1007/s12528-016-9115-z.View ArticleGoogle Scholar
  47. Gravel, C. A., & Dubko, L. (2013). Delivering an online MBA program for future business leaders in Ukraine: A success story. Distance Learning, 10(2), 25–28.Google Scholar
  48. Gunawardena, C. N. (2014). Globalization, culture, and online distance learning. In O. Zawacki-Richter, & T. Anderson (Eds.), Online distance education: Towards a research agenda, (pp. 75–107). Edmonton: AU Press, Athabasca University.Google Scholar
  49. Habibov, N. (2016). Effect of corruption on healthcare satisfaction in post-soviet nations: A cross-country instrumental variable analysis of twelve countries. Social Science & Medicine, 152, 119–124. https://doi.org/10.1016/j.socscimed.2016.01.044.View ArticleGoogle Scholar
  50. Haerpfer, C. W., & Kizilova, K. (2014). Support for democracy in postcommunist Europe and post-soviet Eurasia. In R. J. Dalton, & C. Welzel (Eds.), The civic culture transformed: From allegiant to assertive citizens, (pp. 158–189). New York: Cambridge University Press.View ArticleGoogle Scholar
  51. Halpern, D., & Gibbs, J. (2013). Social media as a catalyst for online deliberation? Exploring the affordances of Facebook and YouTube for political expression. Computers in Human Behavior, 29(3), 1159–1168. https://doi.org/10.1016/j.chb.2012.10.008.View ArticleGoogle Scholar
  52. Halverson, L. R., Graham, C. R., Spring, K. J., Drysdale, J. S., & Henrie, C. R. (2014). A thematic analysis of the most highly cited scholarship in the first decade of blended learning research. The Internet and Higher Education, 20, 20–34. https://doi.org/10.1016/j.iheduc.2013.09.004.View ArticleGoogle Scholar
  53. Hargittai, E., & Shafer, S. (2006). Differences in actual and perceived online skills: The role of gender. Social Science Quarterly, 87(2), 432–448. https://doi.org/10.1111/j.1540-6237.2006.00389.x.View ArticleGoogle Scholar
  54. Hung, M.-L. (2016). Teacher readiness for online learning: Scale development and teacher perceptions. Computers & Education, 94, 120–133. https://doi.org/10.1016/j.compedu.2015.11.012.View ArticleGoogle Scholar
  55. Hung, M.-L., Chou, C., & Chen, C.-H. (2010). Learner readiness for online learning: Scale development and student perceptions. Computers & Education, 55(3), 1080–1090. https://doi.org/10.1016/j.compedu.2010.05.004.View ArticleGoogle Scholar
  56. Ianishevska, K. (2017). Ukrainian ICT trends–moderate moves and fresh optimism. Ukraine Democracy Initiative Retrieved from http://ukrainedemocracy.org/?articles=article-ukrainian-ict-trends-moderate-moves-fresh-optimism.
  57. IEEE (1990). IEEE standard computer dictionary: A compilation of IEEE standard computer glossaries. New York: The Institute of Electrical and Electronics Engineers (IEEE).Google Scholar
  58. Iordache, E., Mariën, I., & Baelden, D. (2017). Developing digital skills and competences: A quick-scan analysis of 13 digital literacy models. Italian Journal of Sociology of Education, 9(1), 6–30. https://doi.org/10.14658/pupj-ijse-2017-1-2.Google Scholar
  59. Jonassen, D. H., & Rohner-Murphy, L. (1999). Activity as a framework for designing constructivist learning environments. Educational Technology Research & Development, 47(1), 61–79.View ArticleGoogle Scholar
  60. Jun, S., Han, S., Kim, H., & Lee, W. (2014). Assessing the computational literacy of elementary students on a national level in Korea. Educational Assessment, Evaluation and Accountability, 26(4), 319–332. https://doi.org/10.1007/s11092-013-9185-7.View ArticleGoogle Scholar
  61. Kogyt, I. (2016). Хлопчики – направо, дівчатка – наліво. Як виглядає гендерна рівність в освіті [Boys to the right, girls to the left. How gender equality looks in education]. Ukrainian Pravda Retrieved from https://life.pravda.com.ua/society/2016/07/6/214737/.
  62. Kosinski, M., Matz, S. C., Gosling, S. D., Popov, V., & Stillwell, D. (2015). Facebook as a research tool for the social sciences: Opportunities, challenges, ethical considerations, and practical guidelines. American Psychologist, 70(6), 543–556. https://doi.org/10.1037/a0039210.View ArticleGoogle Scholar
  63. Kuuti, K. (1995). Activity theory as a potential framework for human-computer interaction research. In B. A. Nardi (Ed.), Context and consciousness: Activity theory and human computer interaction, (pp. 17–44). Cambridge: The MIT Press.Google Scholar
  64. Langemeyer, I. (2011). Science and social practice: Action research and activity theory as socio-critical approaches. Mind, Culture, and Activity, 18(2), 148–160. https://doi.org/10.1080/10749039.2010.497983.View ArticleGoogle Scholar
  65. Lin, H.-H., Lin, S., Yeh, C.-H., Wang, Y.-S., & Jansen, J. (2015). Measuring mobile learning readiness: Scale development and validation. Internet Research, 26(1), 265–287. https://doi.org/10.1108/IntR-10-2014-0241.View ArticleGoogle Scholar
  66. Litt, E. (2013). Measuring users’ internet skills: A review of past assessments and a look toward the future. New Media & Society, 15(4), 612–630. https://doi.org/10.1177/1461444813475424.View ArticleGoogle Scholar
  67. Luhn, A. (2017). Ukraine blocks popular social networks as part of sanctions on Russia. The Guardian Retrieved from https://www.theguardian.com/world/2017/may/16/ukraine-blocks-popular-russian-websites-kremlin-role-war.
  68. Mittelmeier, J., Heliot, Y., Rienties, B., & Whitelock, D. (2015). The role culture and personality play in an authentic online group learning experience. Brighton: Paper presented at the EDiNEB 22, Brighton Business School.Google Scholar
  69. Mykhailenko, O., Blayone, T., & vanOostveen, R.. (2016). Exploring democratized learning and dimensions of culture for educational transformation in Ukraine. RIDRU Conference: Higher Education Reforms in Post-Maidan Ukraine, October 25, 2016. [Video Presentation] retrieved from https://www.youtube.com/watch?v=6_xOb3LpLfY#t=3h4m56s
  70. Nakamura, J., & Csikszentmihalyi, M. (2002). The concept of flow. In C. R. Snyder, & S. J. Lopez (Eds.), Handbook of positive psychology, (pp. 89–105). Oxford: Oxford University Press.Google Scholar
  71. Palalas, A., Berezin, N., Gunawardena, C. N., & Kramer, G. (2015). A design based research framework for implementing a transnational mobile and blended learning solution. International Journal of Mobile and Blended Learning, 7(4), 57–74. https://doi.org/10.4018/IJMBL.2015100104.View ArticleGoogle Scholar
  72. Pangrazio, L. (2014). Reconceptualising critical digital literacy. Discourse: Studies in the Cultural Politics of Education, 37(2), 163–174. https://doi.org/10.1080/01596306.2014.942836.Google Scholar
  73. Parasuraman, A. (2000). Technology readiness index (TRI) a multiple-item scale to measure readiness to embrace new technologies. Journal of Service Research, 2(4), 307–320. https://doi.org/10.1177/109467050024001.View ArticleGoogle Scholar
  74. Parkes, M., Stein, S., & Reading, C. (2015). Student preparedness for university e-learning environments. The Internet and Higher Education, 25, 1–10. https://doi.org/10.1016/j.iheduc.2014.10.002.View ArticleGoogle Scholar
  75. Parrish, P., & Linder-VanBerschot, J. (2010). Cultural dimensions of learning: Addressing the challenges of multicultural instruction. The International Review of Research in Open and Distributed Learning, 11(2), 1–19.View ArticleGoogle Scholar
  76. Paulsen, M. F. (2003). Online education and learning management systems: Global e-Learning in a Scandinavian perspective, (1st ed., ). Bekkestua: NKI Forlaget.Google Scholar
  77. Paulsen, M. F. (2008). Cooperative online education. Seminar.net: International Journal of Media, Technology and Lifelong Learning, 4(2), 1–20.Google Scholar
  78. Pillay, H., Irving, K., & Tones, M. (2007). Validation of the diagnostic tool for assessing tertiary students’ readiness for online learning. High Education Research & Development, 26(2), 217–234. https://doi.org/10.1080/07294360701310821.View ArticleGoogle Scholar
  79. Powell, D. V., Kuzmina, S., Kuzmin, Y., Yamchynska, T., & Shestopalyuk, O. (2014). Using web-blended learning in Ukraine to facilitate engagement and globalize horizons: A pilot study. The Online Journal of Distance Education and e-Learning, 2(2), 34–41.Google Scholar
  80. Powell, D. V., Kuzmina, S., Yamchynska, T., Shestopalyuk, O. V., & Kuzmin, Y. (2015). Educational technologies for maturing democratic approaches to educational practices in Ukraine. Procedia-Social and Behavioral Sciences, 176, 378–385. https://doi.org/10.1016/j.sbspro.2015.01.485.View ArticleGoogle Scholar
  81. Prometheus. (2017). Retrieved from https://prometheus.org.ua/. Accessed 9 July 2018.
  82. Raikhel, E., & Bemme, D. (2016). Postsocialism, the psy-ences and mental health. Transcultural Psychiatry, 53(2), 151–175. https://doi.org/10.1177/1363461516635534.View ArticleGoogle Scholar
  83. Raver, S. A. (2007). The emergence of inclusion for students with disabilities in Ukraine. International Journal of Special Education, 22(1), 32–38.View ArticleGoogle Scholar
  84. Richardson, J. C., Arbaugh, J. B., Cleveland-Innes, M., Ice, P., Swan, K. P., & Garrison, D. R. (2012). Using the community of inquiry framework to inform effective instructional design. In L. Moller, & J. B. Huett (Eds.), The next generation of distance education: Unconstrained learning, (pp. 97–125). New York: Springer-Verlag.View ArticleGoogle Scholar
  85. Roztocki, N., & Weistroffer, H. R. (2015). Information and communication technology in transition economies: An assessment of research trends. Information Technology for Development, 21(3), 330–364. https://doi.org/10.1080/02681102.2014.891498.View ArticleGoogle Scholar
  86. Sapsford, R., Abbott, P., Haerpfer, C., & Wallace, C. (2015). Trust in Post-Soviet Countries, ten years on. European Politics and Society, 16(4), 523–539. https://doi.org/10.1080/23745118.2015.1039286.View ArticleGoogle Scholar
  87. Siemens, G., Gašević, D., & Dawson, S. (2015). Preparing for the digital university: A review of the history and current state of distance, blended, and online learning. Retrieved from http://linkresearchlab.org/PreparingDigitalUniversity.pdf
  88. Social Progress Imperative. (2017a). Social Index Scorecard: Georgia. Retrieved from https://www.socialprogressindex.com/?tab=2&code=GEO. Accessed 9 July 2018.
  89. Social Progress Imperative. (2017b). Social Index Scorecard: Ukraine. Retrieved from https://www.socialprogressindex.com/?tab=2&code=UKR. Accessed 9 July 2018.
  90. Somekh, B., & Nissen, M. (2011). Cultural-historical activity theory and action research. Mind, Culture, and Activity, 18(2), 93–97. https://doi.org/10.1080/10749039.2010.523102.View ArticleGoogle Scholar
  91. Statista. (2017). Global social media ranking 2017. Retrieved from https://www.statista.com/statistics/272014/global-social-networks-ranked-by-number-of-users/. Accessed 9 July 2018.
  92. Stern, S., Wares, A., & Epner, T. (2017). Social Progress Index 2017 Methodology Report. Retrieved from https://www.socialprogressindex.com/assets/downloads/resources/en/English-2017-Social-Progress-Index-Methodology-Report_embargo-until-June-21-2017.pdf. Accessed 9 July 2018.
  93. Swan, K. (2010). Teaching and learning in post-industrial distance education. In M. F. Cleveland-Innes, & D. R. Garrison (Eds.), An introduction to distance education: Understanding teaching and learning in a new era, (pp. 108–134). New York: Routledge.Google Scholar
  94. Swan, K., Garrison, D. R., & Richardson, J. (2009). A constructivist approach to online learning: The Community of Inquiry framework. In C. R. Payne (Ed.), Information technology and constructivism in higher education: Progressive learning frameworks, (pp. 43–57). Hershey: IGI Global.View ArticleGoogle Scholar
  95. Synytsya, K., & Manako, A. (2010). eLearning in Ukraine. In U. Demiray, L. Vainio, M. C. Sahin, G. Kurubacak, P. T. Lounaskorpi, S. R. Rao, & C. Machado (Eds.), Cases on challenges facing e-Learning and national development: Institutional studies and practices, (pp. 989–1007). Eskisehir: Anadolu University.Google Scholar
  96. van Deursen, A. J. A. M., Helsper, E. J., & Eynon, R. (2016). Development and validation of the Internet Skills Scale (ISS). Information, Communication & Society, 19(6), 804–823. https://doi.org/10.1080/1369118X.2015.1078834.View ArticleGoogle Scholar
  97. van Rooij, S. W., & Zirkle, K. (2016). Balancing pedagogy, student readiness and accessibility: A case study in collaborative online course development. The Internet and Higher Education, 28, 1–7. https://doi.org/10.1016/j.iheduc.2015.08.001.View ArticleGoogle Scholar
  98. vanOostveen, R. (2016). Bachelor of Arts in Educational Studies and Digital Technology handbook. Oshawa: Internal program guide. Faculty of Education. UOIT.Google Scholar
  99. vanOostveen, R., DiGiuseppe, M., Barber, W., & Blayone, T. (2016). Developing learning communities in fully online spaces. Oshawa: Paper presented at the second international symposium on higher education in transformation.Google Scholar
  100. vanOostveen, R., DiGiuseppe, M., Barber, W., Blayone, T., & Childs, E. (2016). New conceptions for digital technology sandboxes: Developing a fully online learning communities (FOLC) model. In G. Veletsianos (Ed.), EdMedia 2016: World Conference on Educational Media and Technology, (pp. 665–673). Vancouver: Association for the Advancement of Computing in Education (AACE).Google Scholar
  101. Vuorikari, R., Punie, Y., Gomez, S. C., & Van Den Brande, G. (2016). DigComp 2.0: The Digital Competence Framework for Citizens. Update Phase 1: The Conceptual Reference Model. Retrieved from Luxembourg Publication Office of the European Union. EUR 27948 EN: http://publications.jrc.ec.europa.eu/repository/handle/JRC101254.
  102. Watkins, R., Leigh, D., & Triner, D. (2004). Assessing readiness for e-learning. Performance Improvement Quarterly, 17(4), 66–79. https://doi.org/10.1111/j.1937-8327.2004.tb00321.x.View ArticleGoogle Scholar

Copyright

© The Author(s) 2018

Advertisement