ORIGINAL RESEARCH ARTICLE

The Development of an Integrated Scale of Technology Use in Physics

Fikret Korura*, Sevda Yerdelen-Damarb and Havva Sağlamc

aDepartment of Mathematics and Science Education, Burdur Mehmet Akif Ersoy University, Burdur, Turkey; bDepartment of Mathematics and Science Education, Boğaziçi University, İstanbul, Turkey; cDepartment of Learning Sciences, Boğaziçi University, İstanbul, Turkey

Received: 6 April 2020; Revised: 8 November 2020; Accepted: 24 February 2021: Published: 27 April 2021

In this study, the Integrated Scale of Technology Use in Physics (ISTUP) was developed to determine students’ frequency of technology use, their perceptions about the effects of technology use on physics interest and achievement, and their preferences of technological tools and applications in learning physics. The scale was administered two different times to 670 high school students in total who took physics courses. Exploratory and confirmatory factor analyses were conducted to validate the scale. The results of the study suggest that the ISTUP is a valid and reliable scale. Students’ frequency of technology use in learning physics corresponded to ‘sometimes’. Students perceived that technology use had slightly positive effect on their interest and achievement. Findings regarding the interrelations between students’ preference for technological tools and applications were also discussed.

Keywords: scale development; technology use; physics interest; technological tools preferences; confirmatory factor analysis

To access the supplementary material, please visit the article landing page

*Corresponding author. Email: fikretkorur@mehmetakif.edu.tr

Research in Learning Technology 2021. © 2021 F. Korur et al. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.

Citation: Research in Learning Technology 2021, 29: 2432 - http://dx.doi.org/10.25304/rlt.v29.2432

Introduction

With the changing paradigm in education, different roles have been assigned to schools, teachers and learners (Reigeluth 2012). Learners are viewed as active constructors of their knowledge by using appropriate technological tools under the skillful guidance of their teachers. This shift towards interactivity between students, teachers and educational technology requires deep investigations of students’ perspectives about technological tools (Landry, Griffeth, and Hartman 2006). Students’ acceptance and use of educational technology tools are ultimately determined by their perceptions of the functions of these tools (Venkatesh and Bala 2008). Specifically, if students tend to believe that certain technologies are useful for their performance and easy to adopt, they possess positive intentions to use these technologies (Landry, Griffeth, and Hartman 2006). In this regard, investigating students’ perceptions of the effects of educational technology tools is of importance. Another important variable that is influential on students’ adoption of technological tools is their preferences (Mirriahi and Alonzo 2015). Students’ information technology use features such as their preferences and skills underwent a significant change with the rapid technological developments (Oblinger 2003). Since this transformation might have changed their expectations from their learning settings, it is important to consider their preferences while integrating technology into classrooms (Mirriahi and Alonzo 2015). The study of Gunuc and Kuzu (2014), which examines the involvement of both lecturers and students in technology use, and the study of Yurdugül and Aşkar (2008), which examines students’ attitudes towards technology use, reveal that the integration of the correct technologies in a specific course enhances students’ interest in the course. Accordingly, determining the technologies to be integrated into the physics class and the perceptions of students towards them is crucial. In this regard, this study aimed to develop ‘The Integrated Scale of Technology Use in Physics’ (ISTUP) that has a holistic structure with three main parts. These parts probe students’ (1) perceptions regarding the effects of technology use on their physics interest and academic achievement, (2) class-related technology use frequencies, and (3) preferences of technological tools and applications to be used in physics classes. Therefore, the two main purposes of this study are as follows:

  1. To develop and validate ‘The Integrated Scale of Technology Use in Physics’ (ISTUP) with data collected from high school students who take physics courses.
  2. To determine students’ perceptions about the effects of technology use on their physics interest and achievement, frequency of technology use, and preferences of technological tools and applications.

For the first purpose of the study, the research question ‘Is the ISTUP valid and reliable according to the results of the exploratory factor analysis (EFA) and confirmatory factor analysis (CFA)?’ was answered. In addition, relied on the second purpose, the following research questions were answered: ‘What are the students’ perceptions about the effects of technology use on their interest and achievement in physics?’, ‘How frequent do students use technology when they learn physics?’ and ‘What are students’ preferences of technological tools and applications when they learn physics?’.

Background

Technology use in education is associated with cognitive and social engagement of students (Chen, Lambert, and Guidry 2010). New technologies may enable students to actively explore and collaborate in constructivist learning environments adopting contemporary teaching and learning models (O’Neil 1995). Technologies such as interactive notebooks, computer-based laboratory tools, internet-based drawings, texts, graphs or spreadsheets and multimedia or hypermedia systems are widely used in inquiry-based learning environments (Ebert et al. 2009; Hamilton 2007; Jonassen 2006). Innovative computer and internet-based technologies (Akpınar 2014; Çalik 2013; Çalik et al. 2014; Ebenezer et al. 2012), virtual science laboratories (Kozhevnikov, Gurlitt, and Kozhevnikov 2013), dynamic data visualisations and interactive learning platforms (Levy 2013), and animations in hypermedia (Ebenezer 2001) are other tools that support scientific inquiry activities in classrooms. Concept teaching materials that are supported by technology such as internet and multimedia decrease cognitive load of students (Alpert and Grueneberg 2001). By this way, the planned integration of technology with course content contributes to conceptual understanding of students (Çalik et al. 2014; Linn and Eylon 2011). Furthermore, online concept maps enriched with digital content are used to manage the learning processes of students which may positively affect their academic performance and attitudes (Korur, Toker, and Eryılmaz 2016).

Students’ perceptions of technology use

It is prevalently accepted that appropriate integration of technology into learning environments results in desired student outcomes. However, there are several complex processes that ultimately influence the adoption of technology in classrooms. Students’ own perceptions and beliefs regarding the technological tools and their consequent technology use behaviours constitute one of such complex processes. Furthermore, the technology integration into classrooms necessitates an expensing of time and financial resources. Thus, investigating the potential impacts of these technologies, especially from the perspectives of students, might be important for the sustainability of technology use in classrooms (Landry, Griffeth, and Hartman 2006).

In this regard, the Technology Acceptance Model (TAM) explains the factors that influence individuals’ technology acceptance and technology use behaviours. The TAM is first developed by Davis, Bagozzi, and Warschaw (1989) to explain individuals’ use behaviours of information systems. Afterwards, this model is implemented into educational research to investigate students’ reactions to technology use in classroom settings (Landry, Griffeth, and Hartman 2006). According to this model, individuals’ acceptance and usage of certain technologies ultimately depend on their perceptions of the functions of these technologies. Venkatesh and Bala (2008) indicated that two types of perceptions influence individuals’ intentions and behaviours of technology use: Perceived usefulness and perceived ease of use. Perceived usefulness is defined in terms of individuals’ beliefs about whether a particular technology positively contributes on their performance (Davis, Bagozzi, and Warschaw 1989). Perceived ease of use, on the other hand, implies individuals’ beliefs about whether usage of certain technologies requires little effort (Davis, Bagozzi, and Warschaw 1989). These two perceptions together influence individuals’ intentions to use certain technologies which ultimately influence their technology use behaviours (Venkatesh and Bala 2008). That is, when individuals perceive certain technologies as useful and easy to use, they possess positive intentions towards using that technology. Consequently, these positive intentions result in individuals’ adoption of the particular technology in their real lives (Venkatesh and Bala 2008).

Several research studies investigated individuals’ technology use behaviours within the theoretical framework of TAM. Ma, Andersson and Streith (2005) investigated preservice teachers’ acceptance of computer technologies and indicated that perceived usefulness is highly predictable for preservice teachers’ intentions to use these technologies in classrooms. Önal (2017) indicated that middle school students accepted to use interactive whiteboard in math classes because they perceive the technology useful. Specifically, students believed that the whiteboards positively contribute to their learning process and enhance their interest in the course content (Önal 2017).

Students’ preferences of technology use

Oblinger (2003) indicated that features of students’ information technology use such as ownership, preference and skills have undergone a change with the rapid technological developments. Thus, the researcher argues that students’ expectations of their learning settings have changed, and this change should be considered by educators. In this regard, designing learning environments by considering students’ preferences of technology use might establish positive perceptions among students as well as enhance their acceptance levels of technologies (Mirriahi and Alonzo 2015). For example, Saeed, Yang, and Sinnappan (2009) claimed that individuals have different types of learning styles which ultimately influence their technology preferences during learning. Students learn in various ways; some of them tend to focus on facts, data, and algorithms, while others prefer to learn by focusing on visual information such as images, modelling, diagrams and simulations (Felder 1996). Therefore, it is important to learn the technologies that students want to use during their learning processes. Using the computer or web environment to show a video or picture in a traditional lesson environment does not encourage learning. In fact, this situation is not affected by the teacher’s skills or the design of the selected presentation program or the dynamism of the web environment (Felder and Brent 2017). Technology is not sufficient alone; it will only support learning when students interact with each other as well as with the digital environment (Felder and Brent 2017). This interaction will enable students to use technology for learning and consequently prefer technologies that they benefit from. Saeed, Yang, and Sinnappan (2009) showed that students are eager to experience new technologies within their learning routines. Furthermore, students are flexible in expanding their learning styles while using different types of web-based technologies. The researchers also indicated that students’ preferences of technology use in classrooms are not restricted to only one tool and preferred and integrated technology use result in enhanced academic performance (Saeed, Yang, and Sinnappan 2009).

Investigating the varying demands, preferences and interest of students regarding technology use in classrooms might contribute to successful technology integration processes. However, there are limited studies investigating students’ preferences of technology use, and most of them are conducted at the university level. Conole et al. (2008) investigated university students’ preferences of technology use and the underlying reasons behind these preferences. The researchers found that university students use Wikipedia and Google Scholar to search information, mobile phones and MSN Messenger to communicate with their friends about learning materials and Office programs to do their homework. Skype is another application that is widely used by students since it enables face to face communication and giving instant feedback. The improvement of the interaction between the instructor and students and even among the students as a result of using Facebook (Albayrak and Yildirim 2015) and Twitter (Liu 2018) can be mentioned as another example of the positive effects of social media tools.

Students’ preferences of technology use in learning physics might influence their acceptance of the technologies as well as their interaction time with these technologies. Since there is no scale measuring high school students’ perceptions of and preferences for technology use specifically for physics, there is a need to develop such a scale. In this regard, the ISTUP was developed within the scope of this study. For this purpose, the validity and reliability of the scale was investigated. In addition, students’ perceptions of the effects of technology use on their interest and achievement, their frequency of technology use, and their preferences of technological tools and applications in physics were examined.

Method

This study has a cross-sectional survey design and aims to develop and cross validate the ISTUP for students at the high school level (Fraenkel, Wallen, and Hyun 2012). The development process and content of the scale are briefly described in the following sections.

The participants of the study

To investigate the reliability and construct validity, the ISTUP was administered to 401 students (186 ninth graders, 113 tenth graders and 102 eleventh graders) taking physics courses in a public school. Outliers as well as incorrect and incomplete markings were excluded from the dataset, and the analysis was made with 356 cases. After making the necessary changes based on the EFA analysis, the second draft of the scale was administered to 324 students of another public school for the cross validation of the results. Incomplete markings, missing data and outliers were removed, and the analysis was carried out with 314 cases.

Procedure of the scale development

Following a systematic approach throughout a scale development process is critical to ensure the content and construct validity of the scale (Şahin and Boztunç Öztürk 2018). In literature, different scale development procedures are adopted by different researchers. For example, Morgado et al. (2017) analysed 105 ‘best practices’ in scale development studies and recommended three major steps: (1) item generation, (2) theoretical analysis and (3) psychometric analysis. These phases may be regarded as the most basic steps of a scale development process. Hinkin (1998) offered a six-stage scale development process including ‘item generation, scale management, initial item reduction, CFA, scale evaluation and replication on an independent sample’. Robertson (2017, pp. 8–17) offered very detailed 13 steps for scale development, ‘decide what you want to measure, develop the theoretical foundations, generate item pool, write items, decide on number of items, establish content and face validity, decide on scaling/response options, conduct pre-test/pilot study of the scale, collect data, reduce item pool, establish dimensionality and validity, establish reliability, reproduce results’. Although there are a variety of scale development procedures in literature, the ISTUP was developed by considering the eight phases emphasized by Şahin and Boztunç Öztürk (2018): (1) clearly decide what you want to measure and develop the theoretical foundations of related variables, (2) generate item pools, (3) decide the format of the scale, (4) expert review of the items, (5) ensure item validity, (6) administer the scale and conduct confirmatory analysis, (7) evaluate the items and (8) give the final form to the scale.

The phases adopted in the present study are described as follows:

  1. Clearly decide what you want to measure and develop the theoretical foundations of related variables: The scale was developed to determine students’ perceptions of the effects of technology use on their interest and achievement, frequency of technology use and their preferences of technology use in their physics learning. In this regard, a detailed theoretical framework is presented in the Background section.
  2. Generate item pools: At the beginning of this phase, the related scales in literature were examined. Since there is no comprehensive scale that has been developed for the unique purposes of this study, the item pool is generated by compiling items from different resources. Specifically, items about use of social media tools were obtained from the study of Afzal and Fardous (2016), items about web-based technology were taken from the study of Saeed, Yang, and Sinnappan (2009) and items about hardware use were compiled from the study of Başer et al. (2012). Further items were obtained from the scales developed by Öksüz, Ak and Uca Tabak (2009) investigating students’ perceptions regarding technology use in math class and Özel (2009) and Frantom, Green, and Hoffman (2002) examining effects of technology use.
  3. Decide the format of the scale: Since the scale was developed to measure students’ perceptions about the effects of technology use on their interest and achievement, as well as their frequency of technology use, students have to make three different markings for the same technological tool, equipment and software. In this regard, students are expected to read the same items and make separate markings for three times. To enhance validity of the implementation, three different five-point Likert-type columns were inserted near the items to reduce the time required to make the markings. Thus, students make three different markings for their perceptions about the effects of technology use on interest, perceptions about the effects of technology use on their achievement and frequency of their technology use. Furthermore, two open-ended questions were included to probe students’ preferences of technological tools to be used in physics teaching and learning. In sum, 16 Likert-type items and two open-ended items were included in the scale (see the ISTUP in the Appendix for the format of the items).
  4. Expert review of the items: Following the development of the item pool, opinions of four experts (one from the department of physics education and three from the department of computer education and educational technology) were taken. After the feedbacks of the experts were reviewed by the researchers, the initial version of the scale was formed. In this version, the first section includes 16 items. In this section, students’ perceptions about the effects of technology use on their interest and achievement (with a Likert type response format 1 for Absolutely decrease, 2 for Decrease, 3 for No effect, 4 for Increase and 5 for Absolutely increase) are measured in the first two columns. In the third column, frequency of technology use [with options 1 for never (0–1 h), 2 for sometimes (2–10 h), 3 for occasionally (11–24 h), 4 for often (1–2 days) and 5 for always (more than 2 days)] is measured. Students make three different markings for each item considering their perceptions about interest and achievement as well as frequency of technology use. Thus, the scores are changed between 16 and 80 in this section. The second section of the scale consists of two questions and requires students to place a check mark on a box for their preferred technological tools and applications during physics learning. Students are free to choose more than one option in each item and ‘others’ option is added at the end of each list.
  5. Ensure item validity: The initial version of the scale was administered to 401 students of a public high school. The EFA was conducted to ensure the validity of the scale. According to results of the principal component analysis conducted by the Varimax rotation, one item [Item no 4: Communicating with my friends through social media (including e-mail)] was excluded from the scale since the value for the factor loading was below the accepted value. Thus, the scale consisted of 15 items in the following implementations. No change (including format and item number) was required in the second section of the scale.
  6. Administer the scale and conduct confirmatory analysis: After the analyses were made for the data obtained from the first implementation, the scale was administered to 324 high school students from a different public school. The CFA was conducted with data of 314 students to confirm the one factor-structure of the scale informed by the theory and results of the EFA. Cronbach’s alpha was estimated for each section to determine the reliability of the results.
  7. Evaluate the items: In this phase, loadings of each item to the one-factor structure of the scale were examined.
  8. Give the final form to the scale: The finalised form of the scale is valid and reliable and consists of two sections. In the first section, there are 15 items measuring students’ perceptions about the effects of technology on their interest and achievement as well as frequency of technology use. In the second section of the scale, students’ preferences of technology use for physics learning are determined with two items, one of which probes students’ preferences of technological tools and the other one probes their preferences of technological programs and applications while learning physics.

Data analysis

In the first part of the study, the construct validity of the ISTUP was determined by carrying out the EFA by principal component analysis with the extraction method and varimax with Kaiser normalisation as the rotation method. The Bartlett’s test of sphericity was used to test for the adequacy of the correlation matrix. That is, it tests whether there are significant correlations among the scores of at least some of items. Kaiser–Meyer–Olkin coefficient (KMO) was used to test the sufficiency of the sample size (Ho 2013). The analyses of the first part were carried out with the Statistical Package for Social Studies (SPSS).

The CFA was carried out with LISREL 8.8 to test the one-factor structure of the scale. The maximum likelihood estimation method relied on the covariance matrix was employed for the CFA. Multiple fit indices including chi-square/degrees of freedom (χ2/df), comparative fit index (CFI), normed fit index (NFI), root mean square error of approximation (RMSEA) and standard root mean square residual (SRMR) were used to test whether the measurement model fits the data. The cutoff criteria of Schermelleh-Engel, Moosbrugger, and Muller (2003) for these fit indices were considered for an acceptable fit (2 < χ2/df ≤ 3; 0.05 < RMSEA ≤ 0.08; 0.05 < SRMR ≤ 0.10; 0.90 ≤ NFI < 0.95; 0.90 ≤ CFI < 0.95).

Prior to the EFA and CFA, data screening was conducted to check the data for accuracy, missing data, outliers, normality, and multicollinearity and singularity. Missing values on all items were less than 5%. According to Tabachnick and Fidell (2013), when the percentage of missing values are 5% or less, any methods for dealing with missing values produces similar results. Missing values on the items were replaced by means of series. No univariate outliers were observed; 10 cases were found through Mahalanobis distance as multivariate outliers with p < 0.001. They were excluded from the analysis of the CFA. The influential outliers evaluated through Cook distance and multicollinearity among items inspected with tolerance and VIF were not observed.

Results

The results of the study are presented in three sections. First, the results of the validation analysis of the scale are given. Then, the descriptive statistics for students’ perceptions about the effects of technology on their interest and achievement as well as their technology use frequencies are presented. Finally, the results of the correlations among students’ preferences of technological tools and applications are given.

The results for the validation of the scale

To answer the first research question, the EFA was conducted. The results indicated that the data displayed a similar structure for perceptions about interest and achievement. However, it should be noted that although a single scale is used, different perceptions regarding the effects of technology use are measured. All values were within the acceptable ranges (KMO value for the interest sub-scale, 0.862; Bartlett’s test of sphericity, χ² = 1 439 143, p < 0.001; KMO value for the achievement sub-scale, 0.879; Bartlett’s test of sphericity, χ² = 1 584 229, p < 0.001; KMO value for the frequency sub-scale, 0.926; Bartlett’s test of sphericity, χ² = 2 707 483, p < 0.001).

One factor with eigenvalues of 1.0 or higher was extracted by SPSS. Together, they accounted for 31.98% of the variance for interest section, 34.04% of the variance for achievement section and 47.10% of the variance for frequency section which are accepted as tolerable (Pituch and Stevens 2016). The items in one factor had loadings between 0.417 and 0.662 for the interest section, between 0.361 and 0.725 for the achievement section, and between 0.588 and 0.774 for the frequency section that are all above the critical limit of 0.320 which is recommended by Pituch and Stevens (2016). Similarly, the one-dimensional structure of the scale is supported based on the three criteria suggested by Büyüköztürk (2014): (1) higher factor loadings, (2) significant variance explained by the factor, and (3) eigenvalue of the first factor being higher than the three times of the eigenvalue of the second factor.

Cronbach’s alpha, which is a measure of internal consistency among scores of items in the same dimension, was estimated as 0.85 for the interest section, as 0.86 for the achievement section and as 0.92 for the frequency section.

The scale was re-administered to a different sample for the purpose of making the CFA and descriptive analyses. Cronbach’s alpha was 0.86 for the interest section, 0.83 for the achievement section and 0.91 for the frequency section. Considering the accepted threshold value for Cronbach’s alpha, at least 0.70 (Pallant 2007), it can be said that the results of the scale are reliable.

The results of the CFA for each sub-scale supported one-factor structure of the sub-scales after letting correlation among the error terms of some items. The decisions for the correlations were made based on the modification indices in the LISREL output (see Figure 1). Goodness of fit indices were in the acceptable range: they were found as χ2(78, N =314) = 215.794, χ2/df = 2.77, NFI = 0.94, CFI = 0.96, RMSEA = 0.07, SRMR = 0.06 for the interest section, as χ2(79, N =314) = 196.832, χ2/df = 2.49, NFI = 0.92, CFI = 0.95, RMSEA = 0.07, SRMR = 0.06 for the achievement section, and as χ2(81, N = 314) = 231.605, χ2/df = 2.85, NFI = 0.96, CFI = 0.98, RMSEA = 0.08, SRMR = 0.06 for the frequency section. The factor loadings calculated for the EFA and CFA data are presented in Table 1. All items have significant factor loadings (p < 0.05). As mentioned above, Pituch and Stevens (2016) suggested a threshold value of 0.32 for the magnitude of factor loadings. Only the factor loading of Item 2 for the CFA of the achievement section is smaller than this value (see Table 1). Item 2 ‘Using spreadsheet (Excel) programs’ is retained in the scale; however, future studies should take into account the low factor loading of this item. R2 values indicate the explained variance of the item by the factor range from 0.12 to 0.59 which correspond to medium to large effect size according to the threshold of Cohen and Cohen (1983). The path diagrams for the three sections of the first part of the scale are given separately in Figure 1a,b,c.

Fig 1
Figure 1. (a) The path diagram of the factor structure of the interest section. (b) The path diagram of the factor structure of the achievement section. (c) The path diagram of the factor structure of the frequency section.

Table 1. Factor loading values of each item for the three sub-scales.
Factors Factor loadings
Interest Achievement Frequency
Items EFA CFA EFA CFA EFA CFA
1. Using word processor (Word) 0.434 0.38 0.361 0.35 0.641 0.55
2. Using spreadsheet (Excel) programs 0.417 0.35 0.370 0.22 0.652 0.57
3. Using presentation (PowerPoint, Prezi, etc.) programs 0.444 0.37 0.472 0.35 0.655 0.59
4. Communicating with my teacher through social media (including e-mail) 0.483 0.55 0.387 0.42 0.631 0.60
5. Following the channels or related videos of teachers who are teaching physics lessons on YouTube 0.600 0.53 0.548 0.43 0.648 0.56
6. Benefiting from pages related to the course I follow on social media 0.578 0.68 0.555 0.59 0.611 0.61
7. Using educational software (The Digital Educational Platform of Turkey-EBA, Vitamin, etc.) 0.545 0.68 0.611 0.55 0.705 0.64
8. Watching animations and simulations about the lesson 0.613 0.58 0.594 0.64 0.728 0.73
9. Using technology / computer-aided laboratory equipment 0.662 0.51 0.725 0.56 0.774 0.72
10. Using the smart board 0.600 0.59 0.580 0.52 0.588 0.44
11. Using robot sets (Makey Makey, Ardunio, Lego Mindstorm, etc.) 0.643 0.41 0.708 0.47 0.680 0.67
12. Following / using online Web 2.0 tools (PowToon, flip quiz, cartoon maker, story jumper, etc.) 0.616 0.45 0.664 0.45 0.724 0.71
13. Using the websites that our teacher uses and recommends in the course 0.603 0.64 0.653 0.48 0.748 0.77
14. Using applications related to the course I downloaded on my mobile phone 0.603 0.62 0.665 0.60 0.740 0.58
15. Doing technology supported projects (eTwinning etc.) 0.564 0.42 0.678 0.44 0.738 0.62
NEFA = 356 and NCFA = 314.

The results for students’ perceptions of the effects of technology use and their frequency of technology use

In this section, the results regarding the second research question are presented. Table 2 presents the mean scores and standard deviations related to students’ perceptions about the effects of technology use on their interest and achievement in physics as well as their frequency of use of each technology.

Table 2. Descriptive statistics for the interest, achievement and frequency sections of the Integrated Scale of Technology Use in Physics.
Categories Interest Achievement Frequency
Items Mean S.d. Mean S.d. Mean S.d.
1. Using word processor (Word) 3.20 0.64 3.27 0.65 1.60 0.81
2. Using spreadsheet (Excel) programs 3.20 0.65 3.23 0.61 1.51 0.78
3. Using presentation (Powerpoint, Prezi, etc.) programs 3.37 0.72 3.36 0.67 1.71 0.90
4. Communicating with my teacher through social media (including e-mail) 3.62 0.77 3.66 0.75 1.77 1.06
5. Following the channels or related videos of teachers who are teaching physics lessons on YouTube 4.07 0.81 4.28 0.65 2.67 1.13
6. Benefiting from pages related to the course I follow on social media 3.80 0.76 3.92 0.69 2.18 1.12
7. Using educational software (The Digital Educational Platform of Turkey-EBA, Vitamin, etc.) 3.74 0.77 3.89 0.78 2.10 1.05
8. Watching animations and simulations about the lesson 4.09 0.76 4.10 0.74 2.12 1.11
9. Using technology / computer-aided laboratory equipment 4.12 0.79 4.12 0.73 2.04 1.21
10.Using the smart board 3.66 0.75 3.69 0.74 2.44 1.32
11. Using robot sets (Makey Makey, Ardunio, Lego Mindstorm, etc.) 3.86 0.87 3.80 0.84 1.72 1.15
12. Following / using online Web 2.0 tools (powtoon, flip quiz, cartoon maker, story jumper, etc.) 3.52 0.84 3.57 0.78 1.66 1.01
13. Using the websites that our teacher uses and recommends in the course 3.72 0.82 3.81 0.77 1.89 1.01
14. Using applications related to the course I downloaded on my mobile phone 3.86 0.77 3.97 0.70 2.31 1.18
15. Doing technology supported projects (eTwinning, etc.) 3.54 1.00 3.55 1.01 1.76 1.12
N = 314.

The mean of the items for students’ perceptions about how technology use in physics instruction affects their interest and achievement in physics ranged from 3.20 to 4.30 (Table 2). The mean values for students’ perceptions about the effects of technology use on their interest and achievement were analysed by considering the threshold values of five-point Likert-type scales: absolutely decreases as 1.00–1.80, decreases as 1.81–2.60, no effect as 2.61–3.40, increases as 3.41–4.20 and absolutely increases as 4.21–5.00 (Tekin 2002). Based on mean scores, the use of the specified technologies ‘increases’ students’ interest in physics (X = 3.69). Similarly, students indicated that their achievement in physics increases with the use of the specified technologies (X = 3.75). Students also reported that ‘following the channels or related videos of teachers who are teaching physics lessons on YouTube’, ‘using technology/computer-based laboratory equipment’ and ‘watching animations and simulations about the lessons’ were three technology-related activities that enhanced their interest and achievement the most. Compared to other technological tools and programs, ‘using word processor (Word)’ and ‘using spreadsheet (Excel) programs’ had lower mean scores. Thus, it can be said that students’ perceptions about the effect of the use of these technologies was neutral.

The range for the frequency section of the ISTUP is determined as follows: Never as 1.00–1.80, sometimes as 1.81–2.60, often as 2.61–3.40, usually as 3.41–4.20 and always as 4.21–5.00. The means of the items probing how frequent students use the technological tools and programs indicated that the time that students spent were below the mid-point corresponding to 11–24 h a week. Students’ perceptions of the frequency of technology use in physics course ranged between 1.51 and 2.44. The mean value of frequency of technology use was 1.96 which corresponds to ‘sometimes’. Among the technological activities, the frequency of use was ‘often’ only for ‘following the channels or related videos of teachers who are teaching physics lessons on YouTube’. All the remaining technological tools and programs are used by the students at the level of ‘sometimes’ or ‘never’.

The results for students’ preferences of technology use while learning physics

The rate of students preferring to use technological tools ranged from 40.1% to 73.5%. The most preferred tool was ‘virtual reality glasses’, whereas the least preferred tool was ‘multifunctional calculator’. In descending order, the percentages of students’ preferences for technological tools were 73.5% for virtual reality glasses, 72.5% for 3D printers, 67.6% for computers, 66.7% for computer based laboratory tools, 63.3% for smart phones, 60.50% for robots, 59.6% for wearable technologies, 54.9% for smart boards, 59% for tablets and 40.1% for multifunctional calculators. On the other hand, the percentage of students preferring to use technological programs and applications changed from 26.9% to 68.5%. Students preferred ‘lecturing videos’ most while they preferred ‘other web 2.0 tools’ least. The percentages of students’ preferences for the technological programs and applications were 68.5% for lecturing videos, 68.2% for virtual reality applications, 67.3% for educational videos, 65.7% for simulations, 65.4% for animations, 61.1% for digital educational games, 55.9% for augmented reality applications, 52.5% for social media, 51.9% for mobile applications, 49.4% for educational websites, 46.9% for QR-encoded virtual reality environments, 38.9% for interactive videos, 35.2% for electronic printed resources, 33% for online concept maps, 31.8% for distance education system and 26.9% for other web 2.0 tools.

As the students’ preferences were coded, ‘1’ was used if the participant ticked the tool and ‘0’ was used if the participant did not tick the tool. Since each preference was coded dichotomously, the phi correlation coefficient indicating association between two dichotomous variables was estimated for the correlations among students’ preferences of technological tools, and technological programs and applications. The magnitude of the phi values ranged from 0.01 to 0.58 (see Supplementary Material_Table A).

There was a strong positive relationship between students’ preference of computers and smart phones (φ = 0.50), while there was almost no association between preferences of computers and virtual reality glasses (φ = 0.01). That is, there was a consistency between students’ preference of computers and smart phones, while there was no consistency between their preference of computers and virtual reality glasses. There was also a strong positive correlation between students’ preferences of virtual reality glasses and 3D printers (φ = 0.58).

The phi correlation coefficients ranged between 0.02 and 0.60 for students’ preferences of technological programs and applications (see Supplementary Material_Table B). There was a strong positive relationship between students’ preferences of using online concept maps and other Web 2.0 tools (φ = 0.60). Considering low percentages of the preferences of these two applications, it can be said that the students who did not prefer using online concept map also did not prefer to use other Web 2.0 tools. There was also a strong positive correlation between students’ preferences of distance education system and other web 2.0 tools (φ = 0.51); between online concept maps and interactive videos; and between virtual reality applications and QR-encoded virtual reality environments and; between animations and simulations. There was almost no association between preferences of simulations and educational websites (φ = 0.02) as well as simulations and lecturing videos (φ = 0.02).

The correlation between students’ preferences of technological programs/applications and technological tools ranged between 0.01 and 0.49 (see Supplementary Material_Table C). There was a strong positive relationship between students’ preferences of virtual reality applications and virtual reality glasses (r = 0.49). There was also a strong positive correlation between students’ preferences of social media and smart phones as a tool (φ = 0.40). On the other hand, there was almost no association between preferences of simulations and smart phones (φ = 0.01) as well as educational websites and 3D printers (φ = 0.01).

Discussion and conclusion

This study mainly aimed to develop ‘The Integrated Scale of Technology Use in Physics’ (ISTUP) measuring students’ perceptions and preferences of technological equipment and programs to be used in physics teaching and learning. In this regard, the validity and reliability analysis was made for the scale. It was found that the ISTUP was a valid and reliable scale for students taking physics courses at high school level. Both the EFA and CFA results supported the construct validity of the scale. The well-planned and systematic aspects of the scale development process adopted in this study might be a reason to get valid and reliable results. The study put forth evidence for the factorial validity, construct validity and internal consistency of the ISTUP (Field 2009; Tabachnick and Fidell 2013). In addition, the structural, semantic and conceptual equivalences of the scale items were developed by consulting literature and experts. In this regard, the ISTUP can be used by other researchers and instructors to probe high school students’ perceptions regarding the effects of technology use on their physics interest and achievement. It can also be used in studies designed to investigate students’ class-related technology use frequencies and in studies aiming to identify students’ preferences of technological tools and applications to be used in physics teaching and learning.

The ISTUP can be used to integrate the appropriate technology based on students’ preferences into physics lessons in order to increase the academic achievement of students. Supporting this result, in the relevant literature, it is implied that the technologies integrated into learning environments (Saeed, Yang, and Sinnappan 2009), adopted and determined by students (Felder and Brent 2017; Landry, Griffeth, and Hartman 2006), have sustainability and have positive effects on students’ interest and academic achievement. Furthermore, they were open to use technological tools and applications while learning physics. This result is compatible with the findings of the study of Başer et al. (2012) demonstrating students’ positive perceptions of and less resistance to technology use. Similarly, Conole et al. (2008) also indicated that students make choice of technological tools such as Wikipedia and mobile phones according to their own learning needs and this may positively influence their learning processes.

This study also investigated students’ preferences of technological tools and applications and the interrelations among them. In terms of technological tools, more than 50% of students preferred to use virtual reality glasses, 3D printers, computers, computer-based laboratory tools, smart phones, robots, wearable technologies, smart boards and tablets, whereas only around 40% of students preferred to use of multifunctional calculators in physics teaching and learning. More than 50% of students preferred to use lecturing videos, virtual reality applications, educational videos, simulations, animations, digital educational games, augmented reality applications, social media and mobile applications as technological programs and applications. Students’ preference percentages discussed in result section imply that they tend to choose popular and rarely used technological tools and applications more than the commonly used ones. This could be due to students viewing new and different technologies as tools which should be used for effective learning, especially for courses such as physics mostly focusing on concept teaching.

Students also preferred to use technological applications such as lecturing videos, simulations and animations with which they are more acquainted with. One of the reasons why students often preferred to use these technological programs and applications in learning physics can be explained by the results of the study of Korur, Toker, and Eryılmaz (2016), which stated that it reduced students’ cognitive load and positively affected students’ physics learning by increasing the processing time of information in short-term memory. However, it is also stated that students who are used to learning with instructions may remain passive and the expected increase in motivation or success may not occur (Ramma et al. 2018). Therefore, it would be appropriate for physics teachers to integrate these highly preferred technologies into the relevant subjects in the curriculum in a systematic order, taking into account the periods when students have to learn independently outside of school (such as the current pandemic period). One of the reasons why students prefer technological applications and tools with such high percentages may be their need to learn physics lesson meaningfully and their existing tools and equipment may not be used at a level to meet this need.

As Felder and Brent (2017) stated, students’ interaction with web environment enables them to use technology for learning, and they prefer technologies supporting their learning. This result supports their preferences for choosing technological applications that they frequently use and are familiar with. Engaging in concept learning in physics may stimulate students to prefer applications that they have previous experience with such as lecturing videos or that they believe to be useful like virtual reality applications. It was concluded that more than half of the students prefer to use social media (such as Skype, Facebook or Whatsapp) for learning purposes. It is compatible with the results of the studies implying that social networks can encourage students to develop a positive attitude towards the lesson, by increasing teacher–student interactions, not only for students who are participating in the classroom but also for students who want to participate outside the classroom (Afzal and Fardous 2016; Albayrak and Yıldırım 2015; Liu 2018).

The correlations among the preferences imply that students preferring to use computers tended to use mobile phones but not virtual reality glasses. This might imply that students made preferences for technologies that they are familiar with. In contrast, students choosing to use virtual reality glasses tended to use 3D printers. Thus, it can be said that students preferring novel technologies tended to not favour frequently used technologies. This might stem from students’ sense of wonder as well as their ineffective experiences with the commonly used technologies. The high correlations between technological applications (online concept maps and other web 2.0 applications) that were rarely preferred by students can be also explained by this notion. In contrast, the high correlations between preferences of animations and simulations may stem from students’ positive experiences with both of these tools and their consequent high rates of choices. Thus, it is predicted that students may have made preferences if they perceive the technological tool ‘beneficial’ in regard to their previous experiences. Preferences for newly popular technological tools and applications may be affected by students’ sense of wonder and their views about the effectiveness of the technologies. In addition, students choosing virtual reality applications tended to prefer virtual reality glasses and students favouring social media tended to prefer mobile phones. Similarly, students preferring simulations did not tend to prefer mobile phones and students choosing educational web pages did not prefer 3D printers. Thus, it can be said that students tend to prefer related technological tools and applications. Specifically, students preferring technological applications are more likely to prefer technological tools that can be used for those applications. This finding can be interpreted as an evidence of the consistency of students’ responses on the scale.

This study has a number of limitations. First, this scale measures the effects and preferences of technology use based on students’ perceptions. Developing this kind of scales requires a careful item construction process, conducting a pilot study and consulting expert opinions. All of these aspects were considered throughout the development process of the scale. This kind of self-evaluation scales may eliminate students to express their real thoughts. This problem may be solved by constructing clear and comprehensible items as intended in this study. Furthermore, the researchers tried to lessen the effect of this limitation by using few items in the scale, giving sufficient time to participants, requiring participants to read all the items and making extreme value analysis. Second, the ISTUP was administered to high school students in Turkey and validated only for this specific sample. The scale can be administered to high school students in different countries and cultures, and cross-cultural studies can be conducted. The technological tools, programs and applications specified in the scale were determined by the researchers of this study. Thus, further replications of the study in different cultures and the future should take into consideration the unique features of their own contexts.

In conclusion, this study contributes to literature by developing a valid and reliable technology use scale. In addition, the findings obtained from the second part of the study can give different insights into the nature of students’ preferences of technological tools and applications. Determining students’ technology preferences with a systematic measurement may inform the design of relevant technology-integrated learning environments from students’ point of view. The findings of this study can be also used as a resource for studies that investigate variables related to students’ physics achievement as well as the impact of certain technology integrations.

Acknowledgements

This study was supported by the Bogazici University Research Fund (Project Number 14461).

References

Afzal, M. T. & Fardous, N. (2016) ‘Students’ preferences of technology usage for their learning engagement’, American Journal of Educational Research, vol. 4, no.10, pp. 749–751. doi: 10.12691/education-4-10-7

Akpınar, E. (2014) ‘The use of interactive computer animations based on POE as a presentation tool in primary science teaching’, Journal of Science Education and Technology, vol. 23, pp. 527–537. doi: 10.1007/s10956-013-9482-4

Albayrak, D. & Yildirim, Z. (2015) ‘Using social networking sites for teaching and learning: students’ involvement in and acceptance of Facebook® as a course management system’, Journal of Educational Computing Research, vol. 52, no. 2, pp. 155-179. doi: 10.1177/0735633115571299

Alpert, S. R. & Grueneberg, K. (2001) ‘Multimedia in concept maps: a design rationale and web-based application’, in Proceedings of Ed-Media 2001: World Conference on Educational Multimedia, Hypermedia and Telecommunications, eds C. Montgomerie & J. Viteli, pp. 31–36.

Başer, V. G., et al. (2012) ‘Perceptions of students about technology integration’, Education Sciences, vol. 7, no. 2, pp. 591–598.

Büyüköztürk, Ş. (2014) Sosyal bilimler için veri analizi el kitabı: İstatistik, araştırma deseni, SPSS uygulamaları ve yorum, 15th edn, Pegem Akademi, Ankara.

Çalik, M. (2013) ‘Effect of technology-embedded scientific inquiry on senior science student teachers’ self-efficacy’, Eurasia Journal of Mathematics, Science & Technology Education, vol. 9, no. 3, pp. 223–232. doi: 10.12973/eurasia.2013.931a

Çalik, M., et al. (2014) ‘Improving science student teachers’ self-perceptions of fluency with innovative technologies and scientific inquiry abilities’, Journal of Science Education and Technology, vol. 24, no. 4, pp. 448–460. doi: 10.1007/s10956-014-9529-1

Chen, P. S. D., Lambert, A. D. & Guidry, K. R. (2010) ‘Engaging online learners: the impact of web-based learning technology on college student engagement’ Computers & Education, vol. 54, pp. 1222–1232. doi: 10.1016/j.compedu.2009.11.008

Cohen, J. & Cohen, P. (1983) Applied Multiple Regression/Correlation Analysis for the Behavioral Sciences, 2nd edn, Prentice Hall, Hillside, NJ.

Conole, G., et al. (2008) ‘Disruptive technologies’, ‘pedagogical innovation’: what’s new? Findings from an in-depth study of students’ use and perception of technology’, Computers & Education, vol. 50, no.2, pp. 511–524. doi: 10.1016/j.compedu.2007.09.009

Davis, F. D., Bagozzi, R. P. &Warshaw, P. R. (1989) ‘User acceptance of computer technology: a comparison of two theoretical models’, Management Science, vol. 35, pp. 982–1002. doi: 10.1287/mnsc.35.8.982

Ebenezer, J. V. (2001) ‘A hypermedia environment to explore and negotiate students’ conceptions: animation of the solution process of table salt’, Journal of Science Education and Technology, vol. 10, no. 1, pp. 73–91. doi: 10.1023/A:1016672627842

Ebenezer, J. V., et al. (2012) ‘One science teacher’s professional development experience: a case study exploring changes in students’ perceptions of their fluency with innovative technologies’, Journal of Science Education and Technology, vol. 21, pp. 22–37. doi: 10.1007/s10956-010-9277-9

Ebert, E. K., et al. (2009) ‘Learning science with inquiry in the Clark County School District’, in Inquiry the Key to Exemplary Science (Vol. 6), ed R. E. Yager, NSTA Press, Arlington, VA, pp. 253–271.

Felder, R. M. (1996) ‘Matters of style’, ASEE Prism, vol. 6, no. 4, pp. 18–23.

Felder, R. M. & Brent, R. (2017) ‘Effective teaching: workshop’, Available at: https://engineering.purdue.edu/Engr/AboutUs/Administration/AcademicAffairs/Resources/Teaching/effective-teaching.pdf

Field, A. (2009) Discovering Statistics Using SPSS, 3rd edn, Sage Publications Ltd., London.

Fraenkel, J. R., Wallen, N. E. & Hyun, H. H. (2012) How to Design and Evaluate Research in Education, 8th edn, McGraw-Hill Inc., New York, NY.

Frantom, C. G., Green, K. E. & Hoffman, E. R. (2002) ‘Measure development: the children’s attitudes toward technology scale (CATS)’, Journal of Educational Computing Research, vol. 26, no. 3, pp. 249–263. doi: 10.2190/DWAF-8LEQ-74TN-BL37

Gunuc, S. & Kuzu, A. (2014) ‘Tendency scale for technology use in class: development, reliability and validity’, Eğitimde Kuram ve Uygulama, vol. 10, no. 4, pp. 863–884. doi: 10.17244/eku.19404

Hamilton, D. N. (2007) ‘The scholarship of teaching as transformative learning’, Transformative Dialogues, vol. 1, no.1, pp. 1–6.

Hinkin, T. R. (1998) A Brief Tutorial on the Development of Measures for Use in Survey Questionnaires, Available at http://scholarship.sha.cornell.edu/articles/521

Ho, R. (2013) Handbook of Univariate and Multivariate Data Analysis with IBM SPSS, 2nd edn., CRC Press, New York, NY.

Jonassen, D. H. (2006) Modelling with Technology: Mindtools for Conceptual Change, 3rd edn., Pearson, Boston, MA.

Korur, F., Toker, S. & Eryılmaz, A. (2016) ‘Effects of the integrated online advance organizer teaching materials on students’ science achievement and attitude’, Journal of Science Education and Technology, vol. 25, no.4, pp. 628–640. doi: 10.1007/s10956-016-9618-4

Kozhevnikov, M., Gurlitt, J. & Kozhevnikov, M. (2013) ‘Learning relative motion concepts in immersive and non-immersive virtual environments’, Journal of Science Education and Technology, vol. 22, pp. 952–962, doi: 10.1007/s10956-013-9441-0

Landry, B. J., Griffeth, R. & Hartman, S. (2006) ‘Measuring student perceptions of blackboard using the Technology Acceptance Model’, Decision Sciences Journal of Innovative Education, vol. 4, no.1, pp. 87–99. doi: 10.1111/j.1540-4609.2006.00103.x

Levy, D. (2013) ‘How dynamic visualization technology can support molecular reasoning’ Journal of Science Education and Technology, vol. 22, pp. 702–717. doi: 10.1007/s10956-012-9424-6

Linn, M. C. & Eylon, B.-S. (2011) Science Learning and Instruction: Taking Advantage of Technology to Promote Knowledge Integration, Routledge, New York, NY.

Liu, C. (2018) ‘Social media as a student response system: new evidence on learning impact’, Research in Learning Technology, vol. 26, pp. 1–29. doi: 10.25304/rlt.v26.2043

Ma, W. W., Andersson, R. & Streith, K. O. (2005) ‘Examining user acceptance of computer technology: an empirical study of student teachers’, Journal of Computer Assisted Learning, vol. 21, no. 6, pp. 387–395. doi: 10.1111/j.1365-2729.2005.00145.x

Mirriahi, N. & Alonzo, D. (2015) ‘Shedding light on students’ technology preferences: implications for academic development’, Journal of University Teaching & Learning Practice, vol. 12, no. 1, p. 6.

Morgado, F., et al. (2017) ‘Scale development: ten main limitations and recommendations to improve future research practices’, Psicologica: Reflexao e Critica, vol. 30, Article 3. doi: 10.1186/s41155-016-0057-1

Oblinger, D. (2003) Boomers gen-xers millennials: understanding the new students. Educause Review, vol. 500, no. 4, pp. 37–47.

Öksüz, C., Ak, Ş. & Uca Tabak, S. (2009) ‘A perceptions scale for technology use in the teaching of elementary mathematics’, Yüzüncü Yıl Üniversitesi Eǧitim Fakültesi Dergisi, vol. 6, pp. 270–287.

Önal, N. (2017) ‘Use of interactive whiteboard in the mathematics classroom: students’ perceptions within the framework of the Technology Acceptance Model’, International Journal of Instruction, vol. 10, no. 4, pp.67–86. doi: 10. 67-86. 10.12973/iji.2017.1045a

O’Neil, J. (1995) ‘On technology and schools: a conversation with Chris Dede’, Educational Leadership, vol. 53, no. 2, pp. 6–13.

Özel, S. (2009) Development and Testing of Achievement from Multiple Modes of Mathematical Representation: Audio, Audio-Visual, and Kinesthetic, Dissertation Thesis, University of Texas A&M.

Pallant, J. (2007) SPSS Survival Manual: A Step by Step Guide to Data Analysis Using SPSS for Windows, 3th edn, Open University Press, Maidenhead.

Pituch, K. A. & Stevens, J. P. (2016) Applied Multivariate Statistics for the Social Sciences: Analyses with SAS and IBM’s SPSS, 6th edn, Routledge, New York, NY.

Ramma, Y., et al. (2018) ‘Teaching and learning physics using technology: making a case for the affective domain’, Education Inquiry, vol. 9, no. 2, pp. 210–236. doi: 10.1080/20004508.2017.1343606

Reigeluth, C. M. (2012) ‘Instructional theory and technology for the new paradigm of education’, RED. Revista de Educación a distancia, vol. 32, pp. 1–18. doi: 10.6018/red/50/1b

Robertson, G. (2017) ‘Developing valid and reliable survey scales’, [online] Available at: https://i2ifacility.org/system/documents/files/000/000/010/original/Developing_valid_and_reliable_survey_scales_i2i_October_2017.pdf?1507625556

Saeed, N., Yang, Y. & Sinnappan, S. (2009) ‘Emerging web technologies in higher education: a case of incorporating blogs, podcasts and social bookmarks in a web programming course based on students’ learning styles and technology preferences’, Educational Technology & Society, vol. 12, no. 4, pp. 98–109.

Şahin, M. G. & Boztunç Öztürk, N. (2018) ‘Scale development process in educational field: a content analysis research’, Kastamonu Education Journal, vol. 26, no. 1, pp. 191–199. doi: 10.24106/kefdergi.375863

Schermelleh-Engel, K., Moosbrugger, H. & Muller, H. (2003) ‘Evaluating the fit of structural equation models: tests of significance and descriptive goodness-of-fit measures’, Methods of Psychological Research, vol. 8, no. 2, pp. 23–74.

Tabachnick, B. G. & Fidell, L. S. (2013) Using Multivariate Statistics, 6th edn, Pearson, Boston.

Tekin, H. (2002) Eğitimde Ölçme ve Değerlendirme, Yargı Yayıncılık, Ankara.

Venkatesh, V. & Bala, H. (2008) ‘Technology Acceptance Model 3 and a research agenda on interventions’, Decision Sciences, vol. 39, no. 2, pp. 273–315. doi: 10.1111/j.1540-5915.2008.00192.x

Yurdugül, H. & Aşkar, P. (2008) ‘An investigation of the factorial structures of pupils’ attitude towards technology (PATT): a Turkish sample’, Elementary Education Online, vol. 7, no. 2, pp. 288–309.

Appendix

The Integrated Scale of Technology Use in Physics - ISTUP

Student Id: ………..................... Grade: ……...…........ Gender: [ ] Female [ ] Male

Introduction: Dear students,

This scale has been prepared to determine how often you use technology to learn the physics lessons, and how its use affects your academic success and your interest in the physics course. The scale consists of two sections. There is no right or wrong answer. Your answers will not be shared with anyone. Thanking you for the contributions.

PART 1: Evaluate how the situation in each of the following items affects your interest in physics and your achievement in physics. Mark your weekly usage frequency for the situation mentioned in the related item. For each item you read below, you need to make three separate markings. Please do not leave blank for each item.

Learning physics (at home or at school) by My Interest in Physics My Achievement in Physics My Weekly Usage Frequency No Comment
Absolutely Decreases Decreases No Effect Increases Absolutely Increases Absolutely Decreases Decreases No Effect Increases Absolutely Increases Never (0–1 hour) Sometimes (2–10 hours) Often (11–24 hours) Usually (1–2 day) Always (more than 2 days)
SAMPLE MARKING: Using a computer X X X
1. Using word processor (Word, etc.)
2. Using spreadsheet (Excel, etc.) programs
3. Using presentation (Power point, Prezi, etc.) programs
4. Communicating with my teacher through social media (including e-mail)
5. Following the channels or related videos of teachers who are teaching physics lessons on YouTube
6. Benefiting from pages related to the course I follow on social media
7. Using educational software (The Digital Educational Platform of Turkey-EBA, Vitamin, etc.)
8. Watching animations and simulations about the lesson
9. Using technology or computer-aided laboratory equipment
10. Using the smart board
11. Using robot sets (Makey Makey, Ardunio, Lego Mindstorms, etc.)
12. Following or using online Web 2.0 tools (PowToon, flip quiz, cartoon maker, story jumper, etc.)
13. Using the websites that our teacher uses and recommends in the course
14. Using applications related to the course I downloaded on my mobile phone
15. Doing technology-supported projects (eTwinning etc.)

PART 2: Indicate the technological tool or applications you want to use while learning physics by ticking the following boxes. (You can tick more than one box.)


1. Indicate the technological tool you want to use while learning physics by ticking the following boxes. (You can tick more than one box.)
☐ Robot (Ardunio, Makey Makey, Lego Mindstorm, etc.) ☐ Smart Board
☐ Computer ☐ Smart Phone
☐ Virtual Reality Glasses ☐ Functional Calculator
☐ 3D Printers ☐ Wearable Technologies
☐ Tablet ☐ Computer Based Laboratory Tools (sensors, motion detector, etc.)
☐ Other (.....................................................)
2. Indicate the technological applications you want to use while learning physics by ticking the following boxes. (You can tick more than one box.)
☐ Websites (EBA, Vitamin, etc.) ☐ Simulations (PhET, etc.)
☐ Mobile applications (Socrative, quizlet, etc.) ☐ Animations (animatedphysics.com, animoto, PowToon, etc.)
☐ Digital Educational Games (Kahoot, etc.) ☐ Virtual Reality Applications (HP Reveal, etc.)
☐ Social media (WhatsApp, Facebook, Instagram, etc.) ☐ QR-encoded virtual reality environments (Plickers, Gokr, HP Reveal, etc.)
☐ Electronic Printed Resources (e-library, e-books) ☐ Augmented Reality Applications (Google Sky Map, QuiverVision, etc.)
☐ Educational Videos (BBC Science Documentaries, National Geographic, etc.) ☐ Online Concept Maps (Padlet, CMap, ÇİDKOM, etc.)
☐ Interactive Videos (Wyzowyl, eduCanon, etc.) ☐ Other Web 2.0 tools (Digital stories, etc.)
☐ Lecturing Videos (YouTube-Hocalara Geldik, Tonguç Akademi, etc.) ☐ Distance Education System (online.uzaktanegitim.com etc.)
☐ Other (.....................................................)