Effects of Computer-Based Training in Computer Hardware Servicing on Students' Academic Performance

Effects of Computer-Based Training in Computer Hardware Servicing on Students' Academic Performance

Rex Perez Bringula, John Vincent T. Canseco, Patricia Louise J. Durolfo, Lance Christian A. Villanueva, Gabriel M. Caraos
DOI: 10.4018/IJTESSS.317410
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This study determined the effects of computer-based training in computer hardware servicing with a pedagogical agent named “DAC: The Builder” on the academic performance of computing students. Fifty-six university students (30 students in the control group, 26 students in the experimental group) participated in a two-week experiment. The majority of the experimental group exhibited gaming behavior but subsequently reduced it after DAC intervention. The data collected in this study showed that the null hypothesis stating that there is no significant difference in the pretest and posttest scores of the experimental group can be rejected. Moreover, the hands-on posttest scores of both groups had significant differences. This study demonstrates that returning students to the lecture when they exhibit gaming the system behavior is an effective tool for discouraging this behavior. The use of the DAC is therefore recommended for students taking up computer hardware servicing. Implications and recommendations were also discussed.
Article Preview
Top

1. Introduction

Computer hardware servicing is a technical skill where students have to learn computer set building, computer troubleshooting, software installation, system configuration, and computer maintenance (De Jesus, 2019). From basic secondary school to computer-related courses in tertiary education, computer hardware servicing instructions are a fundamental skill in computer education (Hsu & Hwang, 2014). However, there are challenges to learning the course. The difficulty experienced by students in assembling a computer is not only due to a lack of practice but also to insufficient assistance and materials (Hwang et al., 2011). For example, to understand the functions of a motherboard, students need to see a fully functional motherboard. The ideal teaching method for the subject is to allow the students to use a functional motherboard. However, it will be highly impractical to dismantle a working computer to show the motherboard. Moreover, providing individualized feedback to all students will be very tedious and time-consuming (Botarleanu et al., 2018).

One way to address these issues is to employ computer-based training (CBT) software (subsequently referred to as software) for a computer hardware servicing system (De Jesus, 2019). However, prior work (e.g., De Jesus, 2019) did not include interventions when students are gaming the system (GTS) (a deliberate behavior to exploit the system to achieve correct responses rather than learning the materials; Baker et al., 2008) and assistance from a pedagogical agent. To address these gaps, this study was conceived. This study developed software for computer hardware servicing for computing students (Information Technology, Computer Science, and Information Systems) with a pedagogical agent capable of detecting the GTS. Specifically, the study aims to answer the following research questions (RQ). 1) What is the software utilization of the students in the experimental group in terms of the number of lectures taken, time spent on the hands-on activities, number of hands-on errors, time spent gaming the system, and lesson where GTS was observed? 2) What are the hardware servicing academic performances of the students in the control and experimental groups in terms of pretest scores, posttest scores, time spent on the hands-on activities, and number of hands-on errors? 3) Is there a significant difference between the academic performances of the students in terms of pretest scores, posttest scores, time spent on the hands-on exercises, and number of hands-on errors in the control and experimental groups?

The following null hypotheses were tested in this study:

  • H0a: There is no significant difference in the pretest scores of the experimental and control group.

  • H0b: There is no significant difference in the posttest scores of the experimental and control group.

  • H0c: There is no significant difference in the time spent on the hands-on activities of the experimental and control group.

  • H0d: There is no significant difference in the number of hands-on errors committed of the experimental and control group.

  • H0e: There is no significant difference in the pretest and posttest scores of the students in the control group.

  • H0f: There is no significant difference in the pretest and posttest scores of the students in the experimental group.

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2025): Forthcoming, Available for Pre-Order
Volume 13: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 12: 2 Issues (2022): 1 Released, 1 Forthcoming
Volume 11: 1 Issue (2021)
Volume 10: 2 Issues (2020)
Volume 9: 2 Issues (2019)
View Complete Journal Contents Listing