A Multi-Stakeholder Perspective of Analytics for Learning Design in Location-Based Learning

A Multi-Stakeholder Perspective of Analytics for Learning Design in Location-Based Learning

Gerti Pishtari, María Jesús Rodríguez-Triana, Terje Väljataga
Copyright: © 2021 |Pages: 17
DOI: 10.4018/IJMBL.2021010101
Article PDF Download
Open access articles are freely available for download

Abstract

Promoted by the growing access to mobile devices and the emphasis on situated learning, location-based tools are being used increasingly in education. Multiple stakeholders could benefit from understanding the learning and teaching processes triggered by these tools, supported by data analytics. For instance, practitioners could use analytics to monitor and regulate the implementation of their learning designs (LD), as well as to assess their impact and effectiveness. Also, the community around specific tools—such as researchers, managers of educational institutions, and developers—could use analytics to further improve the tools and better understand their adoption. This paper reports the co-design process of a location-based authoring tool that incorporates multi-stakeholder analytics for LD features. It contributes to the research community through a case study that investigates how analytics can support specific LD needs of different stakeholders of location-based tools. Results emphasise opportunities and implications of aligning analytics and LD in location-based learning.
Article Preview
Top

1. Introduction

Advances in mobile and wireless technologies have made possible to extend the boundaries where learning happens. Location-based authoring tools are an example of these technologies that assist the creation of context-aware mobile learning (m-learning) activities outside the classroom (Muñoz-Cristóbal et al., 2018). Practitioners can adopt these tools to create innovative learning activities in line with their learning design (LD) goals (Burden et al., 2019). However, due to the distributed nature of learning in these environments, where learning happens across spaces (e.g., physical and virtual) and settings (e.g., formal and informal), designing, monitoring and evaluating LDs entail additional challenges, thus affecting practitioners (and other involved stakeholders) practices (Pishtari et al., 2020). In general, research in LD has considered as community stakeholders the community of practitioners around specific LD tools (Hernández-Leo et al., 2019). However, understanding the teaching and learning processes supported by LD tools is a matter of interest for other stakeholders as well. For instance, understanding the usage, impact, and adoption of the tools could also help researchers, developers, or managers of educational institutions.

The research communities of LD and Learning Analytics (LA) have provided different solutions for these issues. LD can help to guide and contextualise the analysis, by making them more meaningful for the involved stakeholders, while analytics can inform design decisions and help to evaluate LDs (Persico & Pozzi, 2015). In the context of m-learning, only a few works inquire about the benefits of aligning analytics and LD (Pishtari et al., 2020), while to the best of our knowledge none has considered all stakeholders around a given tool, with an interest in LD practices.

This paper reports the first steps of the development of a location-based authoring tool that integrates analytics for LD features (as dashboards), following a co-design process. The tool aims to support stakeholders with analytics to design innovative m-learning activities outside the classroom, as well as to understand the usage, impact, and adoption of the tool. The paper presents insights from practitioners, researchers, and managers of educational institutions, about how analytics can support their specific LD needs and is driven by the following research questions (RQ):

  • (RQ1) What kind of information could help practitioners during the creation of LDs outside the classroom with location-based authoring tools?

  • (RQ2) What aspects of the LDs would practitioners like to monitor, or assess from the students?

  • (RQ3) How would practitioners assess the effectiveness of the LDs?

  • (RQ4) What would the researchers and managers of educational institutions like to know about the creation and usage of LDs?

To answer these questions we carried out contextual inquiries with 5 practitioners (with a main focus on RQ1-RQ3), and semi-structured interviews with 2 researchers and 2 managers (RQ4) of two location-based tools (Avastusrada and Smartzoos). Findings were grouped using the AL4LD framework (Hernández-Leo et al., 2019) and the learning context that they pertain (see Table 1), which later served as guidelines for a design workshop that produced three dashboard prototypes that aim to support different stakeholders’ LD practices through analytics (each based on a different metaphor), as well as their evaluation process.

The rest of the paper is structured as follows: section 2 presents the related work; section 3 gives an overview of the context of the research; section 4 describes the research methodology; section 5 presents the main findings of the interviews, the prototype dashboards, and their evaluation; section 6 discusses the implications from the results; section 7 concludes the paper and gives an outlook of future work.

Top

LD, as an artifact, is the sequence of learning tasks, resources, and supports developed by practitioners that captures the pedagogical intent of a unit of study (Lockyer et al., 2013). Contributions from the LD field include representations, authoring tools, design frameworks and methodologies that support practitioners to create, share and implement lesson plans (Persico & Pozzi, 2015). In m-learning contributions include authoring tools, frameworks, and tools that integrate LDs across spaces (Pishtari et al., 2019a). Examples specific to location-based learning are usually in the form of authoring tools (e.g., Muñoz-Cristóbal et al., 2018).

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024)
Volume 15: 2 Issues (2023): 1 Released, 1 Forthcoming
Volume 14: 4 Issues (2022)
Volume 13: 4 Issues (2021)
Volume 12: 4 Issues (2020)
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing