In the article "A Theory of Online Learning as Online Participation" Stefan Hrastinski (2008) presents empirical evidence that shows that online participation drives online learning. As evidence, Hrastinski points to numerous studies that have shown that interacting with others has more benefits than individualistic approaches to participation (Alavi, 1994; Brown & Palincsar, 1989). Benefits include spending more time on synthesizing ideas, integrating new concepts, problem solving, and critical thinking. In addition to these skills, time spent working cooperatively during online participation is linked to having a significant positive impact on student achievement. According to several studies, greater interaction amongst peers equated to more favorable outcomes as well. In addition to the benefits of cooperative learning online, evidence suggests that online learning is better when learners participate and collaborate. Evidence also suggests that learners who participated in collaborative or group learning were connected to high or higher learning outcomes than traditional settings. In summary, Hrastinski (2008) states that "the research reviewed here suggests that online participation drives better learning outcomes, at least when learning is measured as perceived learning, grades, tests, and quality of performances and assignments" (pg. 79).
So what is online learner participation? Following the presentation of evidence, Hrastinski discusses what online learner participation is and how one might conceptualize it. Hrastinski notes that Wenger (1998) and Webster's definitions of the concept align. Participation is then defined as "to have or take a part or share with others." While researchers agree that participation is an integral part of online learning, how the term is defined in this context differs greatly. While some scholars argue that participation is simply the number of times a learner engages in an aspect of learning, others argue participation is a "complex phenomenon" and "is a process of learning by taking part and maintaining relations with others" (pg. 80). Regardless of what definition one prescribes to, Hrastinski notes that there are several key characteristics of online learner participation. First, participation is a complex process that require one to maintain relations with others. Second, participation is supported by physical and psychological tools. Third, participation is not synonymous with talking or writing. And finally, participation is supported by all kinds of engaging activities. Hrastinski concludes with a call to action of sorts stating that if want to improve online learning we need to improve online learner participation. References
In the editorial "TPACK -- time to reboot?," Saubern et al. (2020) argue that while TPACK has proven to be highly popular foundation for educational technology research, the framework continues to be critiqued today. Specifically, critiques continue to explore the relation to defining and delineating the construct components, the relationships between construct components, measurement and validation, the predictive and prescriptive value of the framework and the relationship between TPACK and practice (Angeli et al., 2016). This editorial reflects on how TPACK scholarship is continuing to evolve with a focus on two key questions:
Using the search terms "TPACK" and "TPCK" to search the AJET website, 44 papers were reduced to 20 that used TPACK substantially as a theoretical or methodological base. The research problem, purpose and research question for each paper were identified and noted and the discussions and conclusions were analyzed in relation to the frameworks original goals. A common thread found throughout all of the papers is that "to use technology effectively in teaching and learning, teachers must integrate knowledge of technology with knowledge of pedagogy and content" (pg. 3). However, Saubern et al. (2020) explain that a shift in the research occurs between the identification and the application of the TPACK framework. Rather than focusing on the integrated knowledge base of the framework, much of the research examines the several components of the framework separately. They note that few papers, if any, have investigate what it means to develop the knowledge that can only emerge from the application of all three components and the relationship between this knowledge and what is considered effective classroom practice. The central idea of TPACK, Saubern et al. (2020) argues, seems to have slipped away. This editorial claims that the problem lies in the TPACK diagram because it encourages study of the seven components "as if they are the thing that we are interested in" (pg. 5). Analysis of the 20 papers reveals a great deal of time and energy has been spent on validating the structure of TPACK rather than focusing on the integrated knowledge base that emerges from the application of integrating the 7 components. For this reason, the authors argue through this editorial for a reboot in TPACK research. To refocus on understanding the specialist form of knowledge that emerges when knowledge of technology is integrated with pedagogical knowledge and content knowledge. The editorial concludes with a call to action for researchers to engage with the critical question which TPACK may provide insight--how best to improve teaching and learning with technology? This editorial provides a strong meta-analysis of the existing research surrounding TPACK. Claims made by Saubern et al. (2020) are significant in that they highlight an area of research that has yet to be examined. Through careful analysis of 20 papers, they make convincing claim that if researchers rebooted the focus of research around TPACK to focus on the integration of the seven components and how they work together to allow for the emergence of a specialized, integrated knowledge base the answer to how best to improve teaching and learning with technology may also emerge. The call to action in the conclusion provides future facing language that is helpful for researchers looking to make new contributions to the field while build on the work of those that came before. As mentioned by the authors, those who have spent time examining best practice models and frameworks for technology integration are familiar with TPACK. What I found particularly interesting was how spot on the authors argument about the use of the frameworks diagram was. I've been introduced to the model by multiple instructors, and yet few have talked about the specialized, integrated knowledge that emerges at the intersection of all seven components. Instead, the all too common introduction is an oversimplification of the three intersecting circles. While I’m not particularly interested in exploring how TPACK can be used to answer the question of how best to improve teaching and learning with technology, the meta analysis of the 20 papers certainly makes the case for someone who is interested in pursuing this line of thinking. References
Angeli, C., Valanides, N., & Christodoulou, A. (2016). Theoretical Considerations of Technological Pedagogical Content Knowledge. In M. Herring, M. J. Koehler, & P. Mishra (Eds.), Handbook of technological pedagogical content knowledge (2nd ed) (pp. 21–42). Routledge. Saubern, R., Henderson, M., Heinrich, E., & Redmond, P. (2020). TPACK–time to reboot?. Australasian Journal of Educational Technology, 36(3), 1-9. Guzey et. al (2021) discuss their investigation on the impact of the tool Edpuzzle on the achievement of an Assessment and Measurement undergraduate course. Edpuzzle is a tool that enables technology-enhanced assessment. The tool became more heavily used during the pandemic when schools went remote and instructors wanted to track seat time and better understand how students were engaging with their content. The research study employed a an exploratory research design with a pre- and post-test control group. Guzey et. al (2021) explains that tools like Edpuzzle provide a practical approach to technology-enhanced assessments because the instructor can make enhanced item types without coding. Users can quicky create new content or customize existing content based on their objectives. Expuzzle also provides immediate feedback and scoring or students and enhanced reporting for the instructor. Guzey et. al (2021) explains the affordances of using Edpuzzle that have been discussed in the literature. Specifically, the researchers claim that the research shows that Edpuzzle increases student motivation, helps students construct conceptual knowledge, improves academic achievement while reducing cognitive load, and supports self-regulation skills. Having identified Edpuzzle as a web 2.0 tool, the researchers also point out the positive impact web 2.0 tools have been linked to having on student achievement in the literature. For this reason, this study sought to elaborate on the impact of using Edpuzzle, a web 2.0 tool, on an undergraduate course. Guzy et. al (2021) took a pre- and post-test-matched control groups design approach. One hundred and sixty six students undergraduate students participated in the study. Groups were matched based on their teaching assignments and randomly assigned as control and experimental groups. The control group was made up of science and social studies teacher candidates, while the experimental group was made up of math and Turkish Language teacher candidates. An achievement test from a nationwide public personnel selection exam used in earlier years was used to collect achievement data about the Assessment and Measurement course. Quizzes consisted of multiple-choice and short response open-ended question format and were prepared for the following topics: fundamentals, statistics/central tendency, reliability, and validity. A rubric was used to evaluate students' performance on a performance assessment that incorporated writing items on a chosen topic and developing an achievement test. The analytics gathered from student engagement with the tool were used for analysis. A pre-test was given to both groups before the experimental intervention. Those in the experimental group watched videos and took quizzes in Edpuzzle each week. Student performance was used to identify gaps in the learning. Students in the experimental group also received corrective feedback and were exposed to instructional strategies like group activities, presentation, discussions, peer learning, and question-answer sessions. In the control group, the instructor taught in a traditional way. The result showed no significant difference between control and experimental groups on their pre-test score. However, there was a significant difference between groups on their post-test scores (medium-level effect size). The experimental group also performed significantly better than the control group on the topics of fundamental concepts, central tendency, central dispersion, and validity. The groups did not perform significantly differently in the performance assessment task. Guzy et. al (2021) conclude by explaining that the results suggest that the group that utilized Edpuzzle as a web 2.0 tool achieved significantly better results than the control group that did not use the web. 2.0 tool. Potential reasons for non-significant difference on student achievement on the performance tasks is suggested as an avenue future studies. Students in the experimental group received corrective feedback and were exposed to instructional strategies like group activities, presentation, discussions, peer learning, and question-answer sessions. In the control group, Guzy et. al (2021) state that the instructor taught in a traditional way. What "traditional" instruction is and looks like is not clearly defined by the researchers. The educators also fail to recognize the limitations of the study. The research focused on adult learners participating in an undergraduate teaching course that incorporated opportunities for face-to-face learning experiences. These results could not be generalized to adolescents and online teaching environments. This could also be mentioned as an area for future research since it was noted in the introduction that interest around the tool stemmed from a desire to make sure students were really watching videos during the Pandemic. While this behavior is not limited to adolescents, it does seem more characteristic of them than adult learners. While web 2.0 tools are not an area of research I am interested in pursuing for my dissertation, I was interested in finding out what the research says about tools like Edpuzzle, Playposit, and Discovery Education Experiences video quizzes. Teachers have gravitated towards these tools because of the analytics, reporting, and progress monitoring functionality, but little is discussed about their impact on teaching and learning. While this study does show that Edpuzzle had a positive impact on undergraduate students student achievement in an Assessment and Measurement course, I'm not convinced these same results could be replicated in a k-12 environment. I also think that some concepts lend themselves better to video instruction and some learners prefer to engage with the learning through video while others do not. This leaves a great deal of gray area in terms of the value tools like Edpuzzle have on teaching and learning.
Reference SULAK GÜZEY, S., AKDOĞDU YILDIZ, E., DEMİR, M. C., & AKSU-DÜNYA, B. (2022). Impact of EDpuzzle Use on the Assessment and Measurement Course Achievement. Journal of Hasan Ali Yücel Faculty of Education / Hasan Ali Yücel Egitim Fakültesi Dergisi (HAYEF), 19(1), 52–60. https://doi.org/10.5152/hayef.2021.21045 Researchers have found that several problems emerge when a project-based learning approach is applied to large classrooms. For instance, difficulties motivating students, concentrating on the learning tasks, making connections between new content and prior knowledge, and effectively engaging in cooperative learning activities. While technology has resolved many of the problems related to the efficiency of cooperative learning activates, barriers to project-based learning such as promoting motivation students' motivation and concentration on the learning tasks persist. Another challenge is the need for the development of an effective instructional strategy for conducting project-based learning activities. Researchers have identified storytelling as an effective approach to overcoming. For this reason, Hung et. al (2012) explain that their study "aims to propose a project-based digital storytelling approach, which combines project-based learning and digital storytelling strategies to conduct learning activities for elementary schools students" (pg. 369). The experiment conducted sought to verify the effects of a project-based digital storytelling approach on enhancing students' motivation, problem solving skills, and overall achievement. Through a review of the related literature, Hung et. al (2012) presents several understandings about project-based learning. First, that project-based learning is an instructional strategy that appeals to students. Second, that project-based learning enhances collaboration and cooperation between group members, which in turn reinforces learning cognition, and promotes learning achievement. In addition to student appeal and enhanced collaboration and cooperation, understandings about the design of project-based learning experiences are discussed as well. For instance, scholars note that situations need to be established to guide students when designing project learning activities. In this study, Hung et. al (2012) integrate the digital storytelling strategy in the project-based learning approach. Digital storytelling as a strategy integrates multimedia applications and software with digital storytelling techniques to help learners become involved in a learning situation. Hung et. al (2012) notes that in previous studies digital storytelling has been perceived as an effective strategy for encouraging cooperation and knowledge construction in classrooms. This study is one of significance because previous studies have not researched the effects of integrating digital storytelling and PBL on problem-solving competence and learning achievement. For this reason, this study investigates how digital storytelling can be implemented to develop PBL learning experiences that include taking pictures with digital cameras, developing a storyline using captured images, and producing a video with a soundtrack, and presenting the final product. A pre-test and post-test designed quasi-experiment with non-equivalent groups was conducted. The different learning modes served as variables with the experimental group participating in PBL with digital storytelling and the control group experiencing the traditional approach to PBL. The science learning motivation scale, the problem solving competence scale, and the science course achievement test represented the dependent variables. In relation to the effect of project-based learning with digital storytelling on science learning motivations the score of the experimental group proved to be superior to the control group. Hung et. al (2012) explains that the score shows that the project-based learning with digital storytelling could effectively enhance the science learning motivation of the students. Similarly, the effect of project-based learning with digital storytelling on problem-solving competence proved to be superior for the experimental group. From this result, Hung et. al (2012) again draws the conclusion that digital storytelling can effectively enhance the problem-solving competence of the students. Finally, the results reveal that project-based learning with digital storytelling can also effectively enhance science learning achievement of the students. No significant differences were found when looking at the results in relation to gender. Hung et. al (2012) concludes with a discussion of the limitations noting it could be difficult to generalize the findings to other courses or subjects because this study was done in an elementary school science course. The software used in this study was also highly structured. More research would need to be done how the effects of PBL with digital storytelling using flexible software applications. .Hung et. al (2012) begin the introduction with a broad problem in the field. Specifically, they explain that "researchers have tried to develop various computerized systems or guiding strategies to assist students in improving their learning performance" (pg. 368). However, effective instruction is required to cultivate key competences of students--especially instruction that leverages technology to promote student-centered learning. A comprehensive review of related project-based learning and digital storytelling literature from the proceeding decade follows. A pre-test and post-test designed quasi-experiment with non-equivalent groups using the different learning modes as the independent variables brings into question whether the characteristics of the participants in the elementary science course provide a different explanation for the outcomes. This is because in nonequivalent group designs researchers select groups that appear similar. I found the investigation into the effects of a project-based digital storytelling approach on enhancing students' motivation, problem solving skills, and overall achievement interesting. Having taught a digital literacy and media design course that embraced PBL for ten years, I could relate to the difficulties that with implementing project based learning that were outlined n the introduction and review of related literature. Project-based learning places the teacher in the role of the guide on the side while students engage in inquiry and grapple with complex problems. This type of learning environment can be very engaging for some students, while other students tend to find the lack of structure overwhelming and experience difficulty focusing on the learning goals. Hung et. al (2012) note that the results of this study might be difficult to generalize because the technology used in the study was highly structured. Given that the ISTE Standards for Students promote students choosing their own technology applications based on a target audience and their message, I would be interested in finding out if the results of this study could be replicated when students are able to choose the technology they use to engage in the process of digital storytelling. Reference
Hung, C. M., Hwang, G. J., & Huang, I. (2012). A Project-based digital storytelling approach to improving students’ learning motivation, problem-solving competence and learning achievement. Educational Technology & Society, 15(4), 368–379. Participants did not require any technical support or onboarding to designing hypermedia at the beginning of the study. The majority of students graduated from computer departments in vocational or technical high schools. An interview schedule was developed to collect qualitative data on the students' opinions and perceptions of the use of hypermedia as a cognitive tool in a constructivist leaning environment. This schedule consisted of sixteen questions around the following topics:
Eight groups were interviewed in total at the end of the semester. Through content analysis of the data, eight interviews were transcribed into interview files. A predetermined set of terms from the literature were used for coding which in turn was used to classify and organize the data. Themes were then applied to make meaningful connections of the codes. And finally, the coded data was provided with a description under the themes of problems and difficulties faced by students for interpretation and discussion. Overall Assessment of the Constructivist Learning Environment
Developing Hypermedia
Group Work
Learning Activities Conducted in the Classroom
Assessment of the Students' Performance
Based on these results, the study reveals that "students found using hypermedia as a cognitive tool to be effective for constructing an understanding of the content" (115). Yildirim notes that the amount of content covered in a learning environment of this nature must be taken into account. Another result of this study is related to group performance. The results show that most of the students benefited from the group work. Yildirim suggests that instructors provide more learning activities to make the strategy more effective. Hie concludes with a suggestion for future research, stating that "further research studies are needed to examine the full effects of hypermedia as a cognitive tool for knowledge acquisition, in comparison to traditional classroom instruction and other computer-based cognitive tools" (pg. 116). Reference
Yildirim, Z. (2005). Hypermedia as a Cognitive Tool: Student Teachers’ Experiences in Learning by Doing. Journal of Educational Technology & Society, 8(2), 107–117. |
Archives
December 2022
Categories
All
|