THE ROLE OF LOGICAL BELIEFS IN PREDICTING CHATGPT ADOPTION AND AVOIDANCE IN HIGHER EDUCATION
DOI:
https://doi.org/10.53555/dv2gy534Keywords:
ChatGPT, logical beliefs, technology adoption, avoidance, higher education, epistemic beliefs, TAM, TPBAbstract
This conceptual research paper examines how logical beliefs—defined here as cognitive, evidential, and epistemic judgments about an innovation—shape both the adoption and avoidance of ChatGPT in higher education. While current debates emphasize ethical risks, skill impacts, and institutional policy, less attention has been paid to how faculty and students form reasoned beliefs about ChatGPT’s reliability, usefulness, and limits, and how those beliefs translate into behavioural intentions. Integrating Theory of Planned Behavior (Ajzen, 1991), the Technology Acceptance Model (Davis, 1989), and scholarship on epistemic beliefs (Hofer & Pintrich, 1997), this paper develops a theoretical model in which logical beliefs (accuracy beliefs, transparency beliefs, evidential beliefs, and boundary beliefs) influence perceived usefulness, perceived ease of use, normative pressure, and perceived behavioral control, and thereby predict both adoption and conscious avoidance. The paper proposes an empirical mixed-methods design to test the model in multiple higher-education contexts, outlines measurement approaches, and discusses implications for policy, instructional design, and faculty development. Practical recommendations for institutions to reduce unwarranted avoidance and to ensure responsible uptake are provided. Limitations and future research directions are discussed.
References
1.Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179–211. https://doi.org/10.1016/0749-5978(91)90020-T
2.Bai, H., Wang, J., & Chai, C. S. (2023). Understanding teachers’ adoption of artificial intelligence in education: A systematic review. Educational Technology Research and Development, 71(5), 2261–2287.
3.Bond, M., Bedenlier, S., Marín, V. I., & Händel, M. (2020). Emergency remote teaching in higher education: Mapping the first global online semester. International Journal of Educational Technology in Higher Education, 17(44), 1–24.
4.Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319–340. https://doi.org/10.2307/249008
5.Dietvorst, B. J., Simmons, J. P., & Massey, C. (2015). Algorithm aversion: People wrongly avoid algorithms after seeing them err. Journal of Experimental Psychology: General, 144(1), 114–126. https://doi.org/10.1037/xge0000033
6.Dwivedi, Y. K., Hughes, L., Ismagilova, E., et al. (2021). Artificial intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research. International Journal of Information Management, 57, 101994.
7.Ertmer, P. A. (1999). Addressing first- and second-order barriers to change: Strategies for technology integration. Educational Technology Research and Development, 47(4), 47–61. https://doi.org/10.1007/BF02299597
8.Fishbein, M., & Ajzen, I. (1975). Belief, attitude, intention, and behavior: An introduction to theory and research. Addison-Wesley.
9.Gefen, D., Karahanna, E., & Straub, D. W. (2003). Trust and TAM in online shopping: An integrated model. MIS Quarterly, 27(1), 51–90.
10.Hofer, B. K., & Pintrich, P. R. (1997). The development of epistemic theories: Beliefs about knowledge and knowing and their relation to learning. Review of Educational Research, 67(1), 88–140. https://doi.org/10.3102/00346543067001088
11.Kasneci, E., Sessler, K., Küchemann, S., et al. (2023). ChatGPT for good? On opportunities and challenges of large language models for education. Learning and Individual Differences, 103, 102274.
12.King, W. R., & He, J. (2006). A meta-analysis of the technology acceptance model. Information & Management, 43(6), 740–755.
13.Marikyan, D., & Papagiannidis, S. (2023). Unified theory of acceptance and use of technology (UTAUT) revisited: A review and research agenda. Journal of Business Research, 154, 113337.
14.OpenAI. (2022). Introducing ChatGPT. https://openai.com/blog/chatgpt
15.Rogers, E. M. (2003). Diffusion of innovations (5th ed.). Free Press.
16.Selwyn, N. (2016). Education and Technology: Key Issues and Debates (2nd ed.). Bloomsbury Academic.
17.Shin, D. (2021). The effects of explainability and causability on trust in AI systems. Telematics and Informatics, 58, 101494.
18.Sun, Y., & Zhang, P. (2006). The role of affect in information systems research. Communications of the Association for Information Systems, 17(1), 295–329.
19.Teo, T. (2011). Factors influencing teachers’ intention to use technology: Model development and test. Computers & Education, 57(4), 2432–2440.
20.Trust, T., Whalen, J., & Mouza, C. (2023). ChatGPT: Challenges, opportunities, and implications for teacher education. Journal of Digital Learning in Teacher Education, 39(2), 63–69.
21.Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425–478. https://doi.org/10.2307/30036540
22.Venkatesh, V., Thong, J. Y. L., & Xu, X. (2012). Consumer acceptance and use of information technology: Extending UTAUT. MIS Quarterly, 36(1), 157–178.
23.Wang, Y., & Wang, Y. (2010). Determinants of e-learning adoption in higher education. Information & Management, 47(3), 146–154.
24.Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review of research on artificial intelligence applications in higher education. International Journal of Educational Technology in Higher Education, 16(39), 1–27.
25.Zhai, X. (2022). ChatGPT user experience: Implications for education. Educational Technology & Society, 25(4), 1–15.






