|James Elder Christie: The Pied Piper of Hamelin Public domain (CC0) on Wikimedia Commons|
A major argument in the educational technology debate over the years has been to facilitate widened access to education through digital platforms and applications. This has indeed been achieved to a large extent with the widespread adoption of open access to scientific articles, open educational resources published by many institutions around the world, global collaboration projects such as Wikipedia as well as the use of learning management systems, video conferencing platforms and networking tools. The options for studying from home, even in sparsely-populated regions have increased dramatically and it would be easy to claim that the mission of widened access to education has been achieved.
However, as the power and sophistication of educational technology has increased so we see new inequalities and biases appearing and this is the topic of an excellent article by Laura Czerniewicz, Multi-layered digital inequalities in HEIs: the paradox of the post-digital society. Although millions of students and teachers were able to adapt successfully to online education during the pandemic there were many who suffered. Many students had poor internet access or could not afford to pay for access. Many had to study with only a mobile or sharing one device with the rest of the family and in many countries reliable electricity is still a dream. The rollout of modern edtech assumes that everyone has the right devices, internet access and reliable electricity and thus the digital divide is further accentuated.
The digital divide is alive and well; indeed the digital paradox is that even as the basics of the divide are addressed through access, more complex layers of exclusion are added; digital inequalities thus morph into new complicated forms. Nevertheless, fair and equitable technological infrastructure is the foundation of inclusion in HE: electricity, devices, ubiquitous connectivity and cheap data. These are essential but insufficient.
The pandemic has also forced educational institutions to invest in new technology and the urgency has meant that many have overlooked the serious implications for user privacy associated with much of today's IT industry. The growth of surveillance capitalism, as defined by Shoshana Zuboff, is particularly worrying with the big five tech giants (Amazon, Google, Microsoft, Apple, Facebook) controlling and storing so much of our daily lives.
The intensive digitalisation catalysed by the pandemic and concomitant “online pivot” means that HE is in danger of fast becoming a site of surveillance capitalism, with the concomitant dangers for equity, little transparency and unequal terms of engagement.
With such an upsurge in educational technology many institutions have signed major contracts with companies without questioning what those companies will do with all the student data they gather. We all happily accept the terms and conditions, cookie settings and other privacy settings without really understanding the implications and no matter how digitally literate we may be those settings are not designed to be understood. The hidden biases of algorithms can add to further exclusion of some students groups especially when combined with the rise of artificial intelligence in tracking student progress, identifying weaknesses and detecting cheating.
The amalgamation of the digital into higher education, through the dominant extractive economy, introduces complex and often invisible power dynamics into public higher education. The terms of engagement are imbalanced, hidden behind dense language and easy promises. There are especially profound implications for those with barriers to participation at individual and institutional levels. This has introduced several new inequities into the student experience and the sector.
Czerniewicz presents some strategies to prevent this rather dystopic development. Firstly, institutions need to develop better platform literacy and be more able to demand safeguards from technology providers. We need to decide whether the risks of buying a platform outweigh the advantages - no matter how cool it may be. Many technology decisions are governed by FOMO (fear of missing out) and this tendency must be changed. She advocates iterating towards equality, developing strategies to identify inequalities and minimising risks as far as possible, i.e. Developing equitable ethical data policies and frameworks.
In recent years, schools have adopted technologically-augmented surveillance tools, some of which rely on predictive analysis and automated responses to flag suspicious behavior, trigger investigative or disciplinary action, or prioritize crisis response resources. The surveillance tools can have a disproportionate impact on disabled students and students of color - and likely on disabled students of color in particular.Although the European General Data Protection Regulation (GDPR) has increased our awareness of some of these dangers there is still a need for a deeper understanding of the consequences of the IT systems we use. If we are not careful we risk following the pied piper into the cave, dancing merrily to his tune. Sometimes we must dare to say no to the lure of technology.