What if I could predict your future from the colour of your...

What if I could predict your future from the colour of your shoes?


From Janine Arantes

This post is part of a short series based on a a day-long symposium on ‘digital education futures’, hosted by DER in February 2019. The previous posts are here and here. In this short article, Janine Arantes writes about discrimination and bias in algorithms, exploring the implications for education.

We are embarking on the age of the impossible-to-understand reason, when marketers will know which style of shoe to advertise to us online based on the type of fruit we most often eat for breakfast, or when the police know which group in a public park is most likely to do mischief based on the way they do their hair or how far from one another they walk. (Paul Ohm (2012) p.81)

Algorithms and predictive modelling are currently within the K-12 space. Predictive modelling is where future trajectories are algorithmically predicted based on past patterns across diverse sets of data. It allows correlations in the data to be made, and as such predict the likelihood of future behaviour. Algorithms are simply mathematical rules that are used to interpret and make sense of data too large to be analysed by mainstream databases (Big Data). Predictive modelling is fairly ubiquitous now. For example, predictive modelling is partially how Google and Yahoo provide predictive search results and as such shape or modulate the search findings. Other apps and platforms offer insights and recommendations to teachers about academic and/or socio-emotional aspects of learning. 

The recent Digital Educational Futures Workshop at Monash University, discussed some of the implications of predictive modelling in the K-12 space. The workshop aimed to explore this topic from a critical stance. There are multiple perspectives that could be explored from such a positioning, both supporting and challenging the presence of predictive modelling and algorithms in K-12 classrooms. For example, predictive modelling has offered teachers the ability to analyse data and forecast when an ‘at risk’ student is likely to drop out of a course and provide insights on how to best work with students personal interests and those who have additional needs. Therefore, from a critical perspective predictive modelling could be seen to be an essential tool to target teachers to act, intervene and then personalize their educational practice as a result. Inhibiting such innovation, may potentially diminish equal and accessible education to all.

Acknowledging that there is potential for significant benefit, what are some of the implications from a critical stance? Wider research highlights that predictive modelling has been shown to constitute discrimination (Cheney-Lippold, 2011), influence judicial decisions (Lightbourne, 2014) and politics (Shorey & Howard, 2016) and increase intellectual isolation (Pariser, 2011). Discrimination and bias can become apparent in predictive modelling outputs, when ‘training’ the algorithm to find correlations in the datasets. That is, to make predictions, multiple datasets are aggregated from multiple contexts, allowing interactions between datasets and distinctly different spaces to occur. Should there be historical or social biases in the training data, the predictions will perpetuate the same biases. Secondly, data in one context may be completely different to another. As such what data represents ‘you’ may vary. Consider the data you may provide when answering, “What are your strengths and weaknesses on LinkedIn? What about Tinder?

There are myriads of other opportunities and challenges in relation to exploring predictive modelling and algorithms from a critical perspective. This blog aims to only touch on a handful, and invite you to become part of the conversation to explore various critical ideas in more depth. Where predictive modelling may be hugely beneficial for some in the classroom, it may also unwittingly discriminate minority groups. Therefore, there is a need to debate the presence of algorithms and predictive modelling in K-12 classrooms. 

Q: How has predictive modelling been discussed about  at your school?

About the Author:

Janine Arantes is a PhD student from the University of Newcastle who is currently seeking Australian K-12 teachers who use apps and platforms as part of their educational practice. More information here: ‘Apps in Australian Classrooms

Follow Janine on twitter @Aldous2018