Getting the middle finger from your audience isn’t typically a desired outcome for a talk, but it was exactly the type of reaction I was hoping for during my keynote at the Free Software Foundation’s LibrePlanet conference in mid March.
You see, I had just asked a packed room of Free Software enthusiasts to agree to use a few edtech tools that would help support the learning experience of my talk. There was nothing to worry about, I assured them. These tools – a learning management system, a plagiarism detection tool, and a remote proctoring service – were standard for thousands of educational institutions around the world. I explained how their brilliant features – like facial recognition technology, user data collection, and keystroke tracking are revolutionizing learning by generating personalized learning analytics, empowering fairness, and wiping clean all of the annoying frictions involved in the day-to-day gruntwork of education.
So, I said to the audience, I just need everyone to nod in agreement to using these tools before continuing. That’s when the head shaking and appalled murmuring began. You’d think I had just asked the audience if they would voluntarily surrender the use of their limbs or the privacy of their thoughts. And in fact, in some ways, I had. As an organization devoted to the freedom to run, study, modify, and share code, Free Software Foundation likely viewed the surveillant, controlling edtech tools I had introduced as in violation of basic human rights.
I stayed with the discomfort for as long as I could and then admitted I was joking. I was just trying to make a point. If the attendees had been one of the millions of students around the world required to use these tools in their courses, their refusal would equate to opting out of their educational institution. As a result, these tools and the invasive, controlling practices they enable have been normalized and are teaching a vast segment of software users that digital surveillance and control are nothing to worry about. This “lesson” has in fact been so effective that many students have become so resigned to digital surveillance that they believe, as one research paper noted, that “concerns about privacy are akin to believing conspiracy theories.” At a moment when education should be preparing students to question and resist digital surveillance and manipulation, we are teaching them to be passive users, complacent with technology as it is given.
For many, this is just what edtech looks like, and given the lack of visible alternatives, it can be hard to imagine it otherwise. In my research, however, I’ve found that edtech can be wildly more creative, intellectually stimulating, and less dystopian when academic communities have the freedom and resources to build, study, modify, and share their edtech tools. I spent the rest of my talk making this case by pointing to historical and present-day examples of edtech projects that do just this in spite of major institutional hurdles. For example, in 1979, the graduate student Hugh Burns created a charming Aristotle-inspired chatbot to help students with their writing, which was recreated by Jason Orendorff after hearing about it in my keynote for !!Con 2020. I still think this vintage writing tool is more useful – and delightful – than many of the AI writing assistants emerging today.
You can watch me discuss edtech at length in my LibrePlanet talk here if you like, but the point I want to make in this post is the relevance of these edtech issues to the project of software supply chain security. Though they may seem unrelated, the challenges we see in both enterprises are pointing to a shared need for two key user capacities. The first is user visibility into software systems to enable meaningful understanding of every software component it entails and the consequences of running those components. In an edtech context, such visibility could enable students to know things like what types of personal data the software was tracking and who had access to that data. In a security context, this visibility (which can be provided for today with a software bill of materials or SBOM) enables an engineer the ability to see if they are running software with vulnerabilities that need to be patched or resolved in other ways.
This visibility is meaningless, however, unless the user is also given the second key capacity of user agency, or the ability (and assumed responsibility) to make decisions about what types of software can and can’t run in their software system. For edtech, if such agency was enabled, students could prohibit software that tracks their browsing activity, eye movement, or deleted keystrokes. For security-minded engineers, it means having the ability to disable or restrict software that does not meet their security needs.
Though it might seem like a stretch to compare software supply chain security use cases with those of edtech, the problems arising in each category point to a need for greater user transparency and agency in the software industry at large. Just as the food industry in the 20th century recognized the importance of consumer visibility and agency with regards to food products (and began requiring standardized food labels), the software industry is coming to recognize the range of social, technical, and safety benefits of those same affordances in its software products. As the industry works towards establishing tools, protocols, and policies around visibility and agency for different use cases, there is much that software sub-industries like cybersecurity, edtech, health tech, and AI can learn from each other.
Like with food labels, safety might be the first and most obvious use case for visibility and agency over our software systems. But also like with food labels, this visibility and agency enables engineers and end users to make informed decisions based on concerns that go well beyond security, including ethical, environmental, or social issues. Ultimately, both the health of software supply chain security and society’s general relationship to software depends on our ability to create a culture of visibility and agency around the software that mediates our lives. Maybe it’s time our edtech tools start teaching those freedoms to our students.