Cambridge Festival participant Ella McPherson answers questions about everything from leading with courage in a world of Big Tech to academic freedom.
Courageous leadership is about a critical engagement with the zeitgeist. This isn’t just about thinking differently. It’s about recognising that there are deliberate forces behind what can feel like unstoppable ideas in society, about calling these forces out and making space for critiques and alternatives.
Ella McPherson
Ella McPherson [2004] is Professor of the Sociology of Media and Technology and Co-Director of the Centre of Governance and Human Rights at the University of Cambridge. She will be speaking at the Gates Cambridge Cambridge Festival event on 19th March [6-7.30pm] on a panel on leading with courage in today’s turbulent world. Fellow Scholars Dan Greenfield [2005], D’Arcy Williams [2019] and Cillian Ó Fathaigh [2014] are also speaking.
Ella is also Deputy Head of Cambridge’s School of the Humanities and Social Sciences. Her research is concerned with symbolic struggles surrounding media and technology in times of transition. This includes human rights fact-finding in the digital age, everyday resistance to Big Tech in the UK, and the rise of generative AI in academia. In 2020-21 she was Special Adviser to the House of Lords’ Digital and Communications Committee for their Freedom of Expression Online Inquiry. She has also contributed research on technology and human rights practice as well as on the digitally-mediated freedom of assembly to the United Nations.
Cambridge Festival asked Ella about her views on leading with courage, academic freedom, trust and resisting the sense that the power of Big Tech is inevitable.
Q: In your work on the sociology of new media and digital technology, how would you define ‘courageous leadership’ in today’s technology-infused world?
EM: Courageous leadership is about a critical engagement with the zeitgeist. This isn’t just about thinking differently. It’s about recognising that there are deliberate forces behind what can feel like unstoppable ideas in society, about calling these forces out and about making space for critiques and alternatives. I see this all the time in the interplay between technology and society.
Right now, we are in what I would call a zeitgeist of AI-modernity, where the intervention of generative AI into so many facets of our lives feels inevitable. However, the explosively-growing generative AI market depends on our fatalism, and we know that Big Tech has an enormous PR engine behind that uses techno-optimism to make new technologies hard to resist. In cases like these, courageous leaders expose these power relations and turn towards their constituencies, making collective spaces for their alternative visions of the future and ideas about how to get there.
Q: With the rise of disinformation and declining public trust in institutions, how can leaders use digital technology responsibly to build trust rather than amplify division?
EM: Disinformation’s consequences arise in part because of how our political and media leaders have reacted to the potential and reality of disinformation, which has harmed traditional forms of trust and belief just as the disinformation itself has. Fear of disinformation can close us off to listening to strangers – to the cosmopolitan attitude that underpins democracy.
Trust is an interpersonal, relational concept; if I trust you, I expect that you have my best interests at heart. Communications technology, by its very design, poses challenges for trust, because it intermediates that interpersonal relationship and introduces new interests to it, namely the interests of the technology, which often are related to profit derived from our data. Technology, therefore, is not the tool I would recommend leaders turn to in order to build trust, but rather good, old-fashioned listening, promising and delivering.
That said, I do see signs of the nature of trust changing in the UK and US societies I know best. The prevalence of digital fakery that generative AI enables may mean we are moving from a trust anchored in fact, as exemplified by scientific and journalistic methods, to one anchored in truth, or how we know the world to be. It may be that we believe less in believing, and more in our experience of being.
Q: As co-director of the Centre for Governance and Human Rights [CGHR], what lessons have you learned about leading with courage when advocating for human rights under growing authoritarian pressure?
EM: I’ve learned so much from human rights defenders about leading with courage. For example, they have taught me about understanding and anticipating backlash, a fear of which can otherwise stymie us. Yes, backlash can be scary and frustrating, but backlash also means that your call for change is doing something; it is seen as a threat to the status quo. And even though it might seem to be about something else, backlash is often deeply about a defence of this status quo. I’ve built on this idea to theorise a conceptual family of backlashes: visibility backlashes, reacting to new people, ideas and information on the public stage; epistemology backlashes, attacking new forms of knowing; and ontology backlashes, which seek to defend worldviews about what and who are allowed to exist.
I have also learned from human rights defenders about the importance of friendship and fun in doing the difficult work of seeking accountability and justice. Protection against vicarious trauma – the trauma one can experience through bearing witness to others’ trauma – has been proven to happen through feelings of support and solidarity and through ensuring you take time to do things like watching videos of cute puppies. Leading with longevity needs looking after yourselves.
Q: How can sociological research into technology help leaders anticipate challenges and make courageous decisions in turbulent times?
EM: I will speak very specifically about leaders in the academy, and how to inform decisions about technology adoption, as our sector is rocked both by generative AI and by the worst global conditions for academic freedom in recent memory.
Research in the sociology of technology shows us, time and time again, that the political economy of technology matters in terms of its impacts on societies. Who owns it? Who controls it? How does it make money? What of our fears and hopes is it galvanising to get us on board? Which of our values does it enhance, and which does it threaten? As Big Tech has moved closer to politics, these questions matter even more.
From my perspective as Director of Education of Cambridge’s School of the Humanities and Social Sciences, as much as generative AI may promise in terms of efficiency, it also problematises in terms of our scholarship values across disciplines – including robust education, ethical research, and the eurekas of discovery. Generative AI creeping into our scholarship tools can also be an academic freedom problem, if we define academic freedom not only in terms of autonomy over what we teach and research, but also how we teach and research.
In other words, the politics of procurement matter, and we need to think about what we lose as a consequence of what we gain when we consider adopting new technologies.
Q: In a panel with leaders from business, activism and technology, what unique challenges do you see for those in academia or governance, and how do courage and risk-taking manifest differently in these sectors?
EM: In academia, we are in the business of discovery and innovation (as are the sectors of business, activism and technology, of course!). These take time, resources and autonomy. Leaders in the academy should be clearing the way for their colleagues to advance discovery and innovation through research and teaching. This means defending academic freedom and values from unhelpful external influences, whether regulatory or financial.
It also means supporting sustainable careers and workloads. The rise of what emeritus Cambridge Anthropology professor Marilyn Strathern called ‘audit culture’, where tracking and quantification replace trust in a system, creates a significant amount of paperwork and processes that can get in the way of our University’s vital mission ‘to contribute to society through the pursuit of education, learning and research at the highest international levels of excellence’.
In a way, audit culture is about institutional responses to mitigate risk, so part of the risk-taking leaders in the academy should do is to build in more opportunities for trust; if the trust is there, reducing auditing doesn’t actually create a risk.
Q: Could you share an example from your own career where you had to exercise courageous leadership, and what you learned from it?
EM: I’d like to mention the Academic Freedoms Research Network I co-convene with the support of CRASSH [the Centre for Research in the Arts, Social Sciences and Humanities], CGHR and GRIST [the Global Racisms Institute for Social Transformation]. Because we are at a University and in a nation where academic freedom is relatively well-protected, starting this network did not require any particular courage on my part, but through the network, I have heard so many stories of others’ courage.
All academics are leaders in their own way, forging new trails in research and teaching; around the world, so many colleagues are facing backlashes against their scholarship from states that interpret this scholarship as a threat to their political power and projects. Other colleagues are coming together to support their peers under threat with solidarity networks and institutional resources. The Academic Freedoms Research Network reminds me time and time again that we are working in a very courageous sector.
*Photo credit: Nick Saffell
**Tickets for the Cambridge Festival event are free, but you need to book your place via the Cambridge Festival website. NB this is an in-person event.
