One of the things I learned from studying maths and philosophy is an appreciation of what things follow from what other things. Identifying and understanding what assumptions are implicit in a given argument, what axioms required to establish a given proof.
So when I see or hear something that I disagree with, I feel the need to trace where the disagreement comes from – is there a difference in fact or value or something else? Am I missing some critical piece of knowledge or understanding, that might lead me to change my mind? And if I want to correct someone’s error, is there some piece of knowledge or understanding that I can give them, that will bring them around to my way of thinking?
(By the way, this skill would seem important for teachers. If a child struggles with simple arithmetic, exactly which step in the process has the child failed to grasp? However, teachers don’t always have time to do this.)
There is also an idea of the economy of argument. What is the minimum amount of knowledge or understanding that is needed in this context, and how can I avoid complicating the argument by bringing in a lot of other material that may be fascinating but not strictly relevant. (I acknowledge that I don’t always follow this principle myself.) And when I’m wrong about something, how can other people help me see this without requiring me to wade through far more material than I have time for.
There was a thread on Twitter recently, prompted by some weak thinking by a certain computer scientist. @jennaburrell noted that
computer science has never been very strong on epistemology – either recognizing that it implicitly has one, that there might be any other, or interrogating its weaknesses as a way of understanding the world.
Some people suggested that the
solution involves philosophy.
People in CS and machine learning have been haphazardly trying to reinvent epistemology while universities make cuts to philosophy departments. Instead of getting more STEM majors we might be better off if we figured out how to send more funding to the humanities.
— 🦄 Dietrich Epp 📡 (@DietrichEpp) June 24, 2020
I completely agree with Dietrich about the value of philosophy and other humanities in general. However, I felt it was overkill for addressing the specific weaknesses identified by Professor Burrell, as her argument against this particular fallacy didn’t seem to require any non-STEM knowledge or understanding.
While I sympathize with this sentiment, you don’t always need a philosophy degree to spot incoherent CS thinking/epistemology. You just need to stay awake in your statistics class.
— Richard Veryard (@richardveryard) June 24, 2020
Of course, statistics is not the whole answer; but then neither is philosophy. I mentioned statistics as an example of a STEM discipline in which students should have the opportunity to unlearn naive epistemology; but of course any proper scientific discipline should include some understanding of scientific method. Although computing often calls itself a science, it is largely an engineering discipline; if you use the word
methodology with computer people, they usually think you are talking about design methods. Social scientists (I believe Professor Burrell’s PhD is in sociology) tend to have a much better understanding of research methodology.
And of course, it’s not just epistemology but also ethics.
Nothing scares me as much as seeing naive engineers with no knowledge of structural injustice, pervasive power asymmetries, or conservative and racist history of the field of AI, being endowed with the power to make tech that infiltrates the social sphere.
— Abeba Birhane (@Abebab) June 29, 2020
One of the problems with professional philosophy is that it can be quite compartmentalized. There are philosophers who promote themselves as experts on technology ethics, but their published papers don’t reference any recent literature on the philosophy of science and technology, or reveal any deep understanding of the challenges faced by scientists and engineers.
So although there is undoubtedly good reasons for broader education in both directions, I’m sceptical about expecting clever people in one discipline to acquire a small but dangerous amount of expertise in some other discipline. I’m much more interested in promoting dialogue between disciplines. In his tribute to Steve Jobs, @jonahlehrer called this Consilience.
What set all of Steve Jobs’s companies apart … was an insistence that computer scientists must work together with artists and designers—that the best ideas emerge from the intersection of technology and the humanities.
The final word should go to @abebab
In contrast to an underspecified data for good model, embracing pluralism offers a way to work toward a model of data for co-liberation. This means transferring knowledge from experts to communities and explicitly cultivating community solidarity in data work.
— Abeba Birhane (@Abebab) July 4, 2020
Jonah Lehrer, Steve Jobs: “Technology Alone Is Not Enough” (New Yorker, 7 October 2011)
Related posts: From Convenience to Consilience – “Technology Alone Is Not Enough” (October 2011), The Habitual Vice of Epistemology (June 2019), Limitations of Machine Learning (July 2020), Mapping out the entire world of objects (July 2020)