Generosity and Conviviality in the Age of Algorithmic Oppression

This abridged post was written by Marc Hudson and originally appeared at marchudson.net

This was a superb event. A diverse audience of somewhere between 80 and 90 attended a truly excellent event on ‘algorithms of oppression’ yesterday in Manchester. The event, hosted by Open Data Manchester with the support of The Federation and Manchester School of Art, was centred on a lecture and q and a with Dr Safiya Noble of USC Annenberg. This blogpost is an attempt to appreciate the richness, breadth and generosity of her talk, and also provide links and ‘bookmarks’. It can’t be a blow-by-blow account, but the event WAS live streamed and the organisers hope to get it up on a video sharing platform soon enough. Comments on the blog welcome, especially if I have mangled something, their are tpyos and that sort of thing… Stuff in [square brackets] is me editorialising/suggesting additional lines of enquiry/books — I wouldn’t want you to get the impression that Noble has arcane and weird taste in anecdotes.

The event began with generous amounts of alcohol, fruit (grapes! Nom nom. Apples! Nom nom nom) and nibbles, with time for people to catch up with old friends and make new ones. Julian Tait of Open Data Manchester opened proceedings explaining that ODM, which has been going for about 8 years is a community-led organisation that tries to take a critical look at (open) data and what it might mean for democracy, participation, sustainability and all those Good Things. (The tagline says it best — Supporting responsible and intelligent data practice in Greater Manchester and beyond). They have a bunch of events coming up, including the brilliantly named ‘Joy Diversion’ (scroll to the bottom) and he then asked if anyone else had events. There was on — this Thursday, 10 May Meet Amazing Data Women, open to anyone who identifies as a women.

A representative of the host building, The Federation gave a short talk, mentioning that it’s a newish community led business with free desk space for tech businesses that are trying to do useful things around sustainability. They also have events coming up, including something on May 30 on technology and slavery, with Mary Mazzio, the director of the documentary ‘I am Jane Doe’, and the launch of a report about images and disinformation/misinformation and the recent UK and French elections on June 5.

Safiya Noble

Then it was on to the main event. Professor Farida Vis, from Manchester School of Art, introduced the speaker Dr Safiya Noble. She is (not yet) well-known to British audiences I suspect, but if tonight is anything to go by a) she should be and b) she will be. FT readers may have read the article about her typing in ‘black girls’ to google while looking for things to take her niece to in New York and the results being, well, NSFW [that’s ‘not safe for work’, for any of reader who isn’t quite as down wiv da yoof as the 47 year old middle-class blogger who has to be told off by his wife for quoting The Wire as if it gives him urban(e) cred. Truedat.]

Noble works on the ways in which information technology, while seeming ‘neutral’ [the best trick the devil ever played…] actually perpetuates (intensifies) pre-existing prejudices. She’s been working on this for years and she knows a hell of a lot, but wears it lightly. Noble has an engaging and charismatic stage presence. She clearly knows her stuff, knows why it matters and is keen to communicate it, but also to engage with questions and critiques. She began with an anecdote about her new book. When she first got the contract with NYU Press her editor there (whom she praised — Noble is generous at giving ‘shout outs’ to colleagues) said that there was no way the word algorithm could be in the title — “nobody except you nerds knows what an algorithm is.” Well, now, with bots crawling all over our minds like spiders hatching from an egg sac in a wound, everyone knows differently. Noble said that even her father-in-law is saying “what’s up with those algorithms?” Noble then pointed out that while her research — and her talk — would be about the USA, what she is studying [warning about] is happening globally, at different speeds in different ways.

This of course is part of a broader ‘techlash’ — a backlash against the utopian promises and hype [see Gartner Hype Cycles for more on this]. As a Wired article Noble referenced put it “2017 was the year we fell out of love with algorithms.”

The next thing that happened was a recurring theme: Noble enthusiastically cited the work of another academic (in this case Wendy Chun, and her 2006 book “Control and Freedom: Power and Paranoia in the Age of Fibre Optics“. Two things here — firstly this is a super-helpful habit, sharing your overview of an issue and its back history. Secondly , she wasn’t citing other scholars merely to say their work was incomplete and she had the missing pieces of the puzzle. This wasn’t an alpha (male) academic exercise in the swinging of, er, citations, of the type that those of us privileged to live in the ivory tower so often encounter.

[Btw- strangely many of the authors working on digital oppressions are African-American or BME. Very odd that that African Americans might have the most acute and penetrating perceptions about the ways that power works. It’s almost as if they have been on the pointy end of oppression for centuries. But anyway…]

She also mentioned Ann Everett, but I can’t read my scrawl to get the context. [Ah, the irony — google helps out- In Digital Diaspora; A Race for Cyberspace — “Deftly interweaving history, culture, and critical theory, Anna Everett traces the rise of black participation in cyberspace, particularly during the early years of the Internet”. Noble had by this point already reminded us just how revolutionary and useful Google was when it arrived in 2000, making it actually possible to find stuff…]

Anyway, Noble’s thesis, in her book — omfg I haven’t linked to her book yet- is that black bodies are ‘data disposable’ , upon which technology is practiced and perfected [And for those of you who think ‘conspiracy theory’/chip on shoulder, why don’t you check out the Tuskegee Study of Untreated Syphilis in the Negro Male [the clue is in the name] and how the pill was tested on Puerto Ricans. To back this up she introduced Vilna Bashi Treitler and a book called The Ethnicity Project: Transforming Racial Fiction into Ethnic Factions, which says that there is a ‘core binary spectrum’ (white and non-white), with immigrants striving to become white (think the Irish and Italians) [the same thing happened in Australia, — whiteness is such a freaking fiction].

Noble mentioned a backlash against talking about race since the 1990s, with the rise of so-called ‘colour-blind ideology’, a favoured phrase of venture capitalists looking to fund projects. It has the odd effect of rendering white and Asian men invisible [so ‘normal’ as to be unseeable]

Two more academics working on this got a shout out — Michael Brown [can’t find — perhaps a reference to Ferguson victim?] and Helen A. Neville.

In the one moment that, for me, was questionable Noble pointed out that the rise of ‘computer knows best’ ideology grew in the 1960s at the same time as Civil Rights legislation was being passed and participation in decision-making became at least thinkable for minorities [that said, the use of technology to deskill and suppress workers power is indisputable — see David Noble Forces of Production I just think this was a slightly long bow to draw…]

What we are seeing now is analogous to ‘redlining’ (where banks refused loans to entire categories of people — a practice now outlawed). Profiles of individuals are being built on a mass level — what does it mean to have a data profile about you that you can’t intervene on? For Noble, AI is going to be a massive Human Rights issue in the 21st century, and it is one we don’t have the legal/political frameworks for yet.

More fellow academics and thinkers then got a shout out

[Noble says that this list is just ‘scratching the surface’, and that we need to mainstream the discussion of tech ethics)

Noble told an amusing story about having been on twitter for so long that she actually has the @safiya handle but can’t use it because it gets flooded with mis-tagging (the same thing happens to a guy called @johnlewis, who has great fun with people’s mis-tagging while looking for the store).

Searching questions
Moving on to the question of how trusted search engines are, Noble pointed to a 2012 Pew Centre study which showed that most Americans are satisfied with search engines, most use Google (thus, Noble says, that’s what she studies!, and most use it often. Search engines are therefore seen as a ‘trusted public good’, the people’s portal. The cost of this is that we’ve lost the art of/respect for content curated by an expert.

Noble then shared the experience by which she might be known to a general UK audience — she googled ‘black girls’ (having been told by a colleague not to do it from a university computer. And sure enough, it was pages and pages of porn. Noble wrote an article on this for Bitch magazine, published in Spring 2012. By autumn Google had suppressed the porn in the search results [a pattern that would continue — individual problems dealt with on an ad hoc reactive basis]

Noble then asked if anyone had heard of a UK band called Black Girls, which still appears in the searches. One of the 90ish present had, leading Noble to observethe band was better at search engine optimisation than it was at music distribution…

Next up Noble gave a shout out to an online collection of Jim Crow memorabilia at www.ferris.edu/jimcrow/jezebel, before going on to recount how, in 2015, DeRay Mckesson (with leverage acquired from having been followed by Beyonce) showed that when you googled the ‘n-word’house Google Maps took you to the White House.

Again, Google’s response was to talk about ‘glitches’ (in otherwise perfect systems.

Another example — the following year chap called Kabir Ali was livestreamed by his friends googling ‘three black teenages’ and ‘three white teenagers’. The former gave mugshots, the latter healthy non-threatening sportsballing folks

[This stuff matters. Somewhere (Malcolm Gladwell?) there’s an anecdote about someone regularly doing the Implicit Association Test, which furtles out the links you make ‘unintentionally’ and not being able to figure out why his results were improving –then realised he was watching the Olympics, where black athletes were doing well/being praised]

Anyway, the following day, it was tweaked.

Next up — googling “unprofessional hairstyles for work” came up with lots of black women, while “”professional hairstyles for work” came up with white women with pony tails.

See also Jessica Davis and Oscar Gandy 1999 Racial identity and media orientation: Exploring the Nature of Constraint. Journal of Black Studies, Vol. 29, (3), pp.367–397.

[I mentioned this to the brilliant Sarah Irving and she mentioned the ‘if anyone needs to know that orientalism is still a thing, google ‘sheikh’ — still loads of images of kidnappy/rapey men on camels and their insatiable appetites for white flesh]

So beyond being offensive and demoralising, what are the broader political implications, if any? Noble pointed to a 2013 study that showed that the types of results which came up on the first page when someone googled a candidate could influence who people would vote for, and that search engines need to be regulated. [I haven’t got this totally] This is the article I think — Viability, Information Seeking and Vote Choice. (Of course, googlebombing is nowt new — see what Dan Savage did to Senator Santorum, way back in the day).

Skip forward- after the 2016 Presidential Election, if you googled ‘final election result’ in the US you got taken to a lying site, that said Trump won not only the Electoral College vote, but ALSO the popular vote.

Further links

Neoliberal co-optation

So, the response has been predictable, and probably effective. Google has come up with ‘Black Girls Code’ — in this narrative the main problem is not structural racism but that 5 year old black girls haven’t been getting involved enough… Noble cited Heather Hiles as noting that less than one per cent of venture capital goes into projects led by black women.

Noble then moved on to the deeper question of who makes the tech, and what damage is done in the making of it (the subject of her next work — following the production and value chains). All these techs are “resting precariously on extraction in the Global South” with enormous amounts of hidden labour [and ecosystem devastation] “in I-phone 12 or whatever we’re going to be on in a week” — all part of the (story of) infinite linear progress of technology [ah, the hedonic treadmill, donchajustlove it]

This sense of technology as our (submissive because female) friend is there in the new personal digital assistants such as Microsoft’s Ms Dewey, Apple’s Siri and Amazon’s Alexa. Noble mentioned that a few weeks ago she was talking with robotics professors at Stanford who had not thought through the implications of children learning to bark commands at women’s voices…

[There’s a great Onion story “Congress Demands to Know How Facebook Got people to Give Up their Civil Liberties without a Fight” in which they quiz Mark Zuckerberg on how he convinced people to let bugs/spies into their houses in a way the FBI could only dream of. ]

Other problems include so-called “predictive policing” and embodied software (Robocop Lives!!) Simone Browne and racialized policing. A concrete (in every sense) example — in Champagne Illinois there are [were?], in the poor neighbourhoods, virtually no sidewalks. So, what do kids do? They walk in the street. And what do the cops do? Write them up for jaywalking. As the Violent Femmes once sang ‘this will go down on your permanent record’…. [A neat example of how, as per critical realism, we have to think about material constraints, not just ideologies and ‘rules’ — see Sorrell, 2018: “Explaining sociotechnical transitions: A critical realist perspective“]

Final example — even videos of atrocities (Eric Garner dying, saying ‘I can’t breath’) attract advertising revenues because, well — lots of people watch them. [We monetise our own catastrophes, whether we like it or not…]

What is to be done?

This was the last, and by far the briefest, section of Noble’s talk. She had five suggestions

  • Build repositories and platforms that belong to the public. (Noble noted that the convergence of states and multinational companies made it hard to imagine platforms that were not based on advertising revenues)
  • Resist colorblind/racist/sexist technology development (sex dolls got a mention)
  • Decrease technology over-development and e-waste
  • More info and research visualisation for the public
  • Never give up

Noble closed by observing that the prediction is that by 2030 1% of the population will own 2/3rds of the world’s wealth. There will be intensified datafication, more devices, with more promises of seamless and frictionless liberty. But we can’t eat the digital. We can’t make an iPhone sandwich….

Question and Answer

There was time for some questions. What was interesting here — besides the info itself — was that Noble gave quick and detailed answers, without waffle or using the Q and A as a chance for a thinly-veiled continuation of her lecture (we’ve all seen that happen, right?)

Please do NOT take anything I’ve ascribed to Noble as gospel. I may have got stuff wrong. I don’t do shorthand. Check the recording!

Question 1: Is Capitalism an algorithm of oppression?

Noble: Yes, of course… goes on to point to the racialized element of this. 2008 financial crisis as the biggest wipe out of black wealth since the Reconstruction [e.g. here and here]

Cites Cathy O’Neil Weapons of Math Destruction: How Big Data increases Inequality and Threatens Democracy.

Question 2 Audre Lorde said that the master’s tools cannot dismantle the master’s house. Does this mean that we should be encouraging people to ‘disconnect.?

Noble: No. It’s cute to tell people to delete facebook, but we need collective solutions, not individual ones. If I stop paying my rent, I’m not taking down capitalism.We need to strengthen our position in the situation we are in. For example “fair sourcing”, and design where nobody dies [a challenge she put to comp sci students who had to take her class recently. We need a more powerful public response. We used to have 80% unionisation after world war two. Need that back. That’s not easy. In retrospect everyone was at the 1963 March on Washington, but of course it was no more than 10 percent making the change at the time. We need to find that 10% again

[And keep them for the long haul. Best book on the Civil Rights Struggle I ever read was ‘And We are Not Saved’ by Debbie Louis. Extraordinary.]

[I asked that question. What I MEANT to say was ‘in your opinion, are there inherent properties within the technology that mean it will never be effective as a weapon of liberation. If so, what then?’ but I bollocksed it up. I am sure there is a broader lesson in this — perhaps something about about letting smarter-than-me black women speak for themselves and staying out of the way, but I can’t quite see what it is….]

Question 3: (from a biracial woman). Was searching for home insurance with a white friend. They have same financial profile, and asked for two quotes on the same house. Guess who was quoted the cheaper premium…

[At which point TV Smith’s ‘It’s expensive being poor’ sprang to mind.]

Noble: That’s what we call code data discrimination, which we can’t see. We have to have public policy on this. And most discrimination laws are about proving intent — we should be looking at outputs and impacts. There is this very powerful ‘tech is neutral’ idea which we have to contest.

Noble then gave the example of a (black?) guy who was the son of a financial planner, and a financial planner himself i.e. ‘responsible adult’ had his Amex card declined. Eventually, after multiple calls it emerged he’d once bought something in a Walmart in the Wrong Part of Town and the algorithm had ‘decided’ he was credit risk.

Question 4: to what extent are Silicon Valley executives oblivious to the problem?

Noble: They’re largely underprepared. If you’re designing tech for society and you don’t know anything about society, you’re underprepared. The kind of people who end up in Silicon Valley mostly went to the top five universities and will have been able to transfer out of their humanities components early. For some the last humanities course they took will have been high school English. But even if by some miracle they’d taken ethnic/women’s studies, that wouldn’t necessarily help, since they are coding/designing within a set of institutions/beliefs/paradigms [my paraphrase/word salad].

And there is defensiveness/hostility in some companies about all of this.

Question 5: How much of this is unconscious bias?

Noble: I don’t like that phrase. UB let’s everyone off the hook. Get’s us back into intent questions, where we need to look at outputs and impacts, and then use public policy, HR policies, hiring etc.

What kind of world do we want? One were people can’t afford to eat? How do we do things differently.

Question 6: Thanks for opening my eyes about search engine bias: beyond moderation of search engines, what?

Noble: we must separate advertising content from knowledge. If you’re looking for somewhere to eat, fine, but if, as Dylan Roof (the 19 year old who murdered 9 black churchgoers in Virginia), well, he was doing ‘sense-making’ on Travyon Martin (murdered by George Zimmerman) and was led to lots of white supremacist sites. There’s a chapter in the book on this.

We need to demarcate better, and realise that google should not be a ‘trusted public good’.

Further reading [my suggestions’