Categories
Computers

Kade Crockford – Can computers discriminate? Spoiler alert: yes. Public Interest Tech – Ford Foundation

Kade Crockford: When you look at Google, you very well are seeing an entirely different set of information choices than what someone else will see when they open their Google browser. And that’s because of the algorithm that Google operates, having made decisions about you and about what you are likely interested in. That’s really critical.

[Kade Crockford, Director, Tech for Liberty…….

Kade Crockford: When you look at Google, you very well are seeing an entirely different set of information choices than what someone else will see when they open their Google browser. And that’s because of the algorithm that Google operates, having made decisions about you and about what you are likely interested in. That’s really critical.

[Kade Crockford, Director, Tech for Liberty Program, ACLU Massachusetts. A white gender nonconforming person wearing business clothing.]

My name is Kade Crockford. I run the Technology for Liberty Program at the ACLU of Massachusetts. In many ways, we haven’t even caught up to the late 20th century when it comes to the integration of digital technologies into our laws and our statutory frameworks and constitutional frameworks around the country. Digital technologies, the information age, have changed the way that we live in ways that are really obvious— like, the fact that we all carry tracking devices with us everywhere we go—and in ways that are really opaque—like the various black box algorithms that every single day make decisions for us.

[An animated 3-D black box spins as chains made up of ones and zeros flow into it on all sides.]

A black box algorithm is a mathematical formula that companies like Google and Facebook, as well as even governments, use to process large quantities of information and make decisions about what you’ll see when you open up your web browser. They determine what price an airline will try to sell you a plane ticket for, and they can even determine how much your car loan will cost. That matters because it may very well be the case that someone in a rich white neighborhood gets charged substantially less for auto lending than someone who lives in a largely poor black neighborhood—despite the fact that those people have pretty much identical driving records. This also happens in the employment context, where employers are using black box algorithms to sort through large quantities of data about applicants.

[An animated 3-D black box floats above three separate stacks of resumes, scanning each stack with a laser beam.]

The algorithm will automatically sort and dispose of many, many applicants before any human being even enters the process to decide who is going to get the job or who will get an interview. And those types of systems are in use in almost every industry today. Right now, there’s a major information asymmetry between folks who work at Google and Facebook, about exactly what these tools are capable of and what they’re currently doing, and the vast majority of the public. We need to bridge that gap. And we need technologists alongside us in that fight. Fifty years ago, when there was no public interest pathway for law students, really, besides working for the ACLU, we were not doing all we could as a society, frankly, to maximize what it means to be a lawyer, to maximize the benefits of a legal education—as far as, you know, impacting the society in general in a positive way. It’s equally important now for technologists to also come to the table and tell lawmakers exactly what these tools are doing, what the future looks like, and how to ensure that we don’t magnify exponentially the existing inequalities in our society. If we don’t bring those technologists into the public interest fold, I think we’re really looking at a very dangerous world in which technology does exacerbate and exponentially increase those inequalities.

[This is tech at work for the public! Hashtag Public Interest Tech. Ford Foundation dot org forward slash tech. Ford Foundation logo: a globe made up of a series of small, varied circles.]

Source: https://www.fordfoundation.org/news-and-stories/videos/kade-crockford-can-computers-discriminate-spoiler-alert-yes-public-interest-tech/