What Humans Can Learn From Machines

Kesha Williams, Software Engineer at Chick-fil-A, gives us a sneak peek of her tech track topic for the upcoming Women of Silicon Valley 2018. She shares her work with Suspicious Activity Monitor (SAM), predictive policing and machine learning.
What Humans Can Learn From Machines

Imagine a world where machines rule. Imagine a world where robots and autonomous drones roam a city making crime-fighting decisions using their own judgement without human input. Imagine a world where there is hardly any crime because crime is stopped before it starts.

Does this sound like something you’ve seen before in a science fiction movie or read about in your favorite science fiction novel? Well, I am here to tell you, thanks to a technology called machine learning, science fiction is now a reality!

"Machine learning is a type of artificial intelligence that gives computers the ability to learn without being explicitly programmed." 

Does the sound of this scare you? If so, you would not be alone. The talk of machines taking over the world has been around for a long time. We have adventure-filled science fiction movies to thank for that. However, I am here to tell you that machine learning isn’t scary at all.  Well, it shouldn’t be.

A machine learning program is nothing more than a computer program capable of making predictions about the future based on studying and learning from historical data. That doesn’t sound so scary after all, does it? Past behavior is always a good indicator of future behavior! Now, predictive policing, takes this one step further.

"Predictive policing is a discipline that uses the power of machine learning and criminal behavior models to predict crime; it is said to stop crime before it starts!"

I’ve read studies published by prominent universities stating that predictive policing will be heavily relied upon by 2030.  In less than 15 years, humans will rely heavily on machines to stop crime before it starts. Now this gives me pause. I don’t know about you, but if a machine is going to make a decision on whether I am guilty or not, it should make a fair decision that is bias free. Now herein lies the dilemma. 

Machines start out bias free (we can learn a thing or two from the them) and then take on the biases (e.g. prejudices, partialities, favoritisms, predispositions, preferences, and preconceptions) of the computer programmer that teaches and trains them. This bias typically comes in the form of training data (the historical data machines learn from) or from explicit instructions from the computer programmer (i.e. creator). When dealing with crime, bias is not a good thing.

"We have all heard of racial profiling, and we all know the damage it does to families, race relations, and society as a whole."

When it comes to predictive policing, we definitely want the machine to make a bias free decision.

Based on my experience, it is very easy to teach a machine to look at other factors, apart from race, in making a crime prediction. I can say this because I invented a computer program, SAM (Suspicious Activity Monitor), that removes human bias from policing. 

SAM is a predictive policing machine learning program (think "precrime" from Minority Report) that looks at a particular situation (using computer vision) and predicts the likelihood of crime (using machine learning). SAM looks at several attributes about a person and even uses their current location to make a crime prediction. When creating SAM, I intentionally excluded race as an attribute he considers because I didn’t want him accused of racial profiling. 

"The decision to exclude race was an “a-ha” moment for me because it shows that machine learning can remove human bias from policing; thus, eliminating racial profiling."

The implications of this are huge and the benefits to society many.

My work with SAM has earned many recognitions and awards. I had the fortunate opportunity to present on the TED stage in NYC about machine learning as a part of their Spotlight Presentation Academy. In addition, I routinely travel the world sharing about the lessons I’ve learned creating SAM. I look forward to sharing more about machine learning, predictive policing, and SAM at the Women of Silicon Valley conference in March 21 & 22, 2018.

To see what else is on at Women of Silicon Valley 2018, take a look at the agenda and register to get your tickets now!

Tags: