“It Wasn’t Me, Ma — It Was the Algorithm!”

Apple [NASDAQ: AAPL] got pilloried on social media in November, several months into the rollout of the new Apple Card (managed by Goldman Sachs [NYSE: GS]).  A tech influencer (the creator of the popular programming language “Ruby On Rails”) called out AAPL for giving him a much larger credit limit than his wife:

Source: Twitter

Mr Hansson’s ire was mostly directed at the response he received to all his efforts to take up the issue with AAPL — namely, The algorithm decided, not us — there’s nothing we can do.  Ultimately, even AAPL co-founder Steve Wozniak chimed in — reporting that the same thing had happened to him and his wife with their Apple Card application. 

Mr Hansson’s wife’s credit limit was eventually raised — but without apology or explanation.

The specifics of this case are less interesting to us than the broad picture it suggests about potential problems artificial intelligence will face as it moves forward and becomes more clearly present in consumer-facing enterprises.

Back in 2017, reporting on advances in machine learning, we wrote:

“Laypeople usually think of computer programming as an activity in which the programmer gives instructions to the computer, and then the computer executes those instructions.  Machine learning turns that around.  The programmer gives the computer a set of sample data and a desired outcome, and the computer generates its own algorithm on the basis of those data that it can apply to any future data… 

“The MIT Technology Review notes: ‘… By its nature, deep learning is a particularly dark black box.  You can’t just look inside a deep neural network to see how it works.  A network’s reasoning is embedded in the behavior of thousands of simulated neurons, arranged into dozens or even hundreds of intricately interconnected layers…  Just as many aspects of human behavior are impossible to explain in detail, perhaps it won’t be possible for AI to explain everything it does…  If that’s so, then at some stage we may have to simply trust AI’s judgment or do without using it.’”

In short, a consumer-facing company may just not be able to explain why its credit determination algorithm came up with the result it did; a human resources firm may just not be able to explain why its hiring algorithm came up with the result it did.  That will produce a PR and regulatory conundrum for companies that rely on deep AI — especially when the algorithm’s results fly in the face of social and cultural norms and expectations.

Investment implications:  The flap over AAPL’s new credit card suggests that consumers — and, we are pretty confident, regulators — aren’t going to rest easy with the dodge that “the algorithm made me do it.”  In short, some of the limitations to the use of AI aren’t going to be technological — they’re going to be very, very human.  And there will always be a place — and a need — for a human being standing over an AI system, ready to hit the “bypass” button.

Please note that principals of Guild Investment Management, Inc. (“Guild”) and/or Guild’s clients may at any time own any of the stocks mentioned in this article, and may sell them at any time.  Currently, Guild’s clients own AAPL.  In addition, for investment advisory clients of Guild, please check with Guild prior to taking positions in any of the companies mentioned in this article, since Guild may not believe that particular stock is right for the client, either because Guild has already taken a position in that stock for the client or for other reasons.

You might also enjoy