Four For Friday (39)

The terms AI, big data, machine learning are often used interchangeably, muddling definitions and clarity.

The role of data in decision making is not in doubt. But do the decision makers know the existence of the data? How do the data from one setting, say, a car, interact with the external uncertainties such as highway design and known and unknown risks? Knowing data exist and having access to data is not enough. We must pull data from disparate sources together and make sense of that data in informing our decisions.

But crucially it is not always about “big data”; the value of “small data” is often not understood. It should be used better as this account of the death of Walter Huang in a Tesla Model X shows us (Wired magazine).

Last week, the National Transportation Safety Board released dozens of new documents that provide a detailed understanding of the circumstances of Huang’s death. The documents confirm a claim by Walter Huang’s family that he had experienced this particular glitch, in this particular spot, multiple times prior to the crash. He complained to family and friends about the issue. However, the NTSB was not able to confirm another key claim: that Huang had reported the issue to Tesla.

Forensic data also suggests one reason Huang might not have been paying attention to the road in the final seconds before his death: he was in the habit of playing a game called Three Kingdoms in his car while driving to work. Logs from his Apple-provided iPhone showed that he used the app during his morning commute every day the week of his fatal Friday crash. However, those logs don’t provide enough information to show whether he was interacting with the game in the final seconds before his death.

The documents also point to a third possible factor in Huang’s death: the government officials who designed and maintained Highway 101. This exact turnoff had been the scene of multiple crashes in the years before Huang’s death—including a fatal one in 2015. One reason the 2015 crash had been fatal was that officials had been too slow to replace a crash attenuator—an accordion-like metal device designed to cushion a car’s impact. Unfortunately, Huang’s crash happened just two weeks after another crash in the same spot, and once again the crash attenuator hadn’t been replaced. This reduced Huang’s chances of surviving the crash.

There are other possible uses for what we are calling AI colloquially. Protecting our kids from online predation, for instance. There is promise but there is also need for caution (Wall Street Journal reporting) and the need for the bot to learn continually so as to keep up with the humans involved in predation, before there is any hope of being a step ahead of such predators.

A spokeswoman for Microsoft said the AI tool could work in any text-based environment. The company didn’t provide additional details on how the tool would work with chat services that were end-to-end encrypted, such as Apple’s iMessage or Facebook’s WhatsApp.

Microsoft said it plans to continually teach the AI to recognize more phrases used by predators. But can the bots outsmart the predators?

Ms. Gregoire acknowledges it will be challenging. “The terminology used by these predators continues to evolve,” she said, adding that it’s sometimes even hard for humans to distinguish harmless chatter from something more sinister. “What is a true friendship and what is a predatory relationship?”

Is this the new frontier in surveillance (Economist newspaper) or just something that will have us all reaching for the fragrance wardrobes? What? You don’t have a fragrance wardrobe?

But it is not just DNA that people scatter to the wind as they go about their business. They shed a whole range of other chemicals as well, in their breath, their urine, their faeces and their sweat. Collectively, and somewhat inaccurately, these molecules are referred to as metabolites. Some truly are the products of metabolic activity within people’s bodies. Others are substances an individual has come into contact with, or consumed or inhaled. All, though, carry information of one sort or another.

Such information can reveal a lot. Your god? Regular exposure to burning incense, and thus frequent visits to a church that uses it, will be detectable from the chemicals in the smoke. Not a Christian? Kosher and halal diets are detectable by the absence of metabolites from certain foodstuffs those diets forbid. Your out-of-office activities? Habits like drinking, smoking and narcotic use are visible as numerous chemicals—not merely the active pharmaceuticals which produce the relevant high or low. Your exercise levels? These are flagged up by lower than normal levels of things like leucine, glycerol and phenylalanine. Your local environment? Breathing in polluted air has a marked impact on the profile of your metabolites. Your general health? Illnesses ranging from Parkinson’s disease (altered levels of tyrosine and tryptophan) to diabetes (sugars and sphingomyelin) leave abundant metabolic traces. “The day is coming soon”, observes Cecil Lewis, a molecular anthropologist at University of Oklahoma, who is studying the matter, “when it will be possible to swab a person’s desk, steering wheel or phone and determine a wide range of incredibly private things about them.”

We needn’t be nihilists ourselves but we must know tech nihilism when we see it. Friedrich W. Nietzsche’s warning — “Beware that, when fighting monsters, you yourself do not become a monster… for when you gaze long into the abyss. The abyss gazes also into you.” — rang in my head as I read about the app that warns us of nearby smart devices snooping on us (Gizmodo reporting).

While the app’s a great step forward for data awareness, there’s a certain amount of irony that comes with downloading an app to manage this collection in the first place—especially when apps don’t have such a stellar reputation on the data-privacy front. And while an initial scan of the network traffic of the IoT Assistant app didn’t turn up anything fishy, there’s certain parts of the privacy policy—particularly the app’s funding from DARPA’s Brandeis program—that certainly did. As the policy states:

The Federal government offices that oversee the protection of human subjects in research will also have access to research records to ensure protection of research subjects. The research sponsor (DoD and NSF) representatives are authorized to review research records.

So while the app might make for a handy download for the privacy-savvy among us, it also comes with the possibility that its profile of you could be used to power the research of the military industrial complex. Not the best tradeoff, but, uh, it could be worse?

A fun place to check out while you are here: Cornell’s Small Data Lab

%d bloggers like this: