Catalyst - Issue 12 - Open Book

Catalyst - Issue 12 - Open Book

Open book An intelligent view of AI Data privacy, fake news and the rise of robots all increase public mistrust of the artificial intelligence tech that adds value to consumer experience. Marketers must respond to concerns quickly if they are tokeep consumers on side Words: David Benady O nline privacy has soared to the top of marketers agenda this year. With Facebook facing scrutiny over its treatment of personal data after the Cambridge Analytica scandal and Google causing concern after its Duplex chatbot tricked people into believing it was human pressure is mounting on technology companies to come clean about their activities. This reduction of trust matters to marketers, who increasingly rely on tech platforms to understand and communicate with consumers. The more brands personalise their advertising using customer data, the better responses they expect to get but they depend on the consent and goodwill of consumers to collect that data. Indeed, the EUs GDPR privacy rules make it easier for consumers to withhold their consent for brands to use personal data. Ultimately, a breakdown in trust between brands and consumers has the potential to undermine the power of marketing. Targeting value According to Carl White, co-founder at search-targeting agency Nano Interactive, younger generations generally accept the value of handing brands their data in exchange for access to free social media platforms or websites. There is evidence to show that people dont mind ads being targeted if it adds value and is of relevance to their overall browsing experience, he says. They become concerned, however, when they realise the extent to which their data is being secretly gathered by third parties through web trackers, cookies and device identification. With clarity and transparency, most people are OK with some forms of tracking and technology. There are simple, easy-to-explain, data-gathering techniques that dont make people uncomfortable. Part of what we have to do as an industry is explain that and say, look, this is what we do; it means you see fewer ads and the ads are more targeted. Nano Interactive analyses keyword search terms using artificial intelligence, builds a profile of a users buying intentions, then shows them relevant ads based on the analysis. But White stresses that the company takes great care about the sorts of data it collects. We have drawn up a list of personally sensitive keywords that relate to sexual orientation, medical conditions and other sensitive data, and we are not even collecting that or tracking that person when they make that search, he says. This is designed to assuage peoples fears that their search activity will be used to profile their most personal details. Explaining your ethics Every company involved in online marketing needs to make its privacy policy clear and be transparent about what it does with data. It is a sign of how far the ethical issues around digital marketing have come that, after the Cambridge Analytica scandal in which confidential data ended up in the hands of the data analytics firm Facebook launched an ad campaign in the US, using TV, press and posters, proclaiming it is going to change and get rid of fake news, clickbait and data misuse. Meanwhile, in a move widely seen as a sideswipe at Facebook, Apple has announced that it is cracking down on web trackers. The next version of the Safari browser will show users a pop-up giving them the option to turn off trackers. Google is also in the spotlight over transparency, after chief executive Sundar Pichai demonstrated the new chatbot Google Duplex by calling a restaurant and a hairdresser to book appointments. The bot chatted with the human receptionists, sounding thoroughly human by umming and ahhing and using upspeak rising intonation at the end of spoken phrases. The humans on the other end of the line seemed to be unaware they were talking to AI bots. The exercise was decried by the media as creepy and unacceptable. The next version of the Safari browser will show users a pop-up giving them the option to turn off trackers Humanising the tech At media giant Hearst, AI is used to curate content on websites but chief digital officer Phil Wiser thinks there is no need to tell consumers about this level of AI involvement. Speaking at the recent Digital Workforce Conference in New York, he compared using AI for content curation to using an html web page, explaining that users are not concerned about the technical tools used to create the web experience. It is one of the unfortunate hype factors of AI, he says. We immediately take something that is a statistical algorithm and try to humanise it and attribute all these dangerous and terrible elements to it. You have to be more thoughtful about it. Wiser says that in the Google Duplex presentation there was a duty to let people know they were talking to a chatbot because where there is personal direct engagement there is an expectation of speaking to a human. Such uses of AI will present marketers with major ethical challenges as it becomes increasingly embedded in digital processes. The most important thing we have to do is focus on how AI is going to be used to produce fake information, and how you judge fake-driven data or insights versus real information, says Mark Minevich, an AI expert who advises software giant IPsoft, which markets the Amelia AI chatbot. He warns that fake information produced by AI could create crises in finance, healthcare and energy. We need new regulatory agencies; we need to check for compliance; we need to check for violations. Just as in food, where you know exactly what the ingredients are, I expect the same thing will need to happen for the AI industry, he says. Some level of regulations has to occur fairly rapidly, because we are heading into unsure territory. Whether using AI, digital advertising, social media data collection or running a data-management platform, marketers must offer clear explanations about privacy and data control to consumers. This goes beyond simply complying with GDPR. People are becoming savvy about their data and will demand as much transparency about the technology they use as they do about the food they eat. cim.co.uk/exchange