Hidden dangers in debate between personalization and privacy

Have you ever searched for something on Google and then noticed a banner ad on a different site that calls out what you were looking at? Or maybe you left an item in an online shopping cart and then received an email prompt to complete your purchase? You’re far from alone.

Marketers are seeing great results from collecting, mining, and connecting customer data. Most are using it, and some are using it well—sometimes too well. They are constantly finding bigger and more effective ways to capture and control troves of customer data, from names and email addresses to purchasing habits and search terms entered, with the goal of delivering highly personalized customer experiences and generating higher returns. But at what cost?

“Whether we realize it or not, every digital step we take is indeed being watched—with the resulting data providing a frightening wealth of information about our lives,” Shelly DeMotte Kramer, co-CEO of marketing agency V3B, wrote in a blog post about data mining last year. “Those of us who work in the space know: Every card transaction, every website visited, every online social interaction—even our movements and exact location are routinely collected and analyzed to build up a picture of our habits and preferences.”

We all know that customers can make or break a company. As a result, companies are analyzing their customers’ online and offline attitudes, behaviors, preferences, and lifestyles via data streams, then adapting their products and marketing efforts accordingly. This practice enables hyperlocal and hyperpersonal targeting and optimization, which can materialize as unique, relevant, and engaging one-to-one experiences with customers.

For marketers, creating these experiences is essential because customers respond to them. And to do so, they are investing billions of dollars into tracking, managing, storing, and analyzing as much customer data as possible.

On the surface, the collection and storage of this data might seem harmless, albeit Big Brother-esque. Most people don’t think twice about giving out their personal information to companies, especially if there are incentives involved. But there are real dangers that are rarely top of mind until there is a breach of data or trust, and no one really knows what the long-term repercussions will be.

Specific data points on their own may not be very sensitive or of any use to a hacker. But companies now have so much data—and sophisticated tools to perform analyses—that this data in aggregate could be considered sensitive. By connecting data pertaining to online searches, memberships, GPS locations, and purchases, for example, someone could conclude that certain people have particular medical conditions, are hunting for a new job, or are having (or even considering) an illicit affair.

The theft, mishandling, or exposure of this information could have damaging, if not devastating results. In a matter of seconds, the use of data to provide positive personalized experiences could just as easily be used to create negative experiences—and that’s where the real danger lies.

Adequately addressing privacy, safety, and security concerns has been a wicked problem ever since the Internet’s inception, and it may never be completely solved. But to gain and maintain customer confidence, marketers must prioritize the protection of the sensitive information they collect. They need to demonstrate self-governance in data collection and mining to avoid some very real and severe consequences, from major lawsuits and reputation damage to credit card fraud and threats to customer safety.

As with any good business practice, honesty is the best policy. Being transparent (not just compliant, in the fine-print of terms of service) about what data you are collecting, how you plan to use it, and what measures you are taking to protect it will help garner customer trust—especially in an age where information security is now a critical aspect of personal safety.

This story originally ran on Readystatements.