Categories
Data Social Media

Data is the New Dirt

You may have heard that “data is the new oil”. You might also sense a bit of marketing hoopla in this phrase. There are a few ways to define “big data” but none of them fits with this analogy. My definition comes from my experience in transforming disparate data sets into business intelligence for the purpose of enabling an organization to accomplish its tasks. The data by itself is useless. Data is the medium from which value is extracted, not the value itself. In this view it’s more accurate to say that data is the new dirt.

It is true that data needs to be transformed just like oil needs to be refined, but the key difference is that oil is a finite resource and data is infinite. The scarcity of oil determines its value. Oil in raw form has value. A barrel of crude is worth about US$60 today (I would’ve guessed much higher!) but raw data is worthless. In fact it’s less than worthless. It cuts into the bottom line.

Imagine you’re a CIO of a big organization. You’re standing in a state-of-the-art data center humming away with endless rows of sleek cabinets packed with the latest server hardware, each hosting tens of thousands of virtual machines running database server software on untold petabytes of storage. The lighting is low and the place pulses with high-tech power. It’s all very bad-ass. But as you walk down the center aisle you approach a wall where there’s a huge LED display with a seven-digit number spinning out of control – an amount representing the net cost in millions of dollars per year spent storing all this data Your gut tightens as you fathom the volumes of data flooding the data center and the costs spinning straight to hell. Those numbers are burning into your retinas as you stare up at the LED. Your face is about to melt off like the Nazis in the climactic scene in Raiders of the Lost Ark. But you make your wisdom saving throw and recover just in time, calling HR and telling them to hire some data professionals now.

The point is it’s not the data that’s valuable; it’s truth, and these days there’s a scarcity of truth. The number of things into which oil can be refined is limited. On the other hand data can be molded into any information that is somewhere on the scale between insanely valuable and totally useless. Transforming data must be a flexible, adaptive process or its value is never realized. Somewhere in this mountain of data are facts, the rarest nuggets in the world.

Admittedly, my professional view spells big data with a little “b”. What I work with is nothing compared to the vast oceans of Big Data processed by Silicon Valley powerhouses and the internet of things. Big Data in this sense may very well be the new oil for a handful of tech companies, but anyone who has flown from Houston to Galveston knows the impact oil refinery can have on the environment. The data centers that store all this data use a lot of juice, the production of which also affects climate. Social Media also outputs a massive amount of social pollution unchecked.

Looking at the bigger picture, “Data is the New Oil” can also infer that everyone’s data is valuable and every individual should be getting rich, too. In the future there will be stronger, well-defined data rights to support this, but for now it’s delusional and dangerous thinking, a point made in the excellent blog post, “Data isn’t the new oil, it’s the new CO2,” by Martin Martin Tisné, managing director at Luminate, a philanthropic organization I follow. I plan on writing more about Luminate in the future, as well as Mr. Tisné’s article in the MIT Technology Review, “It’s time for a Bill of Data Rights”.

Categories
Social Media

New Hope for a New Decade

On New Year’s Eve 1979, I was eleven, cozy in my pajamas after a nice hot bath. I lay on my belly on the carpet of my grandparents’ living room, filling the pages of a sketch pad with dreams of the new decade to come. The Eighties were going to be kick-ass!

My grandparents lounged in lazy boy chairs and smoked as we watched an episode of M*A*S*H. Later we watched Lou Grant. I know these details now because I still have the sketch pad, and I’m reading the notes. (I just confirmed these shows were broadcast on the day in question, although there’s no mention of what we watched in the interim one hour between the shows.)

I didn’t take notes of every evening, but it was the first time I was conscious of entering a new decade. I drew spaceships and electric cars (the kind of stuff Elon Musk works on now, in real life), and other beneficial tech that would no doubt improve our lives in the future. Technology was exciting and cool. It would have been impossible to imagine an uncontrolled psychological experiment performed on the human race for the purpose of selling products and ideas, let alone draw it in my sketch pad. The inability to conceive of such a thing then is why many people were slow to understand the negative impact of social media today.

(I didn’t intend for this to be another post about the negative impact of social media, but it’s useful as a way of further defining what I mean by beneficial tech.)

Flash back to the scene of me watching TV with my grandparents on the last day of 1979. This was the golden era of TV, when there were just three competing networks who were in business to sell advertising. At the start of each show there was a commercial break, and another cluster of ads every fifteen minutes thereafter, with a few national ads and maybe a couple slots for local stuff. There were probably twenty or thirty ads aired during the time my grandparents and I watched TV that night, but nobody was paying attention. Commercials were for socializing, grabbing a bowl of ice cream, or taking a bathroom break. TV shows like M*A*S*H were not created for the purpose of selling things. Their entertainment value was incidental to the advertisement business that supported the platform on which they aired.

On the surface it looks like the content of Facebook is also incidental to its ads. But the amount of content is infinite, and “the algorithm” chooses what each individual sees. Put back in the TV age, this would be like everyone watching a slightly different version of the same show. At some point during an episode of M*A*S*H, Hawkeye would pause, look directly at me, and wink, holding the toy I hadn’t received for Christmas. “Still want one of these?” On an emotional level that’s how Facebook works.

Except it takes time for this magic to do its thing. The tech needs to learn every nuance of our behavior and moods. This requires hundreds of hours of our attention. In order to harvest the maximum amount of attention, social media companies employ armies of psychologists and developers to engineer addiction. The idea is to silo people into easily-marketable groups, and the best way to do this is to get them hooked and incite emotional reactions. It just so happens the easiest human emotions to tweak are all negative: anger, rage, hatred, and fear. Multiply this effect by billions of tweaked individuals and the result is a world of angry, depressed, divided masses who view the people on the other side of their engineered, bipolar worldview as sub-human.

This sucks.

It doubly sucks because on almost every measurable level we’re living in the best time in all of human history.

What to do? It doesn’t matter if someone doesn’t use social media, because everyone else does. Anyone who abstains from the madness is breathing in toxic fumes, second-hand-smoke.

My grandparents were cool, not because the commercial brainwashing of the day had convinced me that smoking was cool (it very likely had), but because they were fun people and I loved them. They loved me too, and would’ve never intended to do me harm; but one side-effect of hanging out with them was a continual lungful of second-hand smoke. At the time, I didn’t mind the smoke. Nobody did. It was 1979.

Is social media even less beneficial than cigarettes? On the micro level there are no doubt positive things that happen on Facebook and other platforms, but it’s possible to argue this point. Cigarettes were intentionally addictive and non-smokers breathed in the harmful byproduct. Forty years later, a technology called social media is intentionally addictive and non-users experience harmful byproducts. But the difference is that social media is also intentionally toxic. Cigarettes were incidentally toxic. If there had been a way for tobacco companies to engineer their product so that it didn’t kill their customers, then I’m sure they would’ve done it. Social media companies are colossal advertising platforms that purposely divide people, and mass-produce negativity for the purpose of selling products and ideas.

Should social media be regulated like tobacco? Maybe, but we’d need new laws. Anti-trust litigation isn’t the right tool. Alphabet and Facebook are indeed monopolies. (Their combined market capitalization is $1.3 trillion dollars as of last year, and no one else is even close.) But monopoly is not the problem. The problem is that social media companies are wielding unchecked control over a dangerous technology that does measurable harm to society. How is this not the most alarming thing in the world? Would it be beneficial to humanity if this market became more competitive through government anti-trust intervention? I don’t think so. Instead, the ad-based business model needs to go.

Social media should be sold as a premier service. In the past thirty years companies like HBO and Netflix obliterated the ad-based model of the big three TV networks. It turned out there was a big market for quality entertainment. The result was Peak TV. Shows like M*A*S*H were great, but any given show on Netflix today would be the best show on TV forty years ago. This further defines what I mean by beneficial tech.

A hundred years from now historians will look back on this time and see early social media as humanity’s first big mistake with AI. The technology is at a nascent stage. It can be a bridge to something better down the road.

So what about my dreams of a kick-ass decade? The Eighties turned out to be filled with a mix of miracles and personal tragedies, like any decade for anyone. There was also an explosion of fantastic entertainment and exciting change. I’m optimistic about the upcoming decade.

Let the Roaring Twenties begin!

Categories
Social Media

Why I Avoid Social Media

Technology is a beautiful thing. For the past twenty-plus years I’ve been striving to make tech work for us, because that’s the way it should be. Tech should free up more of our time, and give us more agency over our lives. Aside from being beneficial, it should also be cool. I would never suggest that someone stop using a tool that meets the above criteria, but I think we can all agree that in the past decade social media has fallen short of all three.

By social media I mostly mean “Goobook,” though other apps like Twitter and Reddit can serve up their own versions of hell. I admired (but did not use) the original Instagram before it was more tightly integrated into its parent company’s advertising machine. I don’t know much about the other apps. I just say Goobook for short because it sounds funny and makes me smile.

In the big picture Goobook is probably in some rough period of early evolution, just as the mid-nineteenth century industrial revolution had its share of inhumane bumps in the road. One day the current tech will lead to something that serves us instead of the other way around. But to get there, I believe the ad-based model must go.

Time Is a Non-Renewable Resource

The most basic reason I don’t use Goobook or any of the other apps is because there are an infinite number of better things I’d rather do with my limited time on this earth. For example, nothing. Most people probably don’t remember, but doing nothing can be pretty cool. Some call it meditation. It works wonders for mental health.

In an ad-based business model social media uses machine intelligence and slot machine psychology to get us hooked. Like casinos, it wants the maximum amount of our time. It could care less about our mental health.

Netflix, as an example of the paid model, has some binge-worthy content, but it doesn’t care how much I watch as long as I pay my eight bucks per month. I’m happy to pay it for the value it provides.
Tricking someone to give up their time is not cool, considering it’s the only resource we can’t get back. In Michael Ende’s classic book “Momo,” a homeless girl with special talents fights back against time thieves, the sinister “men in grey”. Written in the Mad Men era, it’s a critique of consumerism and stress, but it might as well be about what’s happening today – except one hundred fold.

Please Take My Money Instead

Goobook has bragged to its customers (its customers are advertisers, grandpa) that it can make an active user feel any emotion and she won’t know why. We like to think we’re rational beings, but emotions rule the world. Emotions are also how advertisers sell things, and negative emotions work best. Nobody buys stuff they don’t need because they’re happy and self-assured. Positivity is the arch enemy of advertising, and Goobook is the most powerful ad platform ever unleashed on the world.

How does this digital trickery work? From a tech point of view it’s fascinating stuff. From a human point of view, not so much. Machine intelligence records micro movements and reactions of individual users over time. The machine cultivates a behavioral profile, until it knows exactly which of our emotional buttons to push. Tristan Harris, former design ethicist at Goobook, has some truly eye-popping examples of how “technology hijacks our psychological vulnerabilities”. Fortunately for us Tristan is now co-founder of The Center for Humane Technology, and spokesperson for the Time Well Spent movement, making him one of the most awesome human beings on Earth.

Goobook also likes to brag that their so-called services will always be “free”. Any complete, truthful disclosure of true intent would be hilarious. Imagine if new users saw this upon signing up:

“We’re not going to lie to you. We employ an army of psychologists and engineers who design revolutionary technology for the sole purpose of getting you addicted to our app. Given enough time, they will succeed. But don’t worry, we’ll never charge you a cent!”

Fake promise

Well I should fucking hope not. In fact I’m thinking Goobook should pay me and everyone else for being lab rats in this uncontrolled psychological experiment they’re performing on the entire world. Let’s take a quick look at the results so far. Since Goobook really took off ten years ago we’ve seen an astonishing rise in uncool.

Cavalier attitudes toward privacy led to the undermining of democracy (Cambridge Analytica, Brexit, the breakup of the European Union, and the rise of nationalism everywhere).

Journalism got monopolized. (All journalists are now “sharecroppers on the Facebook farm,” according to WIRED magazine), subject to censorship based on whatever its users want to see, and thanks to its negative feedback loop this means Disaster, Tragedy and Politics (which, let’s face it, is all the same category these days).

This has led to general grumpiness and negativity despite the fact that nearly every quality of life statistic has improved over the past forty years. The preceding statement is factual, but just the fact that we doubt it for a second is proof that the concept of truth has also taken a hit.

Binary “thumbs up / thumbs down” thinking has polarized society and has disabled our ability to think deeply about complex and nuanced issues – at a time when we need this ability the most.

There’s also a lot of growing interest and debate over the possible role of Goobook in the uptick of U.S. opioid use and suicide rates in the past decade, especially among teens. I was skeptical when I first saw an article on this topic, as I thought it could be that instead of Goobook causing depression, depressed people were drawn to Goobook. And Goobook-bashing had become so trendy that I wanted to keep an open mind. At first the accusation reminded me of my teenage years growing up in the U.S. Bible Belt, when there was a lot of hysteria over the supposed evils of Dungeons & Dragons and heavy metal. (I was a big fan of both, by the way, and look how well I turned out!) But now, compared to apps whose stated business intent is to systematically and subliminally manipulate the thoughts and emotions of billions of people using AI-driven addiction techniques, music and role playing games seem absurdly tame. Considering social media is most effective (for its customers) when its users are isolated and bummed out – and these happen to be the exact conditions under which drug addiction and suicide happen – it wouldn’t surprise me if there was a link.

The New Normal

Somehow we’ve accepted all this as totally normal, when just a few decades ago it would’ve seemed like dystopia on high. In 1984 I was in junior high school, and appropriately enough George Orwell’s famous book was required reading that year. Imagine going back to the 80’s and explaining to me that thirty-five years in the future there would be a system used by everyone, whose stated purpose was to addict people, and to tweak their emotions over time.

My teenage eyes would go wide. “Dude… The Ministry of Truth actually exists in the future? The government has taken control?”

His naiveté stuns me for a second but then I laugh, thinking of the recent congressional hearings with Goobook. “No, in the future the government is dumber and weaker than ever. Tech moves fast. Government is slow.”

He’s confused for a moment but then recovers. “So it’s like Tyrell Corporation in the movie Blade Runner?”

“Ah, good reference. Something like that. Except the tech we have in 2019 is much less cool. That, and the big tech companies of the future all have headquarters in beautiful, cheery places, where all the employees believe they’re making the world a better place.”

“Why would they believe that?”

I ponder this for a moment, thinking of something that would make sense to him. “You know all those people in your school who think pep rallies are awesome?”

The teenage me nods knowingly, but then shakes his head in dismay. “How did all this happen?”

“It happened over a long period of time. People just went along for the ride and then they were hooked. Remember those freaky propaganda films on drug addiction they made you watch last year? It turns out there’s a science to addiction, and the people who implemented Goobook have got it all figured out. Like any addiction, it starts as a fun escape.”

“I’d never fall for that shit.”

“Don’t worry, you won’t. But everyone else will, so it doesn’t matter. It becomes the new normal, and that’s how the world around you will change.”

“Can anything be done?”

“Yes. Fortunately there are ways out.”

I mentioned Tristan Harris at the Center for Humane Technology, who at the moment would get my vote for one of the most awesome guys on Earth. There are others, a rising wave of smart thinkers fighting this thing like Momo fighting the men in grey. One way out seems to be paid subscription, because then the app would be serving us instead of us serving the machine. Goobook wouldn’t make as much money, at least not at first. But the paid model is sustainable and better for everyone in the long run. It’s a little like the early industrial days when companies went unchecked and polluted the environment in their quest to maximize profit at all cost. Goobook is doing the same thing now, except they’re polluting our hearts and minds. We need more awareness on this issue to get it cleaned up. I’ll do my part.

Other champions of this effort are:

Roger McNamee (ZUCKED)
Douglas Rushkoff (TEAM HUMAN)
Jaron Lanier (Ten Arguments for Deleting Your Social Media Accounts Right Now)

For the sake of more beneficial tech and a cooler future, I wish us all the best of luck.

This post first appeared on my author site, May 1, 2019.