attention economy equation techdetoxbox.com

How Attention Economy Works

Last Updated on October 25, 2023

What is the Attention Economy?

how attention economy works techdetoxbox.com

Attention economy was first defined by Herbert Simon, a Nobel-prize winning social scientist, who wrote in 1921:

“In an information-rich world, the wealth of information means the dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention.”

Your attention is a scarce commodity – and therefore extremely valuable. It is biologically limited to 24 hours a day, while digital media that consumes it is growing exponentially. This is a problem. Competition for your time becomes desperate.

To capture and hold your attention, digital media employs dark psychology in their user interface design – at the expense of your wellbeing. Profit is a priority, user happiness is not – billions of dollars are at stake.

Top predators in this law-of-the-jungle food chain are the Big Tech, and you are their prey.

The Formula: Attention Economy Equation

how attention economy works

Let’s unpack the attention economy formula with the classic rate/volume financial analysis.

What We Pay Them

Kevin Kelly, the visionary thinker who predicted the Internet revolution, calculated in his book “The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future” that all the media companies (music, books, movies, news) are earning only an average of $3 per hour of attention – and only if they include high-quality content that keeps us interested. The rate is surprisingly stable over the years – Kelly calculated it in 1995, 2010 and 2015, using the same methodology, and came up with $3.08, $2.69 and $3.37, which means that the value of our attention had been remarkably stable over 20 years.

We seem to have an intuitive sense of what a media experience should cost and will not pay more, especially if the choices are ever-expanding. It makes sense – if Netflix suddenly doubled their subscription rate tomorrow, they would risk a massive backlash and lose its customers to competing platforms.

If the user is only willing to pay $3 per hour, the only way to increase the rate is to SELL THE USER. To extract and sell their data.

The user becomes the product.

Surveillance Capitalism: How Much We are Sold For

How much can you be sold for? Facebook’s average advertising annual revenue per user in the US was $53.56
in 2020. At the time of the Facebook scandal in 2021 the company’s profit (not revenue – profit!) amounted to 40 billion dollars. Google’s advertising revenue in 2020 was 147 billion dollars

Big Tech owns the world of targeted behavioral advertising: unless businesses pay them, they are invisible.

Facebook and Google, however big, are just the most obvious examples of tech vying for your attention. A million other smaller guys make pennies compared to tech giants, but all are making some from selling the product of the attention economy.

What’s the product? Algorithmic behavior modification of the user.

Shoshana Zuboff in her monumental book “Surveillance Capitalism” calls the raw material attention economy runs on “behavioral surplus”. Like Spanish conquistadors conquering the helpless natives in the New World, tech companies capture our human experience and declare it theirs to own and use how they see fit. Behavioral data is shaped into prediction products and sold to the real users of the platforms who pay for the desired outcome – be it selling the product or voting for a political party.

Google places a personalized ad on the website the moment you land there, based on your previous web behavior. Increasingly sophisticated AI targets users with advertising that modifies behavior with precision. A “relevant” ad gets us to click and commands a higher price for user attention.

The more effective the mind control, the more tech customers are willing to pay for it. It makes perfect business sense – if an ad works, advertisers keep coming back for more.

Another way to sell the user is to slip in more subliminal advertising into the content by making it look like opinions, news, or entertainment. With everyone glued to their screens for 12 hours a day to be informed or entertained, algorithmic mind control of billions of people never stops.

The App Under Your Skin

Screens are here to stay – and multiply into the universe of “smart” objects that collect our data and predict our behavior. I had a glimpse of my cup of coffee trying to sell me something at 6am, and shuddered.

The goal of surveillance is to get under your skin. Did you notice that everyone these days wants you to download their app? Do you ever wonder why? Are they so nice to spend time and money to develop the app for your “convenience”?

Not really. Nothing is done unless there is money to be made. Like an IV connected to your bloodstream, an app on your phone is a mainline directly to you. It drips targeted behavioral advertising into your system, and sucks out your valuable data – your location, your contacts, your purchasing history, your social media activity, all of it.

At the power of app suggestions, you buy the product on the spot, consume digital content they place on your screen, and surrender data about your behavior for free.

Sometimes they nudge you gently. Sometimes they leave you no choice. A couple of examples from my recent experience:

  • Ticketmaster forced me to download an app – the printed ticket option was no longer available. The app came loaded with cross-selling and notifications to put their advertising in my pocket.
  • Hotels.com informed me that unless I download their app, they will charge me $5 extra for every booking.
  • Disneyworld app was a miracle of taking money out of our pocket, nudging a captive audience of tired parents to spend more on merchandise, skipping the line, food, and photos.
  • And don’t even get me started on the QR codes that EVERYONE is forcing on us in the times of COVID – from your neighborhood restaurant to a hiking trail. Every public place has a QR code that corrals you into their data capture system with “timed entry”. All supposedly done for our safety, but in reality, it’s a perfect excuse to collect our data.

COVID-19 was a godsend for data collectors. The pressure to buy everything online and install apps was supercharged by the pandemic. All of a sudden a virus made paper tickets and maps dangerous! Not much scientific evidence was ever found for such claims but it gave every store, museum, park, zoo, skating rink, and school portrait service a right to install itself permanently in our life.

Everyone became a data broker, violating our privacy with surveillance and advertising. Their notifications would be ON by default, our email would be full of THEIR marketing agenda. Data collected by one app would be sold to an unknown number of third parties, who would join the assault on our limited attention.

Thousands of data collectors are aggressively pushing themselves onto our already crowded screens, making us spend money on things we don’t need, extracting our data, and stealing the time of our life.

Humans are losing the war for attention. Algorithms are winning.

Pressing Your Emotional Buttons

The easiest way for algorithms to control humans is through subconscious mind, fixed by evolutionary biology:

This never fails. Look around – in every social setting, people are physically there, but they are not present – they are on their phones. Our interactive screens are better at getting our complete undivided attention than our fellow humans, who are just as distracted as we are.

The machines are already learning to use social skills humans have lost. Latest smartphones have gaze-tracking technology built in. By measuring the duration and direction of your gaze, AI can rank what content captures your attention, and give you more of it. The screens are watching us back. Voice-recognition devices in our homes are listening – always.

Unlike your family and friends, Alexa is a great listener.

Persuasive design that manipulates our emotions is used to generate headlines of fear, anger, and division to keep us on the screens: negativity drives engagement better than empathy. Soon, our devices will be able to discern our emotions in real time – and make us feel better or worse, depending on which is more profitable.

If AI is still programmed for negativity-based engagement, be prepared for real terror.

When Elon Musk develops Neuralink technology that puts a wire in your brain, ads and political agendas can be uploaded directly into your subconscious mind, spelling the end of free will.

Today we can still put the phone down and escape algorithmic manipulation – if we choose to exercise our freedom to do so.

Profitability Measure: Time on Screen

There are no natural stopping points in our digital experiences. They are designed for maximum screen time. Why? Because maximum profitability is achieved by maximizing user engagement. Staring at the screen 24 hours a day, 7 days a week, engaging with OUR digital product – while ignoring competing platforms and such trivial real-world pursuits as sleep.

To keep users on the screens longer, the science of persuasive design comes in handy by making digital media more “engaging”. More addictive. More negative and divisive. Appealing to the worst angels of our nature. Anything goes as long as it increases a direct measure of profitability – eyeballs on screen.

Even if your initial impulse when you picked up the phone was wholesome – like working or calling your mom – you can be sidetracked for hours by highly engaging content, bathing the pleasure centers of your brain in dopamine. Or triggered by the emotions of fear, anger, and insecurity, which is just as effective.

Every digital media business is trapped in this attention economy framework – with no incentive to give users a break.

If we don’t grab your attention, somebody else will, there is no point in trying to be a good guy. The platform that encourages digital detox would lose money while changing nothing.

As a result, the attention economy pushes biological limits to the absurd with 11 hours per day spent on screens by young people who are estimated to touch their phones 5,427 times a day – to the detriment of their very humanity. 

The Harm of Algorithmic Mind Control

The attention economy, however new, is the story as old as time – it’s all about the money. Humans have always been consumers. In the attention economy, humans have also become a commodity. 

Which brings to mind some uncomfortable associations with slavery.

“The difference between technology and slavery is that slaves are aware they are not free” – wrote modern-day philosopher Nassim Taleb. Slaves could maintain their inner freedom and humanity, but we comply with digital manipulation more like robots – programmed for any behavioral outcome, all the while thinking it was our idea.

Resisting algorithmic mind control takes awareness of psychological manipulation behind the user interface.

The ethical dilemma of the attention economy business model lies in the agency problem, described this way by Nassim Taleb in his book Antifragile: one party (the agent) has personal interests that are divorced from those of the one using his services (the principal).

Attention Economy Agency Problem: Tech companies are interested in their profits, not in the wellbeing of their users.

Our humanity is of no concern to the Big Tech, we are just a resource, optimized for user engagement – maximum screen time. Even if it means that people end up frazzled, depressed, unproductive, isolated, and sleep-deprived. Or even dead – as teens who commit suicide copying self-harm behavior they see on their screens.

There are no ethical limits imposed on the attention economy equation. Unlike doctors, tech companies are not subject to the Hippocratic Oath of “Do No Harm” when they administer to us toxically high doses of manipulative content.

They are free to do as much harm as they find profitable.

Attention Economy Business Model

A simplified tech company income statement would look something like this:

There are no ethical parameters built in. It’s technically possible, but why bother? The industry will not voluntarily limit their algorithms – after all, why would any business consciously cut their own profitability? Why kill the goose that lays golden eggs of advertising income? Especially if a dozen competitors will immediately rush in to grab your market share?

There is no incentive for the company to reduce unethical revenue to $0, and in addition to that, spend money on optional ethical expenses. Both activities would hurt the bottom line. The job description of the department of ethics would be to cut the branch they are sitting on. To reduce the revenue their salaries are coming from. No wonder companies are reluctant to introduce these jobs.

Here is an experiment anyone can do: go on LinkedIn and search for jobs in “AI Governance”. You would find plenty – with a degree in law as a requirement. Which means the companies want lawyers to make sure they don’t do anything ILLEGAL, not psychologists to make sure they don’t do anything UNETHICAL.

Making teenagers depressed is unethical – but legal and profitable, so the platforms see no reason to change. This is not governance, this is mere compliance. They don’t need digital ethicists – lawyers are sufficient.

There are two forces that can change the situation. One is policy – slow in coming. Another is more powerful – loss of customer trust and potential catastrophic loss of future income that comes with it. Scandals like Facebook being exposed for knowing its Instagram app makes teens depressed bring public awareness to the problem.

The government is hopelessly behind with policies that would force ethical guidelines into the compliance framework of tech companies. Our politicians have little understanding of what they are dealing with. And so it goes, business as usual: as long as the activity it’s not illegal, the industry is free to keep making money from it. No one breaks the law after all.

So, here we are:

  • Tech industry is trapped by a business model.
  • Tech users are trapped by tech addiction.
  • Tech regulators are trapped by inertia.

An Alternative to the Attention Economy

Let’s envision a world where AI is designed to direct us to the content that optimizes our happiness, performance, and learning, instead of the current default mode of optimizing the income extracted from our attention.

The Center for Humane Technology advocates for the creation of filters that optimize for our humanity instead.

Right now these filters – ads that are shown to you on the websites you visit, social media posts in your feed, gaming environments – are all designed to make the most money from you. Which translates to maximizing your screen time – which multiple studies have proven to be detrimental for human flourishing.

Sometimes, our best interests and the interests of the platform align – when we watch motivational speakers on Youtube, and Youtube recommends more, it expands our knowledge. The design of future filtering technologies can make this process intentional, not coincidental.

Unfortunately, Youtube is more likely to recommend inflammatory political debates and conspiracy theories – content that makes the user more angry and upset, and therefore, more engaged with the platform.

Since our realm of choices of information continues to expand exponentially, we actually really need the technology of smart filters to tell us what to pay attention to. Our brains biologically cannot deal with millions of options. Even with our attention span degraded from 12 seconds in the year 2000 to 8 seconds today, the overall amount of time is still finite at 24 hours a day.

Will the future filters feed us content that helps us become a better version of ourselves? Or would Big Tech continue to take advantage of our human weaknesses and stealthily hack our evolutionary cognitive biases to make us think, act, and vote as ordered by the manipulators of our data?

That’s the big question of today and definitely of tomorrow. The one few of us are asking.

Take Back Control
Sign up for our monthly newsletter to receive latest digital wellbeing research and screen time management solutions. We never share your email with third parties.



Leave a Reply