The explanation why TikTok will seemingly by no means face a full ban can be the strongest argument for why it ought to: It’s just too efficient at capturing consideration.
Past problems with geopolitical surveillance and information privateness lies a deeper, less-discussed concern: the standard of the eye platforms like TikTok seize and the strategies used to seize it.
Quick-form video, infinite scrolling and hyper-targeted algorithms aren’t impartial mediums. They form, and sometimes compromise, the eye they harvest, creating compulsive habits that warrant severe reflection – and which deserve consideration as a dimension of brand name security, high quality and shopper well-being.
Hacking the system
For many of the historical past of media, consideration has been earned via worth trade. One thing informative, entertaining or helpful was supplied, and a spotlight was given in return. Early social platforms adhered to this logic. Customers related with pals, found occasions, shared milestones.
However as digital media matured, codecs emerged that bypassed this worth trade fully. They don’t simply entice or invite consideration. They seize it and make it tougher to withdraw. The expertise turns into much less of a alternative and extra of a reflex.
Design patterns like infinite scroll, autoplay video, push notifications, algorithmic suggestions and short-form loops are optimized to scale back friction and prolong engagement. They’re engineered with direct reference to behavioral science and persuasive computing, educated to take advantage of reward loops and cognitive vulnerabilities. That engineering works by hacking our neurochemistry.
Managed substances
Analysis has persistently linked heavy social media use, particularly on these high-potency platforms, with adverse psychological well being outcomes, from sleep disruption to anxiousness, despair and suicidality.
The American Psychological Affiliation has drawn parallels between “problematic social media use” and substance abuse. Each are outlined by utilizing while you need to cease, taking extreme or misleading motion to take care of entry, having sturdy cravings and utilizing greater than meant.
Content material performs a task in algorithmic efficiency, with probably the most provocative and polarizing posts rising to the highest. However the engagement format itself is the difficulty.
Subscribe
AdExchanger Each day
Get our editors’ roundup delivered to your inbox each weekday.
There’s a rising consensus amongst policymakers that this engagement format deserves to be handled the best way we deal with managed substances like alcohol, nicotine, hashish and even playing (e.g., with age restrictions and different controls).
In 2023, the US Surgeon Common issued an advisory on social media use by minors. Utah has handed laws imposing restrictions on algorithmic focusing on for minors and sued Meta and TikTok to “Take away options inflicting extreme use: autoplay, perpetual scrolling and push notifications.” Related legal guidelines are being thought of in different states and internationally. Whereas most of those measures give attention to minors, the identical issues apply to adults.
The lived expertise of many customers helps this motion. Ask round: How many individuals have deleted an app from their cellphone to take a break? What number of describe their utilization by way of management, compulsion, abuse or remorse? I do know I’ve.
Or maybe one solely wants to have a look at our shared language for these experiences, with the mainstreaming of phrases like “doom-scrolling,” “binge-watching,” “dopamine-farming” and “mind rot,” simply to call a couple of. We all know it’s dangerous for us, however we are able to’t cease.
Breaking the cycle
Advertisers can’t cease, both. Platforms have successfully captured shoppers’ consideration, creating a robust financial incentive (even an crucial) for manufacturers to spend there.
The platforms promise and ship rapid, measurable outcomes, all of the whereas conveniently grading their very own homework. Each like, share, view, remark, web site go to or buy gives entrepreneurs rapid gratification, creating their very own cycle of dependence.
With billions being spent to purchase consideration and guarantee protected and appropriate environments, there’s greater than sufficient motive to query whether or not compulsive use patterns are good for enterprise. Not all consideration is created equal. Media environments that blur the road between consumption and compulsion might not be delivering the form of consideration that advertisers assume they’re shopping for.
A dimension of security and high quality
Psychological well being is already making its manner into the model security dialog. Some model security suppliers provide filters that steer advertisers away from content material associated to psychological well being, trauma or different emotionally loaded matters. That’s a begin. However these instruments nonetheless depend on a paradigm of brand name security predicated on content material adjacency alone.
What’s lacking from the dialog is the expertise through which the advert is delivered. Not simply what the person sees however how they acquired there. What frame of mind they’re in. How a lot management they’ve over what they’re consuming or how they devour it. Whether or not they really feel trapped. Whether or not they’re even ready to register a model message with the cognitive readability that efficient promoting requires.
These questions must be a part of how we outline high quality, suitability, security and accountability in media. Bettering model security with deeper contextual and semantic understanding is vital, however the context isn’t simply on the web page. There’s the social and human context, formed by the medium itself, that must be thought of as a dimension of security for manufacturers and audiences alike.
Presently, there’s no established framework for qualifying “wholesome” versus “unhealthy” media experiences inside focused promoting, however such a framework could be comparatively simple to construct. Platforms may cap promoting proven to customers exhibiting compulsive scrolling behaviors or flag customers participating at uncommon hours. Maybe sure age teams could possibly be restricted fully from receiving focused promoting via addictive media codecs.
None of those checks would pose a technological problem; they merely require growing pragmatic, measurable requirements for what constitutes protected and accountable media consumption.
Like privateness, change round compulsive media experiences will in the end emerge proactively from the promoting business or ultimately be imposed upon it via regulation.
If the objective is to construct lasting relationships with actual folks, then advertisers want to think about not simply the message or the content material, however the medium itself. The medium is the message, and it’s telling us one thing. The query is whether or not we’re prepared to hear.
“Information-Pushed Considering” is written by members of the media group and accommodates recent concepts on the digital revolution in media.
Observe Broadsheet Communications and AdExchanger on LinkedIn.


