TikTok’s Secret ‘Heating’ Button Can Make Anyone Go Viral – Forbes
TikTok and ByteDance employees regularly engage in “heating,” a manual push that ensures specific videos “achieve a certain number of video views,” according to six sources and documents reviewed by Forbes.
For years, TikTok has described its powerful For You Page as a personalized feed ranked by an algorithm that predicts your interests based on your behavior in the app.
But that’s not the full story, according to six current and former employees of TikTok and its parent company, ByteDance, and internal documents and communications reviewed by Forbes. These sources reveal that in addition to letting the algorithm decide what goes viral, staff at TikTok and ByteDance also secretly hand-pick specific videos and supercharge their distribution, using a practice known internally as “heating.”
“The heating feature refers to boosting videos into the For You feed through operation intervention to achieve a certain number of video views,” an internal TikTok document titled MINT Heating Playbook explains. “The total video views of heated videos accounts for a large portion of the daily total video views, around 1-2%, which can have a significant impact on overall core metrics.”
TikTok has never publicly disclosed that it engages in heating — and while all tech giants engage, to some degree, in efforts to amplify specific posts to their users, they usually clearly label when they do so. Google, Meta, and TikTok itself, for example, have partnered with public health and elections groups to distribute accurate information about COVID-19 and help users find their polling place, making clear disclosures about how and why they chose to promote these messages. (Disclaimer: In a former life, I held policy positions at Facebook and Spotify.)
But sources told Forbes that TikTok has often used heating to court influencers and brands, enticing them into partnerships by inflating their videos’ view count. This suggests that heating has potentially benefitted some influencers and brands — those with whom TikTok has sought business relationships — at the expense of others with whom it has not.
“We think of social media as being very democratizing and giving everyone the same opportunity to reach an audience,” said Evelyn Douek, a professor at Stanford Law School and Senior Research Fellow at the Knight First Amendment Institute at Columbia University. But that’s not always true, she cautioned. “To some degree, the same old power structures are replicating in social media as well, where the platform can decide winners and losers to some degree, and commercial and other kinds of partnerships take advantage.”
Heating also reveals that, at least sometimes, videos on the For You page aren’t there because TikTok thinks you’ll like them; instead, they’re there because TikTok wants a particular brand or creator to get more views. And without labels, like those used for ads and sponsored content, it’s impossible to tell which is which.
Employees have also abused heating privileges. Three sources told Forbes they were aware of instances where heating was used improperly by employees; one said that employees have been known to heat their own or their spouses’ accounts in violation of company policy. Documents reviewed by Forbes showed that employees have heated their own accounts, as well as accounts of people with whom they have personal relationships. According to one document, a heating incident of this type led to an account receiving more than three million views.
Got a tip about TikTok or ByteDance? Or about Chinese state media’s social media strategy? We’d like to hear from you. Email Emily Baker-White at ebakerwhite@forbes.com or reach out on Signal at 341-221-8664.
Moreover, documents show that staff — including those at TikTok’s parent company, ByteDance, and even contractors working with the company — exercise considerable discretion in deciding which content to promote. A document called TikTok Heating Policy says that employees may use heating to “attract influencers” and “promote diverse content,” but also to “push important information” and “promot[e] relevant videos that were missed by the recommendations algorithms.” Two sources told Forbes employees have often felt left to their own devices to determine whether a video fell within these guidelines.
In response to a detailed set of questions about how and by whom heating has been used, TikTok spokesperson Jamie Favazza wrote: “We promote some videos to help diversify the content experience and introduce celebrities and emerging creators to the TikTok community. Only a few people, based in the U.S., have the ability to approve content for promotion in the U.S., and that content makes up approximately .002% of videos in For You feeds.”
Documentation about heating within TikTok and ByteDance is substantial, but poorly organized. Documents purporting to govern heating exist across multiple teams and regions, including the Content Programming and Content Editorial Team based in Los Angeles, and the Live Platform and Product Operational Teams, based in China. In addition to the MINT Heating Playbook, there are documents titled MINT Heating Operation Policy 101, Heating Quota Guidelines, TikTok Heating Policy and U.S. Heating Strategy Guidelines.
These documents suggest that TikTok and ByteDance initially turned to heating for a mundane, legitimate business purpose: to diversify TikTok’s content away from lip synching and dancing teens, and toward videos that would interest more users. “The purpose of this feature is to promote diverse content, push important information, and support creators,” says the MINT Heating Playbook. “If you make good use of it, heating resources will bring a leverage effect, a small amount of heating resources will bring about growth of midrange users, and a more diverse content pool.”
One source told Forbes that heating has also been used to boost high-profile collaborations between TikTok and external actors, including NGOs and artists being courted by the platform, and that it was also supposed to be used when a creator in one category (e.g. beauty) created a video in another category (e.g. cooking). In those situations, the person said, heating “can help the algorithm find the right audience.”
There is a fraught history of tech platforms using their discretion to increase specific posts’ reach. Human curation has helped platforms create safe experiences for children and keep misinformation in check, but it has also led to claims that companies use curation to impose their own political preferences on users.
For TikTok, fears of political manipulation are tied to concern that the Chinese government could coerce the platform’s Chinese owner, ByteDance, into amplifying or suppressing certain narratives on TikTok. TikTok has acknowledged that it previously censored content critical of China, and last year, former ByteDance employees told BuzzFeed News that another ByteDance app, a now-defunct news aggregator called TopBuzz, had pinned “pro-China messages” to the top of its news feed for U.S. consumers. ByteDance denied the report.
TikTok declined to answer questions about whether employees located in China have ever heated content, or whether the company has ever heated content produced by the Chinese government or Chinese state media.
After this story published, TikTok spokesperson Maureen Shanahan said in a statement: “Under the national security agreement currently being considered by CFIUS, all protocols and processes for promoting videos in the United States would be auditable by CFIUS and third party monitors; only vetted TikTok USDS personnel would have the ability to “heat” videos in the U.S. In addition, source code review by Oracle will verify that there are no alternate means of promoting content.” Oracle did not immediately respond to a comment request.
TikTok is currently negotiating a contract with the Committee on Foreign Investment in the United States (CFIUS) that it says would address all national security issues raised by the app’s foreign ownership. But an increasing number of lawmakers are seeking to ban TikTok over fears that the CFIUS agreement may be too little, too late. Last month, TikTok parent company ByteDance admitted that a team of employees led by a Beijing-based executive had surveilled the physical location of journalists, including this reporter, in an effort to identify their sources. ByteDance fired employees involved in the surveillance.
In December, TikTok announced that it would add a new panel to recommended videos titled “Why This Video,” which would tell users how a given video had been chosen for them. Examples in the blog post, which touted the new feature as “meaningful transparency,” included explanations like ‘This video is popular in the United States” and “you are following [account]” — but the post didn’t mention heating.
When asked whether the new feature would disclose when videos had been heated, Favazza wrote, “we’re continuing our work to expand our ‘why this video’ feature and provide more granularity and transparency to content recommendations.”
Douek, the Stanford professor, said disclosing where and how TikTok uses heating “would be a first step” to getting users comfortable with the tool. “But sometimes, the reason why they don’t [use clearer labels] is because transparency allows for criticism.”
This story has been updated with additional comment from TikTok.