The Delhi High Court’s direction to take down URLs of a Yo Yo Honey Singh and Badshah song from all social media and sharing platforms is not just a story about one song from 2006. It is a story about what courts can require of the world’s most powerful digital platforms operating in India, what obligation those platforms have toward content that was uploaded years or decades ago, and what the Honey Singh case signals about the future of platform liability for offensive content in one of the world’s largest internet markets.
For YouTube, Spotify, Instagram, X, Facebook, and every other platform that hosts user-generated or licensed music content in India, the Delhi High Court’s order is worth reading carefully. Because the principle it establishes, that a court can direct comprehensive URL-level takedowns of content deemed obscene and derogatory toward women regardless of when that content was created or how long it has been available, applies to far more than one Punjabi track from 2006.
What the Platforms Are Being Asked to Do
The court’s direction to take down URLs from all social media platforms and sharing platforms is operationally more complex than it might appear. A song that has been available online for nearly two decades does not exist as a single URL. It exists as thousands of URLs across multiple platforms, uploaded by the original rights holders, by fan accounts, by music aggregators, by clip accounts that have extracted portions for reels and shorts, and by ordinary users who shared it across nearly 20 years of social media history.
A comprehensive URL takedown of this kind requires each platform to identify every instance of the content on its service and remove it, not just the primary licensed upload but every user upload, every clip, every reel, every short, and every share that contains the song or recognisable portions of it. For a song that has existed in digital form since the mid-2000s, that is a significant content moderation undertaking.
Platforms have the technical tools to do this. YouTube’s Content ID system, for example, can identify audio fingerprints across the platform’s entire library and block or remove matching content automatically once a reference file is registered. Spotify can remove licensed tracks from its catalogue with a relatively straightforward administrative action. Instagram and other social platforms have audio identification systems that can detect and remove specific tracks from videos and reels.
The question is not whether the platforms can comply. The question is whether a court direction of this kind establishes a precedent that requires them to proactively hunt down every instance of identified content rather than simply responding to individual takedown notices.
The Safe Harbour Question
Platform liability for user-uploaded content in India is governed by the Information Technology Act of 2000 and the IT Rules of 2021, which provide a safe harbour framework under which platforms are not liable for third-party content as long as they comply with takedown notices expeditiously and follow due diligence requirements. The framework is broadly similar to the DMCA safe harbour in the United States and the e-Commerce Directive framework in Europe.
The Delhi High Court’s comprehensive URL takedown direction goes beyond the standard safe harbour compliance model of responding to individual complaints. It directs platforms to proactively remove all URLs of identified content, which is closer to a proactive monitoring obligation than a reactive takedown obligation. Whether this direction, if it becomes a permanent order after the May 7 hearing, is consistent with the safe harbour framework under the IT Act is a legal question that will be debated extensively in the proceedings.
For platforms, the distinction matters significantly. A reactive takedown model requires compliance with specific notices and requests. A proactive removal model requires ongoing content surveillance and identification, which is operationally expensive, technically demanding, and raises serious questions about the scope of platform obligations and the costs of operating in the Indian market.
The Old Content Problem
The Honey Singh song’s 2006 to 2007 origin date is the aspect of this case that has the most far-reaching implications for how platforms think about legacy content.
The internet has a nearly perfect memory. Content that was produced and distributed in limited form before the age of ubiquitous smartphones and social media has been digitised, uploaded, shared, and reshared across the past two decades, giving it a reach and persistence its creators could not have anticipated. A song that might have sold a few thousand physical copies in 2006 may have been streamed millions of times by 2026 across multiple platforms.
The legal and ethical frameworks governing what content is acceptable have also evolved significantly in the same period. Material that was produced and distributed in 2006 without triggering legal challenge may not meet the standards that courts, regulators, and public opinion apply in 2026. The #MeToo movement, the increased judicial attention to content derogatory toward women, and the broader cultural shift in how misogynistic content is evaluated have all changed the landscape in which this old content now exists.
The Delhi High Court’s order essentially says that the age of content does not protect it from contemporary legal standards if that content is actively accessible and being circulated today. The fact that the song was made 20 years ago does not mean its current availability on streaming platforms is grandfathered from the court’s authority to order its removal.
This has significant implications for platforms that host vast libraries of old content across music, video, and social posts. If the principle that courts can direct removal of legacy content based on contemporary standards of obscenity and dignity is established and upheld after May 7, it creates an ongoing compliance obligation for every piece of old content on every platform that might be evaluated differently today than when it was first uploaded.
What the May 7 Hearing Will Determine
The notice issued to respondents and the May 7 listing date means this case is at an early stage. The interim direction to take down URLs has been issued, but the full legal arguments from the respondents, which will include the platforms and potentially the artists or rights holders, have not yet been heard.
The respondents are likely to argue on multiple grounds. Artistic freedom and freedom of speech under Article 19. The retrospective application of contemporary standards to content created under different norms. The scope of platform obligations under the IT Act’s safe harbour provisions. And the practical question of what comprehensive URL removal of legacy content actually requires of platform operations.
The court will also have to consider whether the content in question meets the legal threshold for obscenity under Indian law, which requires more than mere offensiveness and has a specific legal definition that has been interpreted across decades of Indian jurisprudence.
The May 7 hearing is where these arguments will be made. What the Delhi High Court decides then will determine whether the interim URL takedown direction becomes a permanent order and whether the principles underlying it become established precedent for how Indian courts approach platform liability for offensive legacy content.
The Bigger Picture for India’s Digital Platforms
India has over 800 million internet users and is one of the world’s most important markets for every major digital platform. The regulatory environment for those platforms has been evolving rapidly through the IT Rules of 2021, the proposed Digital India Act, and an increasing number of court directions on content moderation.
The Honey Singh case adds a specific and significant data point to that evolving framework. It demonstrates that Indian courts are willing to issue comprehensive platform-level takedown directions for content deemed derogatory toward women, that the age of content does not provide protection from such directions, and that artistic freedom arguments will not automatically prevail when courts find content crosses the line into obscenity.
For platforms operating in India, the case is a signal that legacy content libraries are not a set-and-forget compliance matter. Content that was uploaded years ago and has been available without challenge may still be subject to court-directed removal if it comes to judicial attention and is found to violate contemporary standards.
One song from 2006. One Delhi High Court order in 2026. And a set of questions about platform liability, legacy content, and artistic freedom that the entire Indian digital industry will be watching May 7 to see answered.