India Brings AI-Generated Content Under Formal Regulation with IT Rules Amendment
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
“Well, I can say that Indian Companies so far has been rather negligent of customer's privacy. Anywhere you go, they ask for your mobile number.”The DPDP Act is designed to ensure that such casual indifference to personal data does not survive the next decade. Below are eight fundamental ways the DPDP Act will change how Indian companies handle data in 2026, with real-world implications for businesses, consumers, and the digital economy.
According to Shashank Bajpai, CISO & CTSO at YOTTA, “The DPDP Act (2023) becomes operational through Rules notified in November 2025; the result is a staggered compliance timetable that places 2026 squarely in the execution phase. That makes 2026 the inflection year when planning becomes measurable operational work and when regulators will expect visible progress.”In 2026, privacy decisions will increasingly sit with boards, CXOs, and risk committees. Metrics such as consent opt-out rates, breach response time, and third-party risk exposure will become leadership-level conversations, not IT footnotes.
As Gauravdeep Singh, State Head (Digital Transformation), e-Mission Team, MeitY, explains, “Data Principal = YOU.”Whether it’s a food delivery app requesting location access or a fintech platform processing transaction history, individuals gain the right to control how their data is used—and to change their mind later.
Shukla highlights how deeply embedded poor practices have been, “Hotels take your aadhaar card or driving license and copy and keep it in the drawers inside files without ever telling the customer about their policy regarding the disposal of such PII data safely and securely.”In 2026, undefined retention is no longer acceptable.
As Shukla notes, “The shops, E-commerce establishments, businesses, utilities collect so much customer PII, and often use third party data processor for billing, marketing and outreach. We hardly ever get to know how they handle the data.”In 2026, companies will be required to audit vendors, strengthen contracts, and ensure processors follow DPDP-compliant practices, because liability remains with the fiduciary.
As Bajpai notes, “The practical effect is immediate: companies must move from policy documents to implemented consent systems, security controls, breach workflows, and vendor governance.”Tabletop exercises, breach simulations, and forensic readiness will become standard—not optional.
As Bajpai observes, “This is not just regulation; it is an economic strategy to build domestic capability in cloud, identity, security and RegTech.”Consent Managers, auditors, privacy tech vendors, and compliance platforms will grow rapidly in 2026. For Indian startups, DPDP compliance itself becomes a business opportunity.
One Reddit user captured the risk succinctly, “On paper, the DPDP Act looks great… But a law is only as strong as public awareness around it.”Companies that communicate transparently and respect user choice will win trust. Those that don’t will lose customers long before regulators step in.
As Hareesh Tibrewala, CEO at Anhad, notes, “Organizations now have the opportunity to prepare a roadmap for DPDP implementation.”For many businesses, however, the challenge lies in turning awareness into action, especially when clarity around timelines and responsibilities is still evolving. The concern extends beyond citizens to companies themselves, many of which are still grappling with core concepts such as consent management, data fiduciary obligations, and breach response requirements. With penalties tiered by the nature and severity of violations—ranging from significant fines to amounts running into hundreds of crores, this lack of understanding could prove costly. In 2026, regulators will no longer be looking for intent, they will be looking for evidence of execution. As Bajpai points out, “That makes 2026 the inflection year when planning becomes measurable operational work and when regulators will expect visible progress.”
As Sandeep Shukla cautions, “It will probably take years before a proper implementation at all levels of organizations would be seen.”But the direction is clear. Personal data in India can no longer be treated casually. The DPDP Act marks the end of informal data handling, and the beginning of a more disciplined, transparent, and accountable digital economy.
![]()
“The social media ban only really addresses on set of risks for young people, which is algorithmic amplification of inappropriate content and the doomscrolling or infinite scroll. Many risks remain. The ban does nothing to address cyberbullying since messaging platforms are exempt from the ban, so cyberbullying will simply shift from one platform to another.”
Leaver also noted that restricting access to popular platforms will not drive children offline. Due to ban on social media young users will explore whatever digital spaces remain, which could be less regulated and potentially riskier.
“Young people are not leaving the digital world. If we take some apps and platforms away, they will explore and experiment with whatever is left. If those remaining spaces are less known and more risky, then the risks for young people could definitely increase. Ideally the ban will lead to more conversations with parents and others about what young people explore and do online, which could mitigate many of the risks.”
From a broader perspective, Leaver emphasized that the ban on social media will only be fully beneficial if accompanied by significant investment in digital literacy and digital citizenship programs across schools:
“The only way this ban could be fully beneficial is if there is a huge increase in funding and delivery of digital literacy and digital citizenship programs across the whole K-12 educational spectrum. We have to formally teach young people those literacies they might otherwise have learnt socially, otherwise the ban is just a 3 year wait that achieves nothing.”
He added that platforms themselves should take a proactive role in protecting children:
“There is a global appetite for better regulation of platforms, especially regarding children and young people. A digital duty of care which requires platforms to examine and proactively reduce or mitigate risks before they appear on platforms would be ideal, and is something Australia and other countries are exploring. Minimizing risks before they occur would be vastly preferable to the current processes which can only usually address harm once it occurs.”
Looking at the global stage, Leaver sees Australia ban on social media as a potential learning opportunity for other nations:
“There is clearly global appetite for better and more meaningful regulation of digital platforms. For countries considered their own bans, taking the time to really examine the rollout in Australia, to learn from our mistakes as much as our ambitions, would seem the most sensible path forward.”
Other specialists continue to warn that the ban on social media could isolate vulnerable teenagers or push them toward more dangerous, unregulated corners of the internet.
![]()
![]()
That lengthy standoff over privacy rights versus child protection ended Wednesday when EU member states finally agreed on a negotiating mandate for the Child Sexual Abuse Regulation, a controversial law requiring online platforms to detect, report, and remove child sexual abuse material while critics warn the measures could enable mass surveillance of private communications.
The Council agreement, reached despite opposition from the Czech Republic, Netherlands, and Poland, clears the way for trilogue negotiations with the European Parliament to begin in 2026 on legislation that would permanently extend voluntary scanning provisions and establish a new EU Centre on Child Sexual Abuse.
The Council introduces three risk categories of online services based on objective criteria including service type, with authorities able to oblige online service providers classified in the high-risk category to contribute to developing technologies to mitigate risks relating to their services. The framework shifts responsibility to digital companies to proactively address risks on their platforms.
One significant provision permanently extends voluntary scanning, a temporary measure first introduced in 2021 that allows companies to voluntarily scan for child sexual abuse material without violating EU privacy laws. That exemption was set to expire in April 2026 under current e-Privacy Directive provisions.
At present, providers of messaging services may voluntarily check content shared on their platforms for online child sexual abuse material, then report and remove it. According to the Council position, this exemption will continue to apply indefinitely under the new law.
Danish Justice Minister Peter Hummelgaard welcomed the Council's agreement, stating that the spread of child sexual abuse material is "completely unacceptable." "Every year, millions of files are shared that depict the sexual abuse of children. And behind every single image and video, there is a child who has been subjected to the most horrific and terrible abuse," Hummelgaard said.
The legislation provides for establishment of a new EU agency, the EU Centre on Child Sexual Abuse, to support implementation of the regulation. The Centre will act as a hub for child sexual abuse material detection, reporting, and database management, receiving reports from providers, assessing risk levels across platforms, and maintaining a database of indicators.
The EU Centre will assess and process information supplied by online providers about child sexual abuse material identified on services, creating, maintaining and operating a database for reports submitted by providers. The Centre will share information from companies with Europol and national law enforcement bodies, supporting national authorities in assessing the risk that online services could be used to spread abuse material.
Online companies must provide assistance for victims who would like child sexual abuse material depicting them removed or for access to such material disabled. Victims can ask for support from the EU Centre, which will check whether companies involved have removed or disabled access to items victims want taken down.
The breakthrough comes after months of stalled negotiations and a postponed October vote when Germany joined a blocking minority opposing what critics commonly call "chat control." Berlin argued the proposal risked "unwarranted monitoring of chats," comparing it to opening letters from other correspondents.
Critics from Big Tech companies and data privacy NGOs warn the measures could pave the way for mass surveillance, as private messages would be scanned by authorities to detect illegal images. The Computer and Communications Industry Association stated that EU member states made clear the regulation can only move forward if new rules strike a true balance protecting minors while maintaining confidentiality of communications, including end-to-end encryption.
Former Pirate MEP Patrick Breyer, who has been advocating against the file, characterized the Council endorsement as "a Trojan Horse" that legitimizes warrantless, error-prone mass surveillance of millions of Europeans by US corporations through cementing voluntary mass scanning.
The European Parliament's study heavily critiqued the Commission's proposal, concluding there aren't currently technological solutions that can detect child sexual abuse material without resulting in high error rates affecting all messages, files and data in platforms. The study also concluded the proposal would undermine end-to-end encryption and security of digital communications.
Statistics underscore the urgency. 20.5 million reports and 63 million files of abuse were submitted to the National Center for Missing and Exploited Children CyberTipline last year, with online grooming increasing 300 percent since negotiations began. Every half second, an image of a child being sexually abused is reported online.
Sixty-two percent of abuse content flagged by the Internet Watch Foundation in 2024 was traced to EU servers, with at least one in five children in Europe a victim of sexual abuse.
The Council position allows trilogue negotiations with the European Parliament and Commission to start in 2026. Those negotiations need to conclude before the already postponed expiration of the current e-Privacy regulation that allows exceptions under which companies can conduct voluntary scanning. The European Parliament reached its negotiating position in November 2023.