Today, using pretty basic skills and decent prompting capabilities for artificial intelligence (AI), it’s possible for novice hackers by the thousands to quickly gain the advanced blueprints needed to become industrial-scale streaming pirates. When we say, “industrial scale,” we’re referring to the raw programming throughput, the operational-know needed to scale coordination and resources, and the capability to distribute pirated content on a massive scale. Such piracy networks can rival legitimate businesses in their scope and operational complexity (aka Netflix-in-a-box).
In fact, the UK’s National Centre for Cyber Security (NCSC) issued warnings that artificial intelligence tools could fuel a surge in cybercrime. They predict that AI technologies will enable hackers of varying skill levels to intensify their hacking activities. Additionally, the NCSC cautions that criminal groups with substantial resources might develop their own AI systems to create malware capable of bypassing existing security measures. Likewise, a recent IBM X-Force report also points to the fast-growing presence of discussions surrounding AI and cybercrime on illicit and dark web forums.
Compared to just a few years ago, the media and entertainment (M&E) space finds itself in a position where the streaming revenues they rely on to help propel their reach is threatened by lesser-skilled programmers who’ve learned how to harness the power of AI to wreak havoc, steal content and disrupt revenue streams.
A huge swath of these possible content crooks likely would not usually have considered pirating without AI. But perhaps or even inadvertently, they realized that through AI prompting and other skills, they choose to be off to the races. After all, it doesn’t take long to piece together well-intended AI knowledge and use it to penetrate high-value content, which inevitably impacts user experiences, ad revenues, subscriber retention or CDN costs – even in their spare time on the weekends. Yet, they’re taking away serious dollars from streaming operators.
Generative AI technology is user-friendly enough that even less skilled attackers can utilize it, broadening its appeal across a wider range of cybercriminals. For example, novice video pirates are endorsing “jailbreaks” for ChatGPT by crafting specific prompts and inputs aimed at coercing the system into generating responses that may include the release of sensitive information, the creation of inappropriate content, or the running of malicious code.
Pirates have already leveraged AI combined with popular cloud software tools to swiftly extract premiere movies and television shows from legitimate online platforms and host them on their unauthorized services, causing losses of billions annually from ads and subscriptions.
How AI Is Contributing to the Trickle Down Effect From Piracy
Since the onset of the pandemic in 2020, the sheer number of streaming services has exploded, and the number of streaming services used per household has correspondingly increased – spanning the use of nationally-known big network brands to niche lesser-known content, brands, and organizations. Due to this increase in subscribers, high-value content, high-demand live events, and the ever-growing importance of ensuring stellar customer experiences, the delivery and security infrastructure for streaming operators needs to be robust and proactively managed.
If anything impacts the user experience for legitimate users, including impacts caused by content piracy, current customers are far more likely to flee and retention numbers can plummet, dramatically hurting businesses – especially for those that are just starting to offer niche content. Those are the startups that must prove their value to customers who are challenging to attract in the first place. The smaller the streaming start-up and its audience, the more they feel the effects of piracy. New streaming businesses clearly need to keep as many ongoing subscribers as possible. Their operations depend upon it. Investments in content protection are especially vital for such organizations, as it’s not something that can be occasionally swept under the rug.
Indeed, many people underestimate the extent of piracy’s impact, perceiving it as a minor issue that only marginally affects those who won’t notice it much. Some may even think the vast majority of the impact caused by piracy only affects millionaire actors, actresses, sports figures, or other VIPs. But nothing could be further from the truth. The reality is that piracy inflicts significant harm across the entire M&E ecosystem. The reduction in revenue not only affects those high-profile figures, but also has a ripple effect on the broader industry that ultimately ends up limiting opportunities in a more dramatic fashion for people like hair stylists, caterers, set builders, security guards, writers, musicians, makeup artists, and the huge number of other needed personnel to make it all possible. Piracy very much undermines the financial health and sustainability of the entire sector.
The Content Owner and Operator ‘Protection Misconception’
One big misconception among streaming operators and others in the space is that protecting content is mostly the content owner’s (studios) job. That’s simply not true – and AI proves it. Without operators, the content can’t typically make its way to the consumer in the first place. Therefore, while content creators may start the process of ensuring their content is protected, it’s up to the streaming service that’s delivering it to protect the content access (such as access made possible with the assistance of AI). In the end, it will help protect their revenue.
Operators do this by properly securing the platforms on which the content is hosted, distributed, or sold, safeguarding against breaches that could lead to content theft or unauthorized access and implementing systems that continually uphold the integrity as required by industry standards. That said, effective content protection is a collaborative effort that requires active participation and coordination among all parties involved.
As AI technology makes pirating easier, all parties need to acknowledge it stands as an added reason to address content protection upfront and not as an afterthought that can end up significantly impacting their bottom line.
The ease with which AI equips even the most amateur hacker with sophisticated tools showcases a pressing need for all stakeholders in the M&E industry to fortify their defenses. Only through collective vigilance and robust protective measures can the industry stem the tide of piracy that AI has made more accessible, preserving both their business and the integrity of the content ecosystem.
Maria “Mascha” Malinkowitsch currently serves as director of product management at Verimatrix (www.verimatrix.com).