The following is a guest contribution Jordan Mitchell, Founder of Integrated Communications & Content Agency Growth Stack Media. Opinions expressed are the author’s own.
Throughout my career, I’ve seen firsthand the challenges content creators face as they strive to make a living while pursuing their passion. While honing your technical skills and storytelling abilities is important, it’s equally important for creators to understand the business side of the creative marketing industry.
As emerging technologies such as artificial intelligence (AI) become powerful enough to replicate human voices without consent or create compelling short films that would require a skilled professional to produce, proper attribution and collaboration between creators and the marketers who partner with them becomes increasingly important.
While laws like the NO FAKES Act are on the way, I believe they are not enough to protect the integrity of branded content created by individual creators, in-house teams of marketers, and creative agencies. To help bridge this gap, I’d like to share some tips on how creators and marketers can protect their rights while effectively leveraging AI tools to thrive in this new era.
Working closely together
Establish clear guidelines regarding content usage and attribution. Maintain open communication so all parties understand and agree to the terms of use.
Clearly define the scope of work from the start to establish a mutual understanding of deliverables. If a client is only paying for specific content deliverables, authors are not obligated to provide source files or raw materials. It’s important to agree on how the content will be used, where it will be distributed, and any licensing requirements.
Here’s why: If a creator delivers approved final assets and the client starts making their own edits, it could be considered a breach of contract. The client would have to pay the creator for such a use case, but this is not advisable, as further adjustments may result in a lack of technical skills to maintain the integrity of the content.
It’s also important to discuss distribution plans and specifically ask your client to tag you in their social media posts when promoting their content externally. If they don’t agree, let them know you plan to re-share their content on your own channels in the future and use the work you’ve created for them as part of your own portfolio.
If the deliverables will be used for internal purposes or for specific offline use cases, ensure that the appropriate documentation, such as a non-disclosure agreement (NDA), is in place to avoid any future confusion. This applies to any form of digital media.
Include a clause stating that the creator must be credited regardless of how the work is used, and negotiate royalties or other forms of compensation if the work is used for AI training or other derivative works.
Leverage technology
To ensure your exported content is protected, it’s important to leverage the tools available to you in addition to your contract: Make it a priority to include metadata with all exported assets, which allows you to encrypt and embed information directly into the media file itself.
Content authentication efforts like TikTok’s partnership with Adobe are a step in the right direction, but they do nothing to prevent the removal of watermarks or their manipulation by AI, which can be easily removed with freely available editing tools and alter videos in ways that make them extremely difficult to trace back to their original source.
While there is no perfect solution, blockchain technology holds great promise: By using the blockchain to track the provenance and ownership of content, creators can establish a transparent record of their work that is much harder to tamper with than traditional methods.
This is already happening with NFTs, helping creators protect their digital works by proving ownership and preventing unauthorized duplication, and similar blockchain-based solutions could be applied to other types of content to ensure creators are credited when their work is used.
Obey the law
The recently introduced NO FAKES Act aims to protect all individuals, including artists, musicians, and actors, from having their likeness replicated by AI without their permission. But media attention to high-profile cases, such as Scarlett Johansson’s concerns that ChatGPT appeared to replicate her voice, has obscured the vulnerability of ordinary creators and the general public.
Even creators who don’t mind having their voices or likenesses reproduced will likely struggle to receive proper credit and compensation under the current NO FAKES Act, and the government has a poor track record of preventing piracy and enforcing copyright laws, especially as new technology continues to outpace regulations.
The infamous file-sharing service Napster was one of the few defendants in copyright infringement cases in the past few decades to actually be punished, and that was in 2001. If the government had a hard time cracking down on MP3 sharing, how can we expect to keep up with the pace of AI development without a focus on public-private partnerships?
Stay ahead of the curve
Be aware that generative AI tools typically scrape data and use your original work without your consent to train large-scale language models (LLMs). Before using generative AI, read their terms of use to understand how your data will be used. Be careful with the prompts and the content you upload to these tools; there may be clauses that automatically consent to the company reusing your content. If you have an NDA in place, this could be problematic for both you and your client.
AI-generated content is becoming increasingly sophisticated. For example, Luma AI recently released Dream Machine, an AI model that quickly creates high-quality, realistic videos from text and images. The technology has an incredible understanding of real-world physics, producing videos that are nearly indistinguishable from live-action footage. As AI evolves, it becomes increasingly difficult to distinguish original content from AI-generated replicas.
Stay up to date on the latest AI technologies and their potential impact on the creative industries. Join trade associations and unions that represent creators’ interests, such as SAG-AFTRA, to partner with professional organizations and advocate for creators’ rights.
Look on the bright side
Despite the risks that AI poses to creators, it’s not all doom and gloom. AI tools can offset time-consuming tasks throughout the creative process, lowering the barrier to entry and allowing more people to freely express themselves. No matter the type of content, AI can help speed up each stage of the creative process.
From generating ideas and outlines to creating drafts, enhancing visuals and selecting key excerpts from longer pieces of text, AI tools can be extremely helpful, but the human element remains crucial to use these tools effectively in the creative process and bring it all together in a way that optimizes budgets and delivers unique pieces that impact business by inspiring audiences to take action or make purchasing decisions.
Ultimately, for AI to be a positive force for the creative community, we need stronger protections for creators’ rights. Ensuring proper credit and compensation is fundamental to building a thriving symbiotic relationship between human creativity and AI.