Artificial Intelligence (AI) and Machine Learning
The AMA recognizes that AI is increasingly integrated into creative workflows and that its use introduces new considerations around rights, value, and risk. This resource is intended to help the creative community establish shared language and practical frameworks for evaluating AI use, negotiating terms, and protecting creative work.
As AI tools and policies continue to evolve, the AMA believes these conversations are best advanced collaboratively. Member feedback and real-world experience will continue to inform how this guidance develops over time.
To support these conversations, the AMA has organized the guidance below into a series of practical reference sections that address common AI considerations in creative production. Each section is intended to help members build shared understanding, ask more informed questions, and make clearer decisions as AI tools become more embedded in creative workflows.
*Please share any AI news or updates you learn of if not listed here: info@artistmanagementassociation.org
Information on this page was last updated: April 14, 2026. Please verify details independently, as laws/policies may have changed since this update.
If you are able please donate. Donations cover AI related efforts including attorney's fees, lobbying and guest speakers. No donation is too small or too big.
-
As AI becomes more embedded in creative workflows, it’s essential for artists, agencies, and clients to understand how different AI systems handle data, intellectual property, and risk. One of the most important distinctions is whether a tool operates as a walled garden or an open system, as this directly impacts how creative inputs are used, stored, and protected. Understanding this distinction helps the creative community align on expectations and establish clearer standards as AI tools continue to evolve.
-
As AI becomes increasingly integrated into creative production, it’s critical to define how commissioned work may be used and where boundaries should be set. Not all AI use carries the same level of risk, value, or impact on creative rights, and each category requires thoughtful consideration. The framework below outlines common AI use tiers, how they typically function in practice, and when additional fees, protections, or contract terms may be warranted.
-
This section offers a practical set of questions to help clarify how AI is being used, what new value it creates, and where responsibility sits, so decisions around permissions, fees, and risk can be made intentionally.
-
As it currently stands, work created using AI alone cannot be copyrighted, thus the ability to license it to a commissioning party and receive usage fees for use of the work is difficult if not impossible. Additionally, no system exists to compensate artists for work that has been ingested by LLMs without consent.
U.S. Copyright Office's Stance: The U.S. Copyright Office has explicitly stated that it will not register works generated solely by AI. They emphasize that copyright law protects only "original works of authorship" created by humans.
Lack of Human Authorship: The core issue is the absence of a human author in the creation process of AI-generated art. While AI can be used to generate images, the creative input and decision-making process is attributed to the AI, not a human.
Public Domain: Consequently, AI-generated works without sufficient human input are considered to be in the public domain, meaning no one can claim copyright ownership.
Based on the current Administration’s AI Action Plan and the recent remarks by the President, we cannot foresee how these regulations may change but the Administration, in its attempt to push AI dominance, is in favor of de-regulation and does not appear to support artists’ right so our expectation is that any guardrails that may have been enacted to protect copyrighted materials and the ability for artists to obtain a copyright on newly created AI generated work with both be eliminated.
-
The following sample language is provided as a reference to help members translate the concepts discussed in this resource into contractual terms commonly reflected in the artist's terms. This language reflects provisions that artists and their representatives may include in their own terms, invoices, estimates or statements of work when addressing artificial intelligence, data learning, and emerging technologies such as NFTs.
Members should be aware that clients may require their own contractual terms, which are often set forth in purchase orders (POs), master service agreements (MSAs), or other governing documents that artists are asked to sign. It is important to clearly understand which terms control the engagement and to ensure that any AI-related permissions, restrictions, or protections that have been negotiated are accurately reflected in the governing agreement.
This language is not intended to be used verbatim in all circumstances. Members should review and adapt it to reflect the specific facts of each project and are strongly encouraged to consult with their own legal counsel to confirm that all terms are consistent, enforceable, and aligned with the agreed scope of work.
Sample Contract Language
LIMITATIONS ON ARTIFICIAL INTELLIGENCE AND NFTs: Unless explicitly stated otherwise in the applicable invoice or SOW, Client may not use the Deliverables (i) in connection with an immutable digital asset intended for sale, including but not limited to non-fungible tokens (NFTs); (ii) in connection with machine learning, artificial intelligence, or biometric technology, including but not limited to the use or ingestion of caption information, keywords, or other metadata associated with the Deliverables. Client may not use AI or similar tools to create derivative works, substantially alter the appearance, poses, expressions or physical characteristics of talent, generate new or composite images, or materially alter the scenario from how it was captured on camera. If AI is used in post production process, photography to be behind a secure, project-specific Garden Wall and not ingested into overall LLM. For clarity, this restriction does not prohibit Client from using Deliverables in such a manner for its own internal, non-commercial purposes.
-
Perfect 10 v. Google (9th Cir. 2007)
Perfect 10 sued Google over Image Search, alleging that thumbnail copies and the display of full-size images via in-line linking infringed its copyrights. The Ninth Circuit held that Google’s thumbnails were fair use because they served a transformative search function, and adopted the “server test,” finding no direct infringement for embedded images not stored on Google’s servers. Rights holders argued this weakened online enforcement, while platforms saw it as clarifying how search and linking can lawfully display images. An influential precedent that still shapes web practices.
Grecco v. RADesign, Inc. (SCOTUS cert. Denied 2025)
Photographer Michael Grecco sued RADesign for unauthorized use of his images, raising a key question about the Copyright Act’s three-year statute of limitations, whether claims accrue under the discovery rule (when the owner discovers or should discover the infringement) or the injury rule (when the infringement occurs). After the Second Circuit allowed the case to proceed without rejecting the discovery rule, defendants sought Supreme Court review; the Court declined to hear it, leaving the discovery rule widely available to rights holders even as debate over accrual continues.
Warner Bros vs. MidJourney
Warner Bros. has filed a lawsuit against the AI company Midjourney, accusing it of illegally using copyrighted characters like Superman, Bugs Bunny, and Batman to train its system and generate new images and videos. The studio argues this misleads users and harms its intellectual property, while seeking up to $150,000 per violation. Midjourney denies wrongdoing, claiming its training methods are protected “fair use” and comparing its system to a search engine. This case follows similar lawsuits from Disney and Universal and could have major implications for how AI tools use copyrighted material.
Disney, Warner Bros, Universal sue MiniMax
Disney, Warner Bros. Discovery, and Universal Pictures have filed a joint lawsuit against Chinese AI company MiniMax, alleging its tool Hailuo AI illegally uses their movies and TV shows to generate images and videos of copyrighted characters like Minions, Superman, and Darth Vader. The studios call the practice an existential threat to Hollywood, warning the technology could soon create full-length unauthorized films. They’re seeking damages and a court order to stop MiniMax from exploiting their works.
Encyclopedia Britannica
Encyclopedia Britannica and Merriam-Webster have sued AI company Perplexity, alleging its “answer engine” copies and summarizes their content without permission. They argue this not only infringes copyright and trademarks but also misleads users by attributing inaccurate AI-generated information to their brands. The lawsuit seeks damages and a halt to Perplexity’s practices, highlighting growing concerns that AI tools can undercut publishers and creators by diverting traffic and revenue away from original sources.
Anthropic Settlement - $1.5 Billion
A group of authors sued Anthropic, alleging it used pirated books to train its AI chatbot, Claude. In a major ruling, the court said training on legally obtained books can qualify as fair use, but storing and using pirated copies is not. Anthropic agreed to a $1.5 billion settlement and must destroy the pirated files.
This case sets an important precedent. AI companies may defend training on lawful data as fair use, but they face serious liability if they rely on pirated or improperly sourced material. For creators, it underscores the need to protect how their work is acquired and used in AI training.
-
U.S. Legal Requirements: None Yet
As of September 2025, there are no federal law in the U.S. explicitly requiring brands to disclose that artwork was generated using AI. Copyright law puts a strong emphasis on human authorship works created entirely by AI are not eligible for copyright and must be disclosed if submitted for registration.
Upcoming / International Trends
In the EU, the AI Act (effective starting 2025) mandates disclosure/watermarking for AI-generated content. A voluntary U.S. congressional proposal, titled the AI Labeling Act of 2023 would require such disclosures, but it has not yet been enacted.
Brand Risk & Best Practices
Even without a legal requirement, transparency is critically important. Brands that disclose AI usage, especially in marketing or creative contexts establish trust and align with growing consumer and regulatory expectations.Platforms like Google and Instagram are already pushing for more AI content labeling for clarity and user trust
-
Closed Garden
A restricted environment where data is used only for a specific purpose and is prevented from influencing or flowing into broader systems or models.Sandboxing
Running data or processes in an isolated environment so they can be used safely without affecting the main system, model, or production environment.Data Isolation
The technical practice of keeping a dataset fully separated from other datasets to prevent cross-contamination, reuse, or unintended learning.Training Exclusion
A restriction ensuring that data is not used to train, fine-tune, or improve any general or future AI models.Fine-Tuning
A process where a model is further trained on specific data to improve its performance; excluded in protected or closed-garden use cases.Inference-Only Use
Using data solely to generate outputs in the moment, without storing, learning from, or retaining the data afterward.Model Contamination
The unintended incorporation of proprietary or licensed data into a broader model’s behavior, outputs, or internal parameters.Embeddings
Numerical representations of data (such as images or text) used by models to understand similarity; embeddings can unintentionally persist learnings if not restricted.Weight Updates
Changes to a model’s internal parameters that occur during training or fine-tuning; prohibited when protecting licensed content.Data Retention
How long data is stored after use; closed-garden systems typically require limited retention or deletion after the project ends.Ephemeral Processing
Data is processed temporarily and discarded immediately after use, leaving no lasting trace in the system.Controlled Scope of Use
A clearly defined limitation on how, where, and for what purpose data may be used.Derivative Works (AI Context)
New outputs generated from original content, where ownership and usage rights depend on how the source data is handled and isolated.Generalized Model
A shared or production-level AI model that learns from many datasets; licensed content should not be absorbed into these models.Data Leakage
The accidental exposure or reuse of data outside its approved environment or purpose.IP Containment
Safeguards that prevent intellectual property from being absorbed, replicated, or reused beyond agreed-upon boundaries. -
The Human Artistry Campaign is a coalition of over 180 member organizations advocating for responsible AI development and use that respects the rights of creators and rights holders and strives to enhance human creativity, not replace it.
The Digital Creators Coalition (DCC) is a group of associations, companies and organizations that represent individual creators, independent producers, small-and-medium-size enterprises (SMEs), large businesses, and labor organizations from the American creative communities. We contribute significantly to U.S. GDP, exports and employment – collectively employing or representing millions of American creators, and contributing billions of dollars to the U.S. economy.
Donations to 501(c)(6) organizations are generally not tax-deductible as charitable contributions for federal income tax purposes. This is because 501(c)(6) entities, such as business leagues, chambers of commerce, and professional associations, are not classified as charities by the Internal Revenue Service (IRS).
However, if a business makes a payment to a 501(c)(6) organization for a purpose that is directly related to the business's own trade or business, such as a membership fee that provides direct business benefits like networking opportunities, advertising, or access to business information, that payment may be deductible as a business expense. It's important to note that the deductibility as a business expense is subject to the ordinary and necessary rules of business expenses under IRS regulations.
It's always recommended to consult with a tax professional or accountant to understand the specific tax implications for your situation and to ensure compliance with IRS rules and regulations when making donations.