Rapid Reads News

HOMEcorporateentertainmentresearchmiscwellnessathletics

Artists say AI scraping without permission isn't innovation, it's exploitation


Artists say AI scraping without permission isn't innovation, it's exploitation

The Government has proposed an exception to copyright laws to allow AI developers to source material from thousands of Australian artists and writers. Penelope Benton reports.

THE FEDERAL GOVERNMENT'S recent three-day economic reform roundtable concluded with unions and tech companies reportedly exploring a model for compensating creators for AI training data. While this sounds like progress, the National Association for the Visual Arts (NAVA) asserts that any agreement must be firmly grounded in existing copyright law, not in workarounds.

Australia's Copyright Act provides essential protections for artists. It ensures that the use of artworks, whether for publication, reproduction, or AI training, requires permission and payment. Yet in early August, the Productivity Commission released an interim report proposing a new exception to copyright for text and data mining. Similar to frameworks adopted overseas, this exception would give AI developers legal cover to use copyrighted material without permission or compensation.

Nightshade empowers artists in the battle against unauthorised AI theft

A new digital tool has emerged that will assist artists in protecting their work against artificial intelligence stealing it without consent.

NAVA has strongly opposed this recommendation and joined arts peak bodies and thousands of artists and writers in rejecting the proposal, which would entrench a system where creators are not asked, not paid, and not protected.

Enforcement is already a major challenge for individual copyright holders. The argument that there are potential avenues for compensation under a "fair dealing" exception for text and data mining misses the point: artists would still bear the burden of identifying and pursuing unauthorised use of their work. Power imbalances, platform opacity, and legal complexity make this all but impossible for most. These are not flaws of copyright law itself, but of the systems that fail to uphold it.

A recent survey conducted by NAVA of more than 890 visual artists found:

over 80% believe AI poses risks to their income, practice, and moral rights; 73% support a compensation scheme when work is used to train AI; their work and personal data have already been scraped without consent; and most face major barriers to identifying if their work has been used, citing lack of transparency, legal complexity, and power imbalances.

Artists repeatedly emphasised that AI scraping without permission is not innovation, it is exploitation. Many called it theft, saying the use of their creative labour is being used in AI training without credit, consent, or compensation, deepening the economic precarity already faced by visual artists. They also raised growing concerns about their identities and visual styles being mimicked or monetised without consent, and some reported that AI tools had scraped personal images to create fake accounts or impersonations. Many now feel unsafe sharing work online.

Beyond copyright concerns, artists voiced strong fears about the ethical, cultural, and environmental impacts of AI. They noted that generative AI is producing faster, cheaper content that pushes aside slower, process-based, or experimental practices. Respondents warned that the pressure to adopt AI is not about enhancing creativity, but meeting unrealistic expectations around productivity.

Environmental impacts were another recurring theme. Artists called attention to the enormous water and energy demands of AI infrastructure, particularly data centres, and urged greater public transparency on the environmental and community costs of these systems.

Artists want to be included in shaping the rules that will govern AI and creative practice. They are calling for clear legal protections and greater transparency in AI development, stronger education around artists' rights, public investment in local creative industries, and policy frameworks that value artistic labour, creative process, and Indigenous Cultural and Intellectual Property (ICIP).

Survey respondents also called for enforceable tools to find out if their work is in training datasets, along with opt-out rights and penalties for infringement. Most said they feel powerless under current laws and platform policies, unable to track usage or afford enforcement. The barriers, they said, are legal, financial, and emotional, and all too high.

Australia's AI push needs more questions, fewer promises

As government and industry sprint toward an AI-powered future, it's the unanswered questions, not the glowing promises, that should give us pause.

Almost half of respondents (46%) said they currently use generative AI in their creative practice, while 41% do not. Another 13% said they may use it in the future. Among those who use it, AI is most commonly used for writing-related tasks, such as editing or grammar correction (49%), drafting written content (49%), and grant writing or admin (36%). Around 40% use it for research and development, and 34% for brainstorming ideas. Fewer artists use it for visual outputs: only 22% use AI to generate sketches or reference images, and just 6% use it to produce final artworks. This shows that artists are not rejecting AI. While many are experimenting with the technology, particularly in administrative and research contexts, they continue to call for its use to be lawful, transparent, and fair.

Confusion around copyright ownership of AI-generated works adds another layer of vulnerability. Many artists expressed frustration over unclear rules around authorship and attribution, especially when AI has been trained on human-made content without permission. Clarifying these issues is essential for ensuring accountability, attribution, and remuneration.

Tools that help artists thrive should not come at the expense of their rights. What artists reject is a system that extracts their work without permission, under the guise of innovation.

NAVA urges the Productivity Commission and the Australian Government to reject any new copyright exceptions that allow AI companies to scrape artists' work without authorisation. Instead, the focus should be on strengthening the enforcement of existing laws and developing stand-alone AI legislation that upholds artists' rights. Protections must include transparent AI training datasets, meaningful consent processes, and clear avenues for attribution and compensation.

Penelope Benton is NAVA's Executive Director and is passionate about improving clarity, transparency and equity across the visual arts sector.

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Australia License

Support independent journalism Subscribe to IA.

POLITICS ARTS AI COPYRIGHT LAW AI artificial intelligence art Australian artists NAVA National Association for Visual Arts Productivity Commission auspol Australian Copyright Act Share Article

Previous articleNext article

POPULAR CATEGORY

corporate

5248

entertainment

6498

research

3286

misc

6101

wellness

5335

athletics

6608