Make a submission: Published response

#486
Arts Law Centre of Australia
15 Sep 2023

Published name

Arts Law Centre of Australia

Upload 1

Automated Transcription

11 August 2023

Regulation of Emerging Technologies Team
Department of Industry, Science and Resources

By email: digitaleconomy@industry.gov.au

Dear Regulation of Emerging Technologies Team

Safe and responsible AI in Australia: Discussion paper

Arts Law is grateful for the opportunity to response to the Department of Industry, Science and
Resources’ discussion paper, Safe and responsible AI in Australia: Discussion paper (Discussion
Paper).

Arts Law was established in 1983. It is a national community legal centre providing free or low-cost legal advice to Australian artists and arts organisations. Each year, Arts Law assists thousands of
Australian artists and arts organisations across all Australian states and territories. Arts Law would like to acknowledge the Traditional Owners of the various lands on which Arts Law works and pay our respects to Elders past and present.

Our submission is informed by Arts Law’s nearly 40 years of experience providing legal advice and education to the Australian creative community. In 2022, we delivered 2487 legal advices and educated 2414 webinar and workshop participants to support their professional development. Arts
Law’s clients include creatives across the arts community, including visual arts and crafts, film, multimedia, digital, new media, performing arts, games, fashion, design and music.

In response to each of the consultation questions set out on pages 34-35 of the Discussion Paper.

Definitions

1. Do you agree with the definitions in the discussion paper?

Yes.

ARTS LAW CENTRE OF AUSTRALIA Haymarket Creative, UTS Building 5, Block B, 1-59 Quay Street, HAYMARKET
NSW 2000 | GPO Box 2508 SYDNEY NSW 2001
T +61 2 9356 2566 1800 221 457 (toll-free) E artslaw@artslaw.com.au W artslaw.com.au ABN 71 002 706 256
Potential gaps in approaches

2. What potential risks from AI are not covered by Australia’s existing regulatory approaches? Do
you have suggestions for possible regulatory action to mitigate these risks?

Arts Law is the only national community legal centre working across the arts and the law in
Australia. Arts Law fills an important marketplace gap for artists and organisations, enabling them to access free or low-cost legal advice, professional development and other resources which they otherwise could not afford. Arts Law takes a leadership role in advocacy for creators and continues to be a strong voice for creators on issues, including the impact of AI.

When we refer to creators and creative organisations and communities, this includes visual artists, craft practitioners, authors, performers, musicians, composers, screen creators, filmmakers, actors, dancers, choreographers, game creators and designers as well as arts organisations, theatre companies, music schools, festival organisers, artist-run initiatives and Aboriginal and Torres Strait
Islander artists and art centres, community cultural development artists and artists with disability as well as the organisations which support Australian creators.

The potential risks from AI on the cultural and creative sectors in Australia are not addressed at all in the ‘Safe and responsible AI in Australia Discussion Paper’ (Discussion Paper), and the existing regulatory approaches in Australia do not encourage or require responsible AI practices via compliance with existing legislation that protects Australian creators and creative communities and organisations.

The negative impact of AI on creators and creative organisations and communities should be considered as a risk in Australia. Consideration of the impact this technology will have on First
Nations creators and Indigenous Cultural and Intellectual Property (ICIP) should also be considered.

Arts Law understands that the Discussion Paper does not seek to consider all issues related to AI, for example intellectual property including copyright. However, we refer to Arts Law’s submission dated
7 March 2023 in response to the Attorney-General’s Department ‘Copyright Enforcement Review,
Issues Paper, December 2022. In this submission Arts Law focussed on the obligation and burden placed on individual creators and copyright owners to enforce their copyright and moral rights.
Similar issues arise with AI and enforcement when copyright and moral rights infringement arises – where again the individual creator and copyright owner must pursue the infringer and there is no regulator to step in on behalf of these rights holders. Accordingly, any consideration of governance mechanisms to ensure AI is used safely and responsibly should also include consideration of the impact and risks AI poses to creators and creative communities and organisations in Australia, and in particular the impact on First Nations creators and the protection of ICIP.

During July 2023, Arts Law and the National Association for the Visual Arts (NAVA) (with the support of the Australian Society of Authors (ASA)), conducted a survey to understand how creators are using generative AI and the potential impacts on Australian creators (AI Survey). There were 36 responses in total. The survey results demonstrate that almost 40% of the respondent creators are using AI tools as part of the creative process, however there is overwhelming concern about the threat generative AI poses to creative output, ownership of copyright material and employment prospects for creators.

Most respondents (86%) are visual artists. Responses were also received from writers, designers, filmmakers, musicians, craftspersons, experimental artists, digital/protection artists, transdisciplinary/participatory practice artists and a curator.

Supporting Responsible AI Discussion Paper: Arts Law Centre of Australia, 11 August 2023 2
Almost 40% of the respondents use generative AI in the creative process via a mix of assistance with written works such as editing and grant writing via Chat GPT and some creators use it for creative content and ideation processes. However around 86% of respondents confirmed that if they use generative AI in the creative process, only about 10% of the final work comes from generative AI use.

64% of respondents think that generative AI is a threat to the creative professions, 48% are concerned that generative AI will affect the amount of money a creator can make from creative work
(34% are not sure), and 51% are concerned that generative AI will be used to replace human creators.

Some respondents commented that ‘some artforms will become less viable and others will be enabled as a result’ and that ‘transparency of use is essential to this tool not threatening creatives.’

On the issue of whether creators are seeing their work used on a generative AI platform without their permission only 11% said yes, 28% said no, and 61% said they don’t know. Some creators used
Google image search or haveibeentrained.com to work this out, and one respondent noted that their work being used on a generative AI platform is no different from any other infringer, as they still don’t have the resources to legally pursue the copyright infringer:

‘Prior to AI my creative work has been used without my permission. So really not much difference if AI can access and replicate my work. It is a problem, but I don’t have the resources to be able to police this.’

Respondents to the AI Survey also noted their concern that AI will have a negative impact on creatives and creative communities in Australia:

‘… even though I see a use as a tool for generating ideas (which also traces AI to further replicate us/our styles) what AI is doing to our arts is ethically, as a society WRONG.’

‘It cheapens and lessens the works and training of arts creators. Why bother involving people if an AI can do it without negotiations?’

‘Generative AI is a danger to the visual arts industry for many reasons. Generative AI is bad for artists, not just because it scrapes the internet for images to use for training, but also because it threatens to deskill the entire industry. Visual artists train for years to develop the skills they have to create their art, but AI steals all the fruits of that training. Artists already live in poverty and find it difficult to make a living from their work, yet AI is threatening to take the few jobs that exist for artists, as generative AI can be used as a disciplinary tool to erode the bargaining power of working artists. If companies are willing to replace real artists with AI (whether or not the art is of worse quality), artists will never achieve better rights at work. The writers’ strike in the US already shows this. Our entire creative culture is worse off when we buy into the hype of AI. Also, generative AI emits tonnes of carbon – the climate cost of generative AI has not been explored anywhere in mainstream media.’

‘You should be asking is there any point in becoming an artist, writer, illustrator, filmmaker, actors? …
Since AI popped up it’s ugly head I haven’t had one brief or new freelance job offered in 6 months when normally I would have 10-20!’

‘Should also be considering other harm impacts on work / labour employment gig economy that artists creatives operate in that supports the elite in the creative sector. Every time a company or person chooses to use a LLM Gen AI to write something for publication commercial gain …. A writer loses a gig….’

Supporting Responsible AI Discussion Paper: Arts Law Centre of Australia, 11 August 2023 3
Any regulatory action should mitigate the risks of AI to Australian creators and creative communities and provide low cost options for resolving copyright and moral rights infringement disputes, as discussed further in our submission to the Copyright Enforcement Review Issues paper referred to above.

One example is to enact statute that requires a machine learning algorithm to comply with the
Copyright Act 1968 (Cth), however this would also require amending the Act to ensure it covers the way AI operates. Additional funding for organisations such as Arts Law to consider further policy solutions and approaches would also assist with developing a regulatory response to the impact of AI on creators and creative communities in Australia.

3. Are there any further non-regulatory initiatives the Australian Government could implement to
support responsible AI practices in Australia? Please describe these and their benefits or
impacts.

Arts Law supports the promotion of Australia’s AI Ethics Principles, and in particular principles 6, 7 and 8 in relation to the impact on Australian creators and creative communities and organisations:

6. Transparency and explainability: There should be transparency and responsible disclosure
so people can understand when they are being significantly impacted by AI, and can find out
when an AI system is engaging with them.

7. Contestability: When an AI system significantly impacts a person, community, group or
environment, there should be a timely process to allow people to challenge the use or
outcomes of the AI system.

8. Accountability: People responsible for the different phases of the AI system lifecycle should
be identifiable and accountable for the outcomes of the AI systems, and human oversight of
AI systems should be enabled.

Additional funding should be provided to creative and arts peak bodies and organisations such as
Arts Law and the National Association for the Visual Arts (NAVA), the Australian Society of Authors
(ASA) and others, to develop ethical principles for AI in the creative sectors and to undertake professional development, develop resources and advocate for compliance with these ethical principles. This will also support the continued growth of Australian creators and creative organisations and communities and ensure detailed consideration the impact of AI and what is best for the creative and cultural economy and society as a whole.

4. Do you have suggestions on coordination of AI governance across government? Please outline
the goals that any coordination mechanisms could achieve and how they could influence the
development and uptake of AI in Australia.

Any coordination of AI governance across government should be undertaken alongside Australia’s AI
Ethics Principles.

Responses suitable for Australia

5. Are there any governance measures being taken or considered by other countries (including
any not discussed in this paper) that are relevant, adaptable and desirable for Australia?

Arts Law supports the approach in the United Kingdom (UK) to legislate to create a statutory duty for regulators to have regard to the UK framework and principles for responsible development and use

Supporting Responsible AI Discussion Paper: Arts Law Centre of Australia, 11 August 2023 4
of AI. This same approach could be taken in Australia to create a statutory compliance with
Australia’s AI Ethics Principles.

The AI Survey revealed that 93% of respondents support the introduction of either guidelines, regulations, a code of practice of legislative protections to regulate generative AI platforms.

Target areas

6. Should different approaches apply to public and private sector use of AI technologies? If so,
how should the approaches differ?

No. The private sector should also have a statutory duty to comply with Australia’s AI Ethics
Principles.

7. How can the Australian Government further support responsible AI practices in its own
agencies?

The Australian Government should introduce a statutory duty for the Australian Government and agencies to comply with Australia’s AI Ethics Principles.

8. In what circumstances are generic solutions to the risks of AI most valuable? And in what
circumstances are technology-specific solutions better? Please provide some examples.

Overall generic solutions to the risks of AI in the form of ethical principles are most valuable, however in the creative sectors specific consideration may need to be given to the impact of specific technology on creators and in that instance technology-specific solutions may be appropriate. For example, where it is evident that large amounts of artistic, literary or musical works have been used to train a machine learning algorithm, there should be a statutory requirement that existing laws such as the Copyright Act 1968 (Cth) are complied with. Otherwise, a burden is placed on individual creators to enforce their rights under the Act as copyright owners.

9. Given the importance of transparency across the AI lifecycle, please share your thoughts on:
a. where and when transparency will be most critical and valuable to mitigate potential AI
risks and to improve public trust and confidence in AI?
b. mandating transparency requirements across the private and public sectors, including how
these requirements could be implemented. a. For creators and creative organisations and communities, transparency is most critical and valuable at any stage where copyright material is reproduced, communicated, or published without the permission of the copyright owner and without attribution provided to the creator. This will improve public trust and confidence that AI is complying with the Copyright Act 1968 (Cth). There should also be transparency where creative material is created by AI rather than an individual creator
(ie replacing the creative work and role of a creator with AI). b. Mandating transparency requirements across the private and public sectors may be undertaken via a statutory duty to comply with Australia’s AI Ethics Principles and ensuring ongoing compliance with other legislative requirements eg Copyright Act 1968 (Cth).

10. Do you have suggestions for:
a. whether any high-risk AI applications or technologies should be banned completely?
b. criteria or requirements to identify AI applications or technologies that should be banned,
and in which contexts?

Arts Law does not have a response to this question.

Supporting Responsible AI Discussion Paper: Arts Law Centre of Australia, 11 August 2023 5
11. What initiatives or government action can increase public trust in AI deployment to encourage
more people to use AI?

Arts Law would like to see a government initiative that funds creative and arts peak bodies and organisations such as Arts Law and the National Association for the Visual Arts (NAVA), the Australian
Society of Authors (ASA) and others, to engage the community and deliver education.

Implications and infrastucture

12. How would banning high-risk activities (like social scoring or facial recognition technology in
certain circumstances) impact Australia’s tech sector and our trade and exports with other
countries?

Arts Law does not have a response to this question.

13. What changes (if any) to Australian conformity infrastructure might be required to support
assurance processes to mitigate against potential AI risks?

Arts Law does not have a response to this question.

Risk-based approaches

14. Do you support a risk-based approach for addressing potential AI risks? If not, is there a better
approach?

Arts Law supports a risk-based approach; however, this approach should take into consideration the risk to Australian creators and the cultural and creative sectors, addressed in part above.

15. What do you see as the main benefits or limitations of a risk-based approach? How can any
limitations be overcome?

The main limitation at this point is that the current risk-based approach does not take into consideration the risk to Australian creators and the cultural and creative sectors.

16. Is a risk-based approach better suited to some sectors, AI applications or organisations than
others based on organisation size, AI maturity and resources?

As noted above, the risk-based approach needs to consider the risk of AI to Australian creators and the cultural and creative sectors.

17. What elements should be in a risk-based approach for addressing potential AI risks? Do you
support the elements presented in Attachment C?

Yes. However, transparency should include information regarding the way AI has used any copyright materials in the AI system, or to create the AI system. And when considering impact, there should be specific consideration of the impact on creators and creative organisations and communities.

18. How can an AI risk-based approach be incorporated into existing assessment frameworks (like
privacy) or risk management processes to streamline and reduce potential duplication?

As privacy is not the only risk that should be taken into consideration, the AI risk-based approach should not be incorporated into existing assessment frameworks but should sit alongside any statutory duty to comply with Australian AI Ethics Principles.

19. How might a risk-based approach apply to general purpose AI systems, such as large language
models (LLMs) or multimodal foundation models (MFMs)?

Supporting Responsible AI Discussion Paper: Arts Law Centre of Australia, 11 August 2023 6
Arts Law does not have a response to this question.

20. Should a risk-based approach for responsible AI be a voluntary or self-regulation tool or be
mandated through regulation? And should it apply to:
a. public or private organisations or both?
b. developers or deployers or both?

A risk-based approach should take into consideration the risk to the culture and creative sectors, should be mandated through regulation, and should apply to public and private organisations, developers and deployers.

Conclusion

Arts Law appreciates the opportunity to make these submissions and welcomes any further discussion and consultation. Please contact Arts Law by email to artslaw@artslaw.com.au or (02)
9356 2566 if you would like us to expand on any aspect of this submission, verbally or in writing.

Robyn Ayres Katherine Giles
CEO, Arts Law Head of Legal and Operations, Arts Law

Supporting Responsible AI Discussion Paper: Arts Law Centre of Australia, 11 August 2023 7

This text has been automatically transcribed for accessibility. It may contain transcription errors. Please refer to the source file for the original content.