Make a submission: Published response
Published name
Upload 1
Department of Industry, Science and Resources
Technology Strategy Branch
DigitalEconomy@industry.gov.au
August 2023
____________________________________________________________________________
Re: Supporting Safe and Responsible AI in Australia
NAVA welcomes the opportunity to respond to the Safe and responsible AI in Australia
Discussion paper.
____________________________________________________________________________
The National Association for Visual Arts (NAVA) acknowledges the Gadigal, Wangal, Dharug,
Dharawal, Ngunnawal, Ngambri, Dja Dja Wurrung, Kabi Kabi and Jinibara peoples as the Traditional
Custodians and knowledge-holders of the lands on which we live, learn and work.
The NAVA community is based across hundreds of sovereign Nations and unceded lands throughout
the continent that has become colonially known as Australia. NAVA pays our deepest respects to all
Custodians of Country to whom these lands belong.
We acknowledge Aboriginal and Torres Strait Islander peoples as the first artists and storytellers on
this continent, and pay respect to First Nations communities' Ancestors and Elders.
Sovereignty was never ceded. Always was, always will be Aboriginal land.
____________________________________________________________________________
The words ‘Aboriginal and Torres Strait Islander’, ‘Indigenous’ and ‘First Nations’ are used in this
document to refer to both Aboriginal and Torres Strait Islander peoples, and global First Nations
artists in the Australian arts and culture sector. Although the usage of these words is complex, and
some First Nations peoples may not be comfortable with some of these words, we use them to reflect
the variety of terminology that First Nations peoples may identify with throughout this continent. We
would like to make known that only the deepest respect is intended in the use of these terms.1
____________________________________________________________________________
The National Association for the Visual Arts (NAVA) is an independent membership organisation
which brings together the many voices of the visual arts, craft and design sectors to improve the
fundamental conditions of work and practice. We do this through advocacy, education and
promotion of the Code of Practice for the Visual Arts, Craft and Design. Our network comprises
over 50,000 artists, arts workers, galleries, arts organisations and industry bodies. Since its
establishment in 1983, NAVA has been influential in bringing about policy and legislative change to
encourage the growth and development of the visual arts sector and to increase professionalism
within the industry.
1
This note on terminology is adapted from a section by Georgia Mokak, ‘Change the Conversation From Surviving to Thriving’, https://visualarts.net.au/news-opinion/2019/change-conversation-surviving-thriving/, 2019.
PO Box 60 18000 4 NAVA (6282)
Potts Point NSW 1335 nava.net.au
nava@visualarts.net.au ABN 16 003 229 285
NAVA has participated in recent roundtables with the Attorney-General's Department and key industry stakeholders to discuss several proposed changes to copyright and Artificial Intelligence
(AI) regulation. NAVA also joined 20 co-signatories from the creative industries, on recent letters to the Hon Tony Burke MP, Minister for the Arts; Hon Michelle Rowland MP, Minister for
Communications; and Hon Ed Husic MP, Minister for Industry and Science on behalf of Australian creators, artists, creative industries and rights holders’ raising concerns with generative artificial intelligence platforms, products and services.
Although Artificial Intelligence (AI) has been around for decades, recent developments have made it more accessible than ever before, prompting questions and speculations about its use and impact on artists’ work, practices, and livelihoods. Generative AI is extremely powerful because it can offer a world of creative and supportive potential. However, it also poses significant challenges, from copyright, privacy and data protection to its environmental footprint.
A range of artists are producing brilliant work by interacting and collaborating with one or more of the rapidly increasing generative AI platforms in their creative practices. There are also many examples of AI platforms serving increasingly supportive roles in administration. However, digital innovation moves faster than the law and other forms of regulation, leaving space for potential and ongoing harm to the work, incomes and rights of independent practitioners. While NAVA embraces some of AI’s potential and realised benefits, the primary focus of current advocacy work is to ensure the risks are central considerations in the development of AI tools and platforms and the laws, regulation and policy that govern them.
NAVA understands the scope of the discussion paper does not consider all issues associated with
AI such as implications on the labour market and skills, and intellectual property. These issues will be present in this submission as the risks faced by the creative industries are intrinsically linked with work, copyright and Indigenous Cultural Intellectual Property (ICIP). If a regulator is unable to ensure compliance in this space, then the individual will undertake the burden. And for an artist, this burden will be beyond their capacity, financial or otherwise.
NAVA’s submission supports and will reference public comments and ‘Safe and responsible AI in
Australia’ submissions from other leading organisations within the creative industries: Arts Law
Centre of Australia (Arts Law), Australian Society for Authors (ASA) and A New Approach (ANA), as well as additional input from the Australian Research Council Linkage Project (LP210300009)
’Empowering Australia’s Visual Arts via Creative Blockchain Opportunities’ - led by A/Prof Brian
Yecies at the University of Wollongong.
Discussion paper questions
Definitions
1. Do you agree with the definitions in this discussion paper? If not, what definitions do you prefer and why?
Yes.
Potential gaps in approaches
2. What potential risks from AI are not covered by Australia’s existing regulatory approaches? Do you have suggestions for possible regulatory action to mitigate these risks?
Last month, NAVA and Arts Law (with the support of the ASA) surveyed a representative group of artists and creators on how AI impacts on and/or feeds into their work and what regulation should be considered around AI platform development and output.
Almost 40% of survey respondents use generative AI in their creative process. Some use it to assist with written work, including editing and grant applications through ChatGPT, and others for the development of creative content and ideation processes. 86% of respondents said that if they use generative AI in their creative process, just 10% of the final work is derived from generative AI use.
Concern for the impacts of AI on the creative sector are high. 64% expressed anxiety over its threat to employment prospects. Half of the respondents are concerned that generative AI could reduce the income creators can earn from their creative work and another half feared the use of generative AI will replace human creators.
11% of respondents expressed concern about finding their work on a generative AI platform without permission, 28% said no, and 61% said they don’t know, indicating a lack of transparency and information about the use of creative work in generative AI input and training.
The survey results show that while there are potential benefits to the use of generative AI, these are outweighed by fears of the risks and impacts on the professional work of independent artists.
Transparency and copyright are essential to protect creator livelihoods.
Although the current laws allow a level of protection to artists and there are avenues by which they can stop any unpermitted use and recover payment, the Australia Council’s economic study of professional artists in Australia, Making Art Work, found that while a quarter of artists experience copyright infringement, only some 40% of those artists take action, with some 60% of those actions being successful, meaning that “the majority of those suffering infringement finish up with no redress.”
Of artists whose rights were infringed:
● 56% of visual artists and 89% of craft practitioners had their work repurposed without
attribution;
● 48% and 54% respectively had work repurposed without permission;
● These were among the higher rates of rights infringement in comparison to practitioners of
other artforms.2
If individual artists are to gain the full economic benefit to which their creative endeavour entitles them, their intellectual property must be adequately protected against unauthorised exploitation or appropriation. The copyright held by some visual artists and craft practitioners in the works that they create contributes in varying degrees to their economic survival.3
Indigenous Cultural Intellectual Property (ICIP)
The Productivity Commission’s recent report, Aboriginal and Torres Strait Islander Visual Arts and
Crafts, notes the following:
Inauthentic Aboriginal and Torres Strait Islander arts and crafts — which include Indigenous-style products created by non-Indigenous people, products that use Indigenous Cultural and Intellectual
Property (ICIP) without the authorisation of traditional custodians, and products that infringe copyright — are a pervasive and longstanding problem.
● Non-Indigenous authored products accounted for up to $54 million of spending,
representing well over half of total spending on Aboriginal and Torres Strait Islander
souvenirs in 2019-20.4
First Nations artists in Australia are already caused harm by those who do not heed ICIP.
We stand by ASA’s public comments on ICIP and the potential for AI to further this issue:
We are disturbed by the potential of Generative AI models to produce and perpetuate inauthentic and fake art, appropriating Aboriginal and Torres Strait Islanders’ art, stories and culture without
2
David Throsby and Katya Petetskaya (2017), Making Art Work: An Economic Study of Professional Artists in
Australia, Department of Economics: Macquarie University; p. 107
3
Ibid; p. 103
4
Aboriginal and Torres Strait Islander visual arts and crafts Study report (overview), November 2022.
reference to Traditional cultural protocols, at a time when the National Cultural Policy has put ‘First
Nations first’ and is working on stand-alone legislation to acknowledge and protect ICIP.5
3. Are there any further non-regulatory initiatives the Australian Government could implement to support responsible AI practices in Australia? Please describe these and their benefits or impacts.
NAVA recommends Australia’s AI Ethics Principles be broadly promoted and that industry leaders are encouraged to endorse and share these principles with their members and peers.
NAVA reiterates Arts Law’s call for arts peak organisations, such as NAVA, Arts Law and ASA to be funded in order to develop sector specific guidelines and be supported in promoting, educating and advocating for an ethical approach to the use and implementation of AI within work and practice.
4. Do you have suggestions on coordination of AI governance across government? Please outline the goals that any coordination mechanisms could achieve and how they could influence the development and uptake of AI in Australia.
NAVA supports ANA’s submission to this inquiry which recommends:
Using coordination mechanisms that support policy decisions on AI governance with both AI expertise and portfolio expertise (such as interdepartmental committees, steering committees and cross- jurisdiction bodies at ministerial and official levels).
The Attorney-General’s department is currently undertaking consultation on copyright and AI issues that overlap with this inquiry. A suggestion of that inquiry is a forum to hear small intellectual property claims, similar to the styles of the US Copyright Claims Board (CCB) and the UK
Intellectual Property Enterprise Court (IPEC). While we understand that the scope of this consultation is broad-reaching and cross-industry, NAVA would like to impress upon the
Department of Industry, Science and Resources Technology Strategy Branch that the individual creator and copyright owner cannot be forgotten in any regulatory development around AI and more specifically generative AI. While the development and uptake of AI will likely progress quickly relatively unaided, it will be more beneficial and valuable to our civic society if that is done so with consideration of ethical principles and if there are mechanisms by which creators can voice and defend the rights in their work.
Of the four respondents in the NAVA and Arts Law joint survey that found their creative work was used by a generative AI platform without their permission and consequently raised the issue directly with the AI, none received a response. Although a small sample size, it is concerning that the platforms regarded any communication on this issue unworthy of their time.
Responses suitable for Australia
5. Are there any governance measures being taken or considered by other countries (including any not discussed in this paper) that are relevant, adaptable and desirable for Australia?
NAVA supports a similar model to that of the United Kingdom (UK):
● Statutory compliance of cross-sector AI-specific principles (In Australia, the AI Ethics
Principles)
● An emphasis and strategy that delivers greater transparency
● A toolkit to assist organisations to assess the risks to individual rights and freedoms caused
by their AI platforms
The AI Survey showed that 93% of respondents support the introduction of either guidelines, regulations, a code of practice of legislative protections to regulate generative AI platforms.
Target areas
6. Should different approaches apply to public and private sector use of AI technologies? If so, how
5
ASA raises concerns with Government about risks of AI, July 2023.
should the approaches differ?
No. There should be an equitable regulatory approach to both sectors.
7. How can the Australian Government further support responsible AI practices in its own agencies?
Endorse and follow any regulation and Australia’s AI Ethics Principles, while communicating and promoting good practices regarding AI platform usage, in particular that of generative AI.
8. In what circumstances are generic solutions to the risks of AI most valuable? And in what circumstances are technology-specific solutions better? Please provide some examples.
Generic solutions are broadly applicable in the creative industries, but technology-specific solutions may be necessary in the flagging of ICIP and in circumstances where a significant portion of copyrighted material has been used in the production of generative AI outputs.
9. Given the importance of transparency across the AI lifecycle, please share your thoughts on:
a. where and when transparency will be most critical and valuable to mitigate potential AI
risks and to improve public trust and confidence in AI?
As per Arts Law’s submission:
For creators and creative organisations and communities, transparency is most critical and valuable at any stage where copyright material is reproduced, communicated, or published without the permission of the copyright owner and without attribution provided to the creator.
b. mandating transparency requirements across the private and public sectors, including
how these requirements could be implemented.
As per Arts Law’s submission:
Mandating transparency requirements across the private and public sectors may be undertaken via a statutory duty to comply with Australia’s AI Ethics Principles and ensuring ongoing compliance with other legislative requirements eg Copyright Act 1968 (Cth).
Risk-based approaches
14. Do you support a risk-based approach for addressing potential AI risks? If not, is there a better approach?
Yes, one grounded in international human rights law.
15. What do you see as the main benefits or limitations of a risk-based approach? How can any limitations be overcome?
As per ANA’s submission:
A main benefit of a risk-based approach is its ability to systematically analyse the impacts of AI, including on Australian interests in arts, culture and creativity. The wide-ranging impacts of AI requires a risk-based approach that explicitly considers risks to human rights of Australians, including to freedom of expression. Risks to human rights are currently considered a high risk area in the proposed EU AI Act. 6
A key limitation of a risk-based approach is that it cannot directly account for risks that are emerging or unforeseeable. The European Commission has explained how a proportionate risk- based framework would involve prohibiting uses of AI with unacceptable risks, regulation for uses with high risks, and limited transparency obligations (such as flagging ‘the use of an AI system when interacting with humans’) of other applications of AI. 7 An obligation to make applications of AI
6
European Parliament (2023). Press release: AI Act: a step closer to the first rules on Artificial Intelligence.
7
European Commission (2021). COM/2021/206 Proposal for an AI Act.
interactions with humans transparent provides some view of emerging risks, and partly addresses this limitation.8
17. What elements should be in a risk-based approach for addressing potential AI risks? Do you support the elements presented in Attachment C?
Yes, NAVA supports the elements presented in attachment C. However transparency in the use of copyrighted materials for the training of a generative AI system and its output should be clear. And any impact on individual creators should be regularly assessed.
20. Should a risk-based approach for responsible AI be a voluntary or self-regulation tool or be mandated through regulation? And should it apply to:
a. public or private organisations or both?
b. developers or deployers or both?
A risk-based approach should be mandated and apply to public and private organisations, and developers and deployers alike. Regulation is far more likely to be adhered to by all these entities, and understood by creators, if it is administered with the same rules. Mandating the approach signals to Australia’s creative community that their contributions are valuable and that they are protected by the Government.
Please do not hesitate to contact me for any further information I can provide.
Sincerely,
Georgie Cyrillo
Deputy Director, National Association for the Visual Arts
8
The EU has already introduced a similar obligation for ‘prominent markings’ to provide some transparency on deepfakes and similar manipulated images, audio and videos. Regulation (EU) 2022/2065
Digital Services Act.