Make a submission: Published response

#357
NSW Ombudsman
31 Jul 2023

Published name

NSW Ombudsman

Upload 1

Automated Transcription

OFFICIAL

26 July 2023

Technology Strategy Branch
Department of Industry, Science and Resources via email: DigitalEconomy@industry.gov.au

NSW Ombudsman submission – “Safe and Responsible AI in Australia” discussion paper

The Australian Government is seeking feedback on appropriate regulatory and policy responses to support the safe and responsible use of AI, and the Commonwealth Department of Industry, Science and
Resources has issued a discussion paper – ‘Safe and responsible AI in Australia’ to facilitate consultation.
I am writing to draw to the attention of the Department a special report I tabled in the NSW Parliament that addresses a number of the issues raised in your discussion paper.

The role of the NSW Ombudsman
The NSW Ombudsman is an independent integrity body that pursues fairness for the people of NSW. In particular, we strive to ensure that those entrusted with public power and resources fulfil their responsibilities and treat everyone fairly.
A central function of the NSW Ombudsman is to receive complaints about, to monitor, and to investigate, the conduct of NSW public authorities. This includes State Government departments and agencies, NSW statutory bodies, and local councils.
We aim to identify that public authorities are conducting themselves lawfully, making decisions reasonably, and treating all individuals equitably and fairly. When public authorities fail to do this, we may make findings that they have engaged in ‘maladministration’.1
In relation to the development and use of AI, the NSW Ombudsman’s oversight of NSW public authorities will include their conduct in developing and/or using AI and related technologies in the exercise of their own functions or service-delivery operations including in particular, any use of automated-decision making (ADM).
The NSW Ombudsman does not generally regulate or oversight the private sector,2 and so will not have direct jurisdiction to scrutinise the development or use of AI by or in the private sector. However, private sector development and deployment of AI could come to our attention if questions are raised about the

1 More formally, section 26 conduct (referring to section 26 of the Ombudsman Act 1974 (NSW)), which sets out the various categories of
wrong conduct about which the Ombudsman may make findings.
2 The NSW Ombudsman’s jurisdiction is, however, extended by the Community Services (Complaints, Reviews and Monitoring) Act 1993
(NSW) and other legislation to include some private sector entities, including non-government community service providers that are funded
by the NSW Government, and private managers of state correctional facilities.

OFFICIAL
OFFICIAL conduct of NSW public authorities, whether that be as direct or indirect users of the technology themselves, or otherwise in their role as regulators of private sector AI.
To date, our focus in this area has primarily been on the use of ADM by public authorities themselves.

The Ombudsman’s Machine Technology (ADM) report
Our report, titled ‘The new machinery of government: Using machine technology in administrative decision-making’ was tabled in the NSW Parliament on 29 November 2021 under s 31 of the
Ombudsman Act 1974 (NSW).
A copy of the report is enclosed and as noted, it touches on many of the issues concerning ADM that are raised by your discussion paper. Additionally, it includes a range of studies from real-world applications of ADM in the NSW public sector.
We were prompted to write the above report after becoming aware that a NSW public authority
(Revenue NSW) had been using an ADM system for the performance of a statutory function (the garnisheeing of unpaid fine debts from individuals’ bank accounts), in a way that was having a significant impact on individuals, many of whom were already in situations of financial vulnerability. A detailed summary of the Revenue NSW matter can be found in Annexure A of the report.3
Our experience with Revenue NSW, together with a lack of visibility of other use of ADM within the NSW public sector, raised our concern that there may be inadequate attention being given to fundamental aspects of public law that are relevant to the adoption of ADM systems in the public sector.
Drawing on key themes and observations made in our report, we provide the following comments relevant to the Department’s discussion paper:
• Existing frameworks. ADM technologies are not being introduced into a complete legal or
regulatory vacuum. The legal environment into which public sector ADM is introduced is the one
that is governed by public administrative law. Administrative law is essentially principles-based
which means it is, generally speaking, technology agnostic. While the technology used in
government decision making may change, the underlying concerns and norms that underpin
administrative law will likely remain unchanged. For simplicity, we group the well-recognised
requirements for good decision-making in our report as follows: proper authorisation,4 appropriate
procedures,5 appropriate assessment,6 and adequate documentation.7 It will be useful to consider
any deployment of ADM technologies against those requirements.
• Principles-based approaches. While a principles-based approach to regulation in this area seems
appropriate, given the diversity of potential applications of ADM, the challenge will be to ensure
that this approach is sufficiently general to be applicable across those various current and future

3 NSW Ombudsman, The New Machinery of Government: using machine technology in administrative decision-making, Annexure A: Revenue
NSW Case Study. URL: https://www.ombo.nsw.gov.au/__data/assets/pdf_file/0004/138208/The-new-machinery-of-government-special-
report_Annexure-A.pdf
4 NSW Ombudsman, The New Machinery of Government: using machine technology in administrative decision-making, Section 7, pp. 27-33.
URL: https://www.ombo.nsw.gov.au/__data/assets/pdf_file/0003/138207/The-new-machinery-of-government-special-report_Front-
section.pdf
5 Above n 4, Section 8, pp. 34-41.
6 Above n 4, Section 9, pp. 42-47.
7 Above n 4, Section 10, p. 48.

Page 2
OFFICIAL
OFFICIAL

applications, while also being fit for purpose and responsive to the particular contexts and risks of
particular use cases. In that regard, it will be essential to ensure that any principles are supported by
clear and comprehensive implementation guidance and processes. In our view (and as explained
further in Section 15 of our report), any principles should provide for consideration to be given to
enacting specific legislation to authorise and regulate a specific proposed use case for ADM.
The discussion paper specifically seeks views about whether a risk-based approach is the best
approach to regulation. In our report, we suggested a list of ‘properties’ that could be considered
when enacting legislation that authorises use of ADM for a statutory function.8 The properties target
some of the key areas of risk when agencies use ADM, such as (lack of) visibility and (in)accuracy.
• Transparency. A key observation in our report was the current lack of visibility of public sector use
of ADM. Indicating when and how ADM systems are used is crucial to effective oversight. When a
public authority gives reasons to an individual affected by a decision, those reasons must be
meaningful. Among other things, a meaningful explanation should include: that automation was
involved; the extent to which automation was used; what information is processed by the ADM
system; the date and version of any technology; how (in lay terms) the technology works. The
statement should also include the usual requirements for decision notices, including details of how
the decision may be challenged or reviewed, and by whom.9
• Oversight. We would support further development in the area of mandatory standards to enhance
consistency in the design, development, deployment, monitoring and decommissioning of ADM
systems.
Oversight of ADM would also be more straightforward if there are clear requirements and
expectations about what a public authority must do when it designs, implements or uses an ADM
system – for example, standards (whether legislated or otherwise) that require agencies, prior to
deployment and/or at particular times following deployment:
– to commission an independent algorithmic audit by an expert auditor accredited for that
purpose, and
– to obtain a comprehensive legal certification that the system is compliant with the laws
governing the relevant function.
If such standards exist, then the role of an independent oversight authority (such as ours, in the case
of NSW public authorities) would be to consider, at least as a first step, whether those requirements
are met. Rather than asking very difficult technical questions like: ‘Is this ADM infected by
algorithmic bias?’ we could start – and possibly end – by asking a range of questions in relation to
the design, implementation and operation of the system, like: ‘Was the ADM properly tested for
algorithmic bias?’. In that case, existing oversight bodies would be able to quickly add value using
existing powers of investigation.10

8 Above n 4, Section 15, pp. 77-79.
9 Above n 4, Section 13, pp. 58-62.
10 See further NSW Ombudsman, ‘Avoiding (and investigating) automated maladministration’. URL:
https://www.ombo.nsw.gov.au/__data/assets/pdf_file/0009/138789/Paul-Miller-Avoiding_and-investigating_automated-
maladministration-speech_4-July-2023.pdf

Page 3
OFFICIAL
OFFICIAL

This observation does not obviate the need (as discussed in our report) to consider whether new or
enhanced oversight capabilities, beyond the existing bodies and their existing powers, might be
warranted to respond to the burgeoning use of ADM.11
As noted in the discussion paper, internationally, there are a range of AI and ADM governance
initiatives that anticipate requirements for the assessment and audit of a systems’ conformity with
established requirements. These may provide some guidance to the Australian context, particularly
around how an audit should be conducted and by whom.

References to ‘Responsible AI’
I note that the title of the discussion paper released by the Department refers to ‘Responsible AI’.
This terminology appears to be more frequently used in recent times, including by some of the private sector entities at the forefront of AI development and deployment, including Google12 and Microsoft,13 as well as professional services consultancies that advise both the private sector and government.14
I take the opportunity to make a quick observation about the use of this terminology by government.
The concept of responsibility, in the context of law and morality, refers to the status of deserving praise or blame, reward or punishment, or some other relevant reaction or consequence. It is a concept that requires, at the very least: (a) a person (being a legal or moral agent) who bears responsibility, (b) to another person or other persons, (c) for some thing (which may be actions or inactions, or consequences or outcomes, etc), (d) having regard to relevant legal or moral norms or criteria (for example, rules or standards about what the person should or should not have done, or about what outcomes or harms the person is expected to bear responsibility for).15
This means that there is no such thing as ‘Responsible AI’.16 Instead, we suggest referring to responsible actors (such as in the term ‘responsible officers’, as used in the NSW AI Assurance Framework)17 or at the very least to responsible activities (like ‘responsible AI development’ or ‘responsible use of AI’, as used elsewhere in your discussion paper), which will usually implicate a relevant actor/s who is undertaking those actions.18
Our concern here is more than a matter of semantics. We worry that referring to ‘Responsible AI’ risks obscuring the real questions about responsibility for AI. Those questions include who is responsible, and what is their responsibility, in respect of the development of AI, the deployment of AI, any uses or harms

11
Above n 4, Section 16, pp 80-81.
12 ‘Responsible AI', Google (Web Page) .
18 To take an example from a different context, reference is often made to “responsible lending [by banks]”, not to “responsible loans”.

Page 4
OFFICIAL
OFFICIAL caused by AI (whether known or unanticipated), and so on. These are critical questions that will need to be considered in the development of regulatory responses to AI.19
By referring to ‘Responsible AI’, as if it were a thing distinct from legal and moral actors and their actions, attention might easily be drawn away from thinking more directly and deeply about those questions - who is (or should be) responsible, to whom, for what, and by reference to what rules or standards?
Additionally, any regulatory framework needs to make clear that responsibility is both more complex and more enduring than might be suggested by a simple consideration of whether the AI (at some particular point in time) has met some checklist of ‘Responsible AI’. For example, AI that is lawful and appropriate when designed or at a particular point in time may become problematic when used in ways that were not originally anticipated, or when there are changes in the surrounding environment, whether that be through legislative change, societal change, or a change to the AI learning environment.
***
We look forward to the outcomes of this consultation process. If you would like any further information about our report or work generally in this area, please contact Chris Clayton, Chief Operating Officer at cclayton@ombo.nsw.gov.au or on (02) 9265 0430.

Yours sincerely

Paul Miller
NSW Ombudsman

19 We note that the discussion paper begins to unpack some of these issues.

Page 5
OFFICIAL

This text has been automatically transcribed for accessibility. It may contain transcription errors. Please refer to the source file for the original content.

Upload 2

Automated Transcription

The new machinery of government: using machine technology in administrative decision-making

A special report under section 31 of the Ombudsman Act 1974

29 November 2021

The new machinery of Government: Using machine technology in administrative decision-making 0
NSW Ombudsman

Acknowledgements
We thank Professor John McMillan AO, former
Commonwealth and NSW Ombudsman, Dr Lachlan
McCalman, Chief Practitioner, Gradient Institute, and Associate Professor Will Bateman, Associate
Dean of Research at the ANU Law School for providing expert comments on early drafts of this report. We are also grateful to Bill Simpson-Young and Dr Tiberio Caetano of the Gradient Institute for the discussions we have had around some of the technical concepts covered in the report.
We also thank James Emmett SC and Myles Pulsford for allowing us to publish their legal opinion, and for their additional comments on a draft of the report.
We also acknowledge and thank Revenue NSW for its co-operation particularly in the preparation of the Statement of Facts upon which that legal opinion was sought, and which is reproduced in annexure A.
All views expressed in this report are those of the
NSW Ombudsman.

ISBN: 978-1-925885-92-7
NSW Ombudsman
Level 24, 580 George Street
Sydney NSW 2000
Phone: 02 9286 1000
Toll free (outside Sydney Metro Area): 1800 451 524
National Relay Service: 133 677
Website: www.ombo.nsw.gov.au
Email: nswombo@ombo.nsw.gov.au
© State of New South Wales, 29 November 2021
This publication is released under a Creative Commons
License CC BY 4.0

The new machinery of government: using machine technology in administrative decision-making i
NSW Ombudsman

The Hon Matthew Mason-Cox MLC The Hon Jonathan O’Dea MP
President Speaker
Legislative Council Legislative Assembly
Parliament House Parliament House
SYDNEY NSW 2000 SYDNEY NSW 2000

Dear Mr President and Mr Speaker

Pursuant to section 31 of the Ombudsman Act 1974, I am providing you with a report titled The new machinery of government: using machine technology in administrative decision-making.

I draw your attention to the provisions of section 31AA of the Ombudsman Act 1974 in relation to tabling this report and request that you make the report public forthwith.

Yours sincerely

Paul Miller
NSW Ombudsman

29 November 2021
NSW Ombudsman

Contents
Foreword from the NSW Ombudsman ..................................................................................... iv
1. Executive summary ........................................................................................................... 1
1.1 Machine technology is on the rise, and offers many potential benefits ....................................... 2
1.2 Why we have written this report ................................................................................................... 2
1.3 Administrative law and practice must be given central attention ................................................. 3
1.4 Administrative law requirements for good decision-making......................................................... 3
1.5 Good practice for designing and implementing machine technology ........................................... 5
1.6 The role of Parliament in authorising machine technology ........................................................... 5
1.7 The way forward – starting with increased visibility ..................................................................... 7

Part 1: Machine technology .................................................................................................. 8
2. Introduction ..................................................................................................................... 9
2.1 The rise of machine technology ..................................................................................................... 9
2.2 Why is the Ombudsman interested in machine technology? ........................................................ 9
2.3 What has prompted this report? ................................................................................................. 11
2.4 The purpose and structure of this report .................................................................................... 12
2.5 What we will do next ................................................................................................................... 13
3. The new technologies ..................................................................................................... 14
3.1 What we mean by ‘machine technology’ .................................................................................... 14
3.2 Machine technology is not just one thing .................................................................................... 14
3.3 The need to design machine technology for particular applications ........................................... 15
4. The promise of machine technology in government decision-making ............................... 16
4.1 Machine technology within a decision-making system ............................................................... 16
4.2 How are governments using machine technology? ..................................................................... 18
4.3 The potential benefits of machine technology ............................................................................ 19

Part 2: Administrative law and machine technology............................................................ 21
5. Why administrative law matters for new technology ....................................................... 22
5.1 Public sector decision-making is different ................................................................................... 22
5.2 Good government according to law ............................................................................................ 22
5.3 Existing laws apply to new technologies ...................................................................................... 23
5.4 How administrative law applies to new machine technologies ................................................... 23
6. Key administrative law issues for machine technology ..................................................... 25
6.1 The essential requirements for good (and lawful) administrative decision-making ................... 25
7. Proper authorisation ....................................................................................................... 27
7.1 Legal power to make the decision ............................................................................................... 27
7.2 Legal authority of the person making the decision...................................................................... 28
7.3 The scope of the decision-making power – including the extent of any discretion .................... 30
8. Appropriate procedures .................................................................................................. 34
8.1 The decision has followed a fair process ..................................................................................... 34
8.2 Other legal and ethical obligations .............................................................................................. 37
8.3 Reasons are given for the decision (particularly decisions that affect the rights or interests
of individuals) ............................................................................................................................... 39
9. Appropriate assessment ................................................................................................. 42
9.1 The decision answers the right question (which necessitates asking the right question) ........... 42
9.2 The decision is based on a proper analysis of relevant material ................................................. 44
9.3 The decision is based on the merits and is reasonable in the circumstances.............................. 45

The new machinery of government: using machine technology in administrative decision-making iii
NSW Ombudsman

10. Adequate documentation ............................................................................................... 48
10.1 The circumstances surrounding the making of decisions is adequately documented and
records kept ................................................................................................................................. 48

Part 3: Designing machine technology to comply with the law and fundamental principles
of good government ................................................................................................ 49
11. Putting in place the right team ........................................................................................ 50
11.1 It’s not an IT project ..................................................................................................................... 50
11.2 Having lawyers on the design team is essential ........................................................................... 51
11.3 Ensuring policy choices are made by the right people, at the right level of authority ................ 51
12. Determining the necessary degree of human involvement ............................................... 53
12.1 The administrator must engage in an active mental process ...................................................... 53
12.2 The division of tasks between machine and human .................................................................... 54
12.3 The risk of technology complacency and ‘creeping control’........................................................ 55
12.4 Practical indications of active human involvement ..................................................................... 56
13. Ensuring transparency .................................................................................................... 58
13.1 Reasons and the right to an explanation ..................................................................................... 58
13.2 Accountability and reviewability .................................................................................................. 60
13.3 Publishing source code ................................................................................................................ 62
14. Verification, testing and ongoing monitoring ................................................................... 63
14.1 Testing before adoption............................................................................................................... 63
14.2 Undertake monitoring, review and periodic evaluation .............................................................. 64
15. Statutory provisions that authorise machine technology.................................................. 70
15.1 Stating, in simple terms, that an administrator is authorised to use a machine ........................ 70
15.2 Attributing machine outputs to an administrator ....................................................................... 71
15.3 More sophisticated authorisation provisions .............................................................................. 73
15.4 Transforming the substantive statutory function ........................................................................ 76
15.5 Mandatory properties of machine technology ............................................................................ 77
16. Coda – new laws for new technology? ............................................................................. 80
Endnotes ............................................................................................................................... 82
Annexure A – Revenue NSW case study .................................................................................. 98

Machine technology in practice
Reviewing ‘decisions’ versus investigating ‘conduct’ ...................................................................................... 10
Mobile phone detection cameras.................................................................................................................... 16
Automating procedural court rules ................................................................................................................. 18
Some machine technology use by government will be ‘legally unexceptional’ .............................................. 26
Services Australia Centrelink’s automated income compliance program (robodebt)..................................... 27
Machine technology and the ‘rule against dictation’ ...................................................................................... 31
Machine technology as a means of delivering ‘guidance’ to administrators .................................................. 32
Algorithmic bias ............................................................................................................................................... 35
Using machine technology to administer Commonwealth child support payments ...................................... 40
Lost in translation – a simple error converting legislation into code .............................................................. 43
Machine technology in sentencing – COMPAS ................................................................................................ 46
Providing explanations for decisions that are ‘instructive, informative and enlightening’ ............................. 58
Use of the ‘Structured Decision-Making’ tool in the NSW child protection system ........................................ 65
‘Nearly identical’ under the Commonwealth Business Names Registration Act 2011 .................................... 72

The new machinery of government: using machine technology in administrative decision-making iv
NSW Ombudsman

Foreword from the NSW Ombudsman
We have entered a new digital age, and it is widely accepted that governments must transform themselves to be fit for this future.1 The NSW Government’s first Digital Strategy spoke of the need for government to be ‘digital by design’ and ‘digital by default’.2
It is unsurprising then, that digital innovation has also begun to permeate the methods by which public officials and agencies exercise their roles as administrators – the ways they make decisions and exercise powers granted to them by Parliament through legislation.3
This report is about this shift toward machine technologies, a term we use for the broad range of digital and data enabled systems and processes that are, or might in future, be used to guide, assist or even determine when and in what ways administrative powers will be exercised.4
There is no doubt that machine technologies have the potential to bring significant benefits to government agencies and the public they serve, including in terms of speed, efficiency, accuracy and consistency.
However, the public sector has a unique constitutional role – it is that arm of government that administers laws, and as such it is uniquely subject to legal rules and standards of good conduct as to when and how it does so. This administrative law – the legal framework that controls government action – does not necessarily stand in the way of machine technology adoption, but it will significantly control the purposes to which it can be put and the ways in which it can operate in any particular context.
Failure to comply with the norms of administrative law risks maladministration – something at the forefront of the Ombudsman’s jurisdiction. Some contraventions may also result in decisions or actions being held by a court to have been unlawful and/or invalid.
This is one reason why, as far back as 2004, the Administrative Review Council emphasised the need for lawyers to be actively involved in the design of machine technology for government5
– a key point we take up in chapter 11 of this report.
Since that time, there is a small but growing body of legal academic literature, both in
Australia and elsewhere, that seeks to examine public sector use of machine technology through the lens of administrative law.
It is not clear to us, however, that this body of work is always reaching the audience it needs to: law-makers, policy-makers, government lawyers and particularly those who are at front- lines of implementing and operationalising machine technologies.
A primary aim of this report is to help to bridge that gap. We also offer guidance on the important practical steps that agencies need to take when considering the adoption of machine technology to support the exercise of administrative functions.
At the end of this report we touch on the question of whether the rise of machine technologies may also warrant a reconsideration of the legal frameworks, institutional arrangements and rules that apply.
For example, it may be that traditional administrative law mechanisms of redress, such as judicial review or complaint to the Ombudsman, will be considered too slow or otherwise too individualised to provide an adequate response to forms of systemic maladministration that could arise from ‘algorithmic bias’. Modified frameworks may be required – for example, to require proactive and ongoing external testing and auditing of systems, in addition to reactive individual review rights.

The new machinery of government: using machine technology in administrative decision-making v
NSW Ombudsman

On the other hand, new or amended laws may also be needed to expressly facilitate the beneficial use of new technologies in some areas, where the operation or uncertainty of existing rules might otherwise unduly stand in the way.

Paul Miller
NSW Ombudsman

The new machinery of government: using machine technology in administrative decision-making vi
NSW Ombudsman

Executive summary

The new machinery of government: using machine technology in administrative decision-making 1
NSW Ombudsman

Our role at the NSW Ombudsman is to oversee government agencies and officials – helping to ensure they are conducting themselves lawfully, making decisions reasonably, and treating all individuals equitably and fairly (chapter 2).
When agencies and officials fail to do this they are said to have engaged in maladministration or, more formally, section 26 conduct (referring to section 26 of the Ombudsman Act 1974 (NSW), which sets out the various categories of wrong conduct).
Clearly, the use by government agencies of machine technology – which might be referred to as artificial intelligence or automated decision-making (see chapter 3) – is not inherently a form of maladministration.
There are many situations in which government agencies could use appropriately-designed machine technologies to assist in the exercise of their functions, which would be compatible with lawful and appropriate conduct. Indeed, in some instances machine technology may improve aspects of good administrative conduct – such as accuracy and consistency in decision-making, as well as mitigating the risk of individual human bias.
However, if machine technology is designed and used in a way that does not accord with administrative law and associated principles of good administrative practice, then its use could constitute or involve maladministration. It could also result in legal challenges, including a risk that administrative decisions or actions may later be held by a court to have been unlawful or invalid.

1.1 Machine technology is on the rise, and offers many potential
benefits
The use and sophistication of machine technology is increasing worldwide, and it has the potential to bring many potential benefits to government and the public (chapter 4).
These include:
 Efficiency and cost savings for government.
 Reduced red tape.
 Increased accuracy.
 Improved consistency.
 Increased productivity and re-focusing of staff to ‘higher value’ activities.
 Better customer service and experience.
 Insights and learning.
Of course, benefits cannot be assumed to follow as a matter of course, and it is important to be realistic about what benefits (and risks) particular technology will deliver in a particular context. Untested assumptions or utopian beliefs about technology should not drive automation strategies.

1.2 Why we have written this report
We were prompted to write this report after becoming aware of one agency (Revenue NSW) using machine technology for the performance of a discretionary statutory function (the garnisheeing of unpaid fine debts from individuals’ bank accounts), in a way that was having a significant impact on individuals, many of whom were already in situations of financial vulnerability.
Following a series of complaints to our office, Revenue NSW worked responsively with us over time to ensure that its garnishee system operated more fairly, by taking account of vulnerability and situations

The new machinery of government: using machine technology in administrative decision-making 2
NSW Ombudsman of hardship. However, we still had questions as to whether Revenue NSW’s system of garnishee automation was legally consistent with its statutory functions.
We sought legal advice from Senior Counsel, which confirmed our doubts. The full Revenue NSW case study, including the legal advice, is set out in annexure A.
Currently, we do not know how many other NSW Government agencies are using, or developing, machine technology to assist them in the exercise of their statutory functions.
However, our experience with Revenue NSW and a scan of the Government’s published policies on the use of ‘AI’ and other digital technologies suggests that there may be inadequate attention being given to fundamental aspects of public law that are relevant to machine technology adoption.

1.3 Administrative law and practice must be given central attention
Some of the broader concerns about machine technology use by the private sector, in terms of privacy, human rights, ethics and so on, also apply (in some cases with greater force) to the public sector.
However, the powers, decisions and actions of government agencies and officials are constitutionally different from that of the general private sector.
This means that the public sector’s use of machine technology, particularly for the purposes of statutory decision-making, must also be assessed from an administrative law perspective (chapter 5). We believe that this assessment must be central to the use of this technology.

1.4 Administrative law requirements for good decision-making
For simplicity, we can broadly group the requirements for good decision-making in the following ways
(chapter 6):

Proper authorisation – this means that there is legal power to make the relevant decision, that the
person making the decision has the legal authority to do so, and that the decision is within the
scope of decision-making power (including, in particular, within the bounds of any discretion
conferred by the power) (chapter 7).

The requirement for proper authorisation means that statutory functions are not and cannot be
directly given or delegated to a machine. It does not necessarily mean that the authorised person
cannot be assisted by machine technology.
There is, however, no uniform answer as to what forms of machine technology can be used, and to
what extent, in the performance of a particular statutory decision-making function. This must be
carefully considered on a case-by-case basis by looking at the particular statute, its purpose, and
the context in which it applies.
However, if the function is discretionary, machine technology must not be used in a way that
would result in that discretion being fettered or effectively abandoned. In effect, this means that
discretionary decision-making functions cannot be fully automated.

Appropriate procedures – this means that the decision has followed a fair process, that it has met
other legal and ethical obligations, and that reasons are given for the decision (particularly where it
significantly affects the rights or interests of individuals) (chapter 8).

The new machinery of government: using machine technology in administrative decision-making 3
NSW Ombudsman

Generally, a fair process requires decisions to be made without bias on the part of the decision
maker (‘no-bias rule’) and following a fair hearing of the person affected (‘hearing rule’). Machine
technology can introduce the possibility of a different form of bias known as ‘algorithmic bias’.
Algorithmic bias arises when a machine produces results that are systemically prejudiced or unfair
to certain groups of people. It is unclear whether the presence of algorithmic bias would
necessarily constitute a breach of the no-bias rule (as that rule is traditionally concerned with
actual or apprehended bias on the part of the particular decision maker). Even if it does not,
however, algorithmic bias may still lead to unlawful decisions (because they are based on
irrelevant consideration or contravene anti-discrimination laws) or other maladministration
(because they involve or result in conduct that is unjust or improperly discriminatory).
Where machine technology is used in the exercise of a function under a particular statute it also
needs to comply with other statutes and common law requirements. Privacy, freedom of
information and anti-discrimination laws, in particular, will almost always be relevant.
Having appropriate procedures also means providing where required, or being able to provide
where requested, reasons to those who are affected by a decision. In our view, this means also
informing those affected if a machine has made (or contributed to the making of) a decision.
Where reasons are required, they must be accurate, meaningful, and understandable, which can
raise particular challenges when machine technology is used.

Appropriate assessment – this means that the decision answers the right question, that the
decision is based on a proper analysis of relevant material, and that the decision is based on the
merits and is reasonable in all the circumstances (chapter 9).

Using machine technology in the exercise of statutory functions means translating legislation and
other guidance material (such as policy) into the form of machine-readable code. A key risk is the
potential for errors in this translation process, and the consequent potential for errors and
unlawful decisions being made at scale.
When designing and implementing machine technology, it is also essential to ensure that its use
does not result in any obligatory considerations being overlooked or extraneous considerations
coming into play. While the use of machine technology may enhance the consistency of outcomes,
agencies with discretionary functions must be conscious of the duty to treat individual cases on
their own merits.

Adequate documentation – agencies are required to properly document and keep records of
decision-making (chapter 10).

In the context of machine technology, this means keeping sufficient records to enable
comprehensive review and audit of decisions. Documentation relating to different ‘versions’ of the
technology, and details of any updates or changes to the system, may be particularly important.

The new machinery of government: using machine technology in administrative decision-making 4
NSW Ombudsman

1.5 Good practice for designing and implementing machine technology
In light of the above, there are some key proactive steps that agencies should take when considering the design and adoption of machine technology that will help them to ensure they comply with principles of administrative law and good decision-making practice.
In particular, when setting out to design machine technology for use in the exercise of statutory functions, agencies should:
1. establish a multi-disciplinary design team that involves lawyers, policymakers, and operational
experts, as well as technicians, with roles and responsibilities that are clearly defined
(chapter 11)
2. assess the appropriate degree of human involvement in the decision-making processes, having
regard to the nature of the particular function and the statute in question (chapter 12)
3. ensure appropriate transparency, including by deciding what can and should be disclosed about
the use of machine technology to those whose interests may be affected (chapter 13)
4. test before operationalising, and establish ongoing monitoring, audit and review processes
(chapter 14)
5. consider whether legislative amendment is necessary or prudent (chapter 15).

1.6 The role of Parliament in authorising machine technology
If legislation is introduced to enable the use of machine technology, then this provides an opportunity for public and Parliamentary debate on the properties that should be required of that technology.
Whether or not these are ultimately prescribed as mandatory requirements in the legislation itself, the kinds of questions that might be asked of government agencies that are seeking legislative authorisation of machine technology could include:

Properties Example of qualities that could be prescribed

Is it visible? What information does the public, and especially those directly
affected, need to be told regarding the involvement of the machine,
how it works, its assessed accuracy, testing schedule etc? Are the
design specifications and source code publicly available – for example
as ‘open access information’ under the Government Information
(Public Access) Act 2009? Is an impact assessment required to be
prepared and published?8

Is it avoidable? Can an individual ‘opt out’ of the machine-led process and choose to
have their case decided through a manual (human) process?

Is it subject to testing? What testing regime must be undertaken prior to operation, and at
scheduled times thereafter? What are the purposes of testing (eg
compliance with specifications, accuracy, identification of algorithmic
bias)? Who is to undertake that testing? What standards are to apply
(eg randomised control trials)? Are the results to be made public?

Is it explainable? What rights do those affected by the machine outputs have to be
given reasons for those outcomes? Are reasons to be provided

The new machinery of government: using machine technology in administrative decision-making 5
NSW Ombudsman

routinely or on request? In what form must those reasons be given
and what information must they contain?

Is it accurate? To what extent must the predictions or inferences of the machine be
demonstrated to be accurate? For example, is ‘better than chance’
sufficient, or is the tolerance for inaccuracy lower? How and when will
accuracy be evaluated?

Is it subject to audit? What audit records must the machine maintain? What audits are to
be conducted (internally and externally), by whom and for what
purpose?

Is it replicable? Must the decision of the machine be replicable in the sense that, if
exactly the same inputs were re-entered, the machine will
consistently produce the same output, or can the machine improve or
change over time? If the latter, must the machine be able to identify
why the output now is different from what it was previously?

Is it internally Are the outputs of the machine subject to internal review of a human
reviewable? decision maker? What is the nature of that review (eg full merits
review)? Who has standing to seek such a review? Who has the ability
to conduct that review and are they sufficiently senior and qualified to
do so?

Is it externally Are the outputs of the machine subject to external review or
reviewable? complaint to a human decision maker?
What is the nature of that review (eg for example, merits review or
review for error only)? Who has standing to seek such a review? If
reviewable for error, what records are available to the review body to
enable it to thoroughly inspect records and detect error?

Is it compensable? Are those who suffer detriment by an erroneous action of the
machine entitled to compensation, and how is that determined?

Is it privacy protective What privacy and data security measures and standards are required
and data secure? to be adhered to? Is a privacy impact assessment required to be
undertaken and published? Are there particular rules limiting the
collection, use and retention of personal information?

The new machinery of government: using machine technology in administrative decision-making 6
NSW Ombudsman

1.7 The way forward – starting with increased visibility
We are hopeful that this report will contribute to public and especially Parliamentary debate about the adoption of machine technology by government, and its proper limits and regulation.
In the final chapter of this report we identify avenues for future consideration, including a question around whether some forms or applications of machine technology might raise such significantly new issues and risks that consideration should be given to new forms of regulation – including mandatory requirements around transparency, pre-operation validation testing and routine auditing, and external review and oversight (chapter 16).
One risk, for example, may be that machine technology will be capable of producing new forms of extremely large-scale systemic injustices, to which the existing framework and institutions of administrative law are ill-equipped to respond.
However, a significant impediment to meaningful debate about the future governance of machine technology use by government is an almost complete lack of transparency about that use.
As mentioned above, we do not know how NSW Government agencies may currently be using machine technology to assist them in the exercise of statutory decision-making functions – and so we do not know how those systems have been designed, what they are being used for, and what (if any) assurance has been obtained that they are operating lawfully and in accordance with principles of good administrative practice.
This is a significant problem. Some technology use may be lawful and appropriately designed and used, but other technology may not.
While we do not consider that visibility is, of itself, a sufficient remedy to address potential concerns that might arise with the use of machine technology, it is an essential starting point.
Following this report, therefore, we will seek to work with relevant bodies, including Digital NSW (part of the Department of Customer Service) and the Office of Local Government, to comprehensively map current and proposed types and uses of machine technology (chapter 2). We will also look inward to consider what more we can do to support agencies and citizens, as well as our own staff, to understand the use of machine technology – and to ensure that administrative law and the enduring values of good public administration, including legality, transparency and fairness, are given central attention.

The new machinery of government: using machine technology in administrative decision-making 7
NSW Ombudsman

Part 1:
Machine technology

The new machinery of government: using machine technology in administrative decision-making 8
NSW Ombudsman

2. Introduction

2.1 The rise of machine technology

Use of machine technologies is increasing in the public
sector, and their sophistication and use will only grow
in the future.
Recent NSW Government announcements reveal an intention to increase work on – and investment in – machine technology. In September 2020, the NSW Artificial Intelligence (AI) Strategy was released which is ‘focused on improved service delivery and government decision-making’.9 At the same time, the Government also released the Artificial Intelligence (AI) Ethics Policy.10
In chapter 3 we discuss what machine technology is, how it is currently being used by governments, and how it may be used in the future.

2.2 Why is the Ombudsman interested in machine technology?
We are always concerned to ensure that government agencies and officials conduct themselves lawfully, make decisions reasonably, and treat all individuals equitably and fairly.
One of our primary functions is to handle complaints about the conduct of government agencies and public officials. We can generally investigate these complaints if we think that conduct may fall within any of the following categories set out in section 26 of the Ombudsman Act 1974:
(a) contrary to law,
(b) unreasonable, unjust, oppressive or improperly discriminatory,
(c) in accordance with any law or established practice but the law or practice is, or may be,
unreasonable, unjust, oppressive or improperly discriminatory,
(d) based wholly or partly on improper motives, irrelevant grounds or irrelevant consideration,
(e) based wholly or partly on a mistake of law or fact,
(f) conduct for which reasons should be given but are not given,
(g) otherwise wrong.11
Conduct of the kinds set out above may be said to constitute ‘maladministration’. Where we suspect maladministration, we can also make inquiries about and investigate conduct on an ‘own motion’ basis, without the need for someone to have made a complaint.
One way that an agency’s conduct may be constitute maladministration (ie be unlawful or unreasonable or unjust, etc) is if it is using machine technology in a way that is inconsistent with administrative law and principles of good administrative decision-making. This report highlights some of the ways this might happen.
Going forward, we will consider what further guidance we can provide to help agencies and public officials understand the matters that we will consider when handling complaints about the use of machine technology in the performance of their administrative functions.

The new machinery of government: using machine technology in administrative decision-making 9
NSW Ombudsman

Reviewing ‘decisions’ versus investigating ‘conduct’
It has been observed that, if a human decision maker is fully replaced by a machine to exercise
administrative functions, one potentially adverse consequence may be that certain rights to
challenge the exercise of those functions in court could be lost.12
This is because some rights may be premised on there being a ‘decision’ that can be the
subject of challenge. As noted in chapter 12, the Federal Court has suggested that an essential
element of a decision generally is that a relevant decision maker has engaged in a subjective
mental process of reaching a conclusion. As an autonomous machine does not have a
subjective mental capacity, a ‘decision’ of the machine may not be recognised by law as a
decision.13
However, the automation of some or all of an agency’s activities should not limit the
jurisdiction of the Ombudsman to receive complaints and undertake investigations about
those activities. This is because the Ombudsman is concerned with conduct.
Under section 5(1) of the Ombudsman Act, conduct of a public authority is defined as follows:
conduct means—
(a) any action or inaction relating to a matter of administration, and
(b) any alleged action or inaction relating to a matter of administration.
Conduct includes (but is much broader than) actions involved in making or implementing
a decision.
For example, any or all of the following could be scrutinised by an Ombudsman to determine
whether conduct has occurred that is unlawful, unreasonable, improperly discriminatory,
unjust or otherwise wrong:
 the decision to adopt machine technology
 the way the machine has been designed
 the data used by the machine
 the policy and business rules underpinning the machine
 the people involved in designing and building the machine, what consultation occurred,
and any external procurement activities
 whether and how the machine was validated, tested, audited and monitored
 whether and what safeguards were put in place to identify and address potential
algorithmic bias
 whether and what information has been disclosed publicly about the machine
 the use of the machine for particular functions of the agency
 the consideration or effect that is given to the outputs of the machine, either generally
or in a particular case.
More generally, if an agency uses machine technology then whatever that machine does will
be ascribed to the agency itself – at least for the purposes of an Ombudsman investigation.
Accordingly, if the processes or outputs of machine technology are unlawful, unreasonable,
unjust or improperly discriminatory, then the agency’s conduct in using that machine will likely
be considered by us to have been unlawful, unreasonable, unjust or improperly discriminatory
under s 26 of the Ombudsman Act.

The new machinery of government: using machine technology in administrative decision-making 10
NSW Ombudsman

2.3 What has prompted this report?
The immediate impetus for this report was an investigation we commenced following complaints we received about Revenue NSW.

Revenue NSW’s garnishee machine
Revenue NSW is the Government’s debt collecting agency. The head of Revenue NSW, the
Commissioner of Fines Administration, is permitted by legislation to issue garnishee orders to recover debts in certain circumstances. A garnishee order is an indirect method of recovering a debt from someone. The order allows a creditor, such as Revenue NSW, who is owed money by a debtor to recover the debt by obtaining payment from a third party who owes money to the debtor. Third parties who can be garnisheed include a person’s employer (who owes the debtor their salary) or a person’s bank (who owes the debtor what is held in the person’s savings account). This power to issue garnishee orders is a debt recovery method that originated in (and is still available from) courts of law.
The complaints we handled were about Revenue NSW garnisheeing the bank accounts of people who had failed to pay fines they owed to the Government. Many of the complainants were financially vulnerable individuals who, in some cases, had their bank accounts emptied.
We engaged with Revenue NSW, and over time we became satisfied by the steps it was taking to address the issues raised in the complaints. For example, Revenue NSW adopted a ‘minimum protected balance’, meaning that its garnishee orders would not result in bank accounts being completely emptied and left with a nil balance. This 'minimum protected balance' protection was later put into legislation.
During our investigation, we became aware that Revenue NSW had been using machine technology in the exercise of those garnishee powers. However,

those who complained to us about Revenue NSW’s
activities were not complaining about the use of machine
technology – they were not even aware of it.
They were just concerned that their money had been taken, in some cases leaving them with no money in their account to pay rent or buy groceries.
As we learned more about how Revenue NSW was using machine technology to issue garnishee orders, we became increasingly concerned about the lawfulness of its conduct. We used our power under section 31AC of the Ombudsman Act to make a number of formal suggestions including that Revenue
NSW seek expert legal advice on the legality and design of its machine technology system.
Revenue NSW responded positively to our suggestions – for example, by publishing a new hardship policy. We decided to discontinue our investigation on the basis of actions Revenue NSW was already taking, as well as future actions it told us it would take.
Eleven months after we suggested that Revenue NSW seek legal advice, we followed up to check on any legal advice received and any action it had taken in response. We were advised that it had not sought that legal advice, either externally or from the legal branch of the Department of Customer Service
(DCS), of which Revenue NSW is a part. Revenue NSW advised us that it did not consider it necessary to seek such advice, as it considered that changes it had made to its process for issuing garnishee orders had addressed any potential legal concerns.

The new machinery of government: using machine technology in administrative decision-making 11
NSW Ombudsman

We continued to have doubts and decided to seek our own legal advice about Revenue NSW’s system – this also helped inform our broader understanding of the legal issues associated with public sector use of machine technologies. Revenue NSW cooperated throughout the process of seeking that legal advice, including by assisting in the preparation of a detailed statement of facts that we then provided to legal counsel for the purpose of obtaining their advice.
The legal advice, and the Revenue NSW case study, is set out in full in annexure A of this report. We understand Revenue NSW is currently considering further changes to its garnishee system. We will continue to monitor developments.

2.4 The purpose and structure of this report
This report provides a starting point for agencies and their officials to better understand why and when the Ombudsman (and other bodies, including courts) could hold concerns about their adoption and use of machine technology, and to identify some proactive steps they could take to ensure compliance with principles of administrative law and good practice.
It is not intended to be a comprehensive guide, either to the technology or to the legal and practice issues that its use might raise. Rather, we highlight some of the more important issues that we foresee will likely arise with the use of machine technology.
In doing so we hope to contribute to public debate about these technologies, and in particular their use by government, with a view to ensuring that fundamental and enduring ‘public law values’ are placed squarely at the centre of those discussions.14
We recognise that both machine technology and administrative law are, in their own different ways, highly technical fields that can be challenging for non-experts to understand. Indeed, one reason why machine technology use in the field of government administration may be particularly risky is because those who are expert in machine technology may lack experience in administrative law, and vice versa.15
However, we have sought as far as possible to write this report in non-technical language. Our hope is that it can be read and understood by any agency official likely to encounter machine technology, and by policymakers and the general public.
In this report we:
 outline what we mean by machine technology, its potential benefits and how we see our role
in this context (part 1)
 examine and highlight the intersection between machine technology and administrative law
and practice (part 2)
 offer some practical suggestions for machine technology design and implementation (part 3).
The report includes a number of short case studies as examples, as well as the more detailed case study of Revenue NSW’s use of machine technology (annexure A).
We end the report with a question about whether there is a need for new laws – not to restrain innovation, but to ensure appropriate governance, transparency, accountability and oversight in government use of machine technology (chapter 16).

The new machinery of government: using machine technology in administrative decision-making 12
NSW Ombudsman

2.5 What we will do next
The NSW Parliamentary Research Service recently observed that, while there had been some international progress on transparency of automated decision-making,

no Australian jurisdiction appeared to be working
on creating a registry of automated decision-making
systems. 16

Following the publication of this report, we will seek to work with relevant bodies – including Digital
NSW (part of the Department of Customer Service) and the Office of Local Government – with a view to mapping in detail the types of machine technology currently in use, or under development, across NSW
Government and Local Government. Following that work, we will explore whether there is a need for a centralised registry or other approaches to enhance transparency on an ongoing basis, such as by mandating that each individual agency make details of their machine technology use publicly available as ‘open access information' under the Government Information (Public Access) Act 2009 (as has been suggested by the NSW Information Commissioner).17
We will also begin work to develop more practical and comprehensive guidance to support agencies, recognising that the internal and external demands for them to adopt machine technology will inevitably continue to grow.
In particular, we will:
 Prepare a new edition of our publication, Good Conduct and Administrative Practice: Guidelines
for State and Local Government,18 to include guidance around the use of machine technology
 Update our training services, including in particular our course on Administrative Law in the
Public Sector,19 to specifically address the implications of machine technology on administrative
law and practice.

The new machinery of government: using machine technology in administrative decision-making 13
NSW Ombudsman

3. The new technologies

3.1 What we mean by ‘machine technology’
The continual, rapid pace of technological change means that the terms used to name and describe various technologies are not settled and can differ depending on the context.
In this report we have chosen to use the term machine technology to refer to a broad cluster of current and future systems and processes that, once developed, run with limited or no human involvement, and whose output can be used to assist or even displace human decision-making (and specifically in the context of this report, within a public sector administrative context).20 The complexity of this technology ranges from relatively rudimentary to extremely sophisticated.
A machine in this context does not necessarily mean a computer or other physically embodied device.
Machine technology will often take the form of software code and, as we will see from the example in chapter 14, it may even involve a methodological tool that can be operationalised by simply using pen and paper.
We have sought where possible to avoid the use of terms such as ‘artificial intelligence’ (AI) or
‘automated decision-making’,21 although these would generally be covered by what we mean by machine technology.

Our focus is not on the technical aspects of machine
technology, but on its outcomes and the risks involved in
using it in the public sector.

3.2 Machine technology is not just one thing
While we have not attempted to define or classify the various types of machine technology that are currently in use and under development, one important distinction is between machine technology that adopts a ‘rule-based’ approach and those that involve adaptive ‘machine-learning’ techniques:
 A rule-based process is one that simulates a human decision-making process by following a
logical set of rules or formulae which could ultimately be reduced to an expression (or a series of
expressions) in the form of: ‘If x, then do y; if not-x, then do z’.22 This is sometimes described as
‘human coding’ or ‘good old fashioned AI’ – in which human programmers construct explicit rules
for intelligent behaviour.23
A critical feature of a rule-based process is that, at least in theory, its rules could be written out in
a way that would be comprehensible to a human, or at least one who also understands the
language of computer coding. In practice, however, some rule-based systems may involve such
density and complexity that no human could ever realistically grasp their full end-to-end process.
Rule-based processes are often used to perform functions at scale because of bulk (‘brute force’)
processing capability such as data-matching, processing online forms, calculation of amounts, and
issuing of system-generated notices and correspondence.
 A machine learning process is one that first uses historical data known as ‘training sets’ – which
may include the machine’s own ‘experiences’ – to identify correlations and patterns in data. It can
then be fed new, previously unseen ‘real world’ data and make inferences and predictions based
on whether and how that new data matches the correlations or patterns previously recognised in
the training sets.

The new machinery of government: using machine technology in administrative decision-making 14
NSW Ombudsman

It does this by determining ‘features’ of the data and assigning ‘parameters’ (that is, weights) to
those features by identifying, typically through an iterative process of trial and error, which of all
the possible features and parameters optimise the proportion of ‘right’ inferences and
predictions that it makes over time.
These systems are said to ‘learn’ because they are ‘capable of changing their behaviour to
enhance their performance on some task through experience’24 and without being explicitly
programmed.
Machine learning systems can be used for various functions, including grouping together cohorts
of people based on characteristics or categorising images.
Of course, a decision-making system may combine elements of machine learning processes and rule- based processes (as is the case in the current Revenue NSW garnishee system – see annexure A).

3.3 The need to design machine technology for particular applications
Whichever type of machine technology is used, every particular application will be unique to the task it has been designed for.
Leaving aside the speculative possibility of some future ‘general AI’ (that is, an intelligence able to understand and learn intellectual tasks equalling or surpassing that of humans),25 every application of machine technology to a particular administrative decision-making context requires human designers to make decisions about the design of that technology in that context.
Even machine technology that may have self-improving (learning) capabilities will require humans to make a multitude of design choices. For example, human designers will ‘collect, curate and label’ training sets from which the technology will learn.26 Human designers will set the objectives – that is, what it is the technology is learning to optimise. Further, while machine learning technology learns its own parameters through complex and iterative processes of ‘trial-error-adjustment-retrial’, there are various deeper aspects of the technology (known as ‘hyperparameters’) that humans must set up or
‘tune’ before learning can begin.27

Machine technology is not just used for an administrative
decision-making task; it must first be designed and built
for such use.
We return to this important observation in part 3 below, when we consider what steps can be taken to better ensure that machine technology is designed and built so that it is not used in a way that may be unlawful or result in maladministration.

The new machinery of government: using machine technology in administrative decision-making 15
NSW Ombudsman

4. The promise of machine technology in government
decision-making

4.1 Machine technology within a decision-making system
The extent to which humans might be involved in the implementation of a system that utilises machine technology can vary widely. Generally speaking, where humans play some active role, the system can be referred to as a ‘human-in-the-loop’ system. For the purposes of administrative decision-making, the most important type of human-in-the-loop system is a ‘human-on-top’ system. In these instances, the final step in the system – say, to grant a permit, approve an application or provide a benefit – is ultimately made by a human with the outputs of the machine technology being used to inform or support their decision.
At the extreme other end are fully automated systems, in which the outputs of the machine technology
(for example, to issue or cancel a licence of some kind) are both generated and actioned (that is, given effect as an administrative decision) without any intervening human decision-making or approval.
As we explain further in chapter 7, understanding the extent to which decision-making is automated and what role, if any, humans have in the performance of a function will often be critical to determining whether the use of the machine technology has been lawful and appropriate.

Mobile phone detection cameras
Since 1 March 2020, Transport for NSW (TfNSW) has been using mobile phone detection
cameras, including fixed and transportable cameras, to identify drivers illegally using mobile
phones.
In New South Wales it is generally illegal for a driver to use a mobile phone except in limited
circumstances, such as to answer a phone call using hands-free Bluetooth or voice activation,
or where the phone is in a fixed cradle that does not obscure vision.28
Images taken by TfNSW’s mobile phone detection cameras are reviewed using machine
learning (AI) systems to filter those images that may show potentially illegal mobile phone use
while driving. The machine technology ‘automatically reviews images and detects potential
offending drivers, and excludes images of non-offending drivers from further action.’29
Images are then reviewed by an authorised officer and penalty notices are issued by Revenue
NSW if illegal mobile phone use is determined.30 This human verification process is an example
of a human-on-top system, and is similar to the checks performed before a penalty notice is
issued in relation to camera-detected speeding and red-light offences.31
In September 2019, to address concerns that the courts may be inundated with spurious
challenges to infringement notices based on images caught by the cameras, the NSW
Government introduced legislation that would ‘reverse the onus of proof’ by deeming that an
object being held by a driver and shown in a photograph from a device approved for mobile
phone use offences is a mobile phone unless the accused driver can establish that it is not.32
The Bill passed the Legislative Assembly on 15 October 2019 and has been introduced into the
Legislative Council. Debate on the Bill was adjourned on 20 November 2019, and has not
resumed.

The new machinery of government: using machine technology in administrative decision-making 16
NSW Ombudsman

The Bill contains no provisions concerning the publication of information about, or the testing
or auditing of, the technology. A Parliamentary Committee report on the Bill stated that:
While it is acknowledged that there will be human intervention and review prior to any
infringement notices being issued, the task of winnowing down millions of images to
identify prima facie criminal conduct will still be handled by artificial intelligence. Given
this, there should be transparency in how the artificial intelligence identifies potential
offenders including the ability to test whether or not the algorithms contain any
inadvertent or inherent biases.33
The Bill also contains no provisions relating to the use or destruction of captured images. The
mobile phone detection scheme relies on broad permissive provisions in the Privacy and
Personal Information Protection Act 1998, which allow the collection and use of personal
information for law enforcement purposes.34 In the second reading speech for the Bill, the
Minister stated:

In relation to privacy, information relating to drivers and passengers is captured for law
enforcement and road safety purposes only. As committed to during the introduction
of the Road Transport Legislation Amendment (Road Safety) Act 2018, Transport for
NSW undertook detailed consultation with the NSW Privacy Commissioner during the
pilot of the program, and will continue to engage with both the Privacy Commissioner
and the Information Commissioner on the rollout of the program.35

On 19 November 2019, the NSW Privacy Commissioner issued a media release noting that:

The Privacy Commissioner provided advice and assistance to the agency to ensure that
privacy rights were considered, and appropriate risk mitigation strategies put in place
to minimise privacy harm to the public such as:
o minimising the retention of images
o cropping or pixilation of images when viewed for verification purposes
o the use of strong encryption and other security measures
o need for strong contractual requirements on any provider to comply with the
PPIP Act.36
There is otherwise limited publicly available information concerning the privacy protection
measures that have been put in place.37
More than 260,000 penalties have been issued since the mobile phone detection cameras
became operational on 1 March 2020.38
With the learning capability of the system, TfNSW expects that the machine technology will
improve over time, meaning that it will become more accurate at detecting potential mobile
phone use.39 The technology is also currently being tested for use in detecting seatbelt
offences.40

The new machinery of government: using machine technology in administrative decision-making 17
NSW Ombudsman

4.2 How are governments using machine technology?
The use of simpler forms of machine technology in public sector decision-making is not new. However,

what is changing is the power, complexity, scale, and
prevalence of machine technologies, and the extent to
which they are increasingly replacing processes that
have, up to now, been the exclusive domain of human
decision-making.
One of the first such machine technology systems used by the NSW Government was the (then)
Department of Fair Trading’s automated business-name registration process in 1999. That system
‘supported the registration of business names and the incorporation of associations’.41
Today, use of machine technology in the NSW public sector is likely to be extensive, and it is growing rapidly. We say ‘likely’, because, as already noted, there is currently no mandatory reporting or means of comprehensively tracking technology use by the NSW Government.
It is, however, clear that over the 2 decades since the adoption of an automated business names registration process (and as described in the examples throughout this report) machine technology is becoming a significant component of much government service delivery. Across NSW Government agencies, machine technology is a critical tool in a wide variety of areas, from traffic and fines enforcement42 to assessment of child protection risk,43 from determining grants of legal aid44 to triaging claims from injured workers.45
Machine technology is also heavily relied on to deliver Australian Government services, including social services, immigration, and taxation. For example, the Australian Tax Office (ATO) has said that it is delivering greater automation and digital services and is using machine learning to accelerate decision- making.46
The rise of machine technology is a global phenomenon and is increasingly being used by governments around the world for delivery of core government business.47 Internationally, it has been frequently observed that machine technology is disproportionately used in areas that affect ‘the most vulnerable in society’ – in areas such as policing, healthcare, welfare eligibility, risk scoring and fraud detection.48

Automating procedural court rules
In Hemmett v Market Direct Group Pty Ltd [No 2] [2018] WASC 310, a claimant had had his
proceedings to recover a debt dismissed by an automated court case management system.
The system was programmed to ‘dismiss’ claims, without human intervention, when claimants
had not taken any action for a prescribed period of time. The claimant in this case was unable
to bring a new claim, as the limitation period had expired.
The Supreme Court of Western Australia set aside the dismissal of the claimant’s case, but on a
technical point. Under the court rules, a case could only be dismissed from an ‘Inactive Cases
List’, and the automated case management system did not keep such a list. Although the claim
may have been registered as inactive in the system, that did not mean it was on an actual list
of inactive cases, and so it could not be dismissed.

The new machinery of government: using machine technology in administrative decision-making 18
NSW Ombudsman

The Court did not need to consider whether the court rules around dismissing inactive cases
could be automated. However, the Court gave a ‘provisional view’ that ‘a degree of automated
decision-making – better described as ‘technology-assisted decision-making’ – may be
permissible' provided it ‘preserves accepted accountability structures’.49

4.3 The potential benefits of machine technology

There are many potential positive benefits of machine
technology for public sector agencies and citizens.50
Frequently cited benefits include:
 Efficiency and cost savings for government: there are clear efficiency benefits and cost savings
for agencies using technology to streamline processes and perform repetitive tasks at scale. In
addition to making faster decisions, machine technology may also enable agencies to reach
more citizens than is possible using staff alone. Increased efficiency also benefits citizens
interacting with the government.
 Reduced red tape: citizens can benefit from machine technology as decisions are made faster
and require less direct and unnecessary engagement with government. This is premised on a
view that ‘citizens have limited time and energy to engage with government.’51 This is certainly
the case where government activities impose red tape – resulting in time and financial costs for
agencies and citizens as they identify and meet regulatory requirements. The NSW Government
has said that it is seeking to use technology to ‘make compliance easy’52 for the citizen.
 Increased accuracy: machine technology is less prone to certain types of possible errors arising
from inherently human frailties such as distraction, fatigue, negligence or lack of training. In
this context, machine technology promises to be more accurate than human decision makers.
Tools that use machine technology to support (rather than fully replace) staff through complex
legislative rules could also support greater accuracy in the performance of functions.
 Improved consistency: as the output of machine technology is limited to what it has been
designed to process, it will produce consistent outcomes. Some machine technology will
produce more consistent outcomes than multiple human decision makers (although there may
be exceptions, such as some machine learning processes that ‘learn’ over time, which prioritise
improved accuracy over consistency across time: see chapter 9).
A related benefit is the ability to produce an audit trail of the steps taken to reach an outcome,
which might in some instances be a more comprehensive form of transparency than a human
decision maker’s account of how they arrived at an outcome.
 Increased productivity and re-focusing of staff to ‘higher value’ activities: the 2019 review of
the Australian Public Service (APS) found that about 40% of APS employee time is currently
spent on ‘highly automatable tasks.’53 There is potential for machine technology to free up staff
to focus on other functions that perhaps cannot (and arguably should not) be automated, such
as complex individual case management.

The new machinery of government: using machine technology in administrative decision-making 19
NSW Ombudsman

 Better customer service and experience: there may be an expectation that government should
keep pace with private sector service standards, providing instant, seamless and increasingly
digital service. Reliability, speed and simplicity, as well as fewer mistakes, can contribute to a
better experience for customers of public services. Automation of large-volume routine tasks
could also allow client-facing staff to devote more of their time to providing more complex and
caring services, giving more attention to those who need it.54
 Insights and learning: an indirect benefit of machine technology use in decision-making is that
it inherently involves the creation of a rich mine of data about both inputs and decision
outputs, which can inform improvements in future public sector policies and practices.
Of course, benefits cannot be assumed to follow as a matter of course. For example, machine technology that has been coded with errors will not only result in inaccurate outcomes, it will also likely result in inaccuracies at much greater scale than would otherwise be possible (see chapter 9).
It is important to be realistic about what benefits (and risks) particular technology will deliver in a particular context, and not to allow untested assumptions or utopian beliefs about technology to drive automation strategies.55

The new machinery of government: using machine technology in administrative decision-making 20
NSW Ombudsman

Part 2:
Administrative law and machine technology

The new machinery of government: using machine technology in administrative decision-making 21
NSW Ombudsman

5. Why administrative law matters for new technology

5.1 Public sector decision-making is different
The use of machine technology by the private sector obviously raises technical, legal and ethical issues.
Many of these issues also arise in the context of public sector use of machine technology.56
Questions about the ethics of permitting risk allocation decisions to be made by autonomous devices, the collection and use of facial recognition and other personal information, and issues of potential bias and discrimination, are equally important to private and public sector use of machine technology.
However, the use of machine technology in the exercise of the government’s administrative functions both heightens the impact of some of those considerations, and raises new ones. Public authorities exercise powers that impact virtually all aspects of an individual’s life; there is ‘scarcely any field of human activity which is not in some way open to aid or hindrance by the exercise of power by some public authority’.57
The inherently ‘public’ nature of such functions (such as health, education, and transport) and the specific focus of some government service provision on vulnerable groups means that the government’s use of machine technology will necessarily, and often significantly, impact most of society. Recipients of government services – unlike customers of private sector businesses – are also typically a captive market, unable to access alternative providers or to opt out entirely if they do not like the way decisions are made and services are provided.
Most importantly, governments do not just provide services; they also regulate the activity of citizens and exercise a monopoly over the use of public power and coercive force – taxation, licensing, law enforcement, punishment, forms of detention, and so on. It is in the exercise of functions like these, which can affect people’s legal status, rights and interests, that administrative decision-making principles raise particular issues that are unique to the public sector.

5.2 Good government according to law
The government has a monopoly over public administrative power, but this means that the exercise of that power is controlled through public administrative law.
Any use of machine technology by government agencies must therefore be considered from an administrative law perspective (this is not to disregard or diminish other perspectives, such as broader ethical58 and human rights59 perspectives).
Ultimately, all administrative law principles may be seen to support a single underlying principle: while citizens may generally do whatever they please unless it is prohibited by law,

those exercising public or governmental power must not
only avoid what is prohibited by law, they must also do
only what they have been authorised by law to do.
That is, a government agency or public official needs express or implied legal authority to make and give effect to an administrative decision. This means that agencies and their administrators may exercise only those functions that have been granted to them – which today is usually done through legislation – along with any ancillary or incidental powers that are necessarily implied to facilitate the exercise of those functions.60

The new machinery of government: using machine technology in administrative decision-making 22
NSW Ombudsman

The need for legal authority also means that functions can only be exercised ‘by reference to correct legal principles, correctly applied’.61 Those correct legal principles are concerned with upholding the values of good government decision-making, and include openness, fairness, accountability, consistency, rationality, legality and impartiality.
The ultimate aim of administrative law is good government according to law.62

5.3 Existing laws apply to new technologies
When new technology is introduced, it is introduced into an existing legal environment.

No technology is ever introduced into a complete
legal vacuum.
The technology may be more or less adapted to that legal environment, and the law itself may be more or less hospitable to the technology. It may also (at least initially) be unclear exactly how the legal environment will accommodate and respond to the technology.
Moreover, the combination of new technology and existing law can generate gaps, inconsistencies or other undesirable outcomes. Where this is the case it may be necessary or desirable to make conscious changes to the law, from minor tweaks to radical overhauls, to meet the challenges of the new technology – a possibility we explore further in chapter 16.
Of particular relevance to this report, the legal environment into which machine technologies are now being introduced is one that is governed by public administrative law – the law which controls government decision-making.63
That legal environment includes courts, which undertake judicial review of administrative decisions, and administrative tribunals, which can have a role in undertaking merit reviews of some decisions. It also includes a range of integrity bodies such as ombudsman institutions, which – while they may not make legally binding determinations – have a broad remit in terms of investigating and making findings about wrong administrative conduct.

5.4 How administrative law applies to new machine technologies
Administrative law has developed over many centuries, although many of its modern features have developed in the last half century.64 However, it is essentially principles-based and can therefore be considered, conceptually at least, to be ‘technology-agnostic’. This means that,

while the technology used in government decision-
making may change, the underlying norms that underpin
administrative law remain constant.
There is no reason to expect that administrative law will not evolve in response to the challenges raised by new technology. Indeed, the growth and importance of administrative law over the past half century or so is a by-product of its application and refinement to meet other challenges of modern government such as the rise of the welfare state, large scale bureaucracy, and privatisation.65
We can be confident that these laws will continue to be interpreted and applied as the technological context continues to change. A recent survey of Australian academics and legal practitioners about the

The new machinery of government: using machine technology in administrative decision-making 23
NSW Ombudsman impact of information technology on the teaching of administrative law found that ‘many interviewees expressed the view that technological change would not impact fundamental administrative law principles, but instead would be relevant to the interpretation and application of those principles in practice.’66 Generally, that is the perspective we have also taken in this report.
In the next chapters we will consider some of the important elements of current administrative law and practice, and how they will likely affect and control the adoption and use of machine technology by government decision makers.
However, we also note in these chapters some potential gaps, or at least uncertainties, in the capacity of existing administrative law rules and associated frameworks to respond to novel issues that may arise with the use of new technologies. In the final chapter of this report, we will ask whether new or additional legal approaches should also be considered.

The new machinery of government: using machine technology in administrative decision-making 24
NSW Ombudsman

6. Key administrative law issues for machine technology
In the next 4 chapters we look at some of the key issues raised by administrative law that will likely be most important to machine-aided administrative decision-making.

6.1 The essential requirements for good (and lawful) administrative
decision-making
When we provide training to non-lawyers in the public sector on administrative law,67 we group the essential requirements of administrative law for good decision-making as follows:
A. Proper authorisation
1. There is a legal power to make the decision.
2. The person making the decision has the legal authority to do so.
3. The decision is within the scope of the decision-making power (including, in particular,
within the bounds of any discretion that is a component of the power).
B. Appropriate procedures
4. The decision has followed a fair process.
5. The procedure meets other legal and ethical obligations.
6. Reasons are given for the decision (particularly decisions that affect the rights or interests of
individuals).
C. Appropriate assessment
7. The decision answers the right question (which necessitates asking the right question).
8. The decision is based on a proper analysis of relevant material.
9. The decision is based on the merits and is reasonable in the circumstances.
D. Adequate documentation
10. The circumstances surrounding the making of decisions are adequately documented
and records kept.
Administrative law is obviously more complex than this simple list may suggest, and there are more technically rigorous ways of classifying its requirements.68 For simplicity, however, we will stick with the familiar and simple list above.
In the next chapters we will examine the use of machine technology in the context of administrative decision-making by looking at each of the above elements in turn. These chapters are not intended to be exhaustive or definitive. There are myriad ways in which administrative decision-making can go wrong – we are highlighting just some of the more obvious ways things can go wrong when machines are used.
In so doing, we aim to demonstrate why and how

the well-established elements of good administrative
decision-making listed above must continue to be given
central focus even – or perhaps especially – when new
technologies are being used.

The new machinery of government: using machine technology in administrative decision-making 25
NSW Ombudsman

Some machine technology use by government will be
‘legally unexceptional’
It is important to acknowledge that there will be many governmental uses of machine
technology that will likely be considered legally unexceptional, in the sense that it will probably
raise few or no significant concerns from an administrative law perspective.
This does not mean that the principles of administrative law summarised above and
considered further below (and especially the underlying requirement of government agencies
to act only within their legal authority) do not apply. However, the risks of automating some
administrative tasks of government will obviously be much lower in some cases.
There are numerous straightforward administrative decisions that are non-discretionary –
where ‘if X’ is the case then the decision must be ‘Y’ and where the question of whether X is
the case will be obviously and incontrovertibly true or false. These would seem to be the kinds
of functions that may be suited to automated processes, particularly as the reasons for any
decision will be clear and there should be no difficulty in identifying if the decision was wrong,
and obtaining redress if it was. In such very simple (and especially high-volume) decision-
making, community benefits of automation in terms of accuracy, speed and efficiency may
‘count for a great deal’.69

The new machinery of government: using machine technology in administrative decision-making 26
NSW Ombudsman

7. Proper authorisation

7.1 Legal power to make the decision
We are primarily concerned with the legal principles that govern the performance of statutory functions
– that is, activities where the source of power is legislation (including Acts, Regulations and other instruments). There are also ‘non-statutory’ sources of government power, such as the powers the
Crown can exercise in common with other legal persons (sometimes referred to as ‘executive power’).70
These include establishing and running workplaces and other enterprises, entering into contracts, procuring goods and services, and bringing and defending proceedings.71 Of course, machine technology can support the exercise of non-statutory powers too, and their use there has the potential to raise various legal and ethical concerns as well as potential benefits (see chapter 4 above).
However, our main concern here is with the exercise of statutory powers. As is the case with statutory powers exercised exclusively by humans, those exercised by, or with the assistance of, machines will only be lawful if they are consistent with the statute that provides the source of the relevant power.
A decision cannot be made to do something that is not within the power given by the relevant statute. This will obviously continue to be true when a machine may be involved in the decision- making process. That point may seem obvious enough. However, its central importance cannot be overstated (see ‘Services Australia Centrelink’s automated compliance program (Robodebt)’ below).

Services Australia Centrelink’s automated income
compliance program (Robodebt)
Much has been written about the issues associated with Centrelink’s automated compliance
program, and it offers a cautionary tale for government use of machine technology.
Under its compliance program, Centrelink sought to use machine technology to raise and collect
debts arising from overpayment of social security benefits and in some cases, apply a
discretionary 10% recovery fee. The automated system used by Centrelink was flawed and
generated erroneous debt notices. It did this by matching data from the Australian Tax Office with
Centrelink data and averaging the income of social security recipients over a period of time. This
failed to account for periods of fluctuating income, which were important for the correct
calculation of social security benefits.72 The erroneous debt notices were sent directly to
Centrelink customers following notification of a possible discrepancy in their payment.
The automated process replaced the previous manual fact-finding process. While the manual
process may also have used averaged income data to identify and question possible overpayment,
the automated process now treated this data as evidence of a debt under social security
legislation. This triggered a shift in responsibility for proof of debt – the alleged debtor was
required to prove that a debt was not owed.73
Failure to pay could also result in garnishee action by Centrelink, as in the case of Amato v The
Commonwealth of Australia.74 In that matter, declarations and orders were made by consent
acknowledging that the Commonwealth could not be satisfied, based on the income averaging
method, that a debt was owed by the applicant, and that there was no foundation for imposing a
penalty or taking garnishee action.75 In November 2019, Centrelink stopped raising debts on the
basis of income averaging and a class action lawsuit was filed against the Commonwealth of
Australia. Centrelink is currently refunding eligible people who paid debts raised using averaged
income information.76

The new machinery of government: using machine technology in administrative decision-making 27
NSW Ombudsman

Centrelink’s automated income compliance program has been subject to two Senate inquiries in
addition to scrutiny through the Senate Estimates process. The Senate Community Affairs
References Committee made several recommendations including a review of the legal
requirements for all Services Australia compliance activities relating to overpayment.
This case study demonstrates the capacity for errors to impact on a large scale when machine
technology is involved, and the importance of agencies obtaining and thoroughly considering legal
advice when designing machine technologies to ensure they are applying the correct
interpretation of legislation.

7.2 Legal authority of the person making the decision
When Parliament creates a statutory function, it gives someone (or more than one person) power to exercise that function. This person must be a ‘legal person’. A legal person can be a natural person (a human being) or a legally-recognised entity, such as a statutory corporation. In other words, statutory functions are granted to someone who is legally capable of exercising powers and being held accountable for obligations.77
Commonly, when Parliament gives someone power to exercise a function, it will also permit that person to formally delegate the function to a delegate.78 Those delegates can then perform the function, as long as they comply with any conditions set out in the statute or the instrument of delegation.79
Just as a statutory function can only be given by Parliament to a legal person, the function can only be delegated to a legal person.
When a function is delegated, the delegate can independently exercise the function in the same way as the person on whom Parliament conferred the function.80 This is the way in which many statutory functions are performed.
At law, if a person purports to perform a function:
 without Parliament having given them the power to do so, and
 without a proper delegation their exercise of the function may be invalid.
Statutory functions are not, and cannot be, granted to or delegated to a machine.81 The authority and responsibility for exercising a statutory function can only be conferred on or delegated to a legal subject (a someone) and not a legal object (a something).82

Administrative assistance (the Carltona principle)
Even when a function has not been formally delegated, the person who has been conferred the function may be able to obtain assistance in the exercise of the function. Bodies corporate can only act through human agents, but even human administrators may be assisted in performing their statutory functions, at least to some extent.83
This principle, sometimes referred to as the Carltona principle,84 recognises that, in conferring a statutory function on an administrator, Parliament does not necessarily intend that the administrator personally undertake every detailed component or step of the function. As a matter of ‘administrative necessity’, some elements of a function might need to be shared with others who are taken to be acting on the administrator’s behalf. The extent to which performance of functions can be shared under the
Carltona principle will depend on the particular statutory function.

The new machinery of government: using machine technology in administrative decision-making 28
NSW Ombudsman

The reasoning underlying the Carltona principle appears to be sufficiently general that it could extend to permit at least some uses of machine technology. That is, if the holder of a statutory function, having regard to ‘practical necessity’,85 cannot be expected to personally perform every step of a function in every case, there seems no reason why they should be limited to assistance only from human agents.
Instead, they may be able share performance of components of the function with a machine.
However, whether using human or machine assistants, the Carltona principle only permits assistance that is consistent with the administrator remaining, at all times, the one who ultimately retains control of the function and is accountable for its performance. There may also be doubt as to whether assistance can extend to activities that are not routine or that involve the exercise of a statutory discretion.86
Further, the principle is based on a necessity imperative. The holder of a statutory function cannot rely on it to authorise sharing performance of a function merely on the basis that it might be more efficient or otherwise desirable to do so.87 While it is possible the Carltona principle may be extended in the future,88 whether and how that might happen is not clear.
To date, the Carltona principle has been concerned only with the ability of administrators to rely on human agents. The reasoning that underpins that principle means it has the potential also to support some uses of machine technology.89

Relevant inputs in decision-making
The Carltona principle is not the only means by which administrators may obtain assistance, whether from other people or other things, to help them better perform their functions.
For example, depending on the particular function, administrators can (and in some cases should, or even must) draw upon the scientific, medical and other technical expertise of others whose input will be relevant contributions to their decisions. Sometimes, this input can even be adopted as a component of the decision of an administrator for certain purposes. For example, an expert medical assessment that provides a report on a person’s level of impairment may be adopted by an administrator for the purposes of then making a compensation decision.90
Of course, apart from these expert human inputs, administrators also use traditional forms of technology to provide inputs into their performance of statutory functions: to record, test, calculate, detect, measure, model, and so on.
Inevitably questions will arise as to the extent to which new machine technologies might be recognised as merely an example, or an extension, of these situations.
More simple machine technologies might be viewed as legally comparable to existing tools that administrators are permitted to use in the exercise of their functions.91 On the other hand, as machine technologies become more sophisticated, might their outputs be recognised as equivalent to the advice of a human expert, where an administrator may take such expertise into account in their decision-making?
We expect that, like the obtaining of expert advice and the use of traditional forms of technology, there will be at least some forms and uses of sophisticated machine technology that will come to be recognised as legitimate tools administrators can use to assist them to perform their functions, within the implicit authority conferred on them by the statute.
However, whether and the extent to which this is so will need to be carefully considered on a case-by-case basis, taking into account the particular statutory function, the proposed technology and the broader decision-making context in which the technology will be used.

The new machinery of government: using machine technology in administrative decision-making 29
NSW Ombudsman

7.3 The scope of the decision-making power – including the extent of
any discretion
Most administrative powers given to decision makers by legislation involve at least some element of discretion.92
This raises particular challenges when it comes to the automation of decision-making through machine technology, as any automation will need to be consistent with the discretionary nature of the power.

What is discretion?
We are using the term ‘discretion’ in a broad sense.93 A function is ‘discretionary’ where there is no one right outcome, or where no one consideration or combination of considerations is ‘necessarily determinative’ of the outcome in all cases.94 This means that there is some element of ‘decisional freedom’ – the outcome will be one about which there is ‘room for reasonable differences of opinion’95 or a ‘choice of legally available alternatives’.96
A discretionary function is therefore one in which an administrator has some freedom as to any one or more of the following:
 whether to exercise the function
 how to exercise the function
 the output of the function (i.e. what is ultimately done or not done).
That freedom does not have to be absolute. Indeed, no discretion is ever completely unfettered. There will always be some constraint on how the administrator acts. There is also a statutory presumption which, if it is not displaced, requires the discretion to be exercised in a manner that is consistent with standards of ‘legal reasonableness’.97
The types of functions that fall within this broad concept of ‘discretion’ include those where an administrator:
(1) is able to decide whether or not to exercise the function on any given occasion (typically
identifiable by the use of the term ‘may’ in the relevant legislative provision)98
(2) can exercise the function in more than one way (for example, may grant a licence with or
without conditions)
(3) is called upon, when exercising the function, to take into account a range of factors, some of
which may ‘pull in different directions’99 without a fixed formula (or ‘recipe’) for applying those
factors
(4) is to evaluate whether someone falls within a category, class or definition contained in
legislation (for example a ‘fit and proper person’ or ‘de facto relationship’) that involves an
evaluative judgment100
(5) is permitted or required to exercise a function only if they possess a particular state of mind
(such as ‘reasonable suspicion’, ‘reasonable belief’ or ‘satisfaction’).101

A technical note about ‘discretionary powers’
It should be noted that the concept of discretion we are using here may be broader than what a court or a lawyer will be referring to when they speak of a ‘discretionary power’.
For example, an administrator who is required to do Y if (and only if) satisfied of X may be said to have a legal duty (and not merely a discretionary power) to do Y. This is because, provided they are satisfied of
X, they must do Y. Although being satisfied of X may involve some element of subjective evaluation or

The new machinery of government: using machine technology in administrative decision-making 30
NSW Ombudsman choice, that does not, under this more technical sense of discretion, change the function from a duty to a discretionary power.
Where it is necessary in this report to distinguish between these different concepts of discretion, we refer to the narrower and more technical concept as a ‘formal discretionary power’. Otherwise, when we just refer to ‘discretion’ we just mean it in its more general sense as explained above.102

The imperative to preserve discretion
By giving an administrator a discretion, Parliament has relinquished some element of control over individual outcomes, recognising that those outcomes cannot be prescribed or pre-ordained in advance by fixed rules.
But at the same time, Parliament is also prohibiting the administrator from setting and resorting to fixed rules that Parliament itself did not fix. If Parliament had intended to lay down fixed, pre-determinative rules for the exercise of these functions, it would have done so.103 Where it has chosen not to do so, that decision must be respected.
This means that

exercising the discretions that Parliament has given to an
administrator is just as important as administrators
complying with any fixed rules Parliament has prescribed.
Potential issues with using machines in the exercise of discretionary decisions
Over time, administrative law has developed specific rules concerning the exercise of statutory discretions. These include the so-called ‘rule against dictation’ – see ‘Machine technology and the ‘rule against dictation’ below.

Machine technology and the ‘rule against dictation’
The so-called ‘rule against dictation’ requires an administrator who has been tasked with a
discretionary function to exercise that discretion themselves, and not in automatic obedience
to the directions or instructions of another.104
This rule prohibits not just circumstances where the administrator is given an express order or
direction from another to act in a particular manner. It also prohibits circumstances where the
administrator feels obliged to exercise their discretion in a particular way based on the
conclusions or wishes of another – even in circumstances that fall short of a direction or
command.105
This does not mean an administrator cannot consider the views of others. However, there is
sometimes a fine line between taking into account what is said by another person (which is
permitted) and acting at another’s behest (which is not).106
We are not aware of any judicial consideration to suggest that the rule against dictation might
be applied directly to machine technology. However, it is easy to see how the principle that
underpins that rule – the need for the nominated administrator to exercise their own
discretion – could have implications for the lawful use of machine technology by
administrators. In particular, an administrator who is under a statutory duty to exercise

The new machinery of government: using machine technology in administrative decision-making 31
NSW Ombudsman

discretion may be acting unlawfully if they automatically or unreflectively adopt the
instructions, recommendations, advice or output of someone or something else, as to do so
would not constitute a genuine exercise of their own discretion.107

There are also rules governing (and limiting) the use of policies and other guidance material to regulate the exercise of discretion – see ’Machine technology as a means of delivering ‘guidance’ to administrators’ below.

Machine technology as a means of delivering
‘guidance’ to administrators
Administrators exercising discretion can be aided in that task by policies, guidelines and other
similar resources (‘guidance material’). Might it be appropriate to consider at least some
forms of machine technology as effectively just guidance material delivered a new way? Rule-
based systems, in particular, that guide administrators through a series of questions and
decision-trees may ultimately contain the same information in substance as what was
previously provided to them in a written policy or manual.
The development and use of such guidance material to aid the exercise of discretionary
functions is generally recognised, often encouraged,108 and may sometimes even be required
by statute.109 Guidance material assists in addressing the tension that may exist between the
flexibility and individualisation that discretion permits, and the consistency that public law and
good administrative practice requires.110 Policies can improve consistency and certainty,
mitigating the extent to which outcomes are affected by ‘individual predilection’ (that is,
individual preference).111
Guidance material (some of which can be relatively detailed)112 is seen as particularly desirable
in the exercise of what are sometimes described as ‘high volume’ functions.113 Use of this
material in these contexts may be necessary to avoid ‘substantial injustice’,114 or ‘blinkered
and individualised decision-making [which] would be a recipe for maladministration’.115
If machine technology were employed to guide the exercise of discretionary functions,
established principles about the lawful use of existing forms of guidance material might be
helpful. For example, courts have generally recognised that guidance material to assist in the
exercise of discretionary functions will be lawful provided it:
 does not give effect to purposes inconsistent with the purposes of the legislation that
created the function116
 leaves the range of discretion intact,117 and does not inappropriately exclude or narrow
the interpretation of criteria prescribed by legislation118
 does not create inflexible rules that an administrator cannot depart from in the
individual case119 and
 is not treated by administrators as giving rise to fixed determinative rules to be adhered
to regardless of the merits of the individual case.120
Even if a particular machine could, in principle, be considered as conceptually equivalent to
some traditional guidance materials, there may be an important distinction in practice: the
tendency humans have to uncritically accept and inappropriately rely upon the output of

The new machinery of government: using machine technology in administrative decision-making 32
NSW Ombudsman

machine technology systems. If that happens, then there is a risk that the machine moves
beyond merely guiding the exercise of discretion (permissible) and instead operates in a way
that means that discretion has effectively been lost (impermissible). This concern was noted by
the Commonwealth Ombudsman in a 2007 report about a series investigations conducted into
immigration decisions.121

Such rules are best viewed as applications of a more general principle that seeks to preserve the discretion that Parliament has incorporated into the function and conferred on a particular person or persons: where a statute gives discretion to an administrator, the administrator must remain capable of exercising, and must in practice exercise, that discretion. Those given a discretionary statutory function must ‘keep their minds open for the exceptional case’.122
Given this principle, there may be risks in using some forms of machine technology in the exercise of statutory functions that have discretionary elements.
This was the view of the Administrative Review Council in 2004. It concluded that, while ‘expert systems’ might be used to assist an administrator to exercise a discretionary function, the exercise of the discretion should not be automated and any expert systems that are designed to assist in the exercise of discretionary functions should not fetter the exercise of that function by the administrator.123
In summary, and at least on the current authorities, it should be assumed that:
Machine technology cannot be used in the exercise of discretionary functions if (and to the extent that) it would result in the discretion being effectively disregarded or fettered.
If a discretion has been given to an administrator, the discretion must remain the preserve of that administrator and there must, in law and in practice, continue to be a genuine exercise of that discretion by that administrator.124
If the introduction of machine technology into a discretionary decision-making system has the effect that the administrator is no longer able to – or does not in practice – continue to exercise genuine discretion, that system will be inconsistent with the statute that granted the discretion, and its outputs will be unlawful.125
In practice, this likely means that discretionary decisions cannot be fully automated by a machine.

The new machinery of government: using machine technology in administrative decision-making 33
NSW Ombudsman

8. Appropriate procedures

8.1 The decision has followed a fair process
Good administrative decision-making requires a fair process. The process will only be fair if it is also reasonably perceived to be fair by those affected. There are many elements that make up a fair process
– those elements include transparency, accountability, and proper management of expectations.
In administrative law, a core requirement for a fair process is known as ‘procedural fairness’ (sometimes also termed ‘natural justice’). Procedural fairness requires:
 that the decision maker acts free of bias (the ‘no bias’ rule), and
 that those directly affected by decisions be given a genuine opportunity to be heard
(the ‘hearing rule’).126
Unless clearly excluded by legislation, administrators must apply procedural fairness principles when exercising statutory functions which could affect the rights or interests of individuals.127 However, what procedural fairness requires in a given circumstances will vary depending on the legislative context128 – there is no one-size-fits-all prescription that will answer all procedural fairness requirements.
The importance of affording procedural fairness may pose several challenges for machine technology.129

Bias
One of the most commonly cited risks of machine technology is the introduction or amplification of inherent biases in the input data. This may be because of the way it has been coded, or it may be as a result of the way in which it has ‘learned’ from data sets that were themselves affected by historical or systemic (and perhaps hidden) biases.
However, the no-bias rule is traditionally concerned with the requirement that the administrator bring an ‘impartial mind’ to the making of the decision.130 Bias may be ‘actual’ or ‘apprehended’.
Apprehended bias occurs where a fair-minded observer might reasonably perceive that a person might not bring an impartial mind to their task.131 Actual bias will be established when the administrator is ‘so committed to a conclusion already formed as to be incapable of alteration, whatever evidence or arguments may be presented'.132
It is unclear if and how the no-bias rule will be applied to address other kinds of systemic bias that may be introduced by machines.133 Similarly, it is unclear whether algorithmic bias – systematic and repetitive errors that result in unfair outcomes – may come to be accepted as a ground of ‘bias’ in judicial review proceedings.
Whether or not that happens, algorithmic bias may result in other kinds of unlawful conduct or maladministration – for example, if the effect is that the decision has been based on extraneous considerations’ (see chapter 9), is unreasonable, has involved unlawful discrimination, or otherwise generates systemically unjust or improperly discriminatory outcomes.134

The new machinery of government: using machine technology in administrative decision-making 34
NSW Ombudsman

Algorithmic bias
One of the most significant concerns about machine technology, and in particular those that
apply machine learning techniques, is its potential to reflect and amplify bias against minority
and vulnerable groups. These concerns are heightened by the challenges in detecting such
biases given the complexity and inherent opaqueness of the technology (see chapter 13).
Bias most commonly reflects and amplifies historical biases and inequalities contained,
sometimes hidden, in the data sets from which machines learn.
Bias can also arise where that data set is unrepresentative or incomplete. For example, if
training data135 is more representative of some groups than others, the accuracy of the
model’s outputs tend to be systematically worse for under-represented groups. This has been
observed in the case of facial recognition technology trained disproportionately on lighter skin
tones and therefore significantly less accurate for darker skinned individuals.136 On the other
hand, bias can also result where more data fields are available for some groups than others.
For example, in the Unites States a child welfare screening tool that is able to use data from
means-tested programs (such as mental health counselling or drug treatment services) will
have that data for lower-income families without having corresponding data on the use of
similar services by wealthier families, with the result that child welfare risks may be
disproportionately rated higher for poorer families.137
Importantly, even training data that does not explicitly include sensitive attributes like race or
gender may be susceptible to bias, because a learning algorithm can develop proxies for
sensitive attributes. For example, in the United States zip codes can often be a proxy for race.
Height or weight may be proxies for gender. Training data that excludes gender fields but
includes names might give rise to gender proxies – for example, an algorithm may learn to
generate different results if the name on record is ‘Tony’ or ‘Toni’.138
This means that an algorithm that is blind to a sensitive attribute may produce a similarly
biased outcome as one that overtly uses the attribute in a discriminatory manner.139 Indeed, in
some cases to simply omit any sensitive attributes may be counter-productive – both because
it may lead to complacency and a failure to recognise and address proxies hidden elsewhere in
the data, and because it may prevent the designers from building in processes that attempt to
‘correct’ for historical bias (a so-called ‘fairness-through-awareness’ approach).140
The key point is that algorithmic bias may arise without any intention to discriminate, without
any awareness that it is occurring, and despite the best intentions of designers to exclude data
fields that record any sensitive attributes or any obvious (to humans) proxies. This is one
reason why formal evaluation, auditing and ongoing monitoring processes are essential.
Some of the growing number of examples from around the world where machine technology
has been shown to generate or amplify bias include:
 Recently in the United Kingdom, government use of machine technology came under
scrutiny when the Department for Education used automation to grade school leavers who
could not sit exams due to the coronavirus pandemic. The system downgraded the results
of large numbers of students in the state school system based on a model which was
found to be favourably biased toward private schools. Ultimately, the grades awarded
using automation were withdrawn in favour of predictions made by teachers.141
 A team of researchers were recently able to demonstrate hidden racial bias in an algorithm
used widely in the US health care system to prioritise referral to a program aimed at
improving outcomes for patients with complex medical needs. The algorithm used the cost

The new machinery of government: using machine technology in administrative decision-making 35
NSW Ombudsman

of healthcare as a proxy for illness, which resulted in bias against African American
patients as they had systemically unequal access to care, meaning that less money is spent
on their health care. By failing to take those systemic differences into account, the system
effectively assumed that African American people were healthier than they were, and
correspondingly required African American people to be sicker in order to be assessed as
eligible for additional assistance. The algorithm developer later confirmed the results
found by the researchers.142
 Amazon’s experimental hiring tool used machine technology to review and score job
applicants’ resumes from 1 to 5 stars. The experiment was shown to be biased towards
men, because the machine had been trained using the resumes submitted to Amazon over
the previous decade, and most of those were from men (in a technology industry still
dominated by male employees).143 The algorithm had learnt to model predicted
employment outcomes based on word patterns in the resumes, rather than relevant skill
sets. Consequently, it penalised resumes that included the word ‘women’s’ or referred to
women’s colleges. Although Amazon ‘scrubbed’ the data to prevent it from overtly
discriminating based on those parameters, there was no way to ensure the algorithm
would not learn an alternative model that would also unfairly sort and rank male
candidates higher, and the algorithm was scrapped.144
 Amazon also used an algorithm to decide which neighbourhoods would be eligible for,
or excluded from, its same-day Prime delivery system. The decision relied on whether
a neighbourhood had a sufficient number of existing Prime members, proximity to a
warehouse, and availability of willing delivery couriers. The purpose was to exclude
unprofitable neighbourhoods. However, the result was unintentionally discriminatory,
as the model resulted in the exclusion of poor and predominantly African American
neighbourhoods.145
 A Georgetown Law School study found significant overrepresentation of African American
people in ‘mug-shot’ data bases. This meant that facial recognition networks used by law
enforcement produced a biased effect, as the faces of African American people were more
likely to be falsely matched.146
 In a well-reported case, the Allegheny County Department of Human Services purchased a
decision tool (the Allegheny Family Screening Tool) to generate scores as to which children
are most likely to be removed from their homes within two years or to be re-referred to
the child welfare office for suspected abuse. The tool was rebuilt after the County
undertook an independent evaluation, which identified statistical unfairness, including
racial bias.147

A right to be heard
The hearing rule might typically require an administrator to notify an individual of a possible or proposed decision or course of action and invite them to respond, with information or arguments, before the decision is finalised or course of action taken.148
One of the important questions that arises here is whether it is necessary, before the affected person is invited to provide their views on the proposed decision or action, to inform them if a machine has proposed, or been involved in proposing, that decision or action?

The new machinery of government: using machine technology in administrative decision-making 36
NSW Ombudsman

We are not aware of any court decision that has directly addressed this question. We expect that, from a strictly legal perspective, the answer may be that it depends on the particular decision or action in question and the nature and extent of the particular machine process involved.
However, as a matter of good administrative practice, our view is that this information should always be disclosed to the person.
Even if the administrator does not consider that the machine’s involvement could be relevant in any way to anything the person might possibility wish to put forward for consideration, the person may have a different view.
Perhaps even more importantly, the mere fact that a machine has made (or was substantially involved in making) the proposed decision may be important to the person in deciding whether or not to make any submission at all. Knowing that an adverse decision was merely the output of some machine process may impel the affected person to make a submission when they otherwise might have simply accepted the outcome. If a decision is going to be made that affects them adversely, and even if it turns out that the decision is correct and cannot be changed, they may reasonably want to ensure that their situation has at least been considered by someone with a genuine (human) capacity to understand the decision and its consequences and impact for them. The right to be heard is not just about ensuring the correct decision is made in light of all relevant considerations – it is also a right of ‘respect’ to the person affected by the decision.
In our view, as a matter of good administrative practice, a right to be heard before a decision is finalised generally requires the person also to be told if a machine has made, or has materially contributed to the making of, the proposed decision.

8.2 Other legal and ethical obligations
Compliance with other lawful and ethical obligations in an administrative decision-making process involves such things as acting honestly and avoiding conflicts of interest. To some extent, the use of machine technology could help to mitigate the risk of contravening these obligations where they might otherwise result from human failings.
However, the requirement to meet other obligations also means complying with laws beyond the statute that creates the relevant function. This reflects an important element of the ‘rule of law’: like citizens, government must also abide by the law.
Some of these other laws, such as those governing privacy, freedom of information and anti- discrimination, have general application to most administrative functions, but will have particular implications for processes that involve machine automation:

(a) Privacy
Administrative agencies are required to comply with general privacy obligations concerning the
collection, storage, use and disclosure of personal information. In NSW, those obligations are
imposed by the Privacy and Personal Information Protection Act 1998 (PIPPA) and the Health
Records and Information Privacy Act 2002 (HRIPA).
Concerns have been raised about machine technology within the broader context of data
protection, as governments increasingly digitise their operations. The use of machines will, in many
instances involve collecting, translating and reducing personal information to a form that is suitable
for use by machine technology.149 That process will also be coloured by assumptions and
interpretations of the designers of the system in determining what personal information is relevant
to code and the significance of any particular piece of information.150 This may raise particular issues
for agencies’ obligations concerning the currency, accuracy and completeness of the personal
information they hold, and whether that might be potentially misleading. There may also be issues

The new machinery of government: using machine technology in administrative decision-making 37
NSW Ombudsman

to consider relating to obligations concerning the permitted uses, retention, safe storage, and
destruction of the personal information that is being held.
Agencies will need to have proper regard to these issues in the design and implementation of
machine technology systems. The NSW Privacy Commissioner has noted that presently there is
no mandatory legal requirement even to conduct a privacy impact assessment before adopting
machine technology, even though such technologies ‘can give rise to unique and complex
privacy issues’.151
While allegations of violations of a person’s privacy are excluded from the Ombudsman’s
jurisdiction (as they are matters for the NSW Privacy Commissioner), we suggest that a privacy
impact assessment should be included as an essential element of any machine technology design
process, and it should be made public.

(b) Freedom of information
NSW’s primary freedom of information law – the Government Information (Public Access) Act 2009
(NSW) (GIPA Act) – deals with information and the records that hold them in a way that is
intentionally technology-agnostic.152
The aims of the legislation are to open government information to the public to maintain and
advance a system of responsible and representative democratic government.
The GIPA Act places various obligations on agencies within NSW in respect of the publication and
release of the information that they create and hold. The GIPA Act also provides rights for people to
apply for access to government information.
These rights remain applicable where government uses technology to provide services and inform
decisions.153
The NSW Information Commissioner has issued guidance noting that:
This technology [automated decision-making systems that involve a computerised process that either
assists or replaces the judgement of human decision-maker] can perform many functions that
previously could only be done by humans. As these systems are adopted by governments, citizens
will increasingly be subject to actions and decisions taken by, or with the assistance of, automated
decision-making systems. To fully exercise their rights, it is important that individuals are able to
access information on how a decision is made and what information was used to reach that
decision.154

(c) Anti-discrimination
Under the Anti-Discrimination Act 1977 (NSW), it is unlawful, in the provision of services and in
a broad range of other contexts, to discriminate against a person because of any of the following
characteristics: disability (including disease and illness), sex (including pregnancy and
breastfeeding), race, age , marital or domestic status, homosexuality, transgender status and
carer’s responsibilities.155 The protections afforded by this Act are long established and similar
laws apply in other Australian jurisdictions.
Discrimination may be direct or indirect.156 The use of machine technology can result in outcomes
that involve either direct or indirect discrimination.157 Furthermore, the use of such technology
may also make it more difficult to detect or to understand if unlawful discrimination is occurring.
Apart from the difficulties in obtaining accessible information about how the technology has
generated its output, there may be a tendency to assume that these systems are neutral, free
of human bias and therefore incapable of unlawful discrimination.158 This may lead agencies to
discount or minimise the need to ensure the operation of machines comply with anti-
discrimination obligations.

The new machinery of government: using machine technology in administrative decision-making 38
NSW Ombudsman

However, in the case of pre-programmed rules, a machine will obviously reflect and potentially
amplify any intended or unintended pre-existing bias or assumption which influenced its
programming.159
More recent machine learning technologies can present a more complex challenge. Current
forms of machine learning develop effectively their own rules based on statistically significant
correlations from available data. Such self-developed rules may be constructed from and
promote proxies for protected characteristics within their decision-making matrices, whether
as a result of assumptions in the training data or learnt correlations.160 The learning capabilities
of these systems also means these rules are not static and may change over time, for better
or worse.
This means that it is not sufficient for agencies to design machines in ways that are not overtly
discriminatory, and to not use training data sets that explicitly record protected characteristics.
Agencies will need to test, and then regularly monitor and audit, the operation of their learning
machine to ensure it continues to operate in accordance with anti-discrimination legislation.161

There may also be less obvious legal rights and obligations that interact with particular statutory functions, which must be considered when functions are to be handled by machine technology. Few statutory functions operate independently of any other written or unwritten laws.
In the Revenue NSW case study, for example, the function being exercised was the issuing of garnishee orders on banks and other financial institutions under the Fines Act 1996 (NSW). These orders direct third parties – financial institutions – to deduct specific amounts from the accounts of account holders and transfer them to Revenue NSW. Failure by a financial institution to comply with a direction of this kind can constitute an offence.
However, any technological process to issue such orders, even if it adhered to all of the provisions of the Fines Act, might still need to consider insolvency and bankruptcy legislation provisions that govern the priority in which debtors of an insolvent company or bankrupt individual are to be paid.162
There are also principles of unwritten law that prevent garnishee orders being issued on joint accounts held by financial institutions, or accounts where the funds in the account are held in trust.163
Few statutory functions operate in isolation. Many of them must comply with other written and unwritten laws. It will be necessary to take steps to identify these and ensure that any use of machine technology is compliant with them.
Some requirements – such as those imposed by privacy, freedom of information and anti- discrimination laws – will likely always be relevant to some extent, and need to be considered.
Care is particularly needed to ensure that the risk of algorithmic bias does not result in conduct or decisions that would amount to indirect discrimination.

8.3 Reasons are given for the decision (particularly decisions that affect
the rights or interests of individuals)
The giving of reasons164 is a basic principle of good administration – a person, especially one whose individual interests or rights have been adversely affected, is generally entitled to an explanation as to why that has happened.
In some cases the requirements of procedural fairness will mean that there is also a legal duty to provide reasons.165 Sometimes this requirement is expressly imposed by statute, either specifically in relation to a particular decision, or more generally under certain circumstances.166 In other cases it is implied because of the nature of the function, the person exercising it and the impact it has on those affected by it.167 If, for example, there is a right of review or appeal, a requirement to provide reasons

The new machinery of government: using machine technology in administrative decision-making 39
NSW Ombudsman will usually be implied, as knowing the reasons for a decision is essential to a person’s ability to decide whether and how to challenge it.
Even where there is no specific legal requirement, the NSW Ombudsman’s Office and other ombuds have consistently taken the view that reasons are an essential requirement of good administrative practice and should be provided to any person whose interests are significantly affected by a decision, at least upon request.
The purposes of reasons include:
(a) Transparency
The person is better able to see:
 The facts and reasoning that were the basis of the decision.
 That the decision was not made arbitrarily or based on mere speculation.
 To what extent any arguments they put forward have been understood or considered.
 Whether they have been dealt with fairly.
 Whether or not they might choose to exercise any rights of objection, review, appeal
or complaint, and the arguments they will have to respond to if they do.
 How they might need to adjust their position to achieve more favourable decisions in the
future.
(b) Accountability
Decision makers who are required to explain their decision have a greater incentive to ensure those
decisions are defensible and based on acknowledged facts. Supervisors, as well as those with an
external review role, are also in a better position to assess the decision, including whether it was
reached lawfully, based on relevant considerations, and on the merits of the case.
(c) Quality
Decision makers who are required to explain their decision have a greater incentive to carefully
identify and assess the relevant issues and apply rigour in their reasoning. Other decision makers
can use reasons as guidance for the assessment or determination of similar issues in future.168
Where reasons are required, the degree of detail required in those reasons may also be prescribed by legislation but, where it is not, will depend on the nature of the administrator and the function, and the circumstances in which it is exercised.169
The prevalence of complaints, proceedings and applications in which individuals claim they have not been given adequate reasons for decisions is testament to the importance (and the complexity involved) in ensuring that actions are adequately explained and capable of being understood.

Using machine technology to administer
Commonwealth child support payments
Commonwealth child support legislation imposes an obligation on certain parents to make
periodic payments through the Child Support Registrar to the other parent of their child or
children. The legislation recognises that the liable parent may also make payments to third
persons, for example when school fees are paid directly to the child’s school.

The new machinery of government: using machine technology in administrative decision-making 40
NSW Ombudsman

When these payments are made, the registrar has the discretion, in certain circumstances, to
deduct those other payments from the amount that would otherwise be required to be paid to
the other parent.
In proceedings concerning the registrar’s decisions to deduct certain payments of this kind, it
became evident that the registrar had been using an ‘automation’ process. This had been done
by automatically crediting the amount of school fees paid by the liable parent over several
months. The registrar would then advise the other parent that the payment had been made
and an equivalent amount had been deducted from the child support payment, giving them an
opportunity to object.
The Tribunal criticised this practice, of issuing ‘provisional’ decisions and inviting the payee to
object, as not in compliance with the legislation, describing it as ‘well intentioned’ but ‘legally
flawed’.170 The Tribunal also added that ‘[i]t must be doubtful that the sort of automation
contended for can be consistent with a discretionary administrative decision in any event’.171

Where reasons are required to be given, there is no uniform standard that will be adequate in all circumstances. Generally, reasons should, when read as a whole, show the ‘actual path of reasoning’ taken by the administrator,172 or provide ‘an explanation connecting any findings of fact with the ultimate decision’.173 Interpretation legislation in some Australian jurisdictions may also mandate certain content.174
The giving of reasons is just as important when machine technology has been involved in the making of decisions.
In our view, if machine technology has been materially involved in the process of making a decision, and the decision is one for which reasons ought to be given, then it would be unreasonable for the statement of those reasons to fail to include some explanation of the fact, nature and extent of the machine technology’s involvement.
Reasons why machine involvement should be disclosed when giving reasons include:175
 Failure to disclose this information appears misleading by omission, as most people would
otherwise assume that the statement reflects the human decision maker’s own reasoning
processes. If, however, someone or something else has taken much of the cognitive load of the
decision-making, then the decision maker’s reliance or consideration of their work is part of the
reasons for the decision and should be disclosed as such.
 The involvement of a machine in the decision-making process may affect the ways in which the
decision can be challenged or reviewed. Informing the person affected of that involvement is
therefore important to give them a genuine opportunity to decide whether and how to exercise
their rights of challenge or review.
 Disclosure provides the opportunity to build confidence in decisions and decision-making
processes. People may be reassured by the use of properly-designed technology that has helped
to ensure the impartiality, rigour, consistency and accuracy of decisions. On the other hand,
secrecy about the involvement of technology is likely to undermine public confidence and raise
suspicions about why that involvement was not disclosed.176
In chapter 13 we will consider in more detail how requirements of transparency, including the requirement to give reasons, should be considered when designing and implementing machine technology.

The new machinery of government: using machine technology in administrative decision-making 41
NSW Ombudsman

9. Appropriate assessment

9.1 The decision answers the right question (which necessitates asking
the right question)
The development of machine technology for use in the exercise of statutory functions will require the translation of legislation, and related guidance material (such as directions or policies), into a form capable of being turned into code that machines can read.
Whether, and to what extent that can be done will depend on the type and style of legislation, the correct interpretation of the legislation,177 and the capabilities and limitations of the particular technology employed. Importantly, it will also depend on the expertise of those who translate the legal text into code, and the processes developed to undertake that task.
Perhaps the most basic error that can be made when introducing automation into the exercise of statutory functions is to misinterpret or misapply the legislative scheme – effectively, to ask the machine the wrong question. Of course, this is not a risk that is confined to technology: human beings are quite capable of misinterpreting and misapplying legislation.
However, for a variety of reasons the risk may be magnified when technology is involved:

(a) The likelihood of error may be greater
Laws drafted and made by humans, to be read and implemented by humans, do not readily lend
themselves to translation into code. Computer code generally is ‘more precise and has a narrower
vocabulary than natural language’ used in legislation.178 The need to translate law into code
introduces an additional step in realising the intention of Parliament.
Those involved in designing technology to exercise statutory functions will typically not have
expertise or experience in interpreting legislation or exercising administrative functions.
This translation process must also be repeated every time the relevant legislation is amended, and
when judicial decisions or changes to other laws affect the way the legislation is interpreted or
applied.
The risk of error with technologically embedded compliance processes is highlighted by a growing
body of court cases where contraventions of statutory obligations by private entities have been
attributed, at least in part, to ‘information technology system issues’.179

(b) The consequences of error may be more significant
Error in code will almost inevitably affect more outcomes (and therefore more individuals) than an
error committed by a particular administrator. One of the key advantages of machines – their
potential to process high volumes of data at high speed – means that errors may be replicated at a
rate exceeding that of any human administrator. Consequently, the number of people adversely
affected by a single error may be substantial.

(c) The detection of error may be more difficult
While administrators have internal processes for detecting human errors in the exercising of
functions, detection of errors in outcomes of machine processes will call for an interrogation in a
manner beyond the capability of most administrators. Those affected by erroneous decisions,
particularly if they are already vulnerable, may also be less able to identify or effectively challenge
a machine error that arises from within the technology design and where the error is not
immediately apparent in the output of any individual case.180

The new machinery of government: using machine technology in administrative decision-making 42
NSW Ombudsman

(d) Rectifying an error may be more costly and take more time
If a human decision maker makes an error, then their conduct can easily be corrected for future
decisions. Even a systemic error perpetuated by an error in policy or other guidance material can
generally be remedied quickly. However, if an error is detected in machine technology, fixing the
error may be difficult, costly and time consuming. This may be particularly so if the technology has
been procured through an external vendor. An agency that is aware that machine technology
contains an error, but is unable immediately to fix those errors, may be in a difficult position if the
move to automation has left it with no other means of exercising the function.
Any errors in the translation process may mean that, even in circumstances where technology can otherwise be used consistently with principles of administrative law, doubts will arise about the legality and reliability of any decisions and actions of the public agency relying upon the machine process.

Lost in translation – a simple error converting
legislation into code
A recent complaint handled by our office illustrates the challenges involved in automating
even those statutory functions that on paper seem very simple.
If the holder of a NSW driver licence exceeds the permitted number of demerit points within a
3 year period, Transport for NSW (TfNSW) may suspend their licence and/or declare them
ineligible to obtain a licence for a period of time (a licence suspension).181
If the driver does not wish to serve the licence suspension period, they can opt instead to
enter into a 12-month good behaviour period.182 If the holder incurs any more than 2 demerit
points within that period, they must incur a licence suspension for a period twice as long as the
original licence suspension.183 A licence suspension in those circumstances is not discretionary.
Licence suspensions are initiated through the issuing of suspension notices. These notices
specify the date (a certain number of days after the date of the notice) when the suspension
will begin and how long it will last.
TfNSW has automated its process for issuing notices of these licence suspensions through the
use of machine technology (the DRIVES system). DRIVES has been programmed in such a way
that a different process is followed depending on how long the driver’s licence has until expiry
at the time of the suspension:
 If there are 35 or more days left to expiry, the notice will be automatically issued.
 If there are fewer than 35 days left to expiry, no notice is issued. Instead, when the
driver applies to renew their licence they will be denied a licence and given a licence
suspension notice.
In the complaint we received, a driver had incurred more than 2 demerit points during the
good behaviour period. At the relevant time, their licence had fewer than 35 days to expiry.
Aware that they faced a 6-month period of licence suspension given the recent demerits, the
driver did not immediately seek to renew their licence. They assumed that their licence was or
would promptly be suspended.
However, because the automated notice system was programmed not to issue a notice unless
or until a new licence application was made, no licence suspension was triggered.

The new machinery of government: using machine technology in administrative decision-making 43
NSW Ombudsman

Some months later, the driver applied for a new licence. The application was refused and they
were only then issued with notice of a 6 month licence suspension. As a notice can only set a
licence suspension to commence in the future, this meant that the suspension period only
began then and not months earlier when it should have been triggered.
The case suggests that those coding TfNSW’s machine made certain assumptions, including
that any driver whose licence had expired would apply promptly for a new licence.
The lengthy delay before the notice of licence suspension was issued meant that there was a
lengthy delay before the suspension period commenced. TfNSW acknowledged the error but
noted that it had no power to ‘backdate’ the suspension. It did, however, apologise to the
complainant.
It seems possible that the machine would have been coded differently had the legislation
explicitly set a specific time limit within which any notice must be sent – that is, if the
legislation expressly stated that a notice of licence suspension must be sent within so many
days of a bond breach occurring. However, although the legislation does not say this, a
requirement to issue a notice within a reasonable time is implied by common law, taking into
account the purpose of the statutory requirement and the surrounding legislative
provisions.184 Such an implied requirement may not have been obvious to those involved in
designing the code for the machine if they were not experienced in statutory interpretation.
TfNSW has acknowledged to us that its code is incorrect in this respect, and that notices of
licence suspension should always be issued promptly. However, while it is committed to
fixing the error, it will not be possible to do so until the next scheduled system update. In
the interim, it will consider whether there are any interim measures it can put in place until
the system can be corrected.

9.2 The decision is based on a proper analysis of relevant material
No decision-making discretion is given to an administrator in absolute or unconditional terms. All functions are qualified to some extent by available, obligatory and extraneous considerations, each of which impact on the exercise of a function in different ways:
Available considerations185 are facts or matters that may be taken into account but are not
required to be taken into account.
Obligatory considerations186 are facts or matters that Parliament has determined must be taken
into account when exercising a discretionary power.
A failure to take an obligatory factor into account may render the exercise of power unlawful, and
potentially invalid.187
In many cases, obligatory considerations are expressly stated in the legislation that created the
function.188 Even where legislation giving a function to an administrator does not expressly set out
any obligatory considerations, they may be implied having regard to the subject matter, scope and
purpose of the statute.189
Extraneous considerations190 are facts or matters that must not be taken into account by an
administrator when exercising a function.
Extraneous considerations, like obligatory considerations, may be set out in the statute191 or
(more generally) they will be implied having regard to the subject matter, scope and purpose of
the power being exercised.192 Where an administrator takes into account an extraneous

The new machinery of government: using machine technology in administrative decision-making 44
NSW Ombudsman

consideration, their conduct can be seen to reflect ‘an extraneous or improper purpose or to
render the decision arbitrary or capricious’.193 The result may be that the decision is invalid.
While an administrator may be given considerable freedom with regard to available considerations, they must have regard to obligatory considerations and must not have regard to extraneous considerations.194
When designing and implementing machine technology, it is essential to ensure that doing so does not result in any obligatory considerations being overlooked or extraneous considerations coming into play.

9.3 The decision is based on the merits and is reasonable in the
circumstances
As already noted, many administrative functions involve an element of discretion. This permits administrators to deliver appropriately individualised outcomes when exercising functions in contexts and circumstances that are inherently unique, and which cannot be perfectly foreseen or prescribed in advance. In other words, decisions must be based ‘on the merits’ of each particular case.
This requirement overlaps many of the matters already discussed – for example, a decision that is affected by bias, or that has been made on the basis of discriminatory or otherwise extraneous considerations, is not a decision made on its (legal) merits.
At the same time, any exercise of discretion must also be reasonable. In assessing what is reasonable in any particular case it may be appropriate to consider whether the decision is consistent with decisions made in other cases – that is, are like cases treated alike and are different cases treated differently?195
While not a stand-alone ground of review in administrative law, a lack of consistency may indicate a decision has not been made on its merits, is arbitrary and not reasonable, or is ‘infected’ by some other specific error. In addition, there will also be ‘limits beyond which such inconsistency itself constitutes a form of injustice.’196
Machine technology has the potential to enhance consistency of outcomes between like cases by controlling for the risk of human biases and other idiosyncrasies associated with multiple human decision makers.
However, there may be a tension between the attainment of this consistency and the requirement to treat each individual case on its merits.
Although a judicial rather than an administrative decision, criminal sentencing decisions provide a clear example where such tensions could arise. Under the laws of sentencing, the task of the sentencer is said to be to arrive at an ‘instinctive synthesis’.197 Each individual to be sentenced and each case is inherently unique, and there are multiple factors the sentencer must take into account. These factors are
‘incommensurable, and indeed, in many respects, inconsistent’.198 For this reason, the goal of
‘reasonable consistency’ between sentences is considered incapable of any ‘mathematical expression.’199
In cases such as this, consistency is important, but the kind of consistency that is sought is a consistent application of principles and reasoning to each different decision, and not merely some formulaic or statistical consistency. The High Court has already raised caution about using statistics and ‘guideline judgments’ in exercising sentencing discretion:200
[R]ecording what sentences have been imposed in other cases is useful if, but only if, it is accompanied by an
articulation of what are to be seen as the unifying principles which those disparate sentences may reveal.
The production of bare statistics about sentences that have been passed tells the judge who is about to pass
sentence on an offender very little that is useful if the sentencing judge is not also told why those sentences
were fixed as they were.201

The new machinery of government: using machine technology in administrative decision-making 45
NSW Ombudsman

...
To focus on the result of the sentencing task, to the exclusion of the reasons which support the result, is to
depart from fundamental principles of equal justice. Equal justice requires identity of outcome in cases that
are relevantly identical. It requires different outcomes in cases that are different in some relevant
respect. Publishing a table of predicted or intended outcomes masks the task of identifying what are relevant
differences.202
This does not mean that machine technology could not come to play any role at all in such complex and inherently case-by-case decision-making. However, it does suggest that the role of machine technology will be limited.
For example, a machine that outputs a suggested sentence or sentence range would seem to be significantly less legally safe to the sentencing decision maker than a machine that provides, with some empirically validated degree of accuracy,203 a numerical rating of the risk of reoffending – that is, something the sentencer could feasibly consider within their overall ‘intuitive synthesis’.204 Of course, there are other obvious risks there, including the risk that inherent but hidden biases in the historic data sets – such as racial stereotyping – will be entrenched or even amplified in such machine-generated risk ratings (see chapter 8).
Unsurprisingly, in other jurisdictions where machine technology has been used for judicial decisions like sentencing, significant concerns have been raised about them – see below ‘Machine technology in sentencing – COMPAS’.

Machine technology in sentencing – COMPAS
The Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) is a risk
assessment tool used in the United States that aims to predict the likelihood of reoffending.
Originally designed for use in post-sentencing supervision decisions, it is now used in
sentencing and other criminal justice processes. COMPAS works by analysing an individual
against certain criteria and historical data – an output is produced by way of a score ranking
them from low risk to high risk of committing future crime. The specifics of how the system
uses inference in performing its calculations are not known.
COMPAS has been widely criticised due to doubts abouts its accuracy and concerns about
discrimination, with one researcher claiming that COMPAS ‘is no more accurate or fair than
predictions made by people with little or no criminal justice expertise.’205
A review of the tool by ProPublica (a non-profit organisation) found that the system was
unreliable in predicting violent crime, as well as being racially biased. ProPublica claims that it
wrongly predicts African American defendants to be reoffenders at almost twice the rate as it
does for white defendants.206 It is understood that ‘race’ itself is not a variable in COMPAS, but
that bias appears to result from the relationships between race and other characteristics
concerning social economic factors (which may operate as proxies for race), as well as because
of historical training data that reflects human biases present in policing decisions.
The creator of COMPAS has disputed ProPublica’s findings,207 and it has since been noted that
the tool creator and ProPublica each measured fairness based on different and incompatible
measures.208
Tools like COMPAS also raise again concerns about the degree of reliance and trust decision
makers place in the outputs of machine technology. ProPublica provides an example of a US
judge who overturned a plea deal more favourable to the defendant to impose an increased

The new machinery of government: using machine technology in administrative decision-making 46
NSW Ombudsman

prison sentence after considering the COMPAS prediction that the defendant had a high risk
of future violent crime.209
In 2016, the Wisconsin Supreme Court upheld the use of COMPAS in sentencing.210 The Court
required, however, that a written warning (or ‘advisement’) be given to judges using COMPAS
about its limitations. However, questions remain about how effective such a warning could
be.211 The editors of the Harvard Law Review observe:

[The larger problem] is simply that most judges are unlikely to understand algorithmic
risk assessments… [T]he court was mistaken to think that as long as judges are
informed about COMPAS’s potential inaccuracy, they can discount appropriately.

Additionally, the warning will likely be ineffectual in changing the way judges think
about risk assessments given the pressure within the judicial system to use these
assessments as well as the cognitive biases supporting data reliance…. Research
suggests that it is challenging and unusual for individuals to defy algorithmic
recommendations.212 [references omitted]

The new machinery of government: using machine technology in administrative decision-making 47
NSW Ombudsman

10. Adequate documentation

10.1 The circumstances surrounding the making of decisions is adequately
documented and records kept
One of the easiest, and unfortunately most common, findings an Ombudsman can make is that an agency failed to properly document and keep records of its decision-making.
In general, the basic documentation required to be kept for any decision will include:
 details of the decision itself
 reasons for the decision
 the identity of the decision maker
 the date of the decision
 copies of any written notification, or file-notes of any non-written communication, of the
decision, to the person affected or to any other person.
These requirements apply equally to decisions made with the assistance of machines. Under the State
Records Act 1988 (and related information legislation, including the GIPA Act) terms like ‘record’,
‘information’ and ‘document’ are defined in technology-neutral ways:
record means any document or other source of information compiled, recorded or stored in written form
or on film, or by electronic process, or in any other manner or by any other means. 213
Machine technology, however, can raise particular issues for the maintenance of appropriate records.
For that reason, it is important that agencies’ recordkeeping policies include provisions that directly address the records required to be kept in relation to any machine technology in use by the agency.
For example, agencies will need to ensure that they maintain a register of all versions of their systems, with their dates and a description of the changes made between each version.214 The changes should specifically note any updates that have arisen because of a change in law or policy.
Ideally, previous versions should be kept in full, so that past decisions made using that version can be replicated and reviewed, if necessary.
In chapter 13, we consider in more detail the importance, when designing machine technology, of identifying the information that will need to be kept and published to ensure transparency and promote accountability.

The new machinery of government: using machine technology in administrative decision-making 48
NSW Ombudsman

Part 3:
Designing machine technology to comply with the law and fundamental principles of good government

The new machinery of government: using machine technology in administrative decision-making 49
NSW Ombudsman

11. Putting in place the right team
In this and the next 4 chapters, we consider some of the practical steps that government agencies should take when designing and implementing machine technology to support the exercise of an existing statutory function.
These chapters proceed on the assumption that the relevant agency or official already has the relevant function, is currently exercising that function without the use of machine technology, and is contemplating the adoption of some form of machine technology to assist its exercise of that function in the future.
We focus on 5 critical steps that agencies should take:
1. establish a multi-disciplinary design team (lawyers, policymakers, operational experts,
and technicians), with clearly allocated roles and responsibilities (chapter 11)
2. assess the appropriate degree of human involvement in the decision-making processes, having
regard to the nature of the particular function and the statute in question
(chapter 12)
3. ensure transparency, by deciding what can and should be disclosed about the use of machine
technology to those whose interests may be affected by the relevant function (chapter 13)
4. test before operationalising, and establish ongoing monitoring, audit and review processes
(chapter 14)
5. consider whether legislative amendment is necessary or prudent (chapter 15).
In this chapter we start with step one: establishing the right design team.

11.1 It’s not an IT project

Adopting machine technology to support a government
function should not be viewed as simply, or primarily, an
information technology project.
It is, rather, a coordinated exercise of legal, policy and operational process development, aided by technology. As such, the team that is formed to design and implement the project needs to include – and indeed, be led by – those with relevant, and sufficiently senior, legal, policy and operational expertise, who can work with and guide the technology specialists.
Each of these individuals must have appropriate degrees of involvement and authority throughout the project. Legal, policy and operational experts should not be relegated to ‘consultation’ or ‘advisory’ status, and nor should they simply be given a near-finished product for review or endorsement.
It is clearly better for all parties (including for the efficiency and reputation of the agency itself) if machine technology is designed by those who are best placed to know whether it is delivering demonstrably lawful and fair decisions, rather than having to try to ‘retrofit’ that expertise into the system later when it is challenged in court proceedings or an Ombudsman investigation.215

The new machinery of government: using machine technology in administrative decision-making 50
NSW Ombudsman

11.2 Having lawyers on the design team is essential
Our concerns with Revenue NSW (see annexure A) arose largely because it did not seek any expert internal or external legal advice on the design and operation of its machine-aided garnishee system, even after we told them they should.
Agencies that have been exercising functions under a statute for a long period of time no doubt develop a good understanding of how that legislation operates. However, when new technologies and new modes of exercising the function are being considered, it is essential that the source legislation be carefully considered afresh.
The task of interpreting a statute to arrive at its correct meaning can be a complex task at the best of times, and one that can challenge both highly experienced administrative officials and lawyers.216 Even legal rules that on their face appear to be straightforward and ‘black and white’, and which may be the most appropriate candidates for machine technology development, can nonetheless have a nuanced scope and meaning. They may also be subject to administrative law principles – such as underlying assumptions (for example, the principle of legality) 217 and procedural fairness obligations – which would not be apparent on the face of the legislation.
This is not to overstate the mystery of the law. However, it is to say that legislative interpretation requires specialist skills, and the challenge involved is likely to be especially pronounced when seeking to translate law into what amounts to a different language – ie a form capable of being executed by a machine.218
The structure, underlying assumptions, nuance and ambiguities of the English language, the internal logic of a particular statute and its relationship to the unwritten law, and the need to apply the text in the real world means that statutory provisions do not typically translate in any straightforward and definitive way into a form required by machine technology. While interpreting legislation starts with interpreting the ordinary and grammatical meaning of the words used, it also involves considering the context in which those words are used.219 That context includes the surrounding legislative provisions and the statute taken as a whole. An analysis of that context may result in those words ultimately being given a different meaning than a purely literal interpretation would produce.220 Any coding of the relevant law (or parts of it) will almost always require the making of interpretative choices to enable it to fit it within the different language and logic of the machine form.221 This must be done in a way that does not result in the meaning or effect of the law being impermissibly altered.
Most government agencies are well-resourced with highly-qualified legal professionals who are skilled in statutory interpretation. Agencies that wish to use machine technology should ensure that they utilise that expertise from the very outset of any design process.

11.3 Ensuring policy choices are made by the right people, at the right
level of authority
Developing and implementing machine technology will rarely, if ever, be a simple mechanical process.
Any design team, presented with a description of a function and an explanation of how it is currently being performed, will not come up with precisely the same device, with exactly the same specifications, functionality and performance, as any other design team. A multitude of decision-points and therefore choices will arise at various steps in the process.
Some of those choices may have profound impacts on the operation of the machine technology, and therefore on the ultimate exercise of the relevant functions and their impact in the real world. They are therefore important public policy choices.
Who is making those choices?

The new machinery of government: using machine technology in administrative decision-making 51
NSW Ombudsman

We have some concern if some of these choices are effectively outsourced to the technicians tasked with developing the technical design specifications for a machine and/or for the detailed coding, design and build of the machine itself. There are three reasons why this could be problematic:
1. First, these technicians will generally have neither the necessary legal and policy expertise, nor the
administrative operational experience, to appreciate all of the legal, policy and operational issues
that may come into play when developing machine technology for the exercise of statutory
administrative decision-making and its impact in the real-world environment.
2. Secondly, coding and other highly technical skills are frequently obtained through private sector
procurement. Even if those technicians did have knowledge and experience to make appropriate
legal policy and administrative decisions affecting the public, it would be inappropriate for them to
do so. They are not public servants, and are not subject to the same constitutional, employment,
cultural and professional ethics frameworks that apply to those who work in public service.
3. Thirdly, the kinds of policy choices involved when designing machines can be thought of as
elements relating to the ‘quality’ of the machine. Some of those choices may have cost
implications. If there are such quality/cost trade-offs, the decision should not be made unilaterally
by the (profit-driven) vendor as that decision may differ from the decision the government agency
itself would make in the public interest. There may be other choices that also involve public policy
trade-offs other than cost (e.g. whether the machine is designed to minimise ‘false positives’ or
‘false negatives’). However, depending on the outsourcing process, it may be that the vendor will
not even need to inform the government agency that these choices can be made.
Concerns of ‘undue influence’ might be raised if important decisions about the design and specification of a machine, including what data sets are or are not appropriate to be included, are left to private sector technology vendors (and especially of those vendors might later claim trade secrecy over those matters).222

Lessons from legislative drafting?
When Government prepares legislation, the process involves senior policymaker (such as Cabinet,
Ministers and senior public officials) deciding the overall policy objectives and parameters, in-house legal experts within agencies drawing up highly-detailed ‘drafting instructions’, and expert drafters at the office of the NSW Parliamentary Counsel preparing the draft statute or other instrument.
This process is iterative, recognising that the drafting process may identify gaps or ambiguities, or other policy decisions that need to be resolved. When this happens, the drafters seek further instructions from the agency. Ultimately, the final Bill returns to the senior-policymaker, together with a report certifying that it has implemented their policy decisions.
A similar process could be considered for the development of decision-making machines – with design specifications taking the place of drafting instructions, and machine code taking the place of legislation.
Such a process means that, as with the drafting of legislation, the process of designing and developing machine technology in respect of legislative functions:
 Is seen as an inherently iterative process between those making design/policy decisions and
those implementing them.
 Clearly differentiates the roles and authorities of the policymakers, those who translate policy
into instructions/specifications, and the technicians.
 Ensures that any significant decisions are only made by appropriate government members of
the team, and are escalated where necessary.

The new machinery of government: using machine technology in administrative decision-making 52
NSW Ombudsman

12. Determining the necessary degree of human involvement
In chapter 7 above, we noted that it is legally essential that a person given a discretionary decision- making function must genuinely exercise that discretion and make their own decision.

Discretionary decision-making requires some degree of
human involvement – it can never be fully automated.
How far a discretionary decision-making process can be automated is not an easy question. It needs to be assessed in the context of the particular function and the statute in question.
As already discussed, the law has recognised that policies may play a legitimate role in guiding discretionary decision makers, provided decisions are not impermissibly fettered by the terms of the policy or the way it is used. The law likewise recognises that a discretionary decision maker may take into account, and where appropriate act on, the advice and recommendations of others, provided they are not impermissibly acting under the other’s dictation and abdicating their own discretion.223 This reflects the practical reality that administrators often need to rely on others, such as their staff and other experts, when carrying out their functions.
It would seem a short and legally uncontroversial step to accept that an administrator exercising a discretionary function is also not precluded from considering the outputs of a relevant and well- designed machine.224 However, those outputs must not impermissibly control the administrator’s exercise of the function.

12.1 The administrator must engage in an active mental process
Minimally, any statutory discretion requires there to be a person (the person to whom the discretion has been given or delegated) who makes a decision whether and how to exercise discretion in the particular case or cases before them.
However, merely placing a ‘human-on-top’ of a process will not, of itself, validate the use of machine technology in the exercise of a discretionary function. As the external legal advice we obtained noted:
Although the response of administrative law to the use of information technology may be nascent, ordinary
administrative law principles require there to be a “process of reasoning” for the exercise of discretions. This
can also be seen in our conceptions of what it means to make a “decision”, with two members of the Full
Federal Court … accepting that one of the elements generally involved in a “decision” is “reaching a
conclusion on a matter as a result of a mental process having been engaged in. 225 [case references omitted]
This means that, even if a person officially ‘signs off’ at the end of a process, the decision-making process may still be unlawful if in reality that person is merely acting as a rubber stamp, accepting the outputs of a machine ‘as a matter of course’ and ‘without engaging in a mental process to justify that conclusion’.226
The need for functions to be exercised by the person to whom it is given (or delegated) has also been emphasised in Federal Court decisions concerning the exercise of immigration discretions, which have referred to the need for there to be ‘active intellectual consideration’,227 an ‘active intellectual process’,228 or ‘the reality of consideration’229 by an administrator when making a discretionary decision.
Among other things, these cases looked at the amount of time an administrator had between when they received relevant material and the time when they made their decision. In some cases, this time period was shown to have been too short for the administrator to have even read the material before them. The Court concluded that there could not have been any ‘active intellectual consideration’

The new machinery of government: using machine technology in administrative decision-making 53
NSW Ombudsman undertaken in the exercise of the function, and therefore overturned the decisions on the basis that there had been no valid exercise of discretion.230
Not all administrative functions have consequences as significant as those concerning immigration.
The ‘reality of consideration’ may look different in different administrative contexts, in proportion to the nature of the function being exercised and the consequences it has for those it may affect.
However, the principle remains relevant to the exercise of all statutory functions by administrators: in the exercise of a statutory discretion given to a person, some level of genuine and active decision- making by that person is required. As noted in chapter 7, where Parliament has chosen not to adopt fixed rules for the exercise of a statutory function, the discretion it has given to an administrator must be recognised and exercised.

12.2 The division of tasks between machine and human
In designing a machine technology supported decision-making process, thought needs to be given not only to ensuring that the human decision maker genuinely makes the final decision, but also to the division of tasks between human and machine throughout the decision-making process.
As we have already seen, most discretionary decisions will include a range of obligatory and available considerations. Some of those considerations may be more appropriate than others to be addressed by machine technology.
Consider, for example, a simple statutory payment scheme that requires an administrator to decide whether to make a discretionary payment to a person having regard to their:
 Age (the person must be above a certain age to be eligible).
 Place of residence (the person must live in a certain area to be eligible).
 ‘Need’ (the person’s need for the payment is to be taken into account).

There would seem to be no issue with the decision maker being assisted by a machine that can generate outputs about a person’s age and place of residence.231 For example, a machine might sort or filter a list of all those who have applied for the payment by reference to those two fields in order to identify those eligible.
The decision maker would then be required to separately consider the question of need. Provided the decision maker does so and considers both the outputs of the machine (age and place of residence) and then additionally considers need, the decision maker will have met the legal requirement of having taken into account all obligatory considerations.232
A more sophisticated machine might go further and also generate an output that seeks to rank or score applicants by assessed need, having regard to parameters such as their income, assets, dependents, and so on.
Unlike ‘age’ or ‘place of residence’ (for which there will, generally, be an objectively right or wrong factual answer) assessing ‘need’ involves the exercise of a complex, evaluative judgment. Therefore, even if the machine generates some score or ranking of need based on pre-determined criteria, the decision maker will still need to apply their own ‘active intellectual consideration’ to the output, as well as take into account any other considerations that have not been addressed (or have not been addressed fully) by the machine’s outputs. They will also need to determine the relative weight given to each of these considerations in their overall decision: ‘What is required is a human judge exercising their discretion to decide which factors are the most important in a particular factual scenario.’233

The new machinery of government: using machine technology in administrative decision-making 54
NSW Ombudsman

12.3 The risk of technology complacency and ‘creeping control’
The simple example above may suggest that designing a machine system that meets a minimum required threshold for human involvement in discretionary decision-making will not be particularly challenging.
However, what matters is not just that there is the required degree of human involvement on paper; there must be that human involvement in practice.
Even if a decision-making system is appropriately designed with a human decision maker in the process who is to (lawfully) consider machine outputs, there are a number of reasons why, over time, the decision maker may tend toward ‘technology complacency’. This means that their decisions may tend to become increasingly (and potentially unlawfully) controlled by the machine’s outputs.
Reasons for this tendency may include:
(a) A bias toward uncritical acceptance
It is well-recognised that there can be a natural bias for administrators to uncritically accept
information provided to them by technology, especially where the outputs generated are
presented in a form that appears to constitute objectively quantifiable fact.234
In the Commonwealth Ombudsman’s review of immigration detention decisions, for example,
the Ombudsman noted a tendency of government staff to accept the accuracy of the
information they accessed through the use of technology, even in the face of conflicting or
contradictory information from other sources.235
(b) Blame-avoidance and ‘path of least resistance’
Even if a human administrator is not completely certain that the machine has produced the
‘right’ answer, they know that accepting that output (even if it turns out to have been wrong) is
unlikely to result in them being held personally responsible for any adverse outcome. If, on the
other hand, they actively overrule the machine, then the risk of their being blamed for the
outcome is likely to be very high.
Accepting the output becomes not just less work, but it is also the lowest risk option for the
individual decision maker.
A decision maker may avoid even questioning the output generated by the machine for fear of
causing unnecessary delay or being seen as causing problems for management. Questioning the
output may be particularly challenging if the machine also required a significant investment of
public funds, or if it was designed, launched and lauded as the ‘next great thing’ by those more
senior than the decision maker.
(c) Practical and technical impediments to scrutiny
There are also practical and technical challenges to even the most conscientious administrator
who seeks to make their own independent decision rather than being controlled by a machine’s
output.
Typically, most machine technology will be designed and maintained in non-operational
technical areas that are organisationally and functionally separated from the administrators
who will use its outputs to exercise functions. Administrators may be insufficiently aware of the
scope, capacities and limits of the machine to even know when they should be seeking further
information or clarification about how it works. If they do wish to seek clarification or more
information to inform their decision whether to adopt a machine output, the machine may not
be configured to provide them with the particular information needed or they may not have
access to it. Pursuing those questions may require engaging at a technical level with technology
support personnel, which may not be feasible, either practically or culturally.

The new machinery of government: using machine technology in administrative decision-making 55
NSW Ombudsman

Agencies need to be wary of the risk that even a well-designed decision-making process involving machine technology could come to cross a line in practice which may render decisions made using it unlawful. The risk is likely to increase with the level of technical opacity of the machine.

12.4 Practical indications of active human involvement
When designing and implementing machine technology, government agencies must therefore also consider how the system will work in practice and over time, having regard to ‘soft’ issues like natural human biases and behaviour and organisational culture.
They must also recognise that those who in future will be making decisions supported by the machine will not necessarily be the people who were involved in its original conception, design and implementation. The controls and mitigations that are needed to avoid ‘creeping control’ by the machine technology will need to be fully documented so they can be rigorously applied going forward.
The following are some of the factors that are likely to be relevant to consider in determining whether there is an appropriate degree of human involvement in a machine-supported decision-making system:236
(a) Time
Does the process afford the administrator sufficient time to properly consider the outputs of the
machine and any other relevant individual circumstances of the case(s) in respect of which the
function is being exercised? Does the administrator take this time in practice?
(b) Access to source information
Is the administrator able to consider the source material used by the machine? Do they have
access to other material and information that may be relevant to their decision?
(c) Seniority and experience
Does the administrator have the appropriate organisational seniority and level of experience
that would be expected for the type of decision they are making (with or without the support
of machine technology)?
(d) Decision-making ownership
Does the administrator always take ownership of their decisions, even when they are following
the outputs of the machine? Organisationally, is the administrator considered responsible for the
decisions they make?
(e) Cultural acceptance
Are there systems in place to overcome or mitigate automation-related complacency or
technology bias, to scrutinise and raise queries about the output of the machine technology, to
undertake their own further inquiries, and – if the administrator considers it appropriate to do so –
to reject the output of the machine? Is the authority of the administrator to question and reject
the machine’s outputs respected and encouraged? Does it happen in practice?
(f) Understanding of the reasoning process
Does the administrator have a thorough understanding of the operation of the machine
technology as a whole, at least conceptually, in order to be able to form a view on a reasonable
and rational basis about its outputs?237 Is the administrator able to provide comprehensible
reasons for their decision?

The new machinery of government: using machine technology in administrative decision-making 56
NSW Ombudsman

(g) Input into decision-making process design
Can the administrator make or require changes to be made to the machine to better support their
decision-making?
(h) Appreciation of decision-making impacts
Does the administrator have a genuine understanding of what their decision (and what a different
decision) would mean in reality, including for the individuals who may be affected by the
decision?238
The list above also highlights the importance of ensuring that those humans who will be involved in using the machine technology are given the appropriate training and skills to ensure they are assisted but not controlled by its outputs.
It is particularly important that the relevant administrator, and others responsible for analysing or working with the outputs of the technology, have a sufficient understanding of the technology and what its outputs actually mean in order to be able to use them appropriately. This is likely to mean that comprehensive training, both formal and on-the-job, will be required.
This training will need to be ongoing, as the technology is modified or updated, as staff may change, and as reinforcement may be required for existing staff to mitigate the risk of declining skills or creeping complacency over time.
There is also a need to ensure that the machine itself is designed so that its outputs will be presented in a manner – whether that be through dashboard designs or data visualisations – that will be most conducive to active mental engagement, human understanding and appropriate scepticism.239

The new machinery of government: using machine technology in administrative decision-making 57
NSW Ombudsman

13. Ensuring transparency

13.1 Reasons and the right to an explanation
In designing machine technology, agencies must ensure that meaningful reasons will still be able to be provided to those whose legal or other significant interests may be affected by decisions. Those reasons should also note whether machine technology was involved in that decision. In our view, the information to be provided in this regard should include, at a minimum:
(a) the fact that machine technology was involved
(b) the nature and extent of that involvement
(c) what information about them is processed by the machine, including any assumptions,
proxies or inferences
(d) the particular version of the technology, program or application used, and the date
of that version, and
(e) an explanation of how the technology works in a way that is meaningful and intelligible
to an ordinary person.
Of course, the statement should also include the usual requirements for decision notices, including details of how the decision may be challenged or reviewed, and by whom – see chapter 8.

Providing explanations for decisions that are
‘instructive, informative and enlightening’
There appear to be very few court or tribunal decisions that have grappled with the adequacy
of explanations given when machine technology has been involved in decision-making. In
Schouten and Secretary, Department of Education, Employment and Workplace Relations,240
the issue arose in an application to review the amount of social services benefit (Youth
Allowance) payable by Centrelink to an individual.
While affirming that the amount being paid to the individual was correct, the Tribunal noted
that it was not until a government employee gave evidence to the Tribunal about the process
employed to calculate the rate of benefit payable that the individual, and the Tribunal itself,
could understand the process. The Tribunal noted that the case highlighted ‘the difficulty
where government agencies make “automated decisions” and the decision is complex.’ It
noted that:

The citizen will not understand and therefore be unable to challenge a decision about
which they feel aggrieved unless provided with a plain English explanation of the basis
for the decision. As in this case, the initial decision-maker is sometimes unable to
provide that explanation. The Administrative Review Council in its report to the
Attorney-General, “Automated Assistance in Administrative Decision-Making Report
No. 46” noted that care was needed to ensure that the values of transparency and
external scrutiny are not compromised where automated decision-making is employed
… A major challenge for government agencies dealing with citizens is to ensure that
their decisions are instructive, informative and enlightening. In this case, Centrelink has
not met that challenge.241

The new machinery of government: using machine technology in administrative decision-making 58
NSW Ombudsman

How can reasons be provided when machine technology is used?
The use of machines can create additional challenges when providing reasons. One challenge is the tension that may exist between providing reasons that are technically accurate (in terms of describing how the machine works, whether or not it is also possible to show exactly how it came to generate a certain output)242 and providing reasons that serve as an explanation of, or justification for, the decision that will be intelligible and useful to the person affected.
It is clear that an explanation for an outcome that is technically accurate, but is otherwise unintelligible, cannot achieve its purpose and should therefore not be accepted as appropriate reasons at all.
When a human makes a decision, the reasons given do not refer to their brain chemistry or the intricate details of a process that commences with a particular set of synapses firing and culminates in a movement of the physical body giving rise to vocalised or written words. Likewise, explaining how a machine works, even if that explanation is fully comprehensive and accurate, will not necessarily satisfy the requirement to provide ‘reasons’ for its outputs.

Reasons must be accurate; they must also be meaningful
and intelligible to the person who is to receive them.
They must provide an ‘explanation’.243
While there has been much discussion about whether a person affected by a relevant machine-made decision should have a right to the underlying code used by the machine, a more immediate issue is ensuring that a statement of reasons is prepared with their purpose and audience in mind. Even where code is made available, this is unlikely to satisfy a requirement to provide ‘reasons’ even to the small number of individuals who could understand it and in the case of the very simplest of codes (the
Business Names Registration Act 2011 (Cth) may be an example). It is hard to see how the provision of source code could satisfy a requirement to give reasons – it would not, for example, set out findings on questions of fact, refer to the evidence on which those findings were based, or otherwise explain why the decision was made.
Generally, reason statements should be in plain English, and provide information that would be intelligible to a person with no legal or other relevant technical training. What is required is something approximating a ‘path of reasoning’, bridging the relevant findings of fact with the outcome.
In the case of machine-assisted decisions, such an explanation might include information about the machine’s objectives, what data has been utilised, its accuracy or success rate, and information about whether and what is measured. It would seem at least arguable that doing so would satisfy the requirement for reasons.
This does not mean that the more technical details of the design and operation of a machine should not also be provided. We note the NSW Information Commissioner has advised that such information should, at least presumptively, be treated and made available as ‘open access information’.244 We agree with that sentiment. However, merely publishing technical specifications or the underlying code will generally not satisfy a requirement to provide reasons.

The risk of automating reason-giving
Just as machine technology could be used to generate decisions or components of decisions, so too it is easy to imagine machine technology being used to generate statements of reasons for decisions or components of the reasons.

The new machinery of government: using machine technology in administrative decision-making 59
NSW Ombudsman

Already, template letters or standard paragraphs with formulaic expressions of reasons are not uncommon in use by government agencies. Nor are they impermissible, provided ‘the formula is used to guide the steps in making the decision and reveals no legal error’. However, a formula must not be used in a way that would ‘cloak the decision with the appearance of conformity with the law when the decision is infected’ by error. In such a case, ‘the use of the formula may even be evidence of an actionable abuse of power by the decision-maker’.245
The use of even more sophisticated machine technologies for the generation of statements of reasons raises greater concerns that they ‘will provide a façade of accuracy or objectivity that masks flawed decisions’. That is, machine-generated statements of reasons may ‘merely enhance the appearance of a lawfully made decision’.246
We suggest that it is safer, if a machine technology process is to be used in the decision-making process, that this process not also be tasked with generating a statement of reasons for the decision. Instead, the machine could produce the necessary audit records of its inputs, outputs, and processing, which can be taken into account by the human administrator as they develop a statement of reasons. Where practical, if the human administrator actually authors the statement of reasons (rather than simply adopting a statement that has been generated by a machine), this could provide important evidence to support a claim that the relevant administrator did in fact engage in the ‘process of mental reasoning’ necessary for them to be considered a genuine decision maker (see chapter 12).

13.2 Accountability and reviewability
In traditional administrative decision-making, a properly prepared statement of reasons will promote accountability in at least two ways:
 explainability – it enables the person who is affected by the decision to understand it, and
provides a meaningful justification for the decision, and
 reviewability – it provides the primary basis upon which the decision and the process that led to
that decision can be reviewed, whether by the person affected themselves, or by another person
or body, such as an ombuds or a court, to verify that it was lawful, reasonable and otherwise
complied with norms of good decision-making.
With machine-assisted decision-making, however, these two aspects of accountability tend to become more distinct.
In particular, a statement of reasons for a machine-assisted decision that provides an appropriate and readable explanation of the decision to the person affected (explainability) is less likely to provide a sufficient basis upon which the decision and its associated decision-making processes can be properly assessed and reviewed (reviewability).
Reviewability will generally necessitate both a broader and deeper (more introspective) examination of what has occurred in the process leading to the decision, including matters relating to the design, training and testing of the machine, the data used in decision-making, the context of its deployment, and all of the surrounding technical and organisational workflows.247
This means, when designing and deploying machine technology, that it will not be sufficient that a
(traditional) statement of reasons can be generated for each decision.
Agencies must also consider what other information needs to be kept and published to ensure that their processes and decisions can be properly reviewed for compliance with legal and good decision- making requirements (including avoiding legal non-compliance because of non-obvious features such as algorithmic bias).

The new machinery of government: using machine technology in administrative decision-making 60
NSW Ombudsman

In particular, agencies must ensure that any decision-making process is designed so that full, and meaningful, records of the process will be available that can enable the Ombudsman, courts or other review bodies to be satisfied that there has been no unlawful, unjust or otherwise wrong conduct.
A failure to keep such records may itself lead to an inference that the agency has engaged in wrong conduct.

Table: Why transparency of machine technology is important248

Role of transparency Purpose or benefit

Dignity and respect Respecting a person’s right to an explanation as to why, how
and by whom decisions were made that affect their legal or
other significant interests (especially when the decision has not
gone their way)

Accountability Enhancing accountability in the exercise of public power and
exposing and deterring unethical, negligent or otherwise
inappropriate conduct

Early warning system Increasing opportunities for early identification and
rectification of legal and other flaws

Stakeholder input and crowd- Encouraging both expert and lay input to improve the
sourcing technology or its ‘fitness’ to particular contexts

Informed choice Enabling individual choice, including whether to ‘opt out’ of
machine processes (if possible) and/or to seek a review of a
machine outcome by a human

Informed public debate Informing democratic deliberation about the relevant function
and the associated use of the technology in the exercise of that
function, and about machine technology generally

Review Allowing identification of grounds for potential challenge and
enabling proper inquiry into decisions and outcomes to be
undertaken to identify any error or unfairness

Recordkeeping policies and practices
Agencies that use, or intend to use, machine technology should therefore also ensure that their recordkeeping policies and practices are reviewed and that they explicitly address the records that are needed to be generated and retained in respect of machines – see chapter 10.
This may also mean explicitly providing that previous versions of technology that can ‘read’ the relevant records are also properly kept and maintained, and that staff continue to be trained to know how to use them.
Again, if an agency does not have in place recordkeeping policies and practices that ensure proper records of decision-making processes are kept and can be comprehensively reviewed, that failure may justify a finding that the agency has engaged in wrong conduct.

The new machinery of government: using machine technology in administrative decision-making 61
NSW Ombudsman

13.3 Publishing source code
As we noted above, providing reasons does not necessarily require releasing the detailed specifications or source code for a machine. Indeed, doing so would rarely satisfy the requirement for ‘explainability’ – that is, the provision of reasons that the person can understand.
However, scrutiny of the underlying specifications and code may be necessary if decisions are to be properly reviewable. Accordingly, these records will need to be available for review and oversight bodies.
In any case, there should be (at least) a presumption in favour of proactively publishing the specifications and source code of machine technology used for government decision-making.
As well as enhancing the transparency and accountability of government decision-making, doing so has the added benefit that it exposes the technology to appraisal by outside experts.
Indeed, just as government may release policy white-papers or exposure draft bills to draw on the expertise of interested stakeholders, agencies should consider releasing draft specifications, code and even ‘beta’ versions of new machines to draw on external expertise and help to identify flaws or potential improvements before the technology is put into operation.249

Trade secrets and commercial-in-confidence arrangements
A key transparency issue arises when an agency engages an external provider for machine technology expertise. Trade secrets and commercial-in-confidence arrangements should not be more important than the value of transparency and the requirement, where it exists, to provide reasons. Contractual confidentiality obligations negotiated between parties must also be read as being subject to legislation that compels the production of information to a court, tribunal or regulatory or integrity body.250
Furthermore, even if courts are willing to protect algorithms as intellectual property, the tension can be avoided by good procurement practices that demand transparency from industry, and ensure ‘that trade secrets and copyright claims do not trump the values of good governance’.251 We agree with the advice that ‘[o]fficials should refuse to work with vendors who are not willing to make their system sufficiently transparent for appropriate auditing and review.’ 252
The NSW Information Commissioner has noted that ‘…there is scope to strengthen existing information access laws to better facilitate access to AI-informed decision-making, particularly where governments partner with the private sector and NGOs in using these technologies.’253
Section 121 of the GIPA Act sets out requirements for inclusion of a contractual provision relating to an immediate right of access by the agency to certain information held by a contractor. Such a provision would mean that in effect, certain information held by the contractor would be government information for the purposes of the GIPA Act. However, there are exceptions to s 121 and it only applies in certain circumstances.
The Information and Privacy Commission’s guidance for agencies negotiating confidentiality clauses is to ask and consider the responses to three key questions:
1. Who holds the information?
2. In what form is it held?
3. How will access be provided?254
As a minimum, agencies should ensure that the terms of any commercial contracts they enter in respect of machine technology will not preclude them from providing comprehensive details (including the source code and data sets) to the Ombudsman, courts or other review bodies as required for them to review the agency’s conduct for legal compliance.

The new machinery of government: using machine technology in administrative decision-making 62
NSW Ombudsman

14. Verification, testing and ongoing monitoring

14.1 Testing before adoption
Agencies need to identify ways of testing that go beyond whether the machine technology is performing according to its programming to consider whether the outputs of the machine technology are legal, fair and reasonable.
Verification and validation testing of the outputs of the machine technology must be relevant to the specific functional area, including whether it is delivering effectively against the relevant legislative mandate and policy imperatives.

Legal audit of the correctness of legal interpretation
Given the inherent risk of interpretive errors being embedded in the code of automated systems, an initial verification process should involve a thorough legal audit of the system before it is implemented.
Ultimately, only a court can provide a conclusive determination of the meaning of a statute. However, as courts are generally unable to provide advisory opinions, legal advice on the correctness of the interpretation of a statute encoded in a machine will need to be sought from legal experts.255
Ideally, those tasked with undertaking a legal audit prior to launch should not be the same lawyers as those who were involved in the design of the technology. A risk-based assessment may be appropriate to guide the nature and scope of legal audit and who should do it (for example, whether it is appropriate to seek a formal opinion from senior counsel).

Validation and accuracy testing
There are various examples that demonstrate the need to verify and validate machine technology at the outset and periodically after implementation. The domestic and family violence risk assessment tools used by Police in NSW and the ACT are illustrative. Those tools have been found to perform poorly in terms of predictive validity.256 In 2018, the NSW Bureau of Crime Statistics and Research (BOCSAR) examined the Domestic Violence Safety Assessment Tool used by NSW Police to determine its ability to accurately predict a victim’s risk of repeat intimate partner victimisation. BOCSAR concluded the tool performed poorly and found that the ‘study highlights the importance of empirical validation when developing a risk assessment tool’.257
The Queensland Police are reportedly now trialling a new risk assessment tool to be used in the domestic and family violence context – incorporating lessons learned from other jurisdictions. The
Queensland Police are aware of the potential for bias in the data model and will ‘develop a framework about monitoring and managing models before they are rolled out’ in addition to a ‘model monitoring tool’ to identify and address bias on an ongoing basis.258

Testing for algorithmic bias and other unintended consequences
Systems and processes need to be established up front to safeguard against inaccuracy and unintended consequences, such as algorithmic bias. It is important at the project planning stage and as part of the risk management strategy for the machine technology that agencies determine testing procedures enabling them to define:
(a) What will be tested, including key components of the machine technology such as data, training
data models, and business rules.
(b) What testing methods will be used from the range of possible techniques available to test the
robustness of the machine technology and identify vulnerabilities and other issues prior to
operationalising.

The new machinery of government: using machine technology in administrative decision-making 63
NSW Ombudsman

(c) The frequency of testing including what major system modifications would trigger additional
unscheduled testing.
(d) Who will be involved in the design and performance of the testing. We note that the European
Commission Expert Group suggests that testing processes ‘should be designed and performed
by an as diverse group of people as possible’.259

Establishment of quality assurance process and audit trail
Prior to implementation agencies also need to develop appropriate quality assurance processes and establish performance metrics for ongoing system monitoring. A key consideration of quality assurance is the ability of the machine technology to generate a comprehensive audit trail to support scrutinising the system and ensure transparency and accountability.

14.2 Undertake monitoring, review and periodic evaluation
It is essential that machine technology be subject to ongoing monitoring, review and periodic evaluation to ensure that the technology continues to support lawful decision-making consistent with principles of good public administration. Adopting machine technology to support the exercise of an administrative function should not involve a ‘set and forget’ approach.
Agencies need to assess whether the machine technology is working as expected, and must actively continue to monitor its accuracy and the fairness of outcomes. The use of enforcement cameras such as fixed-speed cameras and red-light cameras provides an example of existing NSW legislation requiring ongoing confirmation of the accuracy of tools used in a machine technology system. Enforcement cameras must be routinely tested for accuracy and calibrated every 12 months.260 Certification of the cameras is required at 90-day intervals with the testing and calibration performed by a TfNSW team, an accredited laboratory under the national scheme.261
A monitoring and review regime recognises that changes in the external environment can’t be ‘known’ to the machine technology – for example, statutory amendment or judicial interpretation that shifts the basis upon which the machine technology has been designed; or even something like a natural disaster or other external event that might require adjustments to be made to policy settings. It is also important to ensure that any changes over time – especially through machine learning – operate to increase accuracy and fairness, and do not introduce any unintended consequences such as algorithmic bias.
Machine technology governance must be fit for purpose and keep pace with machine technology capabilities.262 We noted above that agencies should establish early a monitoring and review cycle, including assigning responsibilities, the scope of information and data to be reviewed, and a mechanism for monitoring the progress of any recommended changes.263
Ongoing monitoring and review of machine technology as business-as-usual may include:
1. A sustainable schedule of review and internal reporting on outcomes aligned with existing
governance arrangements for risk analysis and mitigation.
2. Routine certification, testing and auditing of machine technology undertaken by an appropriate
independent expert.
3. Systematic review of identified errors, false positive and false negatives.
4. Audits of the machine technology outputs as part of the agency’s overall quality assurance
processes. This might include consideration of information owned by the agency such as
complaints data and feedback from staff that may provide insights into the operation of the
machine technology.264

The new machinery of government: using machine technology in administrative decision-making 64
NSW Ombudsman

5. Random auditing or ‘benchmarking’ of individual cases, by holding out a sample of cases for
human decision independently of the machine process.265
Comprehensive records of how an agency has undertaken monitoring, review and evaluation of the machine technology are not only an important part of transparency and accountability, but may also be required if the machine technology is subject to external review by an oversight body such as ours or by a court.

Use of the ‘Structured Decision-Making’ tool in the
NSW child protection system
Since 2011266, the Department of Communities and Justice (DCJ) has been using a set of tools,
known as ‘Structured Decision-Making’ (SDM) to assist in the performance of its functions
under the Children and Young Persons (Care and Protection) Act 1998.
SDM was developed and is trade-marked in the United States by a non-government not-for-
profit organisation called ‘Evident Change’. Evident Change was previously known as the
National Council on Crime and Delinquency, which was ‘established in 1907 to assist private
and public agencies serving delinquent youth’.267
DCJ’s website describes SDM as:
‘a process that ensures each key decision in child protection is informed by information known
through research to be relevant to that decision. A number of decision-making tools underpin
SDM and assist staff in making key decisions.’268
The core components of SDM typically comprise decision trees as well as scoring checklists.
Additionally, a written narrative is required to be entered by the user to capture analysis and
conclusions about particular items included in the score (or not).
An example element of a decision tree might look something like this:269

The new machinery of government: using machine technology in administrative decision-making 65
NSW Ombudsman

The reference in the above decision tree to ‘levels’ refers to the suggested response that
may be warranted by a case worker; for example:

An example element of a scoring checklist might be:270

By tallying the various scores for individual items an overall score is obtained for some
relevant multi-factor consideration – in this example ‘risk’:

These SDM tools are an example of assisted decision-making developed using machine
technology but which may not necessarily be digital in their operation. Many of the
SDM tools are capable of being used, and in practice have often been used by case workers
in the field, in a paper form.

The new machinery of government: using machine technology in administrative decision-making 66
NSW Ombudsman

The US designed SDM tools are used in the NSW child protection system in relation to
decision-making around reporting possible risk of significant harm, screening reports,
determining response, assessing safety and risk and assessing when it is safe for a child to be
restored home.271 The publicly available information indicates that SDM is used by DCJ in the
performance of statutory functions alongside professional judgement, as opposed to replacing
human judgement.272
As noted above, the tools take the user through a series of definitions and questions
considered relevant to the decision being made. The outputs of SDM in child protection is
dependent on the specific tool. Some tools guide users to suggested actions, while others
suggest an assessment outcome based on the data inputs.
For example, at the Child Protection Helpline, the SDM Screening and Response Priority Tool
may generate an outcome of ‘screened in’, meaning the matter is to be referred to the local
DCJ office for response. Based on the data inputs, the SDM will also generate a suggested
response time, which may be from ‘within 24 hours’ to ‘within 10 days’.273 We understand
there is a limited discretionary override available to staff to change the response time based
on the individual circumstances of a case and professional judgement.
DCJ’s view is that SDM ensures consistency, accuracy and timeliness in decision-making.274 In
a 2017 examination of the use of SDM in Los Angeles, the Office of Child Protection there also
noted that:

[o]ne of the strongest identified benefits of using SDM is that, because it is a data-
driven tool, it is more objective than professional judgment. When used correctly, it
weighs its information uniformly and is not subject to human biases and stereotypes. It
can help to guide a case worker’s thinking about a case, particularly when the factors
of that case are not clear cut. It may also help to address disproportionality by
assessing case characteristics, risk factors, and family functioning equally across
families of varying social backgrounds.275 [references omitted]

However, SDM tools are also subject to the same vulnerabilities as decision-making in other
contexts – including user error, knowledge and training gaps, and non-compliance. The LA
Office of Child Protection also noted the potential for information entered to be
‘manipulated or skewed to support predetermined thinking’.276
The Australian Institute of Family Studies (AIFS) identified other potential weaknesses of
consensus-based and actuarial risk assessment tools like those used in SDM. Actuarial tools
like the one used by DCJ to assess risk may not consider unusual or context specific factors
and be insufficiently flexible to incorporate professional judgment. For example, an SDM
concerning risk to a child of harm may not give the user the option to input information
about the strengths of a family unit, which could be relevant to the outcome. Although the
user may retain an ability to exercise professional judgment to ‘override’ the results of the
SDM, it may not be clear even to a highly expert user how such an additional factor should
be weighed against the output (a composite risk score) generated by the SDM.
The AIFS also raised the potential for trust in SDM to impact whether a user rejects or accepts
an output. In particular, bias may occur where a user assumes the SDM is always accurate.277
The Los Angeles Office of Child Safety noted that:

[o]ne of the most cited weaknesses of SDM is that, because the model is proprietary,
there is a lack of transparency about how its algorithms are constructed and various

The new machinery of government: using machine technology in administrative decision-making 67
NSW Ombudsman

factors weighted (thus earning its classification as a “black box” model). This is
concerning to users and evaluators alike, as no way exists to understand how the
decision-making process is being influenced by these elements, and if any systemic
biases are inherent in the tool.278

All tools used in the performance of administrative functions must be considered thoroughly
before implementation and subject to ongoing monitoring and review to ensure that they
support lawful decision-making consistent with principles of good public administration.
The US-based creator of SDM, Evident Change, states that it ‘works closely with each
jurisdiction to ensure that assessments are constructed, validated, and customized for the
population served. All risk assessments are tested to ensure racial equity...’279 and that
ongoing evaluation of SDM tools is strongly encouraged.280 DCJ’s website states that a
‘preliminary risk calibration study’281 was to be completed as part of the implementation
process but it is not clear what ongoing validation of SDM was conducted by DCJ after 2011.
There is little publicly available information about what jurisdiction-specific calibration and
evaluation has taken place in relation to the use of SDM in the context of NSW families and
children or in respect of different local populations within this State.
In 2017, the NSW Legislative Council General Purpose Standing Committee No 2 Inquiry into
Child Protection questioned the effectiveness of DCJ’s SDM tools and recommended an
independent review of them.282 Later in 2019, issues with SDM including cultural bias were
considered by the Family Is Culture: Independent Review of Aboriginal Children and Young
People in Out of Home Care (FIC) review. The FIC review found that in practice there was little
Aboriginal consultation in the application of the SDM which ‘considerably reduces the
competency of the tool’.283 It found that the SDM could be manipulated by staff to result in a
punitive approach to assessing Aboriginal families.284
The FIC review made a similar recommendation to the 2017 NSW Legislative Council General
Purpose Standing Committee, that there be an independent review of SDM tools.285 The FIC
review added that the independent review should occur in partnership with Aboriginal
communities to examine adequacy from a cultural perspective.286 In November 2020, the NSW
Government reported that DCJ was scoping a possible review of SDM tools in consultation
with Aboriginal stakeholders – to be completed by July 2021.287
In June 2021, DCJ advised us that a Quality Services Review of its SDM tools would commence
later that month. DCJ noted that:

The reviews will focus on co-designing updates with Aboriginal people, practitioners
and researchers to improve racial equity, validity and accuracy to NSW population
data, practice and legislative and policy settings. Implementation of the updated tools
will focus on workforce and leadership development and bolstering systems to
safeguard practice and decisions. These factors are pivotal requirements to ensure any
assessment tool is used effectively, accurately and consistently.288

We have been told the review is expected to be completed over a period of 2 years including
implementation of the updated SDM tools and any related changes to practice and process.
Additionally, DCJ advised that it was developing an additional SDM Family Strengths and
Needs Assessment tool which will be used ‘to develop a more fulsome understanding of the
family’s experiences and characteristics and to support practitioners to case plan with families
in an approach that targets their needs, and utilises their strengths, rather than just
recognising danger or risk.’

The new machinery of government: using machine technology in administrative decision-making 68
NSW Ombudsman

Cost implications
One of the key benefits of machine technologies for government is its potential efficiencies and consequent cost savings. However,

when preparing a ‘business case’ for a proposed machine
technology project, it is important that all costs are
factored into the cost-benefit equation.
In particular the need that has been highlighted in this chapter for rigorous pre-deployment testing, as well as ongoing monitoring and auditing, is a significant cost that must be taken into account.289 So too is the cost of maintaining and updating the machine over time (including as legislation may change in the future), as well as the current and future training needs of operational staff.
It would also be prudent to also consider contingency costs that might be incurred in future if things go wrong – for example, if an error is detected in the machine design that means that it needs to be substantially re-coded or manual work-arounds put in place. Of course, errors can also have the potential to result in costly legal disputes and compensation claims.
Simplistically comparing a machine’s build and basic operating costs against the expenses (usually in wages) of existing manual processes will present an inaccurately inflated picture of the financial benefits of the technology.

The new machinery of government: using machine technology in administrative decision-making 69
NSW Ombudsman

15. Statutory provisions that authorise machine technology
As we have seen in chapters 5 to 10, there can be legal risks associated with the use of machine technology to support the exercise of a statutory function, especially one that requires a decision maker to exercise discretion.
It may be that, after applying the steps identified in chapters 11 to 14, the design team concludes that it would be unlawful or legally risky for the proposed new technology-assisted decision-making process it has designed to be used for the particular function as the law currently stands.
That raises the question: can and should the statute be amended to expressly authorise the use of machine technology?

15.1 Stating, in simple terms, that an administrator is authorised to use
a machine
The simplest form of authorisation provision would be to merely add to the existing statutory function a statement that the person named is authorised to use a machine for the purpose of exercising the function.
As far as we are aware, no Australian court has ruled on the effect of such a provision.
Our preliminary view is that such a simple provision may be of limited effect. This is because, at least in most cases, we do not think that it is necessary to expressly authorise a decision maker to use technology when exercising a function (see chapter 7). Therefore, a provision that states that they are authorised to do this, may be doing little more than making explicit what is already implicit.
That said, adding such a provision may be useful, if only for the avoidance of doubt. However, there are some potential risks to be aware of with this approach:
1. Complacency
Amending the relevant legislation to ‘authorise machine technology use’ might have a tendency
to lead an agency to falsely believe that the issues we have posed in chapters 7-10 have been
fully dealt with and can be safely disregarded. That is not the case.
The Commonwealth Ombudsman’s guidance refers to the authority for making automated
decisions being put ‘beyond doubt’ if specifically enabled by legislation.290 This must be read
with caution. Merely authorising in general terms that machine technology may be used does
not necessarily mean that any specific use of that technology will be lawful. A general authority
to use technology would not, for example, mean that the technology has been authorised to be
used in a way that is biased, that results in a decision maker taking into account extraneous
considerations, or that breaches privacy or anti-discrimination laws.
2. Unintended consequences for other statutory provisions
A second risk is a potential ambiguity affecting other statutory provisions in the same or other
Acts.
For example, if an authorising provision is included in one Act but not in other Acts (or especially
if it is included in one part of an Act, but not in other parts of that same Act), then questions
might arise as to whether Parliament intended that machine technology is not authorised in
those other Acts or parts of the Act that have not also expressly authorised it. That is, if the
authority to use a machine is expressed in one place but not in another, was the omission in
that other place deliberate, and what does that omission mean?

The new machinery of government: using machine technology in administrative decision-making 70
NSW Ombudsman

3. Potential uncertainty in interpreting the legislation
The third risk is that it may not be obvious how the new provision (that authorises machine
technology use) can be interpretated in a way that is consistent with the function itself.
For example, if a statutory provision currently gives a named decision maker a very open
discretion, and then a further provision is added simply to authorise that person to use machine
technology in the exercise of that discretionary power, it may be unclear how the two
provisions should be read together consistently.
Does the express authority to use a machine mean that Parliament intended that the decision
maker could now do things that would otherwise involve an impermissible fetter of their
discretion? This seems to be a possibility suggested by the Commonwealth Ombudsman in its
Better Practice Guide. It suggests that, where legislation has expressly stated that the use of
automation technology is authorised, future courts might decide that it is then acceptable for
discretions to be automated in limited circumstances such as where the automatic output is set
only to apply beneficially to the person affected.291
Or did Parliament intend that the new provision, authorising the use of technology, is to be
limited in standard ways by the discretionary nature of the function? That is, is the authority
to use technology to be ‘read down’ so that a machine can be used, but only in ways that
are consistent with the decision maker retaining and personally exercising (and not fettering)
their discretion?
Ultimately the proper interpretation of the particular statute will only be able to be resolved by
a court. Unless and until that happens, there may be a great deal of uncertainty (and therefore
legal risk) about what the new provision actually authorises the decision maker to do.
Obviously, the only safe course in the interim is to assume that any authorising provision will be
interpreted narrowly.
4. Lost opportunity to give proper consideration to legal and policy issues
The approach of simply authorising technology use is a simplistic approach that gives
insufficient attention to the kinds of issues that we have raised in this report.
If it is thought necessary to expressly authorise by legislation the use of machine technology for
a particular function, then in our view much more comprehensive consideration should be given
as to what that legislation should include beyond simply stating that technology use is
‘authorised’ in general terms (see below).

15.2 Attributing machine outputs to an administrator
Some Commonwealth legislation has gone further than simply authorising the use of machine technology. It also provides that the output of machine technology is or may be ‘taken to be’ the decision of the administrator.292
In some cases, the provision does not provide for the administrator to over-rule or substitute their own decision for the machine’s output, which could be problematic in practice.
Other Commonwealth legislation seeks to address that problem. For example, s 495B of the Migration
Act 1958 (Cth) expressly authorises the human decision maker (the Minister) to over-rule the machine technology’s deemed decision, but only following certification that ‘the computer program was not functioning correctly’ and if the substituted decision is more ‘favourable’ to the relevant person.293

The new machinery of government: using machine technology in administrative decision-making 71
NSW Ombudsman

‘Nearly identical’ under the Commonwealth Business
Names Registration Act 2011
The Business Names Registration Act 2011 (BNR Act) established a scheme for business name
registration specifically developed with the use of machine technology in mind.
One of the objects of the BNR Act is to avoid confusion by ensuring that business names that
are identical or nearly identical are not registered: BNR Act s 16(3)(a). The Act requires the
Australian Securities and Investments Commission (ASIC) to register the business name
submitted by an entity if, among other things, the name is ‘available to an entity’. A name is
available to an entity if it is ‘not identical or nearly identical’ to a range of names prescribed by
BNR Act s 25. The terms ‘identical’ and ‘nearly identical’ are defined in accordance with s 26,294
which allows the Minister to make rules determining ‘whether a name is identical or nearly
identical to another name’.
The current determination made under s 26 is the Business Names Registration
(Availability of Names) Determination 2015. It sets out the rules that must be applied
when determining whether a name is identical or nearly identical to company names
or other names on the Register.
Section 66 of the BNR Act provides that ASIC can use computer programs ‘for any purposes for
which ASIC may make decisions’ under the BNR Act, and that a ‘decision’ made by a computer
program ‘is taken to be a decision made by ASIC’. There are some 14 programmed system
rules that apply the
business name availability rules set out in the BNR Act and the Determination.295
It appears, however, that the system’s rigid rule-based concept of what is nearly identical
does not always square with human common sense. The names in the table below would
seem to be confusingly similar.

Available name Already registered name

Northern Beaches Tutoring Service Northern Beaches Private Tutoring Services296

Perth Martial Arts Centre Perth Martial Arts Academy297

Rainbow Beach Plumbing Rainbow Beach Plumbing Services Pty Limited298

Central Coast Surf Academy Central Coast Surf School299

Appaloosa Association of Australia Australian Appaloosa Association Ltd300

Cainscrete Plumbing Cairns Concrete Plumbing301

However, in each of the above cases, the computer program determined that the first name
was ‘available‘ even though the second name was already registered. That is, the program did
not see the two names as being identical or nearly identical.
In each case, the Administrative Appeals Tribunal (albeit in some cases with obvious
reluctance) upheld the decisions on the basis that consideration of whether a business name is
nearly identical is to be determined solely by applying the rules set out in the Determination,
which are coded into the computer program.

The new machinery of government: using machine technology in administrative decision-making 72
NSW Ombudsman

These decisions were made even when the Tribunal acknowledged that the outcomes might
‘mislead and confuse’,302 give rise to ‘anomaly’,303 be ‘inconsistent and arbitrary’,304 be
‘counter intuitive’ and ‘neither pleasing nor sensible,’305 or could be ‘quite absurd’.306
However, in three other cases, including a recent case in October 2021,307 the Tribunal has
taken a different approach. In these cases the Tribunal has set aside ASIC’s decision to register
business names that the computer program decided were ‘available’. The Tribunal considered
(contrary to the decision of ASIC’s computer) that the following names were identical or nearly
identical:

Already registered name Name not available

Melbourne Children’s Psychology Clinic Melbourne Child Psychology308

Solar Repairs Perth Solar Repairs Pty Ltd309

Voices of Casey Voices of Casey Choir310

In these decisions, the Tribunal refused to accept that the coded rules left no room for human
discretion, and held that it was necessary for the concept of ‘nearly identical’ to be determined
having regard to ‘the ordinary meaning of that term having regard to its legislative purpose’311
– something that the computer program was not able to do, and which required human
intervention.

15.3 More sophisticated authorisation provisions
To date, the approach taken (primarily in the Commonwealth) to authorising in legislation the use of machine technology has involved very simple provisions of the kind described above.
While we are now seeing some legislation take a slightly more sophisticated approach – for example, the Migration Act provision referred to above – they remain high-level and focused on permitting, rather than regulating, the use of machine technology.
More refined approaches might include the following:

(a) Separating the discretionary and non-discretionary components of a decision
It may be possible for legislation to be amended to more clearly differentiate between those
elements of a function that are authorised and expected to be done by a machine, and those
that are reserved for the human administrator. This may require specifically identifying these
different elements in the legislation for the first time.
For example, amendments might be made to more clearly differentiate between bright-line
rules of eligibility and ineligibility (eg, a person is not eligible for a practising certificate as
a legal practitioner unless they have completed a certified course of study) and discretionary
issues on which judgment is required (eg, a practising certificate may only be granted if the
person is ‘fit and proper’). Machine technology might be authorised to determine the first
(eligibility according to rules) but not the second (eligibility according to broad discretionary
or evaluative issues).
Consideration might also be given to whether even those individual elements might be
amended or further broken down to facilitate processing by a machine. For example, if an

The new machinery of government: using machine technology in administrative decision-making 73
NSW Ombudsman

element is currently expressed in terms that confer a discretion, could the element be further
sub-divided into non-discretionary and discretionary components?
This approach may be challenging because, as currently drafted, few statutory provisions
expressly state whether, and in which respects, an administrator has ‘discretion’.

(b) Converting discretionary powers into non-discretionary rules
As we have seen, attempts to automate discretionary powers raise particular legal risks (see
chapter 7). It may be tempting then, to simply amend the relevant statutory provision to
remove discretion and thereby facilitate the adoption of machine technology.
That is, if the function currently involves a discretionary power to do something, the function
might be redrafted so that it is expressed instead as a non-discretionary duty to do that thing
whenever fixed and clear rules say that it must be done. These would be rules that a machine
can process, without any suggestion that it has fettered the discretion of the decision maker –
because the decision maker no longer has discretion.
This is essentially the kind of approach that has been taken in the Commonwealth Business
Names Registration Act (see above). As that example shows, however, even in circumstances
where it appears that clear and unbending rules would be appropriate, the removal of all
possible discretion can lead to results that, to human intuition, might seem absurd or defy
common sense.
We suggest that great care be taken before taking this approach. The prospect that machine
technology will create an incentive toward legislation that eliminates all discretion in favour of
fixed rules could raise concerns.
Discretion exists in the law for a reason, including to ensure that officials can provide
appropriately individualised solutions that take into account the unique context of the unique
human whose status, rights or interests may be affected by the exercise of functions on a
particular occasion.312 Discretion also exists because it is frequently impossible to precisely and
comprehensively detail in hard rules all possible situations that the law might need to deal with
in practice. Even where that might in theory be possible, it may be undesirable. The modern
trend toward overly complex and prescriptive legislative drafting has been criticised.313
Removing discretion could also mean that any right to seek merits review of the decision may
become ‘a meaningless and empty charade’ if the person conducting the review then lacks even
a residual discretion (for example, in the event of absurdity) to make a different decision.314

(c) Authorising the automation of discretion for ‘beneficial’ decisions only
An alternative is to retain the discretionary nature of the function, but for the legislation to be
amended to authorise the exercise of that discretion to be automated in limited cases. If this
authority is expressed clearly enough, it should override the usual presumption that when
Parliament confers a discretion on a person it intends for the discretion to be exercised by that
person and not to be fettered.
One suggestion sometimes made is that legislative authority could be given for discretion to be
automated in this way only in circumstances where ‘it is to apply beneficially to the person
affected’.315 However, this approach may have two limitations:
 First, the approach would seem feasible only for a function that is clearly expressed to be
binary in nature – that is, it involves a simple yes/no decision. In other cases, whether the
outcome is beneficial or not may be perceived differently by the agency and the person
affected. If, for example, a decision appears favourable but does not provide the person
with everything they wanted, would that count as beneficial?

The new machinery of government: using machine technology in administrative decision-making 74
NSW Ombudsman

 Second, the implicit suggestion in this proposal that no one can be harmed if the automated
process is only able to exercise powers beneficially may not be valid. Any automated
process can make two types of errors – false positives (type 1) and false negatives (type 2).
In this case, a false positive would mean that a machine has wrongly determined to exercise
the power beneficially in respect of a person.
The possibility of these kinds of errors may involve systemic discrimination and injustice.316
If, for example, one group of people (Group A) is systemically more likely to be the subject
of false positives than another group (Group B), then even though the machine is only
making beneficial determinations in any individual case, Group B may be said to be
indirectly harmed – in the sense that those in Group B will be systemically subject to less
favourable treatment.

(d) Authorising the automation of discretion with a right of review
Another similar approach is for legislation to expressly authorise the automation of a
discretionary power but subject to certain rights of objection and review.
For example, legislation could provide that a machine can make an initial determination as to
whether the discretionary power will be exercised, provided the person affected is provided
with advance notice of that proposed exercise and can request instead that a human make a
decision.317
In practice, of course, this would likely have much the same effect as the approach in (c) above
(as presumably people will only object to a determination made by the machine if it is
unfavourable to them).
This approach may, however, avoid the first limitation identified in (c) above, as it will be up to
the person affected to decide whether they consider the determination is beneficial or worth
objecting to. The second limitation in (c) above might still apply – that is, a provision of this kind
could still lead to systemic unfairness if the machine has a greater propensity to make
favourable determinations (including ‘false positives’) in respect of some groups than others.
This approach may also raise other concerns, including that people who can and should object
to the machine’s determination not doing so. This may because of the known propensity for
people to accept technology-generated outcomes as correct, or that some people may be less
able, through vulnerability, to exercise their right to object and request a human-made decision.

(e) Beyond authorisation: regulation of machine technology
In the current legislative approaches we have seen, there has been little consideration given to
including in the legislation not merely a general authorisation to use a machine, but also specific
requirements to ensure that its use will be consistent with administrative law values of the kind
discussed in this report.
We see this as a missed opportunity. However, before we take up this issue further (in section
15.5 below), there is an important alternative approach to the legislative authorisation of
machine technology use that could be considered.

The new machinery of government: using machine technology in administrative decision-making 75
NSW Ombudsman

15.4 Transforming the substantive statutory function
An alternative approach to authorising machine technology for use in the exercise of a statutory function is to replace the relevant statutory function itself.
We are, of course, not suggesting that it is appropriate to amend legislation as a way of sidestepping the principles and concerns of administrative law. However, where a machine is appropriately designed to generate good public policy outcomes, then it may be appropriate to consider reframing the entire statutory function itself rather than seeking to simply overlay machine authorisation onto the existing function.
This approach requires a coordinated exercise of legislative and machine design. A simple example is as follows:
 Assume a relevant transport agency has the function of deciding whether or not a person should
be granted a driver licence. Currently this function is given to the Secretary of that agency, who
can delegate it to any officer of the agency. Such a delegation has been made to hundreds of
front line officers above a certain level of seniority.
 The legislation provides that a licence may only be granted if certain conditions apply (eg the
person is over a certain age and has been certified as having passed a driving test) and must
not be granted if certain other conditions apply (eg the person is subject to a current licence
disqualification period).
 Now assume the transport agency wants to automate the process of issuing licences. Following
the steps we set out in chapters 11 to 14, assume it designs a state-of-the-art machine that would
perform the function flawlessly.
 One approach the agency could take is simply to roll out the machine, and hope that its use is not
unlawful. However, this is a very high-risk approach, especially if the current function is expressed
in the legislation as involving some element of discretion (see chapter 7). Amending the Act to
include a simple provision stating that the Secretary (or delegate) is authorised to use a machine
in the exercise of the function may not completely remove the risk (see above).
 An alternative approach may be to design a wholly new legislative scheme to replace the current
driver licence issuing process. Under the new scheme, instead of the Secretary having a statutory
function (personally or through delegates) of deciding whether to issue driver licences, the
Secretary’s function would be to approve and authorise the operation of a machine that issues
such licences. (The legislation might also provide for the Secretary to retain a separate
discretionary power to issue licences outside the automated process.)
 In this way, one statutory function (issuing driver licences) is replaced by a new statutory function
(approving a machine that issues driver licences). And under the new legislative scheme, this
relevant function (approving the machine) is appropriately performed by a legally responsible
and accountable human administrator, the Secretary.

This approach could in some cases be preferable and produce a better public policy outcome than either alternative of attempting to automate without legislative authorisation or enacting an authority provision of the kind we discussed in the previous sections. Advantages to this approach may include:
1. First, and most obviously, there will be less doubt as to the legal efficacy of decisions made
by the machine.
2. Secondly, there will be transparency, as people will know that decisions are being made by
the machine, and the extent to which that is happening.

The new machinery of government: using machine technology in administrative decision-making 76
NSW Ombudsman

3. Thirdly, this approach does not circumvent administrative law, but it does change where it is
focused. The decisions of the Secretary in relation to approving and operating the machine will
be decisions that are subject to the usual requirements of administrative law and good
administrative practice.
This approach does have a major drawback, which is the potential for fewer or weakened mechanisms for legal redress for those who may be wronged by the process, or harmed by the outcomes of the new system.318
However, done right, the preparation, introduction and enactment of a new legislative scheme provides an opportunity for full public and Parliamentary debate about what legal redress avenues are required, and what other properties the machine-driven scheme must exhibit to ensure that it will uphold norms of good public administration.
In the hypothetical example above concerning proposed new legislation for a machine-operated driver licensing scheme, Parliament might also consider additional legislated elements of that scheme such as:
 mandating that the machine’s specifications (and any updates) be made public
 requiring the machine to be subject, prior to deployment and at regular intervals during its
operation, to external legal and technology audits, with findings to be made public
 introducing a clear right of full merits review to an appropriately senior (and human) officer
of the agency for any determination made by the machine that a person wishes to challenge
 ensuring that any person aggrieved by the machine outputs has the ability to some form
of external review or right to complain to an appropriate oversight body.
In other words, designing legislation and machines together may provide an opportunity for better control of machine use – including by emphasising the primacy of legislation and by ensuring that the machine is fully visible to lawmakers, to the public, and to review bodies.

15.5 Mandating properties of machine technology
Whichever approach is taken, if legislation is to be drafted and debated to authorise machine technology, this also presents the opportunity to ensure that the technology has all of the properties necessary for its use to meet legal, Parliamentary and community expectations of good administrative practice.
This does not appear to be a naïve hope. When, for example, legislation was introduced for the use of machine technology for the detection of mobile phone offences while driving (see chapter 4), the legislation as introduced would have done no more than facilitate its use by reversing the onus of proof on drivers who wished to dispute infringement notices in court.319 However, the debate, both in the
Parliament and in a Parliamentary Committee,320 raised broader issues including the privacy and security of the personal data collected and the potential for algorithmic bias. Although the legislation now appears to have stalled in its entirety, a number of amendments had been proposed, including to expressly legislate rules for the proper destruction of images and personal data.321

The new machinery of government: using machine technology in administrative decision-making 77
NSW Ombudsman

The following is not intended as an exhaustive list, but provides an illustration of the kinds of properties that could be considered when legislating a new function for the approval of a machine technology system. The properties that are most important will differ depending on the context.
For example, in some contexts, having stronger properties in terms of reviewability may mean that weaker properties in terms of explainability could be acceptable. Where there is a possibility of algorithmic bias, having stronger properties relating to testing and auditing might be particularly important.

Properties Example of qualities that could be prescribed

Is it visible? What information does the public, and especially those directly
affected, need to be told regarding the involvement of the machine,
how it works, its assessed accuracy, testing schedule etc? Are the
design specifications and source code publicly available – for example
as ‘open access information’ under the GIPA Act? Is an impact
assessment required to be prepared and published?322

Is it avoidable? Can an individual ‘opt out’ of the machine-led process and choose to
have their case decided through a manual (human) process?

Is it subject to testing? What testing regime must be undertaken prior to operation, and at
scheduled times thereafter? What are the purposes of testing (eg
compliance with specifications, accuracy, identification of algorithmic
bias)? Who is to undertake that testing? What standards are to apply
(eg randomised control trials)? Are the results to be made public?

Is it explainable? What rights do those affected by the machine outputs have to be given
reasons for those outcomes? Are reasons to be provided routinely or on
request? In what form must those reasons be given and what
information must they contain?

Is it accurate? To what extent must the predictions or inferences of the machine be
demonstrated to be accurate? For example, is ‘better than chance’
sufficient, or is the tolerance for inaccuracy lower? How and when will
accuracy be evaluated?

Is it subject to audit? What audit records must the machine maintain? What audits are to be
conducted (internally and externally), by whom and for what purpose?

Is it replicable? Must the decision of the machine be replicable in the sense that, if
exactly the same inputs were re-entered, the machine will consistently
produce the same output, or can the machine improve or change over
time? If the latter, must the machine be able to identify why the output
now is different from what it was previously?

Is it internally Are the outputs of the machine subject to internal review by a human
reviewable? decision maker? What is the nature of that review (eg full merits
review)? Who has standing to seek such a review? Who has the ability
to conduct that review and are they sufficiently senior and qualified to
do so?

The new machinery of government: using machine technology in administrative decision-making 78
NSW Ombudsman

Is it externally Are the outputs of the machine subject to external review or complaint
reviewable? to a human decision maker?
What is the nature of that review (eg for example, merits review or
review for error only)? Who has standing to seek such a review? If
reviewable for error, what records are available to the review body to
enable it to thoroughly inspect records and detect error?

Is it compensable? Are those who suffer detriment by an erroneous action of the machine
entitled to compensation, and how is that determined?

Is it privacy protective What privacy and data security measures and standards are required
and data secure? to be adhered to? Is a privacy impact assessment required to be
undertaken and published? Are there particular rules limiting the
collection, use and retention of personal information?

The new machinery of government: using machine technology in administrative decision-making 79
NSW Ombudsman

16. Coda – new laws for new technology?
Throughout this report we have focused on how existing laws and norms of public sector administrative decision-making may control the use of machine technology when used in that context.

Uncertainties and gaps in the existing legal framework
However, we have also observed that there are likely to be, at least initially, significant uncertainties and potentially significant gaps in the existing legal framework given what are likely to be rapid and revolutionary changes to the way government conducts itself in coming years.
One risk, for example, may be that machine technology will be capable of producing extremely large- scale systemic injustices that are not possible or likely under current technologies. The existing framework of administrative law, which is typically concerned with the protection of individual rights and interests, may be ill-equipped or at least too slow to respond.323
Indeed, the fact that administrative law is primarily developed through the decisions of courts, tribunals and other review bodies is one of its strengths, as it provides flexibility – including to accommodate changing technologies. However, it also means that any consideration and determinative rulings are inherently ‘after the fact’. The pace at which legal certainty is provided may be substantially slower than is desirable.
Because oversight decisions first require a challenge to be brought, courts and others must generally wait for opportunities to arise when they can consider and offer certainty about the application or extension of legal and ethical norms to new situations and new technologies.
Those opportunities may arise even less rapidly or frequently in the case of the new machine technologies, given the following:
(a) invisibility – currently, and despite the views we set out in chapter 13, government agencies are
not routinely publishing or informing those affected about their use of machine technology
(b) technical opacity – the complexity of the technology may make it harder for individuals
wronged by decisions to recognise error or maladministration, even if intelligible reasons for the
individual decision in their case are given324
(c) systematisation – errors introduced by the technology are more likely to be systemic in nature,
rather than just affecting a particular individual, which may make it less likely that any individual
will challenge the decision
(d) vulnerability – in the public sector context machine technology has more frequently been used
in ways that affect people in lower socio-economic groups or who are otherwise more
vulnerable, and who may accordingly have less capacity or resources to recognise and challenge
potentially unlawful decisions.
We finish this short final chapter then, by simply asking the question of whether existing laws and associated institutional frameworks are adequate, and whether new laws should be considered.

Modernising administrative law for the new machinery of government
In the previous chapter, we noted that if a statute is to be amended to specifically authorise a particular use of machine technology, this creates an opportunity for Parliament to consider scaffolding a governance framework around that technology. That could include stipulating certain properties the system must exhibit in terms of transparency, accuracy, auditability, reviewability, and so on.

The new machinery of government: using machine technology in administrative decision-making 80
NSW Ombudsman

However, is there a need to consider more generally applicable legal or institutional reform, particularly to ensure that machine technology is subject to appropriate governance, oversight and review when used in a government context?325
There may be precedent for this approach. The machinery of Australia’s modern administrative law
– the administrative decisions tribunals, ombudsman institutions, privacy commissions, and (in some jurisdictions) codified judicial review legislation – was largely installed in a short period of intense legislative reform, responding to what was then the new technology of modern government at the time.326
The Government of Canada has also recently taken steps in this direction, with its ‘Directive on
Automated Decision-Making’. The Directive was issued in 2019 as part of the Government’s commitment to using artificial intelligence ‘in a manner that is compatible with core administrative law principles such as transparency, accountability, legality, and procedural fairness’. The Directive sets out requirements to increase transparency of such systems including public notice of the use of an automated decision system, the provision of reasons for decisions and release of source code. Quality assurance requirements include testing and monitoring, ensuring data quality and consultation with legal services to ensure the system is legally compliant.
The Directive aims to set core requirements for increased transparency and reduced risk for government use of machine technology. However, the Directive does not apply to all agencies and there are certain limitations of scope such as application only to automated decision systems developed or procured after 1 April 2020.327

As we come to understand better how machine
technology will impact on government decision-making,
consideration may need to be given to whether, and if so
how, the legal and institutional framework might again
need to be modernised to address the new challenges.
In the interim, we will also continue to consider the role and value the NSW Ombudsman can and should bring to this area, given our existing statutory functions and resources.

Ombudsman institutions328 have proven useful in many areas where traditional regulation and judicial enforcement is inadequate or inefficient. They seem particularly well-placed to also play an active role in the burgeoning fields of machine technology given their independence, ability to operate with greater agility and informality than judicial processes, and powers to require agency co-operation and access.
Ombudsman institutions also have the ability to not only respond reactively to individual complaints but also to proactively inquire into potential systemic issues, and the ability to make public reports and recommendations to improve practices, policies and legislation.329 On the other hand, it must also be recognised that ombudsman institutions may be limited at present by a lack of the deep technical skills and resources needed for any sophisticated deconstruction and interrogation of data quality and modelling, which may, at least in some cases, be required for effective scrutiny and investigation of machine technology.330

The new machinery of government: using machine technology in administrative decision-making 81
NSW Ombudsman

Endnotes

The new machinery of government: using machine technology in administrative decision-making 82
NSW Ombudsman

1 See Australian Government, Digital Transformation Strategy 2018-2025, (Webpage) .
328 This is true also of bodies that may not necessarily bear the title of Ombudsman, but which perform similar and in some
cases more specialised roles, including for example Human Rights Commissions or Information and Privacy Commissions.
329 Cf Simon Chesterman, We, the Robots? Regulating Artificial Intelligence and the Limits of the Law (Cambridge University
Press, 2021) 220-222 (suggesting the establishment of ‘an AI Ombudsperson’).
330 Cf Coglianese and Lehr (n 91) 1190 (suggesting oversight approaches including ‘the establishment of a body of neutral
and independent statistical experts to provide oversight and review, or more likely a prior rule making process informed
by an expert advisory committee or subjected to a peer review process’).

The new machinery of government: using machine technology in administrative decision-making 97

This text has been automatically transcribed for accessibility. It may contain transcription errors. Please refer to the source file for the original content.

Upload 1

Automated Transcription

NSW Ombudsman

Annexure A
– Revenue NSW
case study

The following case study is an
annexure to the special report
to Parliament under section 31
of the Ombudsman Act titled
‘The new machinery of
government: using machine
technology in administrative
decision-making’
(29 November 2021)

Annexure A – Revenue NSW case study i
NSW Ombudsman

Contents

Annexure A – Revenue NSW case study ......................................................................................... i
1 Overview of the Revenue NSW case study ............................................................................... 1
Complaints.......................................................................................................................................... 1
Legal advice ........................................................................................................................................ 2
2 Statement of Facts – Revenue NSW’s system for issuing garnishee orders................................. 4
PART A: PRELIMINARY ........................................................................................................................ 4
PART B: THE LEGISLATIVE CONTEXT ................................................................................................... 6
PART C: REVENUE NSW’S GARNISHEE ORDER (GO) SYSTEM ............................................................. 9
PART D: MODIFICATIONS TO THE GO SYSTEM................................................................................. 16
PART E: IMPACT AND EFFECTIVENESS OF THE GO SYSTEM ............................................................. 19
3 Questions for Counsel – Revenue NSW’s use of automation technologies in administrative
decision-making .................................................................................................................... 25
4 Legal Opinion of James Emmett SC and Myles Pulsford........................................................... 27

Annexure A – Revenue NSW case study ii
NSW Ombudsman

1 Overview of the Revenue NSW case study
Garnishee orders are one of a range of civil sanctions available under the Fines Act 1996 (Fines Act) to
recover outstanding fines debt. The orders can only be issued when a fine defaulter has not engaged with
Revenue NSW following several notifications of an outstanding debt. Under a garnishee order, a financial
institution (typically a bank) is ordered to transfer funds to Revenue NSW from an account held by the
fine defaulter to satisfy outstanding debt. Account holders are not given prior notice of the order.
Revenue NSW now uses automation to issue large volumes of garnishee orders to banks. There are two
core information technology applications used:
1. Fines Enforcement System (FES)
2. Debt Profile Report (DPR)
The FES is essentially a database of information about individual fine defaulters. The DPR is a business rule
engine that takes the data in the FES (inputs), applies analytics that reflect business and prioritisation rules
(analytics), and generates customer profiling and activity selection (outputs). Together, the FES and the
DPR manage the end-to-end lifecycle of an enforced fine. Most steps are undertaken without staff
involvement, by following pre-programmed business rules.
Revenue NSW issues an electronic file of garnishee orders to the major banks on a nightly basis. The file
includes contact details of thousands of fine defaulters and an order that the bank is to attempt to
garnishee funds if an account in the name of the fine defaulter, is held with that bank.

Complaints
Some time ago we commenced an investigation into a rising number of complaints we were receiving
from individuals whose bank accounts had been the subject of garnishee orders by Revenue NSW.
When we first began receiving these complaints, neither we nor the complainants were aware that
Revenue NSW was using machine technology for its garnishee processes. We published several case
studies in our annual reports about the hardship caused by garnishee orders. In a number of cases,
the complainants had been left with a zero balance in their account. Some of the complainants were
welfare-recipients, whose bank accounts had held the funds they were receiving from Centrelink as
their only source of income.
The number of garnishee orders issued by Revenue NSW increased over time – from 6,905 in the 2010-11
financial year to more than 1.6 million in 2018-19. As the number of garnishee orders issued increased,
we continued to receive a significant and increasing volume of complaints about their administration
and impact.
We made detailed inquiries with Revenue NSW into whether adequate protections were being afforded
to those who were at risk of hardship before, or as a result of, a garnishee order. We also made inquiries
into how Revenue NSW dealt with claims of hardship and requests for a refund after a garnishee order
had been actioned.
During this process we became aware of the extent of automation used in the issuing of garnishee orders,
and several changes were made by Revenue NSW including:
 In August 2016 Revenue NSW implemented a ‘minimum protected amount’ to garnishee orders
issued to banks. This meant that only amounts over a specified minimum – currently $523.10 –
could be subject to a garnishee order.

Annexure A – Revenue NSW case study 1
NSW Ombudsman

 In September 2018 Revenue NSW took steps to exclude ‘vulnerable persons’ from the making of
garnishee orders. It did this by implementing a new machine learning model within their systems
with the intention of identifying and excluding persons identified as vulnerable.
We also made a number of comments to Revenue NSW under s 31AC of the Ombudsman Act. Comments
under s 31AC of the Ombudsman Act are not findings of wrong conduct – they are a formal means of
informing an agency that we believe action is required to ensure it acts reasonably and lawfully. One of
our 31AC comments was that Revenue NSW should seek expert legal advice on the legality and design of
its automated system.
Revenue NSW agreed with most of the actions we suggested, including to develop and issue a
consolidated hardship policy, which was published on its website.
After we raised concerns about the ‘automation’ of garnishee orders, Revenue NSW in March 2019
introduced an additional manual step in the process of issuing garnishee orders. This ‘human-in-the-loop’
process required a Revenue NSW staff member to formally authorise the issuing of the proposed
garnishee orders. This is effected by way of a traffic light system that applies criteria developed from the
Fines Act and business rules to a bulk number of files selected by the technology systems for a garnishee
order. Where all lights are green, a Revenue NSW staff member approves the garnishee orders and the
electronic file is transmitted to the banks.
Revenue NSW’s view was that this change would avoid any legal doubt as to the lawful exercise of
discretionary power under the Fines Act.
Although we were satisfied by the steps Revenue NSW was taking to address the particular vulnerability
and hardship issues raised in complaints (as a result of which we discontinued our investigation of those
complaints), we continued to hold doubts about the legality of the machine processes that it was
continuing to use to issue garnishee orders.
When we later followed up to check on the legal advice we suggested Revenue NSW obtain, we were
advised that no advice had been sought, either externally or from the legal branch of the Department of
Customer Service, of which Revenue NSW is part.
We decided to seek our own legal advice. We worked with Revenue NSW to develop a ‘statement of
facts’, which we agreed provided a comprehensive and accurate statement of how Revenue’s NSW
garnishee system was operating (section 2). We then provided that statement to our Senior Counsel
together with a series of questions (section 3). Counsel’s response to our questions is set out at
section 4 below.

Legal advice
The modification made by Revenue NSW in March 2019 to introduce a human-in-the-loop process meant
that the systems used before and after this time differed in significant ways. Where relevant, we asked
questions of Senior Counsel in relation to both systems.
Counsel’s opinion was that, to the extent a person authorised by the Fines Act to make garnishee orders
was not involved in the automated issue of garnishee orders under that Act between early 2016 and
March 2019, Revenue NSW’s processes were not lawful.
There are three relevant aspects of the Fines Act:
1. The power to make a garnishee order lies with the Commissioner of Fines Administration
(Commissioner), delegate or a person authorised to exercise that function by the Commissioner
(ss 73(1), 116A and 116B).

Annexure A – Revenue NSW case study 2
NSW Ombudsman

2. In order to make a garnishee order, the Commissioner (or delegate or authorised person) must be
‘satisfied’ that enforcement action is authorised (s 73(2)). That satisfaction is a condition
precedent to the making of a garnishee order.
3. The Commissioner (or delegate or authorised person) ‘may’ make a garnishee order (s 73(1)).
This is a discretionary power. However, the degree of discretion that is open to the decision-
maker differs depending on the situation.
In some situations (described in s 71(1)), the Commissioner is required to take some form of civil
enforcement action, and their discretion is confined to deciding which particular civil enforcement
action is to be taken. There are three forms of civil enforcement – property seizure orders,
garnishee orders, and the registration of charges on land. Within those forms there is further
optionality in terms of the particular land or property that is to be the subject of seizure order or
land charge, or the particular person who is to be the subject of garnishee order. For example, a
garnishee order could be directed to a person’s bank, a person’s employer, or any other person.
There are other situations (described in s 71(1A)) in which the Commissioner has a broader
discretion, including whether to take any civil enforcement action at all.

Counsel advised that Revenue NSW’s use of machine technology for the making of garnishee orders
between early 2016 and March 2019 was unlawful because no authorised person engaged in a mental
process of reasoning to reach the state of satisfaction required to issue a garnishee order, and because
the discretionary power was not being exercised by the authorised person.
As noted above, Revenue NSW implemented a change to the system in March 2019 whereby a designated
staff member was required to first review a ‘check summary report’ (essentially a traffic light system) and
formally authorise the issuing of garnishee orders.
Revenue NSW has also confirmed that its process (and the check summary report) only identify garnishee
orders for fine defaulters whose circumstances fall under s 71(1) (and not s 71(1A)) of the Fines Act. That
is, the discretionary power of the Commissioner in these circumstances is a limited discretion – the
Commissioner ‘is to’ take civil enforcement action (s 71(1)), and their decision is limited to deciding what
particular form of action to take and, if they decide to issue a garnishee order, in what terms that order
will be issued and to whom.
Counsel’s opinion was that, although the modification of including a summary check report process
meant that the power to issue garnishee orders was formally being exercised by a person authorised
to exercise the power, there remained doubt that the person was either forming the required state of
satisfaction before making garnishee orders, or genuinely exercising the discretionary power to make
the orders.
While Counsel’s view was that it may be open to Revenue NSW to adopt a system under which
an authorised decision maker considers issuing garnishee orders for multiple fine defaulters
simultaneously, it was not sufficient for the decision maker to approve the issuing of those orders
simply on the basis of a green light generated by the traffic light report process.
Counsel advised that problems described with the lawfulness of the process could be addressed
by modification of the process or legislative amendment.
Counsel advised that there are two possible avenues to challenge a garnishee order issued by Revenue
NSW. The first is to the Local Court under Part 8 of the Civil Procedure Act 2005 (s 124A). The second is
that a fine defaulter may in certain circumstances challenge the legality of a garnishee order in the
Supreme Court.

Annexure A – Revenue NSW case study 3
NSW Ombudsman

2 Statement of Facts – Revenue NSW’s system for issuing
garnishee orders
This document is a description of Revenue NSW’s garnishee order systems and processes, including
key modifications made over time. The document was prepared by NSW Ombudsman and Revenue
NSW and formed the basis of the instructions for legal advice.

PART A: PRELIMINARY
Defined terms
In this document:

“Commissioner” means the Commissioner of Fines Administration.

“Fine defaulter” means a person who is, or who is alleged to be, liable to pay a fine under either a court
enforcement notice or a penalty notice enforcement order (within the meaning of the Fines Act).

“Fines debt” means an amount that a fine defaulter is liable to pay, but has not paid, under either a
court enforcement notice or a penalty notice enforcement order (within the meaning of the Fines Act).

“Garnishee Order” means a garnishee order made by the Commissioner under section 73 of the
Fines Act.

“Original Version” refers to the GO system used by Revenue NSW in the administration of Garnishee
Orders in early 2016.

“Current Version” refers to the GO system used by Revenue NSW in the administration of garnishee
orders today.

“Vulnerable Person” includes (but is not limited to) any person listed in sub-section 99B(1)(b) of the
Fines Act as a person in respect of whom a work and development order may be made in respect of a
fine, being person who: has a mental illness, has an intellectual disability or cognitive impairment, is
homeless, is experiencing acute economic hardship, or has a serious addiction to drugs, alcohol or
volatile substances. “Vulnerable” and “vulnerability” have corresponding meanings.

Acronyms and abbreviations

DPR Debt Profile Report

FES Fines Enforcement System

GO Garnishee Order

SOR System of Record

WDO Work and Development Order

List of legislation
Civil Procedure Act 2005 (NSW) (Civil Procedure Act)
Fines Act 1996 (NSW) (Fines Act)

Annexure A – Revenue NSW case study 4
NSW Ombudsman

Fines Regulation 2015 (NSW) (Fines Regulation)
Government Sector Employment Act 2009 (Government Sector Employment Act)
Ombudsman Act 1974 (NSW) (Ombudsman Act)
State Debt Recovery Act 2018 (NSW) (State Debt Recovery Act)
Taxation Administration Act 1996 (NSW) (Taxation Administration Act)

Unless otherwise stated, a reference in this document to a legislative provision is a reference
to that provision of the Fines Act.

Annexure A – Revenue NSW case study 5
NSW Ombudsman

PART B: THE LEGISLATIVE CONTEXT

Revenue NSW and the Commissioner of Fines Administration

1. Revenue NSW is the administrative agency of the NSW Government responsible for collecting
revenues, administering grants and recovering fines and debts.

2. It is currently a division of the Department of Customer Service. The Department of Customer Service
is a public service department established under the Government Sector Employment Act. The staff
employed by the Department of Customer Service are public servants under that Act.

3. Revenue NSW was established on 31 July 2017, following a name change from the Office of State
Revenue and State Debt Recovery Office.

4. The head of Revenue NSW holds the senior executive public service role of “Deputy Secretary” (of
the Department of Customer Service). That person also holds the roles of “Commissioner of Fines
Administration” under section 113 of the Fines Act and “Chief Commissioner of State Revenue”
under section 60 of the Taxation Administration Act.

5. Functions relating to fines enforcement under the Fines Act are conferred on the Commissioner of
Fines Administration.

The statutory power to make Garnishee Orders

6. Under section 73(1) of the Fines Act, the Commissioner “may make an order [i.e. a Garnishee Order]
that all debts due and accruing to a fine defaulter from any person specified in the order are attached
for the purposes of satisfying the fine payable by the fine defaulter.”

7. The debts that can be enforced by way of a Garnishee Order are debts accruing in respect of:
• a fine imposed by a court following the making of a court enforcement order, and
• the amount payable under a penalty notice following a penalty notice enforcement order
(s 57).

8. Under s 73(4), a Garnishee Order operates as a garnishee order made by the Local Court under Part 8
of the Civil Procedure Act. For this purpose, the Commissioner is taken to be the ‘judgment creditor’
and the fine defaulter is the ‘judgment debtor’.

9. Section 117 of the Civil Procedure Act sets out how the order operates in relation to a bank:
“(1) Subject to the uniform rules, a garnishee order operates to attach, to the extent of the amount
outstanding under the judgment, all debts that are due or accruing from the garnishee to the
judgment debtor at the time of service of the order.
(2) For the purposes of this Division, any amount standing to the credit of the judgment debtor
in a financial institution is taken to be a debt owed to the judgment debtor by that institution.”

10. A Garnishee Order is one of a range of civil enforcement actions that may be taken by the
Commissioner to recover certain fines debt under Part 4, Division 4 of the Fines Act. Other possible
actions include property seizure orders, examination summons and notices, and charges on land.

11. Under s 73(2), the Commissioner “may make a garnishee order only if satisfied that enforcement
action is authorised against the fine defaulter under this Division [Part 4, Division 4].”

Annexure A – Revenue NSW case study 6
NSW Ombudsman

The statutory process leading to the making of a Garnishee Order

12. In respect of fines debt arising in respect of unpaid penalty notices, the standard process leading to
consideration of any civil enforcement action under the Fines Act is as follows:

(1) Penalty Notice
A ‘penalty notice’ is issued (Part 3, Division 2).

(2) Penalty Reminder Notice
If the amount payable under the penalty notice remains unpaid within the time period required by the
notice, a ‘penalty reminder notice’ is issued (Part 3, Division 3).

(3) Penalty Notice Enforcement Notice
If the amount payable is still unpaid, the Commissioner may issue a ‘penalty notice enforcement
order’ (Part 3, Division 4).
From this point, the person owing the fine is referred to as a ‘fine defaulter’. Additional fees may
apply for the cost of enforcement action taken at this and subsequent stages of the process.

(4) RMS enforcement action
If the amount payable continues to be unpaid, the Commissioner may direct Roads and Maritime
Service (RMS) to take certain enforcement action, which may include suspending or cancelling the
driver licence or vehicle registration of a fine defaulter.
RMS sanctions are not to be applied in certain circumstances, such as where the fine defaulter
is under the age of 18 and the fine does not relate to a traffic offence (s 65(3)(b)).
RMS sanctions also need not be applied (before proceeding to civil sanctions) if the RMS sanctions are
unavailable or if the Commissioner is satisfied that they would be unlikely to be successful or would
have an excessively detrimental impact on the fine defaulter (ss 71(1) and 71(1A)).

(5) Civil enforcement action
If the amount payable remains unpaid and RMS enforcement action is either unavailable or
unsuccessful, civil enforcement action may be taken (s 71(2)), including the making of a Garnishee
Order (s 73).

Other relevant statutory provisions

13. (Notice) A Garnishee Order may be made without notice to the fine defaulter (s 73(3)).

14. (Service) A Garnishee Order can be served electronically by Revenue NSW using an information
system (s 73(5)).3

15. (Access to information) The Commissioner is authorised to access information for the purposes of
taking enforcement action including:
a. from police and government agencies, including Roads and Maritime Services – criminal
record, address, property, date of birth, driver license number, details of bank account number
or employer of a fine defaulter held by (s 117)
b. information held by employers (s 117AA)

Annexure A – Revenue NSW case study 7
NSW Ombudsman

c. information held by credit-reporting bodies including the name of a person’s financial
institution and details of any account held (s 117AB).

16. (Delegation) The Commissioner may delegate any functions under the Fines Act (other than the
power of delegation itself) to “any person employed in the Public Service” (s 116A(1)). Enforcement
functions may be exercised by the Commissioner “or by any person employed in the Public Service
who is authorised by the Commissioner to exercise that function” (s 116B).

17. Under s 116A(2), the following functions may be delegated to “any person” (i.e. not just to a person
employed in the Public Service):
(a) The function of serving notice of a fine enforcement order (which includes a penalty notice
enforcement order) (s 59).
(b) The function of notifying a fine default of certain RMS enforcement action, such as driver licence
suspension (s 66)
(c) The function of serving (but not issuing) an order for examination.

18. (Enforcement cost recovery) The Fines Regulation sets out the costs for enforcement action under the
Fines Act.

19. (Reviews) The Fines Act contains no right of review or statutory appeal right in respect of the making
of a Garnishee Order. However:
(a) “the Commissioner may, on application under section 46 or the Commissioner’s own
initiative, withdraw a penalty notice enforcement order” in certain circumstances including if
the Commissioner is “satisfied that there is other just cause why the application should be
granted, having regard to the circumstances of the case” (s 47(1)(i)).
(b) A person may apply to have the penalty notice enforcement notice annulled by the
Commissioner (Part 3, Division 5).

20. (Refunds) Under s 77A of the Fines Act, the Commissioner may refund all or part of an amount paid
under a Garnishee Order on the ground of hardship experienced by the fine defaulter or their
dependant. The debt remains payable including any amount refunded to the fine defaulter (s 77A(2)).

Annexure A – Revenue NSW case study 8
NSW Ombudsman

PART C: REVENUE NSW’S GARNISHEE ORDER (GO) SYSTEM
21. The GO system described in this document is the one that has been used by Revenue NSW in the
administration of Garnishee Orders since at least January 2016.

22. Changes have been made to the system from time to time since then. However, despite those
changes, it is recognisably the same system.

23. In this document, ‘Original Version’ refers to the GO system as it was in early 2016 and ‘Current
Version’ refers to the system as it is today. The most significant changes that have been made
between the Original Version and the Current Version are noted on the next section below.

Revenue NSW’s published policy documents

24. Revenue NSW has no published policies specifically relating to the making of Garnishee Orders.

25. Revenue NSW has internally published business rules relating to the making of garnishee orders.

26. Other policies of relevance include:
(a) Hardship Policy, first published on the Revenue NSW website on 1 November 2019
and available here: https://www.revenue.nsw.gov.au/help-centre/resources-library/hardship-
policy
(b) Privacy Policy, most recent version published on the Revenue NSW website on
1 May 2020 and available here: https://www.revenue.nsw.gov.au/privacy

Revenue NSW’s instruments of delegation

27. The Revenue NSW instruments of delegation are at Attachment A.

Core technology elements of the GO system

28. There are two core information technology applications used in the GO system:
a. Fines Enforcement System (FES) – database and transaction processing
b. Debt Profile Report (DPR) – analytics

29. The FES contains the system of record (SOR), which is essentially a database of records that includes:
 names of ‘customers’4
 information about the debt (fine information)
 contact information
 record history (e.g. former addresses, former names)
 financial records of the customer.

30. The FES interfaces directly with SORs of other government agencies, including RMS.

31. The FES also handles the processing of transactions (including, in particular, civil enforcement action).
In relation to Garnishee Orders, the FES:
 records the Garnishee Order ‘transaction’

Annexure A – Revenue NSW case study 9
NSW Ombudsman

 transmits the Garnishee Order to the relevant financial institution or other recipient (either
electronically where that is possible or by generating an order that is sent by post where
electronic transmission is not possible)
 interprets the response from the recipient
 processes applicable payments and other transactions.

32. The DPR (Debt Profile Report) is a business rule engine that takes the data in the FES (inputs), applies
analytics that reflect business and prioritisation rules (analytics), and generates customer profiling
and activity selection (outputs). The main function of the DPR is to ‘select’ the next enforcement
action to be taken in respect of a file in the FES (e.g., SMS reminder message, data match request,
Garnishee Order, and so on).

33. Once selected by the DPR, a message is sent by the DPR to the FES instructing the FES to either
process the selected action (if it is an automated action) or to notify staff of the need to undertake
the selected action (if it is a manual action).

The standard process for enforcing an unpaid fine in the Original Version

34. Together, the FES and the DPR manage the end-to-end lifecycle of an enforced fine.

35. The following steps describe the standard process flow of a fine as it proceeds toward a Garnishee
Order. It is not exhaustive and does not describe all possible alternative processes and outcomes.

36. It is noted that from Step 2 below, except where staff involvement has been specifically indicated,
each step is undertaken as a result of Revenue NSW’s programmed business rules and core
technology systems which interface with external systems as indicated.

37. At any time during the below process, a customer may elect to:
 pay the fine debt in full,
 enter into a payment plan, or
 contact Revenue NSW for further options such as a work and development order, dispute or
write off.
The taking of any of those actions will cut short the process.

Step 1 – Fine loaded

The fine is ‘loaded’ from the issuing agency into the SOR (in the FES). That is, details of the
relevant penalty notice, court fine, electoral fine or sheriff office jury branch fine are transmitted
electronically to the FES.

Step 2 – Validation of details

The FES ‘validates’ the referred details, ensuring the minimum amount of customer details are
present (date of birth, name, address) and the offence details are present and in the right
format. Staff intervention may be required if the FES identifies a critical error.

Step 3 – Enforcement order generated

An enforcement order is automatically generated. In the case of a fine debt arising from a
penalty notice, this is a ‘penalty notice enforcement order’.
Either a new customer file is created in the SOR or the enforcement order is linked to an
existing customer.

Annexure A – Revenue NSW case study 10
NSW Ombudsman

Staff intervention is required if the FES identifies an error. This may occur if, for example, the
system is unable to verify whether an incoming fine requires a new customer record to be
created or should be matched to an existing customer record.

Step 4 – Data matching to confirm address details

If possible, a data match is conducted against RMS’s system to confirm that Revenue NSW has
the most up to date customer address and contact information.
Staff intervention is required when the RMS returns an error or anomaly.

Step 5 – ‘Printing’ the enforcement order

The enforcement order is ‘printed’. This means that the order is despatched to the customer by
post or, if the customer has previously consented to receiving such material electronically, by
email. At this point the due date for payment (+28 days) is set. If the enforcement order is
posted, the enforcement order is printed, enveloped and despatched with no staff involvement
other than as required for ordinary mail handling. If the enforcement order is emailed, the email
is generated and transmitted without staff involvement.
Before the due date the customer may receive a SMS message (if they have previously opted-in
to receive such messaging) advising them that an enforcement order has been issued and they
should expect it shortly.

Step 6 – RMS enforcement action

If on ‘day +37’ (that is, thirty seven days after the enforcement order was ‘printed’), a request
is automatically issued by the FES to the RMS to apply enforcement action under Part 4,
Division 3 of the Fines Act if:
 the enforcement order remains ‘open’ in the FES (e.g., it has not been ‘closed’ by reason
of the fine having been paid), and
 the enforcement order is not recorded as being subject to a payment plan or as otherwise
being under management.
If the RMS takes enforcement action, a message is sent by RMS to the FES, and the customer is
issued a ‘sanction application letter’ by Revenue NSW. Licence sanctions and vehicle sanctions
take effect 14 days after the sanction application letter is ‘printed’ (that is, despatched by
email, if the customer has previously consented to receive such materials by email, or by post).
During this time the customer (if opted-in to receive messages) may receive a SMS message
advising them that an RMS sanction has been applied.

Step 7 – Assessment for Garnishee Order or other civil enforcement action

At the expiration of the 14 day period (if an RMS sanction was applied, the enforcement order
remains ‘open’, and the enforcement order is not recorded as being subject to a payment plan
or as otherwise being under management) the customer is assessed to determine whether any
civil enforcement action, including any Garnishee Order (directed to a bank or an employer)
should be made.
The assessment is undertaken by the DPR (Debt Profile Report).

Annexure A – Revenue NSW case study 11
NSW Ombudsman

The Debt Profile Report (DPR)
38. The DPR effectively determines which potentially eligible civil enforcement actions are to be applied
to fine defaulters whose fines debt is recorded in the FES.
39. Actions may include Garnishee Orders (bank, employer and third party), property seizure orders,
examination summons and notices, referral of the debt to a private debt collector and/or various
data matching routines with both the RMS and credit reporting bureaus.
40. Revenue NSW’s analytics team maintains the DPR, which categorises all active fine defaulter records
in the Fine Enforcement System (FES) and determines the next best course of action for each of
them.
41. The development and creation of the DPR was the result of a long collaboration between the
operational areas of Revenue NSW and its analytics team. Originally created in 2013, the DPR has
continued to be enhanced over time and Revenue NSW advises that it “is continually improved and
updated to ensure it is providing the maximum benefit to all business areas”.
42. The DPR is a ‘centralised business rules’ engine. This means that customers are assessed for all
potentially applicable actions in one process. The DPR replaced previous approaches that had
involved ‘multiple business rules’ engines being applied in respect of different processes, which had
created problems where the same customer could be selected for multiple actions at the same time.
43. The DPR, by contrast, ensures that only one ‘next action’ for any file is selected at any time, being the
action that is considered most appropriate action for that customer at that time. This ensures that
customers flow through a process one action at a time, before moving on to other actions.
44. Revenue NSW advises that, as well as avoiding the problem of multiple actions being selected for
implementation simultaneously, the DPR also improves on previous approaches by ensuring that any
actions, such as the selection of customers for Garnishee Orders, are taken in a consistent manner
according to pre-approved business rules.
45. Those business rules are coded into algorithms in the DPR. The DPR does not utilise machine learning
technology or other forms of ‘artificial intelligence’.
46. The DPR’s business rules are developed by subject matter experts in Revenue NSW’s business areas,
translated by its analysts into code-able instructions, and then incorporated by software coders into
the DPR code.
47. All business rules and changes to business rules require approval by a senior executive (Executive
Director). Once business rule amendments have been approved, changes to the DPR code are made
with oversight by another executive (Director). There is no formal delegation for these business rules.
The roles in the rules process have been approved by the Executive Director.
48. A more detailed description of how the DPR works is at Attachment B.

Further steps for enforcement by way of a Garnishee Order
49. Picking up from Step 7 above (that is, after RMS enforcement action has been attempted and if the
debt remains outstanding after 14 days) the next steps in the process toward enforcement by
Garnishee Order are as follows:

Step 8 – Queuing of customers for Garnishee Orders
50.
The DPR applies its coded business rules to pool customers into categories based on the next
proposed enforcement action. The categorisation rules are generally aimed at assessing the
potential success of each potential type of enforcement action, having regard to various customer
attributes including the customer’s age, the debt type and their address (see Attachment B).

Annexure A – Revenue NSW case study 12
NSW Ombudsman

The business rules have generally been drafted and coded with a view to selecting as the next
action the one that is:
available (i.e., permitted at the stage and time of the process under the legislation)
likely to be successful in recovering the debt in a timely manner
easy to administer and unlikely to incur significant cost for Revenue NSW.
Customers who are pooled into a category for a particular type of civil enforcement action (such as
a Garnishee Order) are then placed in the relevant queue for that action.

Step 9 – Garnishee Orders made to the big four banks

The relevant enforcement action is then attempted using one of the following approaches,
depending on the particular type of enforcement action:
 a ‘straight through processing’ – should be taken to mean where a particular action is done
without the need for manual intervention, however does not necessarily include an entire
‘end-to-end’ process.
 an automated workflow – should be taken to mean where an entire ‘end-to-end’ function is
undertaken wholly by an information system, such as ‘selecting customers to issue a
garnishee order then issuing a garnishee order then receiving a response back from a bank’.
 a manual workflow – should be taken to mean where one or more components of a particular
process, action or transaction require human intervention.
In the case of Garnishee Orders, Revenue NSW has in place direct electronic interfaces with the four
major banks - Commonwealth Bank of Australia (CBA), Australian and New Zealand Banking Group
(ANZ), Westpac Banking Corporation (WBC), National Australia Bank (NAB)). This allows it to adopt a
straight-through processing approach with those banks.
Accordingly, for customers in a GO queue for one of those banks, Revenue NSW serves the
Garnishee Order on the bank electronically. The orders are transmitted as an electronic file on a
nightly basis for bulk processing. The file contains a list of names of fine defaulters and the following
information in relation to each:
 Date of birth
 Full Name
 Address
 GO Number
 GO Amount
However, the capacity of each bank to accept and process Garnishee Orders at any time is limited.
This means that, typically, more fine defaulters are queued to be targeted for a Garnishee Order at
any time than can be processed on any given day. Where a file is queued for a Garnishee Order but
the order is not able to be issued on a given day, the file is held over in the queue to be re-assessed
by the DPR the following working day. The next day’s reassessment is undertaken afresh in
accordance with Step 7.

Step 10 – Attempted compliance by the big four banks

Once a Garnishee Order is made, the financial institution is required to comply with the order.
An exception is where the relevant account is one into which certain Commonwealth support
payments have been made. For example, under section 62 of the Social Security (Administration)
Act 1999 (Cth) (SSAA) a retrospective protected amount formula must be applied when a court

Annexure A – Revenue NSW case study 13
NSW Ombudsman

order in the nature of a Garnishee Order comes into effect, and social security payments have
been made into an account. Under the SSAA, the garnishee order does not apply to the saved
amount (if any) in an account. Similar provisions apply in relation to Commonwealth family
assistance payments.
Revenue NSW takes the view that it is the responsibility of the banks to ensure that there is
compliance with any relevant Commonwealth legislation. Revenue NSW takes no action to avoid
issuing a Garnishee Order that would, if fully actioned, have the effect of contravening the
Commonwealth legislation and it does not otherwise takes steps to verify that a contravention has
not occurred. Again, these are considered to be matters for the financial institutions to address.
Each financial institution is responsible for matching the Garnishee Order against its own customer
information.5 The banks also decide how to process the orders and the extent to which any of that
process is automated. It is understood that the process is almost entirely automated within all of
the major four banks.
If an account held by the relevant fine defaulter is identified by the bank, and if sufficient funds
(excluding any saved amount referred to above) are available in the account, then the amount of
the outstanding debt is transferred to Revenue NSW. If there are insufficient funds in the account to
satisfy the outstanding debt, then the entire amount held in the account is transferred (excluding
any saved amount). In general, this means that, where an outstanding debt is equal to or higher
than the balance of an account, a Garnishee Order results in a nil balance in that account.
If an account is located by the relevant bank, but there are no funds available at the time of the
Garnishee Order, the bank returns an ‘insufficient funds’ notification to Revenue NSW.
If no active account can be located for the relevant customer, the bank returns a ‘no account held’
or ‘account closed’ notification to Revenue NSW.

Step 11 – Re-attempts if account identified, but less than full recovery

If, at Step 10, a bank has returned an ‘insufficient funds’ notification or only a partial remittance
of funds from a fine defaulter’s account, the DPR business rules apply a 14 day waiting period
before a follow-up Garnishee Order can be issued to the same bank. Three re-attempts can be
issued at the same bank, before the customer file is re-assessed for alternative enforcement
action (as per Step 7), such as a Garnishee Order to another of the four major banks, or to another
financial institution.
Under the DPR business rules, if an initial Garnishee Order results in an ‘insufficient funds’
notification or only partial recovery, the maximum number of further Garnishee Orders that can
be issued in respect of the fine defaulter through ‘straight-through processing’ to the big four banks
in the following 12-month period is limited to sixteen. However, additional Garnishee Orders can be
issued manually by staff to those or other banks.

Step 12 – Re-assessment for enforcement action

If a fine debt is not fully recovered by step 11 above, the customer is re-assessed by the DPR for
enforcement action in the same way as described at step 7 above.
However, if a bank returns a ‘no account held’ or ‘account closed’ notification, the DPR business
rules provide that further Garnishee that can only be re-issued to that bank in respect of that
particular customer a maximum of once every three months (in the case of CBA and ANZ) and once
every six months (in the case of WBC, NAB and the non-major banks). This limit is in place to limit
unnecessary administrative burden being placed on the banks.

Annexure A – Revenue NSW case study 14
NSW Ombudsman

If an account for a fine defaulter is not located at one of the four major banks, the DPR assesses
whether alternative enforcement action should be taken (as per Step 7), including an attempted
Garnishee Order directed to another of the four major banks, or to another financial institution.
Where Revenue NSW does not have an agreement with a bank or credit union to issue a Garnishee
Order electronically, a paper Garnishee Order may be issued. Unlike the ‘batch’ processing
undertaken with the big four banks, these orders are served manually on the relevant institution on
a customer-by-customer bases. They are processed manually by the institution, and generally this
includes remitting funds back for manual processing by Revenue NSW as well. Even in those cases,
however, the DPR is still the mechanism for selecting whether a Garnishee Order should be issued.

Notification to fine defaulters
50. Revenue NSW does not provide specific notice to the fine defaulter before the making of a Garnishee
Order apart from previous notices advising this is one of the options that can be made if the fine
defaulter does not pay or engage with Revenue NSW in some way. This means that a fine defaulter
will typically first become aware that a Garnishee Order has been successful when they notice funds
are missing from their bank account.
51. Revenue NSW does not provide any notice or reasons to the fine defaulter after the making of a
Garnishee Order, including after the successful recovery of a debt under a Garnishee Order.
52. Penalty reminder notices and penalty notice enforcement orders issued to fine defaulters include
specific information and a warning about the further enforcement actions that can be made if there
is a failure to pay or take action.

Enforcement fees
53. Under the Fines Regulation, an enforcement fee of $65 may be applied by Revenue once every six
months for Garnishee Order(s) issued during that period. Enforcement fees may also be applied for
the issuing of an enforcement order ($65) and applying RMS sanctions ($40).
54. Under the original version of the GO system, unless a fine defaulter had sought an internal review of
the original penalty notice, up to $170 in enforcement fees would be applied to a fine debt and
included in a Garnishee Order without any staff member having reviewed the matter. (See paragraph
[56] below, which notes changes made to the imposition of fees from late 2016.)

Annexure A – Revenue NSW case study 15
NSW Ombudsman

PART D: MODIFICATIONS TO THE GO SYSTEM
First modification – The introduction of a minimum protected amount
55. Following customer complaints and concerns raised by the NSW Ombudsman and others, in August
2016 Revenue NSW began applying a ‘minimum protected amount’ to bank-directed Garnishee
Orders.
56. That amount is currently $523.10 (indexed in line with CPI). Revenue NSW instructs banks that this
minimum balance must be left in any account that is otherwise subject to a Garnishee Order issued
by Revenue NSW.
57. The minimum protected amount is consistent with the minimum protected amount for court-issued
garnishee orders directed to employers and, since June 2018 court-issued garnishee orders directed
to banks, under the Civil Procedure Act.6
58. Additionally, at around the same time, Revenue NSW implemented a new policy providing that the
enforcement fee of $65 for Garnishee Orders is only to be applied once per customer, and only in
cases where the total debt exceeds $400.
59. This did not involve any change to a published policy, however it was reflected in the relevant
business rules maintained by Revenue NSW.

Second modification – The exclusion of Vulnerable Persons using a machine learning model
60. In September 2018 Revenue NSW agreed with the NSW Ombudsman that it should take steps to
exclude the making of Garnishee Orders in respect of Vulnerable Persons.
61. Revenue NSW advises that it had found that collection success rates were lower if the fine defaulter
was a Vulnerable Person. Further, when a Garnishee Order was issued on a Vulnerable Person there
was a greater likelihood that it would result in a request for a refund, the processing of which
imposed additional administrative costs for Revenue NSW. Consequently, Revenue NSW advises that
the exclusion of Vulnerable Persons assists Revenue NSW to better target its resources.
62. Revenue NSW did this by implementing a new machine learning model within the DPR with the
intention of identifying and excluding Vulnerable Persons from the application of Garnishee Order
processes.
63. The model seeks to find relationships between different variables and to make a prediction about the
likelihood of a person being Vulnerable.
64. Revenue NSW has around 4 million customer records, of which approximately 60,000 customers are
known to be Vulnerable Persons. The model was developed using machine learning algorithms that
compared all customer records with the 60,000 people already identified as Vulnerable in the system.
Overall, the model was trained to identify if a person was Vulnerable using 250,000 customer files,
and having regard to a list of potential variables. Those variables include:
 age
 amount of outstanding debt
 success of previous garnishee orders issued
 number of enforcement orders issued
 previous payment plans
 frequency of contact
 type of offence
 previous long-term hardship stay on enforcement
 data from the Office of the Sheriff

Annexure A – Revenue NSW case study 16
NSW Ombudsman

 known incarceration history
 previous Centrepay7 arrangements.
65. Revenue NSW also included externally-sourced data in the model, including the addresses of all
Family and Community Services (FACS) owned properties and the Australian Bureau of Statistics
socio-economic scores based on geographical location. This allowed the model to ‘learn’, for
example, whether there was a correlation between persons being vulnerable and the fact that their
address matched the address of FACS-owned property. If there was such a correlation, then the
model could use that correlation to predict that a fine defaulter whose address is the same as a FACs-
owned property is more likely to be a Vulnerable Person.
66. The model’s output is a ‘prediction’ as to the likelihood, expressed as a percentage, that the person is
vulnerable.
67. If the machine learning model makes a prediction of 51 per cent and above, then the person is
classified as a Vulnerable Person. Less than 5 per cent of all Revenue NSW customer files are
predicted by the model to fall within this vulnerable category.
68. Revenue NSW advises that the machine learning model demonstrated a 96 per cent accuracy rate in
identifying whether a person is a Vulnerable Person using this 51 per cent probability threshold.
69. Since the establishment of this machine learning model, the business rules of the DPR provide that a
Garnishee Order will not be issued if the model predicts a 35 per cent or more likelihood of a fine
defaulter being a Vulnerable Person.
70. In the month of November 2018, following the adoption of the Vulnerable Person module, Revenue
NSW quarantined approximately 2,800 fine defaulters with up to $27 million in outstanding debt as
ineligible to be considered for a Garnishee Order. This meant that a Garnishee Order would not be
issued to those fine defaulters due to the likelihood they were Vulnerable and that a Garnishee Order
would cause hardship.
71. Customers who return a prediction of Vulnerability are removed by the DPR from the ‘GO’ (Garnishee
Order) process (as well as some other processes) and are instead diverted to a special tier within the
DPR. Actions applicable to this tier may include:
 phone calls, SMS messaging and mail out campaigns by the Hardship Team
 referral to the Interactive Voice Response (IVR) system for manual contact so they can be
routed to the Hardship Team.
The Hardship Team can put the customer in contact with WDO sponsors and/or can discuss other options
for debt resolution, such as low income payment plans or write-off of the debt, if appropriate.
72. The adoption of the Vulnerable Person Tool did not involve a change to any published policy and/or
any other public communication.

Third modification – A ‘human stop/go’ process step
73. In March 2019, Revenue NSW introduced an additional manual step in the process of issuing
Garnishee Orders.
74. Under this now Current Version, before the electronic file is transmitted to the garnisheed banks for
action (that is, between Step 8 and Step 9 above), a designated staff member of Revenue NSW is
required to ‘authorise’ the issuing of the proposed Garnishee Order.
75. This change was made in response to questions raised by the NSW Ombudsman as to the legality of
Revenue NSW’s GO system, and in particular whether that system was consistent with the statutory
conferral of discretionary powers on the Commissioner under the Fines Act.

Annexure A – Revenue NSW case study 17
NSW Ombudsman

76. The manner in which this additional step is being applied in practice is as follows:

Step 8A – ‘Human stop/go’ (Staff member authorisation)

Once the DPR has selected the list of fine defaulters to be ‘pooled’ for the purpose of bulk processing
of Garnishee Orders, a ‘Garnishee Order Issue Check Summary Report’ is produced. An example of
such a report is set out in Attachment C.
A single consolidated report is prepared for all files selected for Garnishee Order. The example in
Attachment C shows a report for a single day (23 March 2020) in which 7,386 fine defaulters had
been selected by the DPR for the issuance of a Garnishee Order.
The report is accompanied by a spreadsheet of the raw data from all of the relevant files (not
included in Attachment C for privacy reasons).
The report sets out by way of red/green ‘traffic lights’ whether the files meet eleven ‘inclusion
criteria’ and do not meet sixteen ‘exclusion criteria’. These criteria reflect Revenue NSW’s business
rules, and include some criteria prescribed by legislation.
The inclusion criteria include things like: the age of the fine defaulter being over 18 and less than 70.
The exclusion criteria include things like: the customer is deceased, bankrupt or in custody. Another
exclusion criterion is: the machine learning model has reported a vulnerability score of more than 35
per cent.
Because these criteria are included in the DPR business rules, the Report should produce ‘green
traffic lights’.
The only circumstance in which a ‘red traffic light’ could appear would be if:
There was some error in the coding of the business rules within the DPR (such that the DPR was
not properly applying an exclusion criterion), or
An inconsistency between the business rules and the criteria for the Report.
If a traffic light does show red, the staff member may review any file that has been flagged and
exclude it from the Garnishee Order file.
In addition, if the Report generates a red traffic light, the file is sent to be reviewed by Revenue
NSW’s analytics team, as it may indicate a defect either with DPR coded business rules or with the
Report itself. A senior officer must then confirm that the impacted customer is excluded from the
daily file before approving.
If all traffic lights are green (or once any red traffic lighted files have been manually removed) the
staff member approves the Garnishee Orders and the files are transmitted to the relevant banks.
In the example report the red light is a company file, although suitable for a Garnishee Order, is
blocked from the auto file. If the Garnishee Order was to be issued, it would be manually generated
by the Targeted Team. In practice, the case was removed from the file, and referred to the
appropriate team to consider manually issuing a Garnishee Order.

Annexure A – Revenue NSW case study 18
NSW Ombudsman

PART E: IMPACT AND EFFECTIVENESS OF THE GO SYSTEM
Debt recovery under the GO system
77. The use of the GO system has resulted in a significant increase in the number of Garnishee Orders
issued by Revenue NSW.
78. In the 2010-2011 financial year, Revenue NSW issued 6,905 garnishee orders. In the 2018-2019
financial year it issued more than 1.6 million.
79. However, as noted above, the GO system typically operates with an iterative process (see Steps 10
and following above). That is, if Revenue NSW wishes to issue a Garnishee Order in respect of a fine
defaulter, it will generally first issue a Garnishee Order to one of the big four banks. The fine
defaulter might not hold an account with that bank. If the first Garnishee Order is unsuccessful in
recovering the debt, then further Garnishee Orders may be issued to different financial institutions.
This may continue successively until an account held by the fine defaulter is identified.
80. For this reason, the number of Garnishee Orders issued in any period does not correspond with the
number of fine defaulters whose active accounts are the subject of such orders. Of the ~1.6 million
Garnishee Orders made by Revenue NSW in 2018-2019, those orders applied to around 237,548
distinct customers.
81. Nevertheless, it is clearly the case that Garnishee Orders have become more prevalent over the past
decade through the use of the GO system. In 2012-2013, Revenue NSW recovered $10,126,428.15 by
way of Garnishee Orders. In 2019-2020 it recovered $11,529,744.39. The average recovery per
Garnishee Order is around $500.
82. Revenue NSW now issues significantly higher numbers of Garnishee Orders compared to other civil
sanctions available under the Fines Act. This reflects the fact that the business rules in the DPR have
been coded to prioritise Garnishee Orders, and Garnishee Orders directed to the big four banks in
particular, for selection as a preferred enforcement action.
83. Reasons for this include that Garnishee Orders issued to the big four banks tend to be a successful
means of recovering fine debt; Garnishee Orders to those banks are, through straight-through
processing, very cheap to administer; and they allow for an iterative approach to be taken to identify
an account held by the relevant fine defaulter if their account details are not already known.
84. Revenue NSW applied the following civil sanctions for the 2019-2020 financial year:
Sanction Number Attempted
Direction to RMS to take enforcement action 401,775
Bank garnishee order 1,069,597
Employer garnishee order 8,991
External debt collection referral 19,868
Property seizure order 12,826
Examination Notices 130,999
Charges on land ~100
Community service orders Nil
Imprisonment Nil

Annexure A – Revenue NSW case study 19
NSW Ombudsman

85. The below table shows the number of requests for refunds of Garnishee Orders issued in
each year since 2012:

Financial Year # Refund Requests
2012-2013 313

2013-2014 794

2014-2015 1236

2015-2016 1963

2016-2017 870

2017-2018 677

2018-2019 557

2019-2020 431

86. The below visualisation depicts refund numbers have fallen significantly with the introduction of the
protected amount in 2016.

Annexure A – Revenue NSW case study 20
NSW Ombudsman

Attachment A: Revenue NSW Delegation Instruments

Not attached to this report.

Attachment B: Revenue NSW Debt Profile Report

This attachment describes, in lay terms, the way in which Revenue NSW’s DPR (Debt Profile Report)
works in terms of making the ‘selection’ of a Garnishee Order as the appropriate enforcement action
for a particular fine defaulter file.

1. The DPR captures over 120 individual data points about a fine defaulter from the FES. This
includes but is not limited to: the outstanding balance, fine defaulter age, debt age, debt type,
enforcement action already conducted (and its results), fine defaulter contact information and
data matching results.

2. Using this data, the DPR sorts the fine defaulters into ‘tiers’ within the DPR. Each tier is associated
with a different next action to be taken in respect of the find defaulter.

3. The tiers themselves are generally grouped into one of the following six categories:

(a) Time to Pay
The fine defaulter is actively repaying the outstanding debt via an instalment plan.

(b) Collections Paused
The fine defaulter has been identified as ineligible for enforcement action at the present time, for
example, because the fine defaulter has been identified as a juvenile, has their financial affairs managed
by the NSW Trustee and Guardian, is deceased or is in custody.

(c) Remedial Action
The fine defaulter has been identified in a tier that requires manual follow-up by a Revenue NSW staff
member, for example due to data quality issues or because the file is the subject of a review. An
example of this would be where a Transport for NSW data match is returned as inconclusive, requiring
a person to investigate the file to determine the correct identification characteristics.

(d) Queued For Collections Process
The fine defaulter has been identified as eligible for a particular enforcement action, however that
enforcement action has a limited number of actions that can be issued on a daily basis and the fine
defaulter has been queued for an issue of that sanction type.

(e) In Collections Process
The outstanding debt on the fine defaulter record is currently subject to an enforcement process for
example, there is an active bank garnishee order, recently issued enforcement order, or a recently applied
RMS sanction.

(f) Write Off Consideration
Enforcement action is otherwise not feasible, for example because only a small balance of debt remains,
the client resides interstate (therefore enforcement options are limited) or the fine defaulter record has
been subject to repeated enforcement action and it has been unsuccessful in recovery of the full debt.

Annexure A – Revenue NSW case study 21
NSW Ombudsman

4. The placement of a fine defaulter in a tier is undertaken on the basis of the following:
 Eligibility for the relevant sanction
Algorithms, based on simple business rules, identify which fine defaulters meet relevant inclusion
criteria (and de-select fine defaulters who meet other exclusion criteria) for particular sanction, and who
are therefore considered ‘eligible’ for that sanction.
 Potential success factor
Based on historical evidence of ‘like’ fine defaulters, the DPR makes an assessment of the likelihood of
particular action being successful against the fine defaulter. In particular, the DPR has been configured to
apply an algorithm that utilises historical data stored within FES to determine a ‘potential success factor’
for each fine defaulter and each sanction for which they are eligible. This algorithm was developed
following a review of previous enforcement actions undertaken over a period of 12 months which allows
the fine defaulter to be matched to a pool of ‘like’ fine defaulters who had enforcement action
undertaken. (Analysis undertaken by Revenue NSW identifies several factors that contribute to
determining the potential success of a sanction; these include the age of the fine defaulter, the type of
debt, recidivism of the fine defaulter, amount outstanding, previous instalment plans, previous
enforcement actions, address information and contact patterns). This is a rules based algorithm, however
it is dynamic in that the algorithm is able to adjust as differences in the data is detected.

 Priority in the queue
The number of fine defaulters already queued for an enforcement action is taken into account. For
example, a fine defaulter’s file may be eligible for a Garnishee Order but if there is already a long queue
of proposed Garnishee Orders, and this particular fine defaulter’s file would have a low priority in that
queue, then it may be streamed into another enforcement action.

5. In general terms, the following is the basic order of priority of tiers showing which enforcement
methods are selected in the DPR. (However, this is subject to variation for some fine defaulters based
on their own individual circumstances having regard to the matters described in paragraph 4. above):
a. The issue of the enforcement order and attempt at an RMS sanction completed in the FES
b. Targeted bank Garnishee Order (that is, a bank Garnishee Order that is issued to a specific
bank because of a previously successful Garnishee Order at that bank in respect of the
relevant fine defaulter, or because a fine defaulter’s bank details are known)
c. Employer garnishee order (if employer details known)
d. Bank Garnishee Order
e. Debt Partnerships Program
f. Examination Notice
g. Property Seizure Order.

6. Although the above suggests a linear process, the DPR applies its business rules against all fine
defaulters on a daily basis. Therefore, it is possible that a fine defaulter could return a ‘lower’ tier
allocation on day one but return a ‘higher’ tier on day two because of data changes within FES. For
example, if a fine defaulter’s file does not contain a date of birth then that fine defaulter will be
ineligible for a Garnishee Order to be issued (as the DPR cannot verify that the fine defaulter is not
within an excluded category, i.e., those under the age of 18). Therefore it will ‘pass over’ all of the
Garnishee Order tiers for that fine defaulter. However, if a date of birth is subsequently found and
entered into the FES, that fine defaulter may be allocated to a Garnishee Order tier based on this
data change.
7. The DPR executes over 130 individual business rules to determine how a fine defaulter should be
treated in the enforcement lifecycle.

Annexure A – Revenue NSW case study 22
NSW Ombudsman

8. Fine defaulters are allocated to a Garnishee Order tier based on the following general rules:
a. The fine defaulter has not been identified in a higher priority tier
b. The fine defaulter has at least one overdue enforcement order
c. The fine defaulter’s total overdue balance is $20 or greater
d. The fine defaulter has at least one enforcement issued in the previous 7 years
e. The fine defaulter had all outstanding enforcement orders issued at least 38 days ago
f. The fine defaulter has not contacted Revenue NSW in the previous 14 days
g. The fine defaulter has not made a partial payment to Revenue NSW in the previous 14 days
h. The fine defaulter has not had a RMS sanction applied in the previous 14 days
i. The fine defaulter is aged between 18 and 70 (inclusive)
j. The fine defaulter has not had a letter advising the customer of a likely referral to an external
debt collector (debt partner) issued in the previous 40 days
k. If the fine defaulter has been previously referred to an external debt collection agency, that
referral must have been returned under an acceptable reason code i.e. not deceased
l. The fine defaulter has not already had previous Garnishee Orders issued to all major banks
that have previously been unsuccessful within a specific timeframe (CBA and ANZ in the last
three months and NAB and WBC in the last six months).
9. Once the fine defaulter record passes the general GO business rules, the record is then prioritised and
placed in a queue with other fine defaulters in the same tier, for issue based on the fine defaulters’
individual circumstances. The priority is generally as follows (from highest to lowest):
a. A previous Garnishee Order was issued for this fine defaulter that identified an active account,
but returned only partial funds or insufficient funds
b. The fine defaulter recently defaulted on a Payment Plan arrangement
c. The fine defaulter’s bank details are known, which allows Revenue NSW to issue a targeted
Garnishee Order to that specific bank. (Bank details are obtained either voluntarily by the fine
defaulter or under some circumstances the financial institution can be identified if the fine
defaulter has made a previous payment to Revenue NSW)
d. The fine defaulter had recent debt re-activated from write off
e. All remaining fine defaulters are prioritised by the age of the debt, with the most recent given
the highest priority.

Annexure A – Revenue NSW case study 23
NSW Ombudsman

Attachment C: Garnishee Order Issue Check Summary Report Example
This attachment is an example ‘Garnishee Order Issue Check Summary Report’ showing a report for
23 March 2020 in which 7,386 fine defaulters had been selected by the DPR for the issuance of a
Garnishee Order.

Annexure A – Revenue NSW case study 24
NSW Ombudsman

3 Questions for Counsel – Revenue NSW’s use of automation
technologies in administrative decision-making
The NSW Ombudsman sought legal advice from Counsel (instructed by the Crown Solicitor’s Office) on
the following matters:

1. Was the process by which Garnishee Orders (GO) were issued:
a. in and around early 2016 using the Original Version of the GO system
b. from August 2016 following the First Modification to the GO system
c. from September 2018 following the Second Modification to the GO system
d. since March 2019 using the Current Version of the GO system
a lawfully permissible process for the making of such orders by the Commissioner8 in accordance
with section 73(1) of the Fines Act? If the answer to any of these is “no”, why not?

2. To the extent that the following questions are not answered in 1 above, please also answer:
a. What action must the Commissioner take before exercising his or her discretion under s
73(1) of the Fines Act to issue a Garnishee Order?
b. What action must the Commissioner take to satisfy himself or herself, for the purposes of
Fines Act s 71, that ‘civil enforcement action is preferable’ to enforcement under Fines Act
Part 4 Division 3?
c. In satisfying himself or herself of the matter referred to in s 71 and/or in exercising
discretion under s 73(1), what consideration or reliance may be given to the outputs of the
GO system? In particular, if the Commissioner may consider or rely upon the outputs of the
GO system then:
 must he or she nevertheless personally and actively consider those outputs in respect
of each particular proposed order and subsequently authorise a particular order to be
issued? If “yes” what “active consideration” is required?
or
 may he or she personally and actively consider those outputs in respect of a ‘batch’
of proposed orders and subsequently authorise that batch of orders to be issued?
If “yes” what “active consideration” is required?
or
 may he or she, in effect, pre-authorise the making of an order that is, in future,
subject to certain outputs of the GO system (without any further active consideration
or authorisation by him or her)?
d. Are there matters that must be considered by the Commissioner when deciding whether or
not to issue a Garnishee Order (mandatory considerations)? Were any of those mandatory
considerations not being considered under:
 the Original Version of the GO system,
 the Current Version of the GO system.
e. Are there any matters that may not be considered by the Commissioner (irrelevant
considerations) that were taken into account when a determination was made whether or
not to issue an order under:
 the Original Version of the GO system,
 the Current Version of the GO system.

Annexure A – Revenue NSW case study 25
NSW Ombudsman

f. Are any of the answers to the questions in 2 affected by the fact that a Garnishee Order
under s 73 operates as an order made by the Local Court under Part 8 of the Civil Procedure
Act 2005.
g. Does the fact that a Garnishee Order under s 73 operate as an order made by the Local
Court under Part 8 of the Civil Procedure Act 2005 mean that that order may be appealed
against or set aside by a Court in the same manner as enforcement action taken under Civil
Procedure Act Part 8?
h. Do any Constitutional issues arise in respect of the interaction or potential interaction
between the Fines Act 1996 s 73 (as it has been applied at any time using the GO system)
and relevant Commonwealth legislation, including the Social Security (Administration) Act
1999 s 62, the Bankruptcy Act 1966 or Income Tax Assessment Act 1936 (Cth), Income Tax
Assessment Act 1997 (Cth) or the Taxation Administration Act 1953 (Cth)?

3. If the answer to 1(d) above is “no”:
a. Are there any modifications that could be made to the GO system that would mean that the
process of issuing Garnishee Orders using that system would then be a lawfully permissible
process for the making of such orders by the Commissioner in accordance with section 73(1)
of the Fines Act? If the answer is “yes”, what would those modifications be?
b. Alternatively, could legislative amendments be made to the Fines Act to authorise the use of
the Current Version of the GO system such that the process by which Garnishee Orders are
issued using that system would be a lawfully permissible process for making such orders in
accordance with section 73(1) of the Fines Act (as amended)? If the answer is “yes”, what
amendments would be required?9

Annexure A – Revenue NSW case study 26
NSW Ombudsman

4 Legal Opinion of James Emmett SC and Myles Pulsford
The document beginning over the page is the joint opinion of James Emmett SC and Myles Pulsford,
instructed by the Crown Solicitor’s Office, 29 October 2020.

1 NSW Ombudsman, ‘Annual Report 2016-16’ (Report, October 2016) 62 <
https://www.ombo.nsw.gov.au/__data/assets/pdf_file/0005/38498/NSW-Ombudsman_Annual-Report_2015-16-plus-
errata.pdf>.
2 Indexed in line with CPI.
3 Under the Fines Act, an order served after 5 p.m. is taken to have been served on the next day that is not a Saturday,
Sunday or public holiday ss 73(6)(a)(b).
4 ‘Customer’ is the general term used by Revenue NSW to refer to all persons who interact with Revenue NSW including fine
defaulters. Under the Fines Act, a person does not become a ‘fine defaulter’ (as defined) in respect of an unpaid penalty
notice until they have been served with a penalty notice enforcement order (s 57(3)). In this document, the term ‘customer’
is used interchangeably with ‘fine defaulter’.
5 Complaints have been received by Revenue NSW and the NSW Ombudsman from time to time when a bank has identified
the wrong account to be garnisheed, such as from an account held by a person who shares the same full name as the fine
defaulter. Revenue NSW advises banks to ensure that they verify all provided data against account details (eg., names, date
of birth) before matching accounts to a Garnishee Order, but that the onus is ultimately on the bank to ensure that it
identifies and transmits funds only from an account to which the order relates.
6 s 118A of the Civil Procedure Act 2005 (NSW), commenced by proclamation on 30 June 2018. Under s 118A(1), ‘one or more
garnishee orders must not, in total, reduce the amount of the aggregate debt that is due and accruing from the garnishee
to the judgment debtor to less than $447.70.’ Under s 118A(2), the amount referred to in s 118A(1) is an ‘adjustable
amount’ for the purposes of Division 6 of Part 3 of the Workers Compensation Act 1987 (NSW).
7 A free and voluntary service to pay bills and expenses as regular deductions from Centrelink payments.
8 Reference to “Commissioner” includes reference to a person duly delegated to perform the functions of the Commissioner.
9 See possible eg s 6A Social Security Administration Act 1999 (Cth); s 6 of the Fines Enforcement and Debt Recovery Act 2017
(SA).

Annexure A – Revenue NSW case study 27
Legality of automated decision-making procedures

for the making of garnishee orders

Joint Opinion

1. Our instructing solicitors act for the NSW Ombudsman.

2. Our advice is sought to assist the NSW Ombudsman prepare a report on automated
decision-making. Our opinion is specifically sought in relation to:

a. The requirements for the lawful issue of a garnishee order under the Fines Act 1996
(NSW).

b. Whether the processes by which garnishee orders have been made by the
Commissioner of Fines Administration (Commissioner) under the Fines Act since
2016 have been lawful.

c. If the process by which the Commissioner presently makes garnishee orders is not
lawful, whether that defect could be cured by modification of the process or
legislative amendment.

3. In summary:

a. The Commissioner’s satisfaction that enforcement action is authorised under Pt 4
Div. 4 of the Fines Act (s 73(2)) is a subjective jurisdictional fact for the exercise of
the Commissioner’s power to make a garnishee order.

b. Section 73(1) of the Fines Act confers a discretionary power on the Commissioner,
although the extent of the discretion depends on the basis upon which enforcement
action is authorised under Pt 4, Div. 4 (see s 71). That discretionary power must be
exercised by the repository of the power or a person authorised or delegated the
function in accordance with ss 116A and 116B of the Fines Act. The power must be
exercised in accordance with the subject matter, scope and purpose of the Fines Act.
Any policy adopted to guide the discretion needs to be consistent with that Act.

1
c. Commonwealth laws, through s 109 of the Constitution (Cth), may, depending on the
relevant circumstances, operate to constrain the Commissioner’s ability to issue
garnishee orders.

d. To the extent that an individual, being the Commissioner, their delegate or an
authorised person, was not involved in the making of garnishee orders between
January 2016 and March 2019, the Commissioner’s process was not lawful because
the requisite discretion was not exercised by the repository of the power and orders
were not issued following satisfaction of the subjective jurisdictional fact.

e. While the interposition of an individual in the process for making garnishee orders
has resulted in orders being made by the repository of the power, it does not appear
to have addressed concerns about the establishment of the subjective jurisdictional
fact in s 73(2) or the manner in which the discretionary power is being exercised
under s 73(1).

f. The defects in the Commissioner’s process for the issue of garnishee orders could be
addressed either by modification of the process or by legislative amendment.

Background

4. We are instructed with a document titled “Statement of Facts – Revenue NSW’s System
for Issuing Garnishee Orders” (SOF), which we understand was prepared by the NSW
Ombudsman with input from Revenue NSW. For the purposes of this advice, we presume
that that document accurately represents the processes of the Commissioner and our advice
must be read with that limitation in mind.

5. Information technology has played a central role in the Commissioner’s process for making
garnishee orders since January 2016: SOF at [21]. There are two “core” information
technology applications in the process: the fines enforcement system (FES) and the debt
profile report (DPR): SOF at [28]. The FES comprises a database of records (referred to
as a system of records (SOR)) and transaction processing: SOF at [29] and [31]. The FES
records the garnishee order transaction, transmits the garnishee order, interprets the
response and processes applicable payments and other transactions: SOF at [31].

2
6. The DPR is a centralised business rule engine that takes the data in the FES, applies
business and prioritisation rules and generates customer profiling and activity selection:
SOF at [32]. The DPR is relevantly responsible for assessing fine defaulters for all
potentially applicable enforcement actions and selecting the next enforcement action: SOF
at [32]. We understand that the DPR has ordered tiers of enforcement actions and executes
over 130 individual business rules to determine how a fine defaulter should be treated: SOF,
Attachment B at [5] and [7]. We are instructed (see SOF, Attachment B at [7]) that fine
defaulters are allocated to a garnishee order “based on the following general rules”:

• The fine defaulter has not been identified in a higher priority tier.
• The fine defaulter has at least one overdue enforcement order.
• The fine defaulter’s total overdue balance is $20 or greater.
• The fine defaulter has at least one enforcement [order] issued in the previous 7 years.
• The fine defaulter had all outstanding enforcement orders issued at least 38 days
ago.
• The fine defaulter has not contacted Revenue NSW in the previous 14 days.
• The fine defaulter has not made a partial payment to Revenue NSW in the previous
14 days.
• The fine defaulter has not had a RMS sanction applied in the previous 14 days.
• The fine defaulter is aged between 18 and 70 (inclusive).
• The fine defaulter has not had a letter advising them of a likely referral to an external
debt collector issued in the previous 40 days.
• If the fine defaulter has been previously referred to an external debt collection
agency, that referral must have been returned under an acceptable reason code i.e.
not deceased.
• The fine defaulter has not already had previous garnishee orders issued to all major
banks that have previously been unsuccessful within a specific timeframe.
7. Once the next enforcement action is selected, a message is sent by the DPR to the FES
instructing the FES either to process the selected action (if an automated action) or to
notify staff of the need to undertake the selected action (if a manual action): SOF at [33].
No manual intervention is required for garnishee orders to the Commonwealth Bank, ANZ,
Westpac or NAB: see Step 9 below.

8. I am instructed (see SOF at [37], [49] and [76]) that the “standard process flow” from a fine
to a garnishee order is as follows:

3
Step 1: The fine is loaded into the SOR in the FES.
Step 2: The FES validates the referred details.
Step 3: An enforcement order is automatically generated by the FES.
Step 4: A data match is conducted between the FES and the system of Roads and Maritime
Services (RMS).
Step 5: An enforcement order is generated and transmitted by post or email, without any
staff involvement other than, in the case of post, as is involved in ordinary mail handling.
Step 6: Thirty-seven days after the enforcement order is printed, if the enforcement order
has not been closed (eg because it was paid or under management), a request is
automatically issued by the FES to RMS to apply enforcement action under Pt 4, Div. 3.
Step 7: After 14-days, the DPR assesses whether any civil enforcement action should be
taken. See [6] above.
Step 8: In accordance with the process identified at [6] above, fine defaulters are pooled by
the DPR according to the next proposed enforcement action and fine defaulters are then
placed in the relevant queue, in accordance with rules of priority, for that action.
Step 9: Garnishee orders are made by FES, without human intervention, to one of the
Commonwealth Bank, ANZ, Westpac or the NAB. We understand that human
intervention may be required for garnishee orders to other recipients. If a file is queued for
a garnishee order but it is not able to be issued on a given day, the file is held over to be re-
assessed by the DPR the following working day.
Step 10: The garnishee order is complied with. The amount of the outstanding debt, to the
extent that there are funds in the fine defaulter’s account, is transferred to Revenue NSW.
The banks notify Revenue NSW if there are no funds available at the time or if the fine
defaulter does not hold an account with the bank.
Step 11: If no funds were available, or if only part of the debt was recovered, the DPR
applies a 14-day waiting period before a garnishee order may be re-issued to that bank.
Step 12: If the debt is not fully recovered after Step 11, the fine defaulter is re-assessed by
the DPR as set out at Step 7. The DPR places limits on re-issuing garnishee orders to a
bank if notified that the fine defaulter does not hold an account with that bank. If the fine
defaulter does not hold an account with the Commonwealth Bank, ANZ, Westpac or the
NAB, DPR assesses whether alternative enforcement action should be taken including
making garnishee orders to other banks and financial institutions.

9. We are instructed that there have been three alterations to this general process since 2016
(Original Version). First, since August 2016, a “minimum protected amount”, currently
in the sum of $523.10, was applied to garnishee orders made to banks (First Modification):
SOF at [55]-[56]. Banks are instructed that the “minimum protected amount” must be left
in any account subject to a garnishee order.

4
10. Second, since September 2018, a machine learning model within the DPR has been used
to identify and exclude “vulnerable persons” from the application of garnishee order
processes (Second Modification): SOF at [60] and [62].

11. Third, in March 2019, an additional manual step was added between Steps 8 and 9 (Current
Version). Before the electronic file is transmitted to the garnished banks for action, a
designated staff member of Revenue NSW is required to authorise the issuing of the
proposed garnishee order: SOF at [74]. After the pooling at Step 8, a Garnishee Order Issue
Check Summary Report (Check Summary Report) is produced: SOF at [76]. We
understand that the Check Summary Report is a single consolidated report for all the fine
defaulters selected for a garnishee order and that that report is accompanied by a
spreadsheet of the raw data from all relevant files: SOF at [76].

12. The Check Summary Report uses a traffic light system in respect of inclusionary and
exclusionary criteria: SOF at [76]. We are instructed that the criteria reflect the DPR’s
business rules and includes some criteria prescribed by legislation: SOF at [76]. At least a
number of the criteria reflect the considerations referred to at [6] above that are used by
the DPR to select a garnishee order as the next enforcement action: SOF at [76]. We
understand that if the traffic lights are green, a staff member of Revenue NSW approves
the garnishee orders and the files are transmitted to the relevant banks: SOF at [76]. A red
traffic light results in the removal and review of the relevant fine defaulters file: SOF at [76].
For example, the Check Summary Report with which we have been briefed concerned
7,386 fine defaulters and we understand that, if all the traffic lights were green, the reviewer
would proceed to approve the making of the garnishee orders without giving any specific
consideration to the file of the underlying fine defaulters.

Relevant legislation

Fines Act

13. The Fines Act is an Act relating to fines and their enforcement: see the Long Title. There
are relevantly two species of fines under the Fines Act: fines imposed by courts (see Pt 2);
and penalty notices (see Pt 3). They may respectively be enforced by way of a “court fine
enforcement order” and a “penalty notice enforcement order”.

5
14. A court fine enforcement order is an order “made by the Commissioner for the
enforcement of a fine imposed by a court”: s 12. The Commissioner “may make” such an
order in the circumstances specified in s 14 of the Fines Act.

15. A penalty notice enforcement order is an order “made by the Commissioner for the
enforcement of the amount payable under a penalty notice: s 40. The Commissioner
“may… make” such an order on application by an appropriate officer for a penalty notice
or on the Commissioner’s own initiative: s 41. The circumstances in which a penalty notice
enforcement order may be made are set out in s 42 of the Fines Act.

16. Part 4 of the Fines Act, headed “Fine enforcement action”, applies to court fine enforcement
orders and penalty notice enforcement orders. Such orders are referred to as “fine
enforcement order[s]” (s 57(2)) and the person liable to pay the fine is referred to as the
“fine defaulter”: s 57(3). Subject to limited exception, as soon as practicable after a fine
enforcement order is made, the Commissioner is required to serve notice of the order on
the fine defaulter: s 59(1). Part 4 provides a graduated series of enforcement options
including the suspension or cancellation of a fine defaulter’s driver licence or vehicle
registration (see Div. 3), civil enforcement (see Div. 4), community service (see Div. 5) and
imprisonment (see Div. 6). See the summary of the cascading enforcement procedure in s
58 of the Fines Act.

17. Divisions 3 and 4 are of present relevance. Section 65 provides that enforcement action “is
to be taken” against a fine defaulter under Div. 3 if they have not paid the fine as required
by the fine enforcement order notice or as arranged with the Commissioner. RMS is to take
that enforcement action when directed by the Commissioner to do so: s 65(2). Division 3
makes provision for the suspension or cancellation of a fine defaulter’s driver licence (see
s 66), the suspension of visitor driver privileges (see s 66A), and the cancellation of the
registration of motor vehicles of which the fine defaulter is a registered operator (see s 67).

18. Division 4 of Pt 4 of the Fines Act deals with civil enforcement, which encompasses
property seizure orders (see s 72), garnishee orders (see s 73) and the registration of charges
on land (see s 74). Enforcement action may be taken by one, all or any combination of
these means: s 71(2).

19. Section 71(1) provides that enforcement action “is to be taken” under Div. 4 if:

6
… the fine defaulter has not paid the fine as required by the notice of the fine
enforcement order served on the fine defaulter and—
(a) enforcement action is not available under Division 3 to suspend or cancel the
driver licence or vehicle registration of the fine defaulter, or
(b) the fine remains unpaid 21 days after the Commissioner directed Roads and
Maritime Services to take enforcement action under Division 3.
20. Section 71(1A), however, provides:

Enforcement action may be taken under this Division before or without taking action
under Division 3 if the fine defaulter is an individual and the Commissioner is
satisfied that civil enforcement action is preferable because, having regard to any
information known to the Commissioner about the personal circumstances of the
fine defaulter—
(a) enforcement action under Division 3 is unlikely to be successful in satisfying
the fine, or
(b) enforcement action under Division 3 would have an excessively detrimental
impact on the fine defaulter.
The Commissioner may decide that civil enforcement action is “preferable” in the absence
of, and without giving notice to or making inquiries of, the fine defaulter: s 71(1B).

21. Section 73 deals with civil enforcement by garnishee order. Section 73(1) relevantly
provides:

The Commissioner may make an order that all debts due and accruing to a fine
defaulter from any person specified in the order are attached for the purposes of
satisfying the fine payable by the fine defaulter (including an order expressed to be
for the continuous attachment of the wage or salary of the fine defaulter). ...

22. Section 73(2) provides that the Commissioner “may make a garnishee order only if satisfied
that enforcement action is authorised against the fine defaulter under” Div. 4.

23. The garnishee order may be “made in the absence of, and without notice to, the fine
defaulter”: s 73(3). The garnishee order “operates as a garnishee order made by the Local
Court under Pt 8 of the Civil Procedure Act 2005” (NSW): s 73(4). For that purpose, the
Commissioner is taken to be the judgment creditor: s 73(4)(a).

24. At the point in the fine enforcement process when the Commissioner makes a garnishee
order, the Commissioner is empowered to give fine defaulters time to pay the fine and to
write off the debt. The Fines Act provides that before a community correction or
community service order is issued under Div. 5, a fine defaulter may apply to the
Commissioner for time to pay a fine (s 100) or have the fine written off (s 101). The
Commissioner may allow further time to pay the fine and its payment in installments

7
(s 100(2)-(3)) and may also write off, in whole or in part, the unpaid fine in the
circumstances specified in s 101(1A).

Civil Procedure Act

25. In Pt 8, Div. 3 of the Civil Procedure Act, s 117(1) provides that “[s]ubject to the uniform
rules, a garnishee order operates to attach, to the extent of the amount outstanding under
the judgment, all debts that are due or accruing from the garnishee to the judgment debtor
at the time of service of the order.”1 Section 117(2) provides that any amount standing to
the credit of the judgment debtor in a financial institution is taken to be a debt owed to the
judgment debtor by that institution. A payment under a garnishee order must be made in
accordance with, and to the judgment creditor specified in, the order: s 123(1) of the Civil
Procedure Act. Section 3 of the Civil Procedure Act defines a judgment debtor as the person by
whom a judgment debt is payable and a judgment creditor as the person to whom a judgment
debt is payable (ie the Commissioner). The “garnishee” is the person to whom a garnishee
order is addressed: s 102 of the Civil Procedure Act.

First question: Requirements for the lawful issue of a garnishee order

Pre-condition to the exercise of the power

26. By reason of ss 73(2) and 71 of the Fines Act, the Commissioner may only make a garnishee
order if the fine is unpaid and the Commissioner is “satisfied” of one of three matters:

a. Enforcement action is not available under Div. 3 to suspend or cancel the driver
licence or vehicle registration of the fine defaulter (s 71(1)(a)). This would occur
where the fine defaulter does not hold a driver licence, is not a visitor driver and is
not the registered operator of a vehicle: see the note to s 65; see also ss 66, 66A and
67.

b. Enforcement action has been taken under Div. 3 and the fine remains unpaid 21 days
after the Commissioner directed RMS to take the enforcement action: s 71(1)(b).

1 The Uniform Civil Procedure Rules 2005 (NSW) (UCPR) deal with garnishee orders in Pt 39, Div. 4.

8
c. If the fine defaulter is an individual, and without taking action under Div. 3, civil
enforcement action is “preferable” to enforcement action under Div. 3 because such
action:

i. is unlikely to be successful in satisfying the fine; or

ii. would have an excessively detrimental impact on the fine defaulter: s 71(1A).

27. In explaining the insertion of s 71(1A) and (1B) by the Fines Amendment Bill 2017 (see
Hansard, Legislative Assembly, 14 February 2017 at 46-47), the Minister for Finance,
Services and Property said:

These amendments will allow the Office of State Revenue to better target different
fines enforcement actions in individual cases. At present, the first fines enforcement
action taken by the Office of State Revenue is to direct Roads and Maritime Services
[RMS] to impose licence, vehicle registration and business restrictions on the fine
defaulter. …

If available, these RMS sanctions must be attempted before the Office of State
Revenue can attempt any other enforcement action, such as a garnishee order. This
requirement limits the flexibility to take the most appropriate action, having regard
to the particular circumstances of the offender. In some cases, the imposition of RMS
sanctions such as driver licence suspension is unlikely to result in the recovery of
fines and may, in fact, be counterproductive in terms of an individual's employment
and access to services. This is particularly applicable to vulnerable members of the
community or people living in rural or remote locations.

The Office of State Revenue processes and systems have been designed to allow
identification of the most effective enforcement action for particular clients or
categories of clients. The bill therefore amends the Fines Act to provide the Office
of State Revenue with the discretion not to direct RMS to impose licence, vehicle
registration and business restrictions before civil sanctions are imposed, where the
Office of State Revenue is satisfied that, having regard to the individual's
circumstances, a better fine enforcement outcome would be achieved. This will allow
the Office of State Revenue to recover fines earlier than is currently permitted with
less negative impact on vulnerable members of the community.

28. The satisfaction of the Commissioner that enforcement action is authorised under Div. 4,
because of one of the matters in [26] above, is a condition precedent to the making of a
garnishee order under s 73(1) of the Fines Act and constitutes a jurisdictional fact for the
exercise of that power: see Minister for Immigration and Multicultural Affairs v Eshetu (1999) 197
CLR 611 at [130] per Gummow J; Minister for Immigration and Multicultural and Indigenous
Affairs v SGLB [2004] HCA 32; 78 ALJR 992 at [37] per Gummow and Hayne JJ.

9
Nature of power

29. While permissive statutory powers may, “in particular circumstances, be coupled with a
duty to exercise the power” (Cain v New South Wales Land and Housing Corporation (2014) 86
NSWLR 1 at [14] (citation omitted)), in our view, s 73 of the Fines Act confers a
discretionary power on the Commissioner.

30. Section 73(1) provides that the Commissioner “may” make a garnishee order. Subject to
contrary intention (s 5(2) of the Interpretation Act 1987 (NSW)), the use of that word
“indicates that the power may be exercised or not, at discretion”: s 9(1) of the Interpretation
Act. We do not think that any contrary intention can be discerned in the Fines Act in
circumstances where the Fines Act appears to use mandatory language where that is
intended: see the use of “is to be taken” in s 71(1).

31. Interpreting s 73(1) as conferring a discretion accords with the nature of power conferred
on, and available to, the Commissioner. A garnishee order is a compulsory exaction of
property held by third parties that is ordinarily ordered by a court; it would be surprising if
the making of such an order is compelled, without the scope for discretionary non-exercise,
by the Fines Act.2 This consideration is even more powerful when it is recognised that the
Commissioner’s power to make orders requiring community service and imprisonment are
conferred in similar terms: “[t]he Commissioner may make…” – see s 79(1) and (3) and s
87(1).

32. The scope, however, of the Commissioner’s discretion under s 73(1) of the Fines Act is not
without some complexity. Given the provision’s mandatory language, in cases falling within
s 71(1) of the Fines Act, the Commissioner’s discretion would appear to be limited to
selecting whether a garnishee order is the civil enforcement action that should be imposed
rather than a property seizure order or a charge on land or, given s 71(2), is one of the civil
enforcement actions that should be imposed. See also s 58(1)(c) of the Fines Act (describing
Div. 4 as the part of the procedure where “civil action is taken to enforce the fine” (emphasis
added)).

33. Sections 100 and 101 (see [24] above), and potentially s 78(b) of the Fines Act, would appear
to provide the only bases for the Commissioner not to undertake any civil enforcement

2 It is noted that the issue a garnishee order by a Court is discretionary: see r 39.38 of the UCPR.

10
action in cases falling within s 71(1). Section 78(b) provides that enforcement action may
be taken under Div. 5 (community service) if “civil enforcement action has not been or is
unlikely to be successful in satisfying the fine” (emphasis added). While s 78(b) could be read as
indicating that the Commissioner is not compelled to take civil enforcement action (being
entitled to proceed directly to Div. 5 where action is unlikely to be successful), consistently
with the chapeau of s 71(1), it can be read as allowing the Commissioner to proceed under
Div. 5 where civil enforcement action has been taken but its outcome is not yet known and
is likely to be unsuccessful.

34. In contrast, in cases falling within s 71(1A), the Commissioner is not compelled to
undertake civil enforcement action. In such cases, the Commissioner “may” take
enforcement action under Div. 4: see s 71(1A) and 73(1). The Commissioner’s power is
clearly a true discretion.

Repository of the power

35. The Fines Act reposes the power to make a garnishee order in the Commissioner. Subject
to consideration of issues like agency (see Carltona Ltd v Commissioner of Works [1943] 2 All
ER 560) and delegation, to be validly exercised a discretionary power must be exercised by
the repository of that power. Justice Gibbs, for example, observed in Racecourse Co-operative
Sugar Association Ltd v Attorney-General (Qld) (1979) 142 CLR 460 at 481:

When a discretionary power is conferred by statute upon the Executive Government,
or indeed upon any public authority, the power can only be validly exercised by the
authority upon whom it was conferred. ...

See also Re Reference Under Section 11 of Ombudsman Act 1976 for an Advisory Opinion; ex parte
Director-General of Social Services (1979) 2 ALD 86 at 93.

36. For the reasons set out in the paragraphs that follow, the intention evident in the Fines Act
is that the power to make a garnishee order is to be exercised by an individual who is a
member of the Public Service, being either the Commissioner, their delegate appointed
under s 116A the Fines Act, or a member of the Public Service authorised under s 116B.

37. Section 114 of the Fines Act provides that the Commissioner, who is to be employed in the
Public Service (s 113(2)), has the functions conferred or imposed on the Commissioner by
or under the Fines Act: s 114(1). A function includes a power, authority or duty (s 3(1)) and

11
would include the function of making a garnishee order under s 73. The reference in s
114(3)(b) to the Commissioner’s function “of administering… the taking of enforcement
action against fine defaulters” should not be understood as suggesting that the
Commissioner need only administer a process for enforcement action in circumstances
where that is inconsistent with the text employed by both s 73(1) and (2). It appears that
s 114(3)(b) is a holdover from when the State Debt Recovery Office was responsible for
issuing garnishee orders: see ss 73 and 114(2)(b) of the Fines Act prior to the Fines Amendment
Act 2013 (NSW).

38. If the Commissioner does not wish to exercise the power personally, the Commissioner
may utilise s 116A or s 116B. Section 116A(1) provides that “[t]he Commissioner may
delegate to any person employed in the Public Service any function of the Commissioner
under [the Fines Act], other than this power of delegation”.

39. Section 116B(1) also provides that “[a]n enforcement function may be exercised by the
Commissioner or by any person employed in the Public Service who is authorised by the
Commissioner to exercise that function”. Section 116B(4) defines an “enforcement
function” as a “function of the Commissioner of making or issuing an order or warrant
under this Act” and would include the making of a garnishee order pursuant to s 73 of the
Fines Act.

40. The need for the function to be exercised by a member of the Public Service is underlined
by s 116A(2), which identifies only three functions, of a procedural nature, which the
Commissioner may “delegate to any person” (emphasis added).

Considerations relevant to the discretion

41. Section 73(1) of the Fines Act does not specify what the Commissioner should, or should
not, consider in determining whether or not to exercise the power to make a garnishee
order.

42. The absence of express guidance about the considerations does not mean that the
discretion is unbounded. As French CJ explained in Minister for Immigration and Citizenship v
Li (2013) 249 CLR 332 (Li) at [23] (citations omitted):

Every statutory discretion is confined by the subject matter, scope and purpose of
the legislation under which it is conferred. Where the discretion is conferred on a

12
judicial or administrative officer without definition of the grounds upon which it is
to be exercised then:

“the real object of the legislature in such cases is to leave scope for the judicial
or other officer who is investigating the facts and considering the general
purpose of the enactment to give effect to his view of the justice of the case.”

That view, however, must be reached by a process of reasoning.

43. The scope of permissible considerations for the Commissioner under s 73(1) of the Fines
Act is, in our view, relatively broad.

44. While there is considerable scope for debate about this, when exercising the s 73(1)
discretion in respect of fine defaulters falling within s 71(1)(a) or (b), we consider that it
would be open to the Commissioner (or delegate or authorised decision-maker) to decide
that particular factual matters would not change their decision and therefore do not require
specific consideration. It would follow that it would not be necessary for the Commissioner
(or delegate or authorised decision-maker) to take the time to review the fine defaulter’s
file in relation to such matters.3 This would extend to considerations raised in applications
under ss 100 and 101 in the Fines Act, at least to the extent that they did not bear on the
selection of a garnishee order as the appropriate civil enforcement action vis-à-vis a
property seizure order or charge on land. The decision-maker would, of course, be entitled
to take such matters into account in exercising their discretion and, if so, would be expected
to review the file to consider such matters.

45. We note, however, that if the Commissioner proceeded in that fashion, there would be a
risk that the Commissioner might occasion a denial of procedural fairness. Unless clearly
displaced, procedural fairness is implied as a condition of the exercise of a statutory power:
see Minister for Immigration and Border Protection v SZSSJ (2016) 259 CLR 180 at [75] per
French CJ, Kiefel, Bell, Gageler, Keane, Nettle and Gordon JJ. While the obligation to
afford procedural fairness has been modified by s 73(3), it has not been abrogated.
Declining to consider all, or part, of a fine defaulters file would seem to us to carry the risk
that the Commissioner might make a garnishee order in circumstances which would be

3 A simple example might be the person’s age or even a person’s financial circumstances. These are matters that the
decision-maker could properly take into account, but it would also be open to the decision-maker to say to
themselves “I would exercise the discretion by making an order regardless of how old the person is or how
parlous their financial circumstances. I therefore do not need to inquire into those mattesr in order to take
them into account.”

13
considered be procedural unfair. Whether this was so would necessarily turn on the facts
of each case.4

46. In the case of fine defaulters falling within s 71(1A) of the Fines Act, in our view, it is not
open for the Commissioner to limit the inputs into the decision-making process in the same
fashion. The chapeau to s 71(1A) makes clear that fine defaulters fall within the purview of
Div. 4 based on an assessment of the Commissioner “having regard to any information
known to the Commissioner about the personal circumstances of the fine defaulter”: see
[20] and [26](c) above. While the same language is not employed in s 73, we do not consider
that, in exercising the discretion, the Commissioner could properly ignore, or put from the
Commissioner’s mind, considerations which the Commissioner was required to consider at
the anterior stage of exercising the function under s 71A (ie, considerations arising from
those personal circumstances). The Commissioner may, however, decide to accord some
or all of such matters little or no weight in the exercise of the s 73(1) discretion.

47. Irrelevant considerations would be matters falling outside the proper scope of the
administration of the fines enforcement system and, in particular, civil enforcement action.
This might include, for example, the personal characteristics of the fine defaulter that are
unrelated to the fine and its enforcement under the Fines Act (eg the fine defaulter’s sex).

Policy

48. While the benefit of adopting policies to guide administrative discretion has been
recognised (see Plaintiff M64/2015 v Minister for Immigration and Border Protection (2015) 258
CLR 173 at [54]), the nature and application of such policies is constrained by
administrative law principles.

49. Any policy adopted must be consistent with the Fines Act: see Minister for Home Affairs v G
(2019) 266 FCR 569 at [58]; Drake v Minister for Immigration and Ethnic Affairs (No 2) (1979)
2 ALD 634 (Drake (No 2)) at 640. In Minister for Home Affairs v G, the Full Federal Court
(Murphy, Moshinsky and O’Callaghan JJ) explained at [58]-[59]:

It is established that an executive policy relating to the exercise of a statutory
discretion must be consistent with the relevant statute in the sense that: it must allow
the decision-maker to take into account relevant considerations; it must not require

4 This might include, for example, a garnishee order being made in circumstances that are inconsistent with any
representations made to the fine defaulter by Revenue NSW.

14
the decision-maker to take into account irrelevant considerations; and it must not
serve a purpose foreign to the purpose for which the discretionary power was created:
see Drake (No 2) at 640 per Brennan J; NEAT Domestic Trading Pty Ltd v AWB Ltd
(2003) 216 CLR 277 at [24] per Gleeson CJ; Cummeragunga at [159] per Jacobson J.

An executive policy will also be inconsistent with the relevant statute if it seeks to
preclude consideration of relevant arguments running counter to the policy that
might reasonably be advanced in particular cases: Drake (No 2) at 640. Thus, an
executive policy relating to the exercise of a statutory discretion must leave the
decision-maker “free to consider the unique circumstances of each case, and no part
of a lawful policy can determine in advance the decision which the [decision-maker]
will make in the circumstances of a given case”: Drake (No 2) at 641.

50. Care is required in applying these principles in different statutory contexts. Drake (No 2), at
640, was concerned with a Minister’s power to “determine whether or not to deport an
immigrant or alien whose criminal conviction exposes him to that jeopardy”.
Justice Brennan considered in Drake (No 2) that “[t]he discretions reposed in the Minister
by these sections cannot be exercised according to broad and binding rules (as some
discretions may be: see, eg, Schmidt v Secretary of State for Home Affairs [1969] 2 Ch 149)”. It
was in the specific statutory context of Drake (No 2) that Brennan J said that the Minister’s
policy had to leave him free to consider the individual circumstances of the case.

51. In respect of fine defaulters falling within s 71(1) of the Fines Act, having regard to the
limited nature of the decision-maker’s function, the modification of procedural fairness
effected by s 73(3) and the absence of any mechanism for fine defaulters to make
submissions with respect to the exercise of the power in s 73,5 we consider that it would be
open to the Commissioner to adopt a policy that the making of a garnishee order would
ordinarily be appropriate in identified circumstances.

52. Given the nature of the Commissioner’s discretion in respect of fine defaulters falling
within s 71(1A), and consistently with [46] above, any policy adopted by the Commissioner
in respect of fine defaulters falling within s 71(1A) would need to leave the Commissioner
free to consider the unique circumstances of each such case.

53. In either case, it would remain necessary that there be an individual, being the
Commissioner, their delegate or an authorised person, who reaches the relevant state of

5 The Commissioner’s power may be distinguished from cases where the decision-maker is required to “consider”
certain material, such as a submission, which would involve “an active intellectual process directed” to that
material: see Tickner v Chapman (1995) 57 FCR 451 at 462.

15
satisfaction and decides that this is how they will exercise their discretion in the case or
cases before them.

Amenability to challenge

54. A garnishee order is liable to be challenged in two ways. First, given that the Fines Act
provides that the order “operates as a garnishee order made by the Local Court under Pt 8
of the Civil Procedure Act 2005”, and subject to the applicable jurisdictional limit, we are
inclined to the view that the judgment debtor would be able to avail themselves of the
mechanism in Pt 8 to challenge a garnishee order. 6 In this regard, s 124A of the Civil
Procedure Act provides that:

The court may, at any time on the application by a judgment debtor, vary or suspend
the making of payments by the judgment debtor under a garnishee order, or order
the total amount paid by the judgment debtor under the garnishee order to be repaid,
if the court is satisfied that it is appropriate to do so.

55. Secondly, a garnishee order is liable to be challenged in the supervisory jurisdiction of the
Supreme Court. The Supreme Court’s supervisory jurisdiction is “the mechanism for the
determination and the enforcement of the limits on the exercise of State executive and
judicial power by persons and bodies other than the Supreme Court”: Kirk v Industrial
Relations Commission of New South Wales (2010) 239 CLR 531 at [99]. An applicant would need
to establish jurisdictional error to enliven the Court’s jurisdiction. In Hossain v Minister for
Immigration and Border Protection (2018) 264 CLR 123 (Hossain) Kiefel CJ, Gageler and
Keane JJ explained, at [24], that “jurisdictional error”:

… refers to a failure to comply with one or more statutory preconditions or
conditions to an extent which results in a decision which has been made in fact
lacking characteristics necessary for it to be given force and effect by the statute
pursuant to which the decision-maker purported to make it.

56. It is important to recognise, particularly in the context of a discussion of the requirements
for the lawful issue of a garnishee order, that the Fines Act would be “interpreted as
incorporating a threshold of materiality in the event of non-compliance”: Hossain at [29] (ie
a breach of a statutory precondition/condition must be material in order to be a
jurisdictional error). In Hossain, Kiefel CJ, Gageler and Keane JJ, at [30], explained:

6 The Commissioner, as the judgment creditor, would equally be able to avail himself or herself of the enforcement
mechanism in s 124.

16
Whilst a statute on its proper construction might set a higher or lower threshold of
materiality, the threshold of materiality would not ordinarily be met in the event of a
failure to comply with a condition if complying with the condition could have made
no difference to the decision that was made in the circumstances in which that
decision was made.

57. Their Honours went on to observe, at [31], that “[o]rdinarily… breach of a condition
cannot be material unless compliance with the condition could have resulted in the making
of a different decision”: see also, Minister for Immigration and Border Protection v SZMTA (2019)
264 CLR 421 (SZMTA) at [2]-[3] and [45] per Bell, Gageler and Keane JJ.

58. Materiality “is a question of fact in respect of which the applicant for judicial review bears
the onus of proof”: SZMTA at [4] per Bell, Gageler and Keane JJ; see also at [46].

Constitutional limits

59. Commonwealth laws may, through s 109 of the Constitution (Cth), operate to constrain the
Commissioner’s ability to issue garnishee orders.

60. Section 109 of the Constitution provides that “[w]hen a law of a State is inconsistent with a
law of the Commonwealth, the latter shall prevail, and the former shall, to the extent of the
inconsistency, be invalid.”

61. The operation of s 109 of the Constitution was recently explained by the High Court in Work
Health Authority v Outback Ballooning Pty Ltd (2019) 93 ALJR 212 (Outback Ballooning) at
[29] and [31]-[35] per Kiefel CJ, Bell, Keane, Nettle and Gordon JJ. There are two general
types of inconsistency which will engage s 109: a direct inconsistency; and an indirect
inconsistency.

a. A direct inconsistency will arise where the “State law would ‘alter, impair or detract
from’ the operation of the Commonwealth law”: Outback Ballooning at [32].

b. An indirect inconsistency arises where the Commonwealth law “is to be read as
expressing an intention to say ‘completely, exhaustively, or exclusively, what shall be
the law governing the particular conduct or matter to which its attention is directed’”
and the State law deals with that conduct or matter: Outback Ballooning at [33].

17
Where there is an inconsistency, s 109 resolves the conflict by giving the Commonwealth
law paramountcy and rendering the State law invalid or inoperative to the extent of the
inconsistency: Outback Ballooning at [29].

62. Given the limited purpose for which our advice is sought, it is not necessary to attempt to
exhaustively identify all Commonwealth laws which might give rise to a s 109 issue for the
making of garnishee orders under the Fines Act. It is sufficient to demonstrate the operation
of s 109 by reference to two examples: the Social Security (Administration) Act 1999 (Cth); and
the Bankruptcy Act 1966 (Cth).

Social Security (Administration) Act

63. Division 5 of the Social Security (Administration) Act deals with the “[p]rotection of social
security payments”. Section 60 provides that, subject to exceptions which are not presently
relevant, “[a] social security payment is absolutely inalienable.”7 Section 62 deals with the
effect of a garnishee or attachment order, with subsection (1) providing:

If:
(a) a person has an account with a financial institution; and
(b) either or both of the following subparagraphs apply:
(i) instalments of a social security payment payable to the person
(whether on the person’s own behalf or not) are being paid to the
credit of the account;
(ii) an advance payment of a social security payment payable to the
person (whether on the person’s own behalf or not) has been paid
to the credit of the account; and
(c) a court order in the nature of a garnishee order comes into force in
respect of the account;
the court order does not apply to the saved amount (if any) in the account.

64. The “saved amount” is calculated by deducting the total amount withdrawn from an
account during the 4 week period immediately before the court order came into force from
the total amount of social security payments paid to the credit of the account during that
period: see s 62(2).

65. There is no indirect inconsistency between the Fines Act and s 62 of the Social Security
(Administration) Act in circumstances where s 62 contemplates the attachment of garnishee
orders to any amounts in an account other than the “saved amount” (including amounts

7 A “social security payment” is defined in s 23 of the Social Security Act 1991 (Cth). It includes, for example, a
society security pension, a social security benefit and allowances under the Social Security Act.

18
arising from social security payments paid prior to the four week period by reference to
which the “saved amount” is calculated).

66. However, a state law that authorised the issue of garnishee orders for debts, by way of a
court order, that attached to a “saved amount” in an account with a financial institution
would alter, impair or detract from s 62 of the Social Security (Administration) Act. As
garnishee orders issued by the Commissioner pursuant to s 73(1) operate as an order of the
Local Court (s 73(4)), we accordingly consider that there is a direct inconsistency between
the Social Security (Administration) Act and s 73 of the Fines Act, and s 117 of the Civil Procedure
Act, to the extent that they purport to authorise the making of garnishee orders that attach
to a “saved amount”. Section 109 resolves that inconsistency in favour of the
Commonwealth law, and ss 73 and 117 would be rendered inoperative to the extent of the
inconsistency.

Bankruptcy Act

67. Part VI, Div. 4B, Subdiv. HA of the Bankruptcy Act establishes a supervised account regime.
The trustee of a bankrupt’s estate may determine that the supervised account regime applies
to the bankrupt in certain circumstances: s 139ZIC. The bankrupt is required to ensure all
monetary income actually received by the bankrupt after the opening of the account is
deposited to the account: see s 139ZIF. Unless specific circumstances exist, the bankrupt
is prohibited from making, or authorizing, withdrawals from the account: see s 139ZIG(1)-
(7). Section 139ZIG(8) provides:

Garnishee powers not affected
(8) This section does not affect the exercise of powers conferred by:
(a) section 139ZL of this Act; or
(b) section 260-5 in Schedule 1 to the Taxation Administration Act 1953; or
(c) a similar provision in:
(i) any other law of the Commonwealth; or
(ii) a law of a State or a Territory.

68. Although there is a level of similarity to the “saved amount” concept in the Social Security
(Administration) Act, no s 109 inconsistency arises from s 139ZIG. Section 139ZIG places
the relevant prohibition on the bankrupt, not third parties in the position of the
Commissioner. Even if that were not the case, the Commissioner’s power under s 73 of
the Fines Act would not be affected by reason of s 139ZIG(8), whose evident purpose is to

19
avoid the provision limiting garnishee powers: see the Explanatory Memorandum of the
Bankruptcy and Family Law Legislation Amendment Act 2005 (Cth) at [141].

69. Other provisions of the Bankruptcy Act do, however, operate to constrain the
Commissioner’s ability to issue garnishee orders. Although it is beyond the scope of the
present advice to identify all the inconsistencies potentially arising between the Fines Act
and the Bankruptcy Act, it may be noted that the Bankruptcy Act prohibits a person entitled
under a law of the State, like the Commissioner, from retaining or deducting money in
particular circumstances: see ss 54H, 185F and 185K. In addition, it is to be noted that
where a bankrupt is discharged from bankruptcy, s 153 of the Bankruptcy Act provides that
the “discharge operates to release him or her from all debts (including secured debts)
provable in the bankruptcy”.8 As explained above, s 109 of the Constitution would operate
to render any inconsistent provisions in the Fines Act inoperative to the extent of the
inconsistency with the relevant provisions of the Bankruptcy Act.

70. We are happy to provide further advice about these matters if instructed to do so.

Second question: Validity of the Commissioner’s processes

71. In our view, the Commissioner’s processes for the issuing of garnishee orders since 2016
departs from the requirements of the Fines Act in a number of respects.

Original Version of the process

72. The Original Version of the Commissioner’s process was not lawful because human input
was wholly excluded from the process for issuing garnishee orders. As identified above,
once the DPR had selected a garnishee order as the next enforcement action, the garnishee
order was automatically generated and issued by the FES, at least with respect to orders
made to the Commonwealth Bank, ANZ, Westpac and NAB. Human interaction was only
involved to the extent that manual action was required to issue the order.

73. To the extent that the Commissioner, their delegate or an authorised person was not
involved in making the garnishee order under the Original Version of the process, the

8 Section 82(3) of the Bankruptcy Act provides that “[p]enalties or fines imposed by a court in respect of an offence
against a law, whether a law of the Commonwealth or not, are not provable in bankruptcy.” Section 82(3)
would accordingly operate to the limit the extent to which court fine enforcement orders are discharged by
s 153 of the Bankruptcy Act.

20
absence of human involvement had two salient effects. First, at no point was the subjective
jurisdictional fact met; the Commissioner, their delegate or an authorised person did not
reach the state of satisfaction required by s 73(2), namely that civil enforcement action was
authorised against the fine defaulter.

74. Secondly, and relatedly, given that the Fines Act invests the power to make an order in the
Commissioner, their delegate or an authorised person, it could not be said that the garnishee
order had been made by the repository of the power. Indeed, it would not appear possible
to identify any human decision-maker for the decision to make a garnishee order under the
Original Version of the process.

Process following the First and Second Modification

75. So far as we understand them, the amendments to the Commissioner’s processes for
making garnishee orders in August 2016 and September 2018 (see [9]-[10] above) did not
change the fact that the Commissioner, their delegate or an authorised person was not
involved in the determination to make a garnishee order. Those amendments accordingly
do not alter our opinion as to the lawfulness of the Commissioner’s process for making
garnishee orders during that period.

Current Version of the process

76. Although the Current Version corrects at least one of the defects of the previous versions,
we maintain concerns about the lawfulness of the Commissioner’s process for making
garnishee orders under the Fines Act.

77. The Current Version, through the interposition of a staff member between the information
technology applications and the issue of the garnishee order, would appear to address the
issue concerning the source of the power to make the order. On the assumption that the
staff member involved in the Check Summary Report holds the relevant delegation under
s 116A or authorisation under s 116B,9 the amendment resulted in garnishee orders being

9 We have been instructed with instruments of delegation and authorisation dated 17 June 2016, 20 March 2017
and 29 October 2019. They indicate that specified staff in Revenue NSW are empowered to make garnishee
orders under s 73 of the Fines Act. The 2016 and 2017 delegation and authorisation is relevantly to persons
assigned to roles in Collections and Technical & Advisory Services. The 2019 delegation and authorisation
is to persons assigned to roles in Customer Service Fines & Debt and Technical & Advisory Services. The
2019 instrument also delegates and authorises the exercise of enforcement functions under the Fines Act to
persons assigned to certain roles in Service NSW.

21
made by the repository of the power in circumstances where, without the approval of the
staff member, no garnishee orders would be made.

78. It is not, however, possible to say that the interposition of the staff member has addressed
the issue relating to s 73(2) of the Fines Act. On the materials available to us, it is not
apparent that the Commissioner, their delegate or an authorised person forms, as part of
the Check Summary Report process, the state of satisfaction required by s 73(2).10

79. Nor is it apparent whether the Check Summary Report provides a basis for the
Commissioner, their delegate or an authorised person to form the requisite state of
satisfaction. The Check Summary Report, and the DPR system, appear to only be directed
to fine defaulters falling within Pt 4, Div. 4 of the Fines Act because the fine remains unpaid
after the Commissioner directed RMS to take enforcement action (ie persons falling within
s 71(1)(b) and not s 71(1)(a) or 71(1A)): see Steps 6 and 7 above. The Check Summary
Report does contain a rule check for “Period for Issue after EN” of 21 days, but we are
not aware whether this is a reference to the period after the Commissioner directed RMS
to take enforcement action and, more importantly, whether the Commissioner, their
delegate or an authorised person understands that that is what the reference is to. 11

80. Even assuming that the threshold in s 73(2) is met, there would appear to be a question
about the lawfulnesss of the issue of garnishee orders under the Current Version of the
process. While we are of the view that the Commissioner (or delegate or authorised person)
may, as a general matter, consider the issue of garnishee orders to multiple fine defaulters
simultaneously (at least with respect to fine defaulters within s 71(1)) and that the matters
raised by the Check Summary Report are permissible considerations, for the reasons that
follow, we do not consider that it is sufficient for the purposes of s 73(1) of the Fines Act
for the staff member to simply give effect to the activity selection of the DPR (see [6]
above) or rely on the fact that the Check Summary Report showed green lights in order to
lawfully make a garnishee order. But we nevertheless think that the decision-maker might,
when dealing with a fine defaulter falling within s 71(1), properly follow a course of

10 Given that the function in s 73(2) has not been expressly delegated in the instruments of delegation with which
we have been briefed, we note that a delegate may exercise any function that is incidental to the delegated
function: s 49(4) of the Interpretation Act.
11 We note that, according to Step 7, the DPR begins assessing fine defaulters for civil enforcement action after
only 14 days (rather than 21 days) after the Commissioner directed RMS to take enforcement action under
Pt 4, Div. 3.

22
reasoning that means they do not need to review each file, provided they have properly
considered the nature of the information that they are disregarding and formed the view,
on a reasonable or rational basis, that such information would not alter their decision.

81. In order for there to be a lawful exercise of a statutory discretion, we consider that generally
a human needs to consider the relevant factors and reason to the relevant outcome. In the
case of the Fines Act, the decision-maker is required to consider the relevant factors (see
[43]-[47] above) and decide, in fact, whether to make a garnishee order. In the case of fine
defaulters falling within s 71(1), the Commissioner is required to decide whether a garnishee
order is the civil enforcement action that should be imposed rather than, or in addition to,
a property seizure order or a charge on land. In the case of fine defaulters falling within
s 71(1A), the Commissioner is empowered to decide whether or not a garnishee order
should be made.

82. Although the response of administrative law to the use of information technology may be
nascent, ordinary administrative law principles require there to be a “process of reasoning”
for the exercise of discretions (Li at [23]). This can also be seen in our conceptions of what
it means to make a “decision”, with two members of the Full Federal Court (Moshinsky
and Derrington JJ) accepting that one of the elements generally involved in a “decision” is
“reaching a conclusion on a matter as a result of a mental process having been engaged in”:
Pintarich v Deputy Commissioner of Taxation (2018) 262 FCR 41 at [141] and [143], quoting
Semunigus v Minister for Immigration and Multicultural Affairs [1999] FCA 422 at [19].

83. Absent express statutory amendment (discussed below), we accordingly do not think that
a statutory discretion can be lawfully exercised by giving conclusive effect to the output of
an information technology application. We do not think that the unlawfulness is altered by
that output being broken down into component parts (ie the considerations raised in the
Check Summary Report) and the decision-maker proceeding, as matter of course, to exercising
the power (ie issuing the garnishee orders because all the traffic lights were green) without
engaging in a mental process to justify that conclusion.

84. For similar reasons, we do not consider that statutory discretions can be lawfully exercised
by pre-authorising the making of an order if certain outputs are obtained.

85. On the materials available to us, it is not apparent whether the staff member involved in
the Check Summary Report is undertaking any process of reasoning or is issuing the

23
garnishee orders simply because the traffic lights are green. Given considerations of
materiality, this departure may not be of significance in the case of fine defaulters falling
within s 71(1)(a) or (b), in respect of whom civil enforcement action is effectively
mandatory under the Fines Act (subject of course to the operation of ss 100 and 101). Our
concern as to non-compliance would be more acute with respect to fine defaulters falling
within s 71(1A), in respect of whom the Commissioner has a true discretion whether or
not to issue a garnishee order. We repeat, however, our observation at [79] above that the
Commissioner’s automated process appears (at least on the materials with which we are
briefed) directed to fine defaulters falling within s 71(1)(b)).

86. As to the operation of s 109 of the Constitution, our instructions do not allow us to say
whether garnishee orders issued by the Commissioner have in fact been issued in
circumstances contrary to s 62 of the Social Security (Administration) Act 12 or the various
requirements in the Bankruptcy Act. As explained above, s 109 would render inoperative the
provisions of the Fines Act to the extent that they purported to authorise the Commissioner
to make garnishee orders in circumstances prohibited by the Commonwealth laws.

Third Question: Modification and/or statutory amendment

87. Modifications could be made to the Current Version of the process for issuing garnishee
orders to make it lawfully permissible. As identified above, the process would need to
amended to require the Commissioner, their delegate or an authorised person to reach the
state of satisfaction required by s 73(2). Assuming that the staff-member is currently
proceeding automatically from the traffic lights to the issue of the garnishee orders (which
would not be permissible), the process could also be amended so as to ensure that the
decision-maker is actually reasoning, by reference to the applicable statutory test, from the
relevant inputs in the decision-making process to the output of whether or not to issue a
garnishee order in respect of the fine defaulter/s.

88. Alternatively, the Fines Act could be amended to make permissible the Commissioner’s
process for issuing garnishee orders. The subjective jurisdictional fact in s 73(2) could be
replaced by a jurisdictional fact (see Icon Co (NSW) Pty Ltd v Australia Avenue Developments
Pty Ltd [2018] NSWCA 339 at [13]), so as to avoid the Commissioner, their delegate or an

12 We note that the extent to which garnishee orders attached to “saved amount[s]” would likely have been reduced
since Revenue NSW began applying a minimum protected amount to bank-directed garnishee orders.

24

This text has been automatically transcribed for accessibility. It may contain transcription errors. Please refer to the source file for the original content.

Upload 2

Automated Transcription

Avoiding (and investigating) automated maladministration
Paul Miller PSM, NSW Ombudsman

(This is an edited version of a presentation given at the 13th National
Investigations Symposium, 25 May 2023, Four Seasons Hotel, Sydney)

Almost 20 years ago now, in 2006, the NSW Ombudsman investigated the use by NSW Police Force (NSWPF) of a non-human intelligence tool that had been procured and trained to detect certain criminal offences.
The tool was estimated to have a technical capability in the field of criminal offence detection that was at least 40 times more powerful than humans.
Over a two-year period, the tool was used in over 470 operations.
It profiled hundreds of thousands of people, and from them detected over
10,000 potential criminals.
Most of the people identified by the tool were then subject to Police action.

I hope by now that you have figured out what ‘non-human intelligence tools’ I am talking about…

Yes, back in 2006 the NSW Ombudsman looked into the use of drug detection dogs by the NSWPF.
Dogs are estimated to have an ability to smell – let’s call that olfactory intelligence – that is around 40 times superior to humans.
In our report, we noted that during the review period 17 dogs made 10,211 indications.
That number is of course a fraction of the total number of people ‘screened’ by the dogs.1
Of those detected, most were then subject to a search by Police.

NSW Ombudsman | Avoiding (and investigating) automated maladministration
This accorded with NSWPF policy, which stated that an indication by a drug detection dog, of itself, gives police reasonable suspicion to search a person.
Why, you might ask, am I talking about dogs when I am supposed to be talking about automated decision-making (ADM) and artificial intelligence (AI)?
Well, I’m going to leave you to ponder that as I proceed with my presentation.
But let give me you a bit of hint by pointing out just some of the issues that were raised by the Ombudsman when investigating the use of drug detection dogs:

(a) Questions of lawful authority
Police in NSW had started using drug detection dogs many years prior to
the 2006 Ombudsman report.
In 2001, a Magistrate dismissed two drug charges against a man found in
possession of prohibited drugs during a drug dog operation. The
Magistrate held in that matter that the actions of the NSWPF using dogs
constituted an illegal search.2
In essence – the statute that gave Police the power to stop and search
people, did not implicitly carry an authorisation to search using a dog.
As a result of that court decision, legislation was introduced to expressly
authorise NSWPF to use drug detection dogs.
Now I’m going to get straight to the point here by noting that, as we’ll
soon see, in the field of ADM, quite often the very first question that
arises is whether it’s legally permissible to use an ADM tool, or whether
an amendment to the statute must or should be made to authorise that.
(b) Questions of accuracy
This was a key focus of the Ombudsman’s investigation, and led to its
perhaps most often quoted finding, which you still see quoted in the
media today.
The stated objective of drug detection dogs is to indicate people
currently in possession of illicit drugs.
So how good are they at doing that?
Well, perhaps not very. The Ombudsman’s report found that they were
successful in doing that just 26% of the time.
That means nearly three quarters of the people searched after being
indicated by a dog were found not to be in possession of any drugs.3

NSW Ombudsman | Avoiding (and investigating) automated maladministration
Of course, if a human police officer – no matter how well trained – was
tasked with sniffing people in a crowd to see if they could detect those
who might be in possession of drugs, I’m pretty sure the failure rate
would be a lot greater.
The question is this: is the mere fact that dogs are so much better than
humans at this task enough? Or should we require some higher standard
of accuracy before the outputs of a tool like this can be used as a basis
for branding a person as a suspected criminal and subjecting them to an
invasive search?
Related to this issue of accuracy, there is quite a bit of discussion in the
Ombudsman’s report about the testing and accreditation of drug
detection dogs – some dogs are obviously going to be better drug-
detection tools than others – and again I invite you to consider possible
parallels with AI tools.
There is, for example, an interesting discussion in the report about the
differences between the testing and training environments, and real-
world applications – such as how dogs tested in a controlled
environment translate to large crowd situations.
(c) Questions of potential bias and discriminatory impact
The Ombudsman observed that drug detection dogs indicated men far
much more frequently than women.
They also more frequently indicated young people – almost half of the
people indicated were under 25.
The report was unable, because the data wasn’t collected, to say much
about Aboriginal status or ethnicity of the people searched.
(d) Unintended harms
Some of the observations in the Ombudsman’s report concerned the
extent to which the use of drug dogs might cause unintended harms.
The things looked at included whether they might have resulted in:
• the consumption of larger amounts of drugs at once instead of taking
smaller amounts over a period of time,
• consuming drugs at home and then driving to venues,
• purchasing drugs from unknown sources at venues to avoid carrying
drugs,
• switching to potentially more harmful drugs in the belief that they
are less likely to be detected by dogs.

NSW Ombudsman | Avoiding (and investigating) automated maladministration
So, let’s turn now, more explicitly, to the topic of ‘avoiding and investigating automated maladministration’.

Let me very quickly outline what I mean by the two key concepts inherent in my presentation title:
(1) automation, or ADM, and
(2) maladministration.

First ADM.
In the interests of time, I will not spend too much time on this terminology. I’ll also probably jump around using terms like ADM, AI, or algorithmic decision- making, as if they all mean the same thing, which they don’t, necessarily.
But generally speaking, if you just accept that what I am talking about is the use of technologies to make or contribute to making government decisions – decisions that had previously been the exclusive domain of human beings – then we’ll be pretty much on the same page as to what I mean.
I will, however, highlight two very important points, which can sometimes be overlooked:
I referred to technologies that make or contribute to making, decisions. That’s important. ADM does not just mean fully automated decision-making.
Take for example a technology that performs a filtering activity.
Mobile phone camera detection technology used in NSW is a great example.
Cameras placed at intersections take and then analyse photos of drivers, identifying through the use of machine learning software those images that are most likely to show a driver holding in their hand a mobile phone device – which is of course a criminal offence. The technology therefore filters out all of those images that appear not to show such an offence.
But a fine is not automatically sent when an image is selected. Rather – at least as we understand the system to work currently – every image that gets through the AI filter is then reviewed by a human, who makes the call as to whether a fine should be issued.4 So, it’s not fully automated decision making – but it is still ADM in the sense of contributing to a decision-making process and certainly it’s a kind of ADM that I am talking about.

NSW Ombudsman | Avoiding (and investigating) automated maladministration
The second point is that although some of the technologies used in ADM are referred to as Artificial “Intelligence” – the technology does not have to be very intelligent at all, and quite frequently it is not.
ADM includes some very basic rules-based programs.
A clear example of this was some of the ADM used during COVID. Border crossing rules in various states required approval before entering a state during certain times. If you wanted approval, you filled in an online form, entering your relevant details and ticking boxes, for example to indicate your reason for wanting to enter the state. In some cases, the approval would pretty much come back instantaneously – not because there was a little human who had read your application and assessed your eligibility, but rather because a very simple rule-based algorithm had determined that you’d ticked all the right boxes (so to speak).
Here's another more recent example – a simple tool was implemented to automate the billing of ambulance and related medical services. It meant that that, whenever details were entered to show that a certain ambulance or other medical service had been provided, a bill would be automatically generated and sent to the person who received the service.
Not surprisingly, those rescued from rooftops in the recent year’s flooding found it a bit insensitive when they were automatically issued a bill for their helicopter rescues.5

Turning now to ‘maladministration’.
Maladministration is the central concern of all Parliamentary Ombudsman, and being the Ombudsman, it is unapologetically what I’m focused on today.
Literally maladministration means wrong or bad administration, and it the opposite of conduct that accords with administrative law and principles of good administrative practice.
It is critical to note that maladministration certainly includes, but is wider, than conduct that is unlawful, and certainly it is considerably wider than the kinds of conduct that would, if challenged in a Court of Law, be overturned or declared to be beyond power.
That said, administrative conduct that is contrary to law is the first kind of maladministration.
Illegality is the start of maladministration, but it is by no means the end.

NSW Ombudsman | Avoiding (and investigating) automated maladministration
As an Ombudsman, we have powers to make findings not just when government agencies and officials have acted unlawfully, but also when they have acted unreasonably or unjustly, or in an improperly discriminatory manner, or in some way that is otherwise wrong.
So bringing these two things together, ADM and maladministration – what can we say?
Well, the first thing we can and should say – even if it is obvious – is that the use of ADM technology is not always and inherently a form of maladministration.
Indeed, in some instances the technology has the potential to improve administrative conduct and help to address some of the things that an
Ombudsman might otherwise be quite concerned about – such as by improving the consistency of decision-making and mitigating the risk of individual decision-maker idiosyncrasy and bias.
However, if ADM technology is designed and used in a way that it not lawful, or does not otherwise accord with associated principles of good administrative practice, then its use could constitute or involve maladministration.
And in this regard, the main issue I want us to think about today is whether, and if so how, automated maladministration is different from any other kind of maladministration, both in terms of what maladministration might look like and – particularly for this audience – how we might go about investigating it.
So that requires us to think about what changes when ADM comes into play, and just as importantly, what doesn’t change.

First, let’s look at what doesn’t change.
In November 2021, we tabled a special report in Parliament titled, ‘The new machinery of government: Using machine technology in administrative decision-making’.6
One of the key messages we wanted to get across in that report was to emphasise what doesn’t change when technologies are used in administrative decision-making.
In particular, whenever new technology is introduced, it is introduced into an existing environment, including an existing environment of legal rules and norms of good practice.

NSW Ombudsman | Avoiding (and investigating) automated maladministration
The legal environment into which public sector ADM is being introduced is the one that is governed by public administrative law – the law which controls government decision-making.
Now existing legal environments may be more or less hospitable to new technologies. There can also be uncertainty, at least initially, about how the environment will accommodate and respond. There may, at least initially, be gaps and inconsistencies.
Administrative law, as it has developed over many centuries but rapidly during the last 50 or so years, is essentially principles-based. That means that it is, generally speaking, technology agnostic.
So, while the technology used in government decision making may change, the underlying concerns and the underlying norms that underpin administrative law remain unchanged.
This doesn’t mean that the laws won’t change, but the core principles and values about which the law is concerned are going to be familiar.
Let’s consider those principles briefly.

For simplicity, we group the requirements for good decision-making in our
2021 report as follows:
• proper authorisation,
• appropriate procedures,
• appropriate assessment,
• adequate documentation.
They are pretty self-explanatory, even common sensical, and each of these are addressed in some detail in our report, and I won’t go through them in any detail now, but I will refer back to a couple of them to highlight how they might apply in the context of ADM.
The first principle is a pretty obvious one: there must be legal power to make the relevant decision.
This is where I need to very briefly mention Robodebt.7
What occurred in Robodebt would have been just as unlawful had it been undertaken without any automation, and the unlawful elements of the
Robodebt scheme weren’t dependant on the presence of automation. Rather, the core issue is that Robodebt involved doing something that there just

NSW Ombudsman | Avoiding (and investigating) automated maladministration
wasn’t legal power to do under the relevant legislation – to generate a debt by way of income averaging.
Robodebt shows that familiar problems can arise under familiar laws and principles, but in the context of using new data-enabled technologies.
But it’s also the case that ADM is going to give rise to new problems, albeit they will still need to be considered against the already well-established and familiar framework of laws and principles that control agency conduct.
The point here is not whether or not those principles still apply – they most definitely do – but the issue is considering how they will apply in this new particular context of ADM.
I will refer here to our investigation in relation to Revenue NSW’s use of ADM in garnishee orders. I won’t go into detail, but the nub of that case was as follows:
There is a well-established principle of law that, if Parliament gives a particular person the function of making a discretionary decision – then that discretion cannot be fettered or abdicated.
A public servant can’t, for example, have someone else dictating to them what decision they should make.
Nor can a public service agency adopt a policy that says, without exception, in every case we are going to make this or that decision.
In other words, if you’re given discretion you have to be prepared to exercise it.
Now Revenue NSW adopted ADM technology for its garnishee system, under which it is able to sweep money directly out of people’s bank accounts to recover unpaid fines debts.
Relevantly, the legal decision to do so involves some discretionary aspects.
Initially Revenue’s NSW machine was automatically issuing the garnishee orders. Following our concerns, they changed the process so that a relevant human being would, at the end of each day hit “go” to issue the orders.
We obtained a legal opinion from Senior Counsel that both before and after this change, Revenue NSW’s conduct was unlawful, as it was not in compliance with the legislation.
It seems clear why it was not lawful before there was a human ‘in the loop’.
But even after a human had been ‘put on top’, Counsel still considered the system to be operating unlawfully. In essence, it was not consistent with having a discretionary function for a human to be simply adopting the output

NSW Ombudsman | Avoiding (and investigating) automated maladministration
of a machine, without engaging themselves in an active mental process of deciding whether to issue the order – so that the decision could be said to be truly their decision and not just what the machine told them to do.8
It's a neat example, I think, of a very familiar rule regarding discretion being applied to a new situation presented by ADM technologies.
There are other examples that raise similar questions as to how familiar law and familiar principles might apply to different technology:
• Right to be heard
Take the right to be heard. It is a long-standing principle of procedural
fairness that, generally speaking, a person who will be materially and
adversely affected by an administrative decision should have a right to
be heard before the decision is made. And for that right to be heard to
mean anything, the person has a right to be told the case against them,
so they can understand what they are responding to.
Now in the context of ADM, one question is whether that familiar right
means that a person has a right to be told whether and how ADM
technology has been involved in the making of the decision.
In our report we suggest that it does – we say that, in order to genuinely
have a right to be heard, a person needs to understand the proposed
decision and how it was made, including if it is the result of some output
from a machine.
• Algorithmic bias
I am not going to talk in any detail about algorithmic bias. But there are
three quick points to leave with you – especially for those of you who
are lawyers – in terms of considering how the existing principles of
administrative law will apply where systems might exhibit algorithmic
bias.
First, the traditional rule against bias might not be the key one to
consider – because that is primarily concerned with prejudgment or bias
in the mind of the individual decision-makers.
Second, laws prohibiting discrimination on certain protected grounds –
like sex and race will clearly be relevant.
And third – and of most interest I think – the familiar rules concerning
proper decision-making, such as the requirement to only have regard to
relevant and to not have regard to irrelevant considerations, will likely
affect the legality of using systems to make decisions when those

NSW Ombudsman | Avoiding (and investigating) automated maladministration
systems may exhibit algorithmic bias. Likewise, the use of a system that
exhbits algorithmic bias may raise questions around compliance with the
requirement that decision-making be reasonable – and I mean
reasonable both in the legal sense in cases like Wednesbury and Li,9 but
also the broader sense of reasonable as used in the Ombudsman Act.10
• Obligation to provide reasons
I will say something quickly about the obligation to provide reasons.
Although it is not always a legal requirement, it is a norm of good
administrative practice that if a decision is made – particularly one that
adversely affects the rights or interests of an individual – reasons should
be able to be, and should be given.
‘Computer says no’ is unlikely to be a reason.
The question of what reasons can and should be provided when using an
ADM systems – particularly those that use machine learning technology
– is a challenging one, and continues to be a matter of some debate, but
again the underlying principle that people deserve a meaningful reason
they can understand is familiar.

Let’s turn to some of the things that do change with ADM…
(a) Likelihood of error
One reason why errors, or at least certain types of errors, might be more
likely with ADM is because of the challenge of translating language. Laws
are currently drafted in natural language, while coding is more precise
with a narrower vocabulary.
Furthermore, those involved in the coding of an ADM system are
generally not the kinds of people who have experience and expertise in
interpretating and applying legislation.
Now this may change over time, but for now it seems that what we
might call translation risks remains a considerable challenge.
(b) Consequences of error
A key advantage of machines is their ability to process high volumes of
data and cases at very high speed. This also means though, that any
errors will be replicated at a rate exceeding that of any individual human
administrator.

NSW Ombudsman | Avoiding (and investigating) automated maladministration
Consequently, the number of people adversely affected by a single error
may be much more substantial.
(c) Speed
The speed and scale of technology advancement is changing.
It is worth noting that generative AI applications such as ChatGPT were
not even on the radar publicly so far as we can tell when we published
our report less than 18 months ago. No doubt, it was already in
development and since ChatGPT was released, there has been a
proliferation of generative AI use cases – a number of which are freely
available and come with significant risk. This is the challenge of keeping
up.
(d) Detection of error
The detection of error may be a challenge. Detecting errors in ADM may
call for quality assurance capabilities that administrators currently do
not possess.
Those affected by erroneous decisions – particularly if they are already
vulnerable – may be less able to identify or effectively challenge a
machine error.
In both Robodebt and the Revenue NSW examples, the initial complaints
were not about unlawful ADM – people were just concerned by the
outcome of the process. In most cases they would have had no idea that
ADM was even being used.
(e) Reversibility and rectification of errors
If a human decision maker makes an error, their conduct can usually be
easily corrected for future decisions.
Even a systemic error that is reflected generally in agency policy can
generally be remedied quickly and effectively.
However, if there is an error in ADM technology, fixing the error can be
difficult, costly and time consuming – and a tweak here or there to fix
one error may run the risk of creating others.
In our 2021 report we included an example relating to Transport for
NSW and its system DRIVES which is used for a range of functions
including driver licence suspensions. DRIVES was programmed in such a
way that a different process is followed depending on how long the
driver’s licence had left until expiry at the time of the suspension:

NSW Ombudsman | Avoiding (and investigating) automated maladministration
• if there are 35 or more days left to expiry, a notice will be
automatically issued.
• if there are fewer than 35 days left to expiry, no notice is issued.
Instead, when the driver applies to renew their licence they will be
denied a licence and given a licence suspension notice.
It appears that those coding the system made certain assumptions,
including that any driver whose licence was expiring would apply
promptly for a new licence. We handled a complaint from a driver
whose licence was due to expire, but did not immediately apply to
renew their license because they knew that they would need to serve a
licence suspension period due to an accrual of demerit points. This
meant, however, that a notice of suspension was never issued until the
person (many months later) did apply for a new licence – and
consequently it was only then that the suspension period commenced .
This meant that the driver was unable to drive for a much longer than
necessary period of time.
While TfNSW committed to fixing the glitch, it said it would not be
possible to do so until the next scheduled system update, and that it
would need to consider what interim measures it could put in place.11
(f) Dynamism and creeping complacency
Another challenge is dynamism, which sounds like a good thing, but just
means that the system will change over time.
Obviously, every time there is some change in the law governing the
relevant function, the system will need to be modified, and over many
years a system may end up looking quite different and certainly less
elegant than its original design.
A related challenge is the potential for change in the relationship
between the ADM and the humans who are working with it. Even if
initially the system is designed so that humans are actively playing their
role in a decision-making system, there’s a well-documented tendency
toward technology complacency – over time people just tend to place
increasing reliance and confidence in technological outputs.
This means that even a system that is designed to be lawful – for
example with humans properly exercising discretion – might in practice
degrade, and cease to operate lawfully if those or future decision-
makers cease in practice to genuinely perform that task.

NSW Ombudsman | Avoiding (and investigating) automated maladministration
(g) Cost
Another challenge is the expense associated with making changes and
even small tweaks to ADM systems.
I’ll just give you a sense of what we are talking about here. During the
election, the now Government made an election promise that drivers
with demerits points will be eligible to have 1 demerit point removed if
they do not incur any further penalties over a 12-month period. This
was promoted as an incentive to safe driving.
It sounds simple enough.
Like other election promises, this one was costed by the NSW
Parliamentary Budget Office.
The proposal was estimated to cost $5.66 million. The estimate includes:
• $2.81 million to implement and test changes to systems, and
• $2.85 million in staff resources representing an increase of 21 full
time staff to support the policy including by responding to customer
inquiries and manually investigating individual cases.12
(h) Ownership and control
Finally, I want to note that many technologies are owned and controlled
externally to agencies. This can have a range of impacts on an agency’s:
• ability to make changes the system,
• essential understanding of how the technology works,
• ability to provide information about the operation of the system for
example to members of the public.

What about the new challenges raised by ADM specifically for Ombudsman and those who will be involved in investigating maladministration? There are a few obvious ones:
(a) System complexity
There are different phases of ADM systems from design to
implementation and ongoing monitoring and review. In many cases, we
need to understand each of those phases in detail in order to
understand how the system works, and where there may be possible
maladministration.

NSW Ombudsman | Avoiding (and investigating) automated maladministration
Consider again Robodebt – which as far as ADM goes, is relatively
simple.
But consider how much work the Robodebt Royal Commission has
undertaken just to understand how the system was implemented and
works.
Documenting roles of people and roles of machine becomes more
complex with handoffs between the two.
The Royal Commission sought expert advice from Deloitte in that case,
and the process flow charts they created are all available on the
Commission’s website.13
(b) Lawfulness
As I said before, a core question for maladministration investigators will
be whether agency use of ADM is lawful.
A finding that an agency has acted unlawfully is a serious claim to make,
and unsurprisingly agencies tend not to like it when we do.
It’s not something an ombudsman would do lightly or without being
pretty sure we are on firm ground.
However, particularly in this context, there is a challenge because:
• our findings are not authoritative, and it can’t be known whether a
court would come to the same conclusion, and
• issues relating to public sector use of ADM is still an emerging area,
and there is very little Black Letter or case law to guide us.
We can’t go to court for an advisory ruling, and so in the Revenue NSW
matter we obtained eminent Senior Counsel advice on the legality of
the system.
(c) Visibility and transparency
Another challenge is visibility. In our 2021 report to Parliament, a key
observation was that NSW agencies do not have an obligation to
routinely disclose when they are using ADM.
It follows that we do not know how many agencies are using, or
developing, ADM to assist them in the exercise of their statutory
functions. If we, and complainants, do not know when ADM is being
used, that is major obstacle to effective oversight by bodies such as ours.
I would also add here transparency, as distinct from mere visibility. It is
by now well-documented that many technologies are a black box in

NSW Ombudsman | Avoiding (and investigating) automated maladministration
terms of how exactly the algorithms and models work. In this case, it can
be difficult if not impossible to detect issues such as bias – which leads
me to the challenge of capability.
(d) Capability
Detecting bias and assessing accuracy will require in some cases,
specialist skills and knowledge.
Integrity bodies face this capability challenge now when we investigate
ADM systems. We can see the need for:
• internal capability to ensure staff are alive to identifying and
scrutinising ADM, and
• technical capability to understand an ADM systems’ technical
operation where required – this could either be embedded internally
or sourced externally.
Ideally a multidisciplinary team would investigate an ADM system to
consider it from a range of perspectives including legal, policy, user and
technical.
One of the interesting points here concerns external capability.
I referred a moment ago to our seeking a legal opinion about the
Revenue NSW garnishee system. We went to a Senior Counsel, who is
eminent in the field of administrative law. That person is not a judge and
is not infallible and it is possible their opinion will differ from a future
court if the issue were to come before it. But, that is unlikely, and the
clear structure and hierarchy of the legal profession means that we can
reasonably rely on the opinion we have received.
At the moment, while there are organisations which are developing
capabilities in the space of analysing AI systems – for example on
detecting algorithmic bias – it may be less clear who we should go to and
who we can reasonably rely on for expert opinion on such matters.
In some respects, this is not a unique situation for investigators – who
often need to become instant experts themselves – but I think the fact
this is still so new means that related fields of forensic accounting,
auditing etc are only still developing.
There is, at present, also no guidance in Australia for choosing a credible
expert or body to perform something like an algorithmic audit.

NSW Ombudsman | Avoiding (and investigating) automated maladministration
(e) Emerging standards
This leads me to the next point, which is that standards of conduct in
this area are only just starting to be developed.
What I mean is this – the role of an ombudsman would be more
straightforward, and investigation could be more streamlined if there
were in place very clear rules about what an agency must do when it
implements and uses an ADM system – for example, if there were
standards that said:
• it must commission an independent algorithmic audit by an expert
auditor accredited for that purpose, and
• it must obtain a comprehensive external legal certification that the
system is compliant with the laws governing the relevant function.
The more well-developed, comprehensive and well-accepted such
standards are, then the more straightforward in many ways will be the
role of a maladministration investigator.
It would mean that we could look, at least as a first step, simply at
whether those requirements were met.
Rather than diving straight in to ask technically very difficult questions
like: ‘Is this ADM infected by algorithmic bias?’ we could start – and
possibly end – by asking a range of questions in relation to the design,
implementation and operation of the system. For example, we would
ask: ‘was this system properly tested for algorithmic bias, in accordance
with the requirements of the relevant standards?’.
Internationally, there are a range of AI and ADM governance initiatives
that anticipate requirements for the assessment and audit of a systems’
conformity with established requirements. These may provide some
guidance for us. Currently the European Parliament is considering an ‘AI
Act’ and we will be interested to see whether and how that legislation
clarifies how an audit should be conducted and by whom.
Standards for the development and implementation of AI by
Government agencies are also beginning to develop here. NSW has
adopted an ‘AI Assurance Framework’. Since March 2022, it is
mandatory for agencies to apply the framework to projects that contain
or use non-off-the-shelf AI.14 However, the completed framework
assessment must be submitted for review to the AI Review Body in the
following circumstances: if the project uses AI and costs more than
$5million or was funded from the State’s Digital Restart Fund or; if the

NSW Ombudsman | Avoiding (and investigating) automated maladministration
project uses AI and mid-range or higher risks (according to the
framework) remain present after mitigations.
The Framework’s aims are fairly modest at this stage – analysing and
documenting a project’s AI risks.
Nevertheless, a failure by an agency to properly consider and apply the
framework may be an important conduct issue for consideration in any
maladministration investigation that subsequently arises in respect of an
agency’s use of an AI system.

Now let me turn explicitly to the ‘avoiding maladministration’ part of my presentation.
In our report, we distilled 5 key proactive steps agencies should take when introducing or reviewing their use of ADM.
• Step 1: Assemble the right team
The first is about assembling the right team, which must involve lawyers – the statute is the source of the power and agencies need people expert in that as well as policymakers, and operational and technical experts.
• Step 2: Determine the role of staff at the outset
Deciding how far a process can be automated is not an easy question. It needs to be assessed in the context of the agency’s functions and legislation. Merely placing a human on top of a process may not be sufficient to properly authorise automated decision-making.
• Step 3: Ensure transparency
We recommended agencies identify early in the project how they will be transparent about their ADM use including providing reasons for decisions made using ADM where required.
• Step 4: Test early and often
We highlighted that just like other tools that support administrative decision- making, ADM systems need to be tested before going live and at regular stages once it’s in operation to ensure decisions are legal, accurate and unbiased.
• Step 5: Consider legislative amendment
Finally, we recommended that agencies consider seeking legislative amendment where necessary or prudent if it might otherwise be legally risky to proceed with ADM.

NSW Ombudsman | Avoiding (and investigating) automated maladministration
Seeking express legislative authorisation for the use of ADM not only reduces the risks for agencies, it also gives Parliament and the public visibility of what is being proposed, and an opportunity to consider what other regulation of the technology may be required.

*****

Let’s get back to our furry friends where we started.
Back when the Ombudsman examined the use of drug detection dogs, the methodology used looked something like this:
• analysis of records kept by police on their use of drug dogs,
• directly observing police using drug dogs,
• reviewing court documents about the warrants for drug dog operations,
• reviewing transcripts and judgments of cases where charges were
brought following drug dog detection,
• consulting with a range of community groups and police officers of
various ranks, and
• examining complaints about police utilising drug dogs.
It was also useful, no doubt, to have a fairly good understanding about dogs generally, as well as obtaining some information about the proper training and handling of dogs.
What I think is interesting is what’s not on this list – at no point did the
Ombudsman engage expert scientists or veterinarians to try to understand the underlying mechanism of a dog’s nose to work out exactly how exactly it might be detecting drugs.
Now investigating AI may be different, depending on the circumstances, and it may be necessary for investigators to really ‘get under the hood’.
And we shouldn’t take the animal intelligence/artificial intelligence analogy too far.
However, there is one key aspect of both cases that I think is the same and if there is one point to take away from what I have said today it is this: when an ombudsman is investigating maladministration – we are always, and without exception, really investigating the conduct of people.

NSW Ombudsman | Avoiding (and investigating) automated maladministration
When we looked at drug detection dogs, we were concerned about the conduct of Police – their decisions and actions: how they used drug dogs, how they trained them, how they relied on them, and so on.
We weren’t investigating the dogs. If a dog indicated someone who wasn’t in fact in possession of any drugs, we don’t say that the dog has engaged in maladministration. The question is whether Police were wrong to rely on the dog’s indication.
It’s the same thing with AI. We are not investigating the AI as such. We are investigating the people who made the decision to design it, to test it, to deploy it, to train it, to use it, to consider its outputs, to rely on those outputs.
And when we look at those people’s conduct – the touchstones we turn to are the familiar and long-standing principles of administrative law and norms of good administrative practice – asking ourselves whether the conduct is lawful, reasonable, non-discriminatory, and just.
[END]

1
The report noted that no records of the actual number of people screened were available, ‘Review of the
Police Powers (Drug Detection Dogs) Act 2001’ NSW Ombudsman (Report)

NSW Ombudsman | Avoiding (and investigating) automated maladministration

This text has been automatically transcribed for accessibility. It may contain transcription errors. Please refer to the source file for the original content.

Make a general comment

We request that the personal contact details of Mr Clayton (email and phone number) be redacted from any published copy.

Do you agree with the definitions in this discussion paper? If not, what definitions do you prefer and why?

Refer to submission attached.