- The emergence of automated engines and AI to perform business valuation (in whole or part) presents both opportunities and risks for accounting firms.
- On the upside, these tools promise faster, more cost-effective valuation processes and automatic generation of consistent and professional valuation reports.
- However, these tools also come with a significant hidden risk – potential non-compliance with the mandatory Code of Ethics, and professional and engagement standards.
Author: By Richard Stewart OAM FCA
The digital revolution is transforming the accounting profession – and business valuation is not immune. Valuation tools range from software that automates processes and calculations to artificial intelligence (AI) engines incorporating machine learning and/or large language models such as ChatGPT. The emergence of these automated engines to perform business valuations presents both opportunities and risks for accounting firms.
On the upside, these tools promise faster, more cost-effective valuation processes and automatic generation of consistent and professional valuation reports. Their ability to efficiently digest volumes of related information into a meaningful resource which leverages off, in some cases, global experience and expertise is enticing and may improve outcomes. For those without in-house valuation expertise, they might also offer less reliance on outsourced specialists.
However, automated third-party tools also come with a significant hidden risk that must be addressed – potential non-compliance with the mandatory Code of Ethics, and professional and engagement standards.
“The complexity and limited transparency of automated tools means members cannot unquestioningly rely on them for valuations and also comply with the obligations of the Code of ethics and APES 225/AES-2.”
How automated valuation tools work
utomated valuation engines and AI tools aim to streamline valuations by combining real-time data, client-specific inputs and complex algorithms to generate valuations and client reports.
Understandably, where these tools are externally developed, the algorithms are not generally available for close examination. After all, they form a critical part of the tool provider’s intellectual property and commercial advantage. Even if they were openly available, their high level of complexity would mean that few would be able to confidently untangle how they work. In the case of AI engines and ChatGPT there’s also little to no visibility over the veracity of data training the algorithms.
This means that users of these tools are unlikely to have a full understanding of the calculations and assumptions being used to reach their valuation outcomes. Instead, they need to place their trust in the third-party provider getting it right. And herein lies the issue.
The issue with automation
In Australia, all members of CA ANZ, CPA Australia and the Institute of Professional Accountants (IPA) are required to comply with the Code of Ethics and other pronouncements set by the Accounting Professional & Ethical Standards Board (APESB). Members residing or practicing in New Zealand are subject to (amongst other things) the Code of Ethics and professional and engagement standards issued by the New Zealand Institute of Chartered Accountants (NZICA).
Professional competence and due care
The Code of Ethics (both APESB and NZICA) requires members to comply with the fundamental principle of professional competence and due care. This fundamental principle is reiterated and expanded on in the professional and engagement standards specific to business valuation: APES 225 Valuation Services, issued by APESB, and AES-2 Independent Business Valuation Engagements issued by NZICA. This obligation extends to the use of any third-party service or product, by assessing the third party’s professional competence and objectivity, as well as the appropriateness and reasonableness of the work performed.
The complexity and limited transparency of automated tools means members cannot unquestioningly rely on them for valuations and also comply with their mandatory obligations. Rather, members should use their professional judgement to critically evaluate the outputs of automated tools, and document how they have done this.
Confidentiality and privacy
There’s another significant risk when using third party non-proprietary tools – confidentiality. This is particularly the case for non-proprietary engines, for example, it isn’t clear how data input to ChatGPT is stored, accessed and distributed. Members using third party software must be very mindful of complying with the fundamental ethical principle of confidentiality, and applicable privacy laws in their jurisdiction. Even proprietary AI software, such as those developed by large firms is not without risk. It may reduce or eliminate the risk of not understanding the source of data, but firms need to ensure that they have their client’s permission to use their data in this way to not fall foul of applicable standards and regulations.
Our recommendation to members
Does this mean that accounting professionals should avoid automated valuation and AI tools? Not at all – when leveraged appropriately they can be very beneficial. However, the scope of their use needs to be clearly considered, defined and critically evaluated, as well as the confidentiality of client information and data retained, and appropriate permissions sought.
So take care if you are considering using a third-party automated valuation tool. If it’s a non-proprietary tool, ensure confidentiality and privacy is appropriately addressed and if it is a proprietary tool, make sure you have permission to use your client’s data in this way. In all cases, ensure you have undertaken sufficient work to meet the ‘professional competence and due care’ test in the Code of Ethics and APES 225/AES-2. Test the reliability and accuracy of the tools regularly to ensure that conclusions drawn are appropriate, and document the work you have done to satisfy yourself that reliance on the output of automated tools is appropriate in the circumstances.