Perceptions about analytics-aided audit quality
Audit firms are increasingly performing audits using data and analytics tools and techniques and have invested substantial resources in recent years into new technologies. However, auditors have expressed concerns about how external reviewers will scrutinize these approaches. New research supported by the AICPA’s Assurance Research Advisory Group provides insights on this topic.
The research was performed by:
Scott A. Emett, Ph.D., accounting assistant professor at the W.P. Carey School of Business at Arizona State University.
Steven E. Kaplan, Ph.D., accounting professor at the W.P. Carey School of Business at Arizona State University.
Elaine G. Mauldin, CPA, Ph.D., accounting professor at the Trulaske College of Business at the University of Missouri and president-elect of the American Accounting Association.
Jeffrey S. Pickerd, CPA, Ph.D., assistant professor at the Patterson School of Accountancy at the University of Mississippi.
The researchers conducted two experiments in which audit partners and managers with experience as an AICPA peer reviewer or firm quality control reviewer and familiarity with data and analytics approaches participated. In the experiments, the participants conducted a hypothetical external engagement review where the audit team either employed a data and analytics audit approach or a traditional audit approach.
Their results across the two experiments provided evidence that experienced auditors acting in the role of external reviewers judge data and analytics approaches to be lower in quality than traditional approaches, even though the level of audit assurance was constant. Additional analyses of their first experiment suggest the results were driven by external reviewers’ relying on the “effort heuristic,” which is the tendency to judge quality based on perceived effort. Their second experiment evaluated an intervention that successfully reduced reviewers’ reliance on the effort heuristic.
Overall, their evidence substantiates auditors’ concerns, identifies a specific cause for the concern, and introduces an intervention that addresses the concern.
The researchers answered questions about their research and findings in a Q&A session with the JofA. This is an edited version of that conversation.
What motivated you to do this study?
Emett: There is an interesting dynamic going on within the audit profession over the past few years as audit firms are investing billions of dollars on new data and analytics technologies they hope will transform their audit approaches. At the same time, those same firms have expressed concerns that external reviewers in charge of evaluating audit quality, including PCAOB inspectors and AICPA peer reviewers, may be applying unwarranted scrutiny when evaluating audit approaches incorporating data and analytics tools. One of the main goals of our study was to test how external reviewers perceive these technologies when they are incorporated into audits.
Why did your research focus on data and analytics audit approaches within audit firms?
Mauldin: We are all interested in audit quality, and all our backgrounds include involvement in audit quality in one fashion or another. It seemed a natural fit to consider how data and analytics affects perceptions of audit quality.
Pickerd: Although we have seen firms making such large investments, it’s still an open question whether these tools can be implemented in practice and how they will change the nature of audits. Because of the great potential for data and analytics to transform how audits are done, we wanted to explore the costs and benefits and how to avoid the pain points in data and analytics approaches.
How did you define audit quality for your research?
Pickerd: We know audit quality can be measured in different ways. We looked at the participants’ perceptions of the quality of the audit procedure performed, whether they believed the procedure had the ability to detect misstatements and whether it changed legal liability risks. Each of these different measures of audit quality gets to a different aspect of audit quality with interesting results on each individual component and also the composite audit quality measure.
Can you describe the two experiments covered by your research?
Kaplan: Experiments were set up where highly experienced audit partners and managers took the role of an external reviewer. The reviewers were given the results of audit procedures that were identical in their statistical inferences but were presented using either a traditional approach or using a data and analytics approach.
Relatedly, while data and analytics approaches are used to identify various exceptions in the population, they are often followed up with additional sampling. So using data and analytics changes, but doesn’t avoid, sampling. We were interested in the question of how external reviewers look at sampling results when sampling is involved in both traditional and data and analytics approaches. Does it change their perception of quality because the nature of the sampling changes in the two approaches?
Emett: The first experiment had two goals. The first goal was to answer the big picture question: Could external reviewers perceive audit quality to be lower when the audit team uses data and analytics audit approaches? Auditors with an average 17 years of experience assumed the role of an external reviewer performing an engagement review. Half evaluated an audit with data and analytics tools, and half evaluated an audit without these tools. It was important to keep both approaches on a level playing field. We didn’t want one approach to be objectively better or worse than the other approach, so we held constant the level of audit assurance provided by the two approaches.
Our second goal was to explore different reasons why external reviewers might evaluate audits differently when audits incorporate data and analytics tools. We hypothesized that external reviewers might use the “effort heuristic,” meaning reviewers might ask themselves how much manual effort did the engagement team exert within the audit and view perceptions of effort as a cue to audit quality. The use of the effort heuristic is common, based on psychological research. But it can be problematic for external reviews of data and analytics approaches, because those approaches provide a high level of audit assurance without high engagement team effort. Reviewers may be punishing data and analytics approaches that don’t require much manual effort from the engagement team.
Mauldin: Theories of psychology indicate that the effort heuristic is primarily unconscious. These are beliefs we bring with us when evaluating a task. I found it fascinating that firm investment in data and analytics tools was ignored. At a firm level, there’s a lot of cost and effort going into data and analytics tools. But as external reviewers review individual audits, they appear more focused on the manual effort in the audit procedures and not on the firm’s investment as a whole in data and analytics.
Emett: Experiment 2 had two goals. We wanted to provide additional stronger evidence of the effort heuristic and try to provide evidence on an intervention that could potentially reduce the extent of an effort heuristic reliance. A key research question was, “If we prime participants that high-quality audits can be done without extensive manual effort, will they stop relying on the effort heuristic when evaluating data and analytics audit approaches?”
The average years of audit experience in Experiment 2 was 24 years. We set up the second experiment the same as the first with one exception. Before the participants evaluated the different audit approaches, we randomly assigned them to read one of two speeches. One emphasized the importance of audit effort and the other audit execution. Then they evaluated either traditional or data and analytics audit approaches.
Mauldin: When we talked about audit execution, the speech didn’t mention data and analytics tools. It was much more subtle — the concept of work smarter not harder. In neither was data and analytics tools mentioned because it might have affected participants’ perceptions.
What were the results of the experiments?
Emett: The results of Experiment 1 were that participants perceived the data and analytics audit approaches to be lower in quality than the traditional audit, and we found evidence consistent with their using an effort heuristic — they perceived that data and analytics approaches entailed lower effort, so they in turn perceived them to be lower in quality. To the extent we can generalize, this could potentially discourage audit firms from applying data and analytics approaches if their work is judged to be of lower quality.
In Experiment 2, when participants read the audit effort speech, the results mirrored Experiment 1. Data and analytics audit approaches were evaluated as lower quality because participants were using an effort heuristic. When they read the audit execution speech, they no longer fell prey to the effort heuristic and evaluated the data and analytics approach to be of similar quality to the traditional audit approach.
Kaplan: The speech wasn’t meant to change perceptions of effort. It did change whether effort was a signal of audit quality. When they saw analytics, they perceived quality as lower. But with the intervention, reviewers were far less likely to use effort as a signal to judge audit quality.
What did you learn in terms of the results?
Kaplan: Firms invest in technology to create tools intended to simultaneously be more effective and more efficient. Data and analytics fits that mold because presumably it’s a more efficient, technology-enabled process requiring less human effort. Part of our interest was to see when firms invest in technology, will external reviewers penalize auditors for using technology-enabled approaches that allow the engagement team to be more efficient and directly apply less effort to the engagement? Does it change perceptions of effort and quality? Without the intervention, yes. There are concerns for firms considering investing in technology that audit quality may be judged lower. The question is what are the possible interventions firms can think about within their firms and when working with reviewers.
What do you mean by audit execution, and how is it different from effort?
Emett: The effort heuristic has two links in the process: Participants were perceiving data and analytics approaches as lower effort, and the second link was their belief that higher effort equals higher audit quality. The second experiment was trying to break the second link and persuade participants that higher effort doesn’t necessarily lead to higher audit quality. The execution speech in Experiment 2 was given by a respected member of the audit community and argues that higher effort doesn’t necessarily mean higher audit quality but that better audit execution improves quality. There are a number of audit procedures that entail a lot of effort but don’t improve audit quality that much. If auditors perform the types of audit procedures that improve audit quality, there will be a higher-quality audit regardless of the effort.
We made a conscious choice as we designed Experiment 2 to avoid talking about specific audit procedures. We didn’t want to recommend data and analytics approaches, but rather discussed philosophically the relationship between audit effort and quality.
Do you envision that the results of your research will have an impact on standard setters?
Mauldin: We are not saying the standards themselves are at fault or bad. We are trying to bring to light the need for recognition of old-school thought processes and come up with ways firm leaders and AICPA training for external reviewers could encourage thinking about all of this. We want to help bring to light a problem and a potential solution that allows practice to think through how to encourage use of these tools without reviewers being a drag on them.
Could your research influence the audit standards themselves?
Emett: Not the standards, per se. Our message to the profession relates to the old ways of thinking that a lot of manual effort is required for a quality audit. The audit profession as a whole may need to adjust to the idea that with new technology, high-quality audits can be performed with much less engagement team effort than in the past. We all believe the audit profession can adjust to this idea through training, guidance, awareness, etc. I’m not sure standard setting is the best way to do that.
Based on your research findings, what would your advice be to firm leaders?
Pickerd: There are a variety of ways to promote this messaging — group discussions, training, guidance, general awareness, and promotion of the idea that there can be high-quality audits with lower manual audit effort because of data and analytics approaches. Otherwise people will go back to their old way of thinking that manual audit effort is linked to audit quality.
Mauldin: One thing that occurs to me is that firm leaders might talk to external reviewers a little bit differently as they are turning over their audit procedures to the reviewers — talk about execution versus the effort, and the manual effort in the audit versus all the firm effort that was involved in designing the procedures.
What would your advice be for external reviewers?
Kaplan: It’s a difficult message because reviewers’ biases are grounded in experience. In general, effort does matter and leads to good outcomes. It may be hard for external reviewers to address this because it is so intuitive to them. It’s more about the infrastructure of the external review program and the role of education and training. The AICPA and PCAOB have training and operational guidance that might advise reviewers when they approach the audit: If the firm you are reviewing is using data and analytics, you may need to reconsider your mindset and potentially be more open-minded, consider the firm’s investment in technology and the ability of technology-enabled approaches, and effort may not be a very good signal of audit quality.
Emett: If I’m thinking about this strategically from the perspective of the audit firm, one of the implications is that if external reviewers continue to use an effort heuristic in the immediate future, our study suggests to firm leaders that at least for right now they can’t use data and analytics audit procedures to just hold audit quality constant and reduce the cost of an audit. They have to use data and analytics to improve audit quality. We held audit assurance constant in our experiment across the two conditions. But if we had set audit assurance higher in the data and analytics condition, maybe reviewers wouldn’t have said it was lower in audit quality. Audit firms need to use data and analytics approaches to go above and beyond what they’ve done in the past, or they can expect some pushback from external reviewers.
How can your findings affect audit quality going forward?
Emett: I hope our findings can help alleviate some of the tensions that arise between audit firms and external reviewers. There have been some great recent research papers providing evidence that there is quite a bit of tension between them specifically on the issue of technology adoption and data and analytics audit approaches. External reviewers should consider evaluating data and analytics tools based on their merits, rather than the effort heuristic, and then firms may accelerate their efforts in adopting these tools and ultimately improve audit quality.
Pickerd: As we help external reviewers rely on the merits of the data and analytics tools and remove the effort heuristic, reviews will be more focused on audit quality, and audit quality will improve.
Do you plan to do additional research on this or a related topic?
Kaplan: We examined one type of intervention — to change the link for effort as a signal of audit quality. We considered a different intervention that was changing people’s perceptions of effort. There is effort by the engagement team but also by the firm to develop data and analytics tools. Another potential intervention/exploration that we didn’t look at explicitly but thought about was how to help reviewers broaden the definition of audit effort to be the total effort by the firm and the engagement team related to data and analytics tools. There is definitely more work to be done in this area.
— Maria L. Murphy, CPA, is a freelance writer based in North Carolina. To comment on this article or to submit an idea for another article, contact Ken Tysiac, the JofA’s editorial director at Kenneth.Tysiac@aicpa-cima.com.
Research & References of Perceptions about analytics-aided audit quality|A&C Accounting And Tax Services
Source
0 Comments