Literature Review
Print
Literature Review
Behavioral challenges to professional skepticism in auditors’ data analytics journey
expand article infoXiaoxing Li
‡ Vrije Universiteit Amsterdam, Amsterdam, Netherlands
Open Access

Abstract

The aims of this paper are to inform audit practice and academia about the potential behavioral challenges to the application of auditors’ professional skepticism when using audit data analytics (ADA) and to discuss future research opportunities. This is accomplished by reviewing relevant audit research and discussing the potential challenges from five perspectives, including auditors’ attitudes toward ADA, data characteristics, anomalies identified by ADA, auditors’ mindsets, and social contexts and interactions involved in ADA practice. Although applying ADA brings many benefits to audit practice, they simultaneously raise many challenges to the application of appropriate levels of auditor professional skepticism. Being aware of and prepared for those potential behavioral challenges is critical to maximize the benefits of ADA to professional skepticism and ultimately audit quality.

Keywords

Audit data analytics (ADA), behavioral challenges, professional skepticism

Practical relevance

This paper is relevant for audit practice by highlighting and informing the audit profession about the potential behavioral challenges to the application of professional skepticism when using ADA. Possible mitigation methods provided by academic literature to audit practice are also discussed in this paper.

1. Introduction

Audit firms around the globe have invested heavily into a variety of audit technologies (e.g., Alles and Gray 2016; Deloitte 2016; KPMG 2016, 2019; EY 2018; PwC 2020; Eilifsen et al. 2020; Austin et al. 2021). Of these technological developments, audit data analytics (ADA) in particular are receiving increasing attention in auditing because they facilitate the incorporation of larger and more complex datasets as well as more diverse data sources into audit testing (e.g., FRC 2017). This paper adopts the ADA definition by the AICPA (2015, 2017) as being “the science and art of discovering and analyzing patterns, identifying anomalies, and extracting other useful information in data underlying or related to the subject matter of an audit through analysis, modeling, and visualization for the purpose of planning or performing the audit” (AICPA 2015, p. 92; 2017, p. 1).

ADA are expected to facilitate more effective and efficient auditor judgment (e.g., AICPA 2017). Instead of incorporating only a limited number of data sources, ADA enable auditors to simultaneously compare data from a wider variety of sources (e.g., prior year balances, budgets, industry data, data from related accounts, and non-financial measures) as suggested by auditing standards such as ISA 520 (IAASB 2018). Hence, the use of ADA will enable auditors to potentially gain deeper insights into their clients’ data and obtain a better understanding of the clients’ business and environment compared to traditional methods (e.g., Brown-Liburd et al. 2015; Cao et al. 2015; Krahel and Titera 2015; Yoon et al. 2015; FRC 2017; IAASB 2020c, 2021; Austin et al. 2021; PCAOB 2021a). The use of ADA is also expected to improve audit efficiency. Using ADA potentially reduces the audit time and cognitive effort needed for processing the information (e.g., Anderson et al. 2020). Therefore, the speed at which auditors can identify inconsistencies, anomalies, or red flags indicating higher risk of material misstatements will potentially increase substantially (e.g., AICPA 2017).

Of particular interest in the current paper, ADA are expected to improve the appropriate application of auditors’ professional skepticism, ultimately improving audit quality. Auditing standards such as ISQM 1 (IAASB 2020a) and ISA 220 (IAASB 2020b) define professional skepticism as an attitude of a questioning mind and a critical assessment of audit evidence. Using ADA potentially improves auditors’ judgment quality because it helps the auditor to efficiently identify patterns and inconsistencies, thereby motivating more application of professional skepticism and increasing the likelihood of identifying material misstatements (IAASB 2020c).

Besides potential benefits, research shows that ADA may also bring challenges to the appropriate application of professional skepticism (e.g., Appelbaum et al. 2017; Rose et al. 2017; Barr-Pulliam et al. 2020; Holmstrom 2020; Austin et al. 2021; Commerford et al. 2021; Holt and Loraas 2021). Being alert to the potential behavioral challenges and exploring approaches of mitigating those challenges are critical for audit practice to fully utilize the benefits of ADA. Therefore, this paper’s objectives are to discuss the potential behavioral challenges created by ADA to professional skepticism as identified in the literature, review possible approaches of mitigating those challenges, and suggest future research opportunities. Namely, I discuss the potential behavioral challenges from the following five perspectives:

  • auditors’ attitudes toward ADA,
  • data characteristics,
  • anomalies identified by ADA,
  • auditors’ mindsets when using ADA, and
  • social contexts and interactions involved in ADA practice.

First, forming an appropriate attitude about ADA is important since both under-reliance and overreliance on ADA may impede the appropriate exercise of professional skepticism (section 2). Second, auditors should be aware of how the data characteristics (e.g., data reliability and data relevance) may influence their interpretations of audit evidence obtained from ADA, potentially distorting the appropriate exercise of professional skepticism (section 3). Research further identifies that the anomalies identified by ADA may influence auditors’ application of professional skepticism (e.g., the larger number of anomalies, false positives, and false negatives) (section 4). Next, auditors need to adopt appropriate mindsets when using ADA since mindsets can influence their judgment quality (section 5). Finally, the social contexts and interactions between auditors and other stakeholders in the ADA journey may also influence the exercise of professional skepticism (section 6). I conclude the paper with a summary and discussion in section 7.

2. Auditors’ attitudes toward ADA

Attitude is usually defined as “an evaluative integration of cognitions and affects experienced in relation to an object” (Crano and Prislin 2006, p. 347). Auditors’ forming of attitudes during an audit has the potential to influence their professional skepticism (e.g., Nolder and Kadous 2018). In fact, auditing standards define professional skepticism as a skeptical attitude including a questioning mind and a critical assessment of audit evidence (IAASB 2020a, 2020b). Therefore, taking appropriate attitudes in the auditing process is critical to auditors’ skeptical judgment quality and hence predicts their subsequent skeptical intentions and behaviors, ultimately determining audit quality (e.g., Nolder and Kadous 2018). Auditors’ use of ADA adds complexity to this requirement since auditors also need to form an appropriate attitude toward ADA in their cognitive and evaluative responses to the audit evidence generated by ADA.

One example of a potentially inappropriate attitude in this regard is auditors’ exhibition of algorithm aversion in their use of ADA. Specifically, advanced ADA algorithms (e.g., artificial intelligence) employ machine learning techniques to integrate, model, and analyze large and diverse datasets, thereby assisting auditors with some challenging tasks, such as evaluating complex accounting estimates, assessing fraud risk, and estimating the financial distress related to going concern opinions (e.g., Murphy 2017; Gepp et al. 2018; Commerford et al. 2021). However, prior research suggests that auditors may suffer from an aversion to algorithms and as a result they may under-rely on the evidence provided by ADA (e.g., Önkal et al. 2009; Commerford et al. 2021). Algorithm aversion implies that humans perceive algorithms as incapable when the task involves a high level of subjectivity (Castelo et al. 2019; Yeomans et al. 2019). Commerford et al. (2021) find that auditors exhibit algorithm aversion when artificial intelligence is used in auditing of complex accounting estimates. Specifically, auditors receiving evidence that contradicts management’s position from an artificial intelligence system, compared with evidence from a human specialist, potentially propose smaller audit adjustments to management’s complex accounting estimates. As such, auditors’ susceptibility to algorithm aversion may even lead to an increased reliance on management’s evidence. Therefore, algorithm aversion may undermine auditors’ questioning mind and their subsequent skeptical behavior (e.g., Commerford et al. 2021).

While under-reliance on ADA can be problematic, it is equally important for auditors to avoid overreliance on ADA (IAASB 2021). Although there is little research on algorithm appreciation (Logg et al. 2019) in the auditing literature, regulators and standard-setters express concerns about the possibility of auditors’ overreliance on audit technologies and its potential damage to the critical assessment of audit evidence from ADA (IAASB 2021).

As a result, it is critical for auditors to adopt an appropriately balanced attitude toward ADA. The auditing literature has identified some possible solutions to increase auditors’ reliance on audit technologies in general or audit evidence from ADA, such as increasing transparency of those audit technologies (Holmstrom 2020), and increasing auditors’ control over and hence perceived ownership of those tools (Dietvorst et al. 2018; Holmstrom 2020).

Further research could examine behavioral interventions that can be used to enhance auditors’ appropriate reliance on ADA. Specifically, future research may explore what features can be added to current ADA tools to prime auditors’ appropriate level of reliance when using those tools.

3. Data characteristics

In this section, I discuss how auditors’ perceptions of the reliability and relevance of data inputs processed by ADA may influence their evaluations of audit evidence obtained from ADA and hence influence their application of professional skepticism.

3.1. Data reliability

The data characteristics that may influence auditors’ perceptions of data reliability discussed in this section are data sources and data structure.

3.1.1. Data sources: internal versus external data

Client-internal data sources, such as client transaction records, ledger accounts, and general ledgers, will probably continue to be the primary data sources for audit testing under ADA approaches. However, external data from multiple sources, such as industry data from third-party data providers or big data from social media platforms, can also be processed in ADA testing to complement current internal data (e.g., Yoon et al. 2015; PCAOB 2021b). Incorporating data from multiple sources potentially increases the likelihood of obtaining contradictory evidence, hence reducing auditors’ tendency to simply confirm management numbers.

Encouragingly, data obtained from sources outside the client entity may be perceived as more independent and reliable, and therefore the audit evidence obtained from external data may be regarded as being of higher reliability (e.g., AICPA 2017; IAASB 2019; PCAOB 2021b). On the other hand, related controls regarding the external data may be insufficient and ineffective, therefore potentially reducing the reliability of audit evidence obtained from external sources (e.g., AICPA 2017; IAASB 2019; PCAOB 2021b). Given these uncertainties related to the varying reliability of external data, the contradictory evidence obtained from external sources may be evaluated as insufficient to justify auditors’ positions against their clients. Therefore, even though incorporating external data from various sources into audit evidence is possible through ADA, whether it can enhance the application of professional skepticism is uncertain.

Future research could examine how auditors perceive data source reliability, given varying data sources, and how such perceived reliability influences their application of professional skepticism. In particular, further research could examine whether and how incorporating external data into analyses enhances or dampens professional skepticism.

3.1.2. Data structure: structured versus unstructured data

ADA allow auditors to incorporate both structured and unstructured data. Structured data, such as financial statements, journal entries, and general ledgers, have standardized ways of presentation and interpretation, while unstructured data, such as emails, usually lack those standardizations. Some unstructured data could reveal rich information capturing nuances in personal emotions and motivations (e.g., Holton 2009; Moffitt and Vasarhelyi 2013; Cao et al. 2015). Therefore, the use of ADA may expand auditors’ information sets and motivation for incorporating traditionally less obtainable data into audit testing, thereby enhancing their professional skepticism.

However, unstructured data are inherently more ambiguous and hence open to multiple interpretations, increasing the perceived difficulty of generating plausible explanations for the fluctuations in this type of data (e.g., Luippold and Kida 2012). Hence, incorporating unstructured data into ADA and interpreting the results may be a challenging task for auditors. This perceived difficulty may even increase auditors’ tendency to anchor on the management explanations and hence hinder their judgment quality and application of skepticism (e.g., Rose et al. 2020).

Holt and Loraas (2021) find that unstructured data lead to more conservative judgment than structured data. Specifically, auditors presented with unstructured data, compared with structured data, potentially assess the risk as higher and are more likely to recommend inventory write-downs (Holt and Loraas 2021). From this perspective, unstructured data seem to positively influence professional skepticism. Of note, in Holt and Loraas’s (2021) study, auditors are preassigned to conditions providing them with either structured or unstructured data. That is, auditors do not have the opportunity to choose which data they prefer. Meanwhile, in audit practice, auditors, to some extent, have discretion over selecting the information they intend to use as audit evidence. Hence, if auditors are intolerant to information ambiguity, they may actively avoid attending to unstructured data. Therefore, information ambiguity of unstructured data, rather than enhancing auditors’ conservativism and judgment, may reduce auditors’ intention to incorporate those data, thus missing opportunities to gain useful insights.

Further research is recommended to examine auditors’ preferred choices of selecting structured versus unstructured data for their analyses. Research could also provide empirical evidence on auditors’ current practice of incorporating unstructured data into ADA testing.

3.2. Data relevance

Besides data reliability, data relevance is another critical determinant of the appropriateness of audit evidence (AICPA 2017). Although larger and more complex datasets can be incorporated into ADA testing, not all data available are highly relevant to the subject matter of an audit. Including irrelevant or weakly relevant data may actually impair auditor judgment and application of professional skepticism. Auditors incorporating irrelevant information into their testing may suffer from the dilution effect, which occurs when irrelevant information negatively influences decision makers’ evaluation of relevant information (e.g., Nisbett et al. 1981; Hackenbrack 1992; Hoffman and Patton 1997; Waller and Zimbelman 2003). For example, Hackenbrack (1992) finds that auditors’ fraud risk assessments potentially become less extreme in the presence (vs. absence) of nondiagnostic, irrelevant evidence. Incorporating weakly relevant information into audit testing also possibly distorts auditors’ assessments of strongly relevant information, which is known as the averaging effect (e.g., Lambert and Peytcheva 2020).

Concluding, auditors need to pay close attention to distinguishing more relevant data from less relevant data before inputting them into ADA tests. As the volume and complexity of data increase dramatically in this Big Data era, it can be difficult to clearly discriminate the data relevance level. Further research is needed to investigate whether and how auditors distinguish relevant data from irrelevant or weakly relevant data when selecting inputs to ADA. It is also recommended to explore potential interventions to mitigate the potential dilution and averaging effects on auditor judgment when using ADA.

4. Anomalies identified by ADA

In this section, I discuss how the anomalies identified by ADA may influence auditor skepticism. The topics discussed in the current section are the larger number of anomalies, false positives and false negatives.

4.1. The larger number of anomalies

As the size and complexity of datasets included in ADA testing increase, the number of anomalies identified is also likely to increase dramatically (e.g., Brown-Liburd et al. 2015). For example, the number of cases violating controls is bound to increase when the testing is based on the full population rather than on a sample (e.g., Brown-Liburd et al. 2015). However, given budget constraints, auditors will not have the resources to investigate all the anomalies identified by ADA. Besides, investigating all anomalies is likely to induce information overload, which possibly leads to suboptimal decision making (e.g., Alles et al. 2006; Rose et al. 2017).

Therefore, auditors will need to employ certain prioritization procedures where they exercise cognitive effort and skeptical judgment to determine which anomalies will be further investigated and the order in which they will be examined (e.g., Brown-Liburd et al. 2015). Research could examine what are the factors that auditors consider when they prioritize anomalies for further investigation so as to both efficiently and effectively exercise their professional skepticism.

4.2. False positives

False positives (i.e., type I errors) are those items or relationships identified as potential anomalies that, after further investigation, are determined to be reasonable and explained variations in the data (e.g., AICPA 2017; Johnson and Wiley 2019; Barr-Pulliam et al. 2020). Along with the larger number of anomalies identified by ADA, the presence of a larger number of false positives brings additional challenges to the exercise of professional skepticism (e.g., Cao et al. 2015; Earley 2015; Krahel and Titera 2015; Vasarhelyi et al. 2015; Wang and Cuthbertson 2015; Yoon et al. 2015; Alles and Gray 2016; AICPA 2015, 2017; Richins et al. 2017; Salijeni et al. 2019; Barr-Pulliam et al. 2020; Kipp et al. 2020; Austin et al. 2021; Krieger et al. 2021). Higher false positive rates of ADA indicate lower calibration rates and, hence, investigating anomalies leads to potentially excessive costs (e.g., audit reporting delays, budget overages, but also strained client relations). Given the usually high budget constraints faced by auditors, higher false positive rates may reduce auditors’ motivation to investigate the anomalies identified by ADA, therefore reducing their motivation for skeptical behavior (Barr-Pulliam et al. 2020).

Since the presence of a larger number of false positives has become a concern in the application practice of ADA, it is critical to explore how to counter negative effects of false positives on the application of professional skepticism (e.g., Salijeni et al. 2019; Barr-Pulliam et al. 2020; Austin et al. 2021). Indeed, research finds that consistently rewarding auditors for their professional skepticism helps to motivate auditors’ exercise of professional skepticism but only when false positive rates are lower (Barr-Pulliam et al. 2020). Therefore, it seems that improving the calibration rates (i.e., reducing the false positive rates) of ADA is the primary solution to this issue. So, further research could examine what measures audit firms can take to reduce the false positive rates of ADA (e.g., Baader and Krcmar 2018). However, since this technological improvement process is expected to be complex, further research may also explore behavioral interventions that can be used to enhance auditor professional skepticism when using ADA with higher false positive rates.

4.3. False negatives

Besides false positives, ADA may also produce false negatives (i.e., type II errors). False negatives refer to those red flags that ADA fail to identify (e.g., Banerjee et al. 2009). For example, ADA may fail to identify an unusual transaction in client data, but later in the review or inspection process, this transaction is found to indicate a material misstatement. Compared with false positives, false negatives more directly threaten auditor reputation, litigation and, most importantly, audit quality.

Since ADA enable the incorporation of larger and more diverse datasets, they are expected to reduce the risk of missing important information. However, the size of the datasets is not necessarily positively related to data completeness or quality, and therefore it is still possible that some relevant data fail to be included into ADA for audit testing. Hence, false negatives still emerge in ADA testing even though larger datasets have been incorporated.

Importantly, ADA’s capability of incorporating larger and more diverse datasets may create an illusion that the output is free of missed information. Psychological research indeed finds that the amount of information more saliently enhances decision makers’ confidence in their judgment than the accuracy of their judgment (Tsai et al. 2008). If this is the case, auditors using ADA may be overly satisfied with their current information sets which could result in overconfidence about their current evidence, possibly reducing their sensitivity to new information and hence their motivation for further information seeking (e.g., Desender et al. 2018; Desender et al. 2019). Auditors may even be motivated to ignore other important information that is not currently included in the ADA testing (e.g., Williams 2021).

Further research on auditors’ response to false negatives of ADA could examine whether and how the incorporation of larger datasets into ADA changes auditors’ expectation on the false negative rates. It is important to learn more about auditors’ awareness of the false negative rates of ADA and to explore what potential interventions can be taken to maintain auditor skepticism, especially their sensitivity to new information when perceiving lower false negative rates of ADA testing.

5. Auditors’ mindsets when using ADA

A mindset refers to “a set of judgmental criteria and cognitive processes and procedures that produce a disposition or readiness to respond in a certain manner” (Griffith et al. 2015a, p. 54). Mindsets can influence auditors’ judgment quality and application of professional skepticism (e.g., Griffith et al. 2015a; Brewster and Bucaro 2020; Saiewitz and Wang 2020). Therefore, adopting inappropriate mindsets may diminish the benefits of ADA to audit practice. Although there is limited research on mindset during ADA use (Cao et al. 2021), implications can be drawn from research in related auditing areas (e.g., Griffith et al. 2015a; Plumlee et al. 2015; Backof et al. 2018; Austin et al. 2020; Saiewitz and Wang 2020).

Mindsets potentially influence auditors’ ADA adoption decisions. Cao et al. (2021) find that auditors with growth mindsets, compared to auditors with fixed mindsets, are more likely to adopt ADA when inspection risks are higher. It is often observed that people with a growth mindset believe that abilities and intelligence can be developed if they expend more effort to learn, and hence they usually attribute their successes (failures) to (lack of) effort. In contrast, those with a fixed mindset believe that abilities and intelligence are fixed attribute that cannot be developed by devoting effort, and therefore, they usually attribute their successes (failures) to (lack of) abilities or intelligence (Dweck and Leggett 1988; Elliott and Dweck 1988; Dweck 2008). In a high-inspection-risk environment, auditors with a growth mindset are more likely to adopt an ADA because they recognize the learning potential. In contrast, auditors with a fixed mindset tend to be concerned about the potential negative implications of performance failures, which decreases their motivation to adopt ADA. Although Cao et al. (2021) do not examine the effect of growth versus fixed mindsets on professional skepticism, it can be expected that auditors with a growth mindset, compared with a fixed mindset, are more likely to investigate the anomalies identified by ADA because they are likely to regard investigations as good opportunities to improve their knowledge and expertise.

In addition to ADA adoption decisions, future research could examine how mindsets influence auditors’ assessment and evaluation of contradictory evidence from ADA. Griffith et al. (2015a) find that auditors in a deliberative mindset, compared to in an implemental mindset, make higher-quality judgments in complex accounting estimate audits. According to the mindset theory of action phases (Gollwitzer 2012), when making goal decisions, decision makers tend to adopt a deliberative mindset to broadly process available information in an open-minded and balanced manner, which “facilitates a broad consideration of the pros and cons of various alternatives” (Griffith et al. 2015a, p. 55). Once the goal is determined, decision makers turn to an implemental mindset, which “facilitates planning how, rather than whether, to execute a task or reach a goal” (Griffith et al. 2015a, p. 55). Therefore, auditors employing a deliberative mindset, relative to an implemental mindset, are less likely to exhibit bias in their information evaluation, hence motivating their examination on contradictory information and identification of inconsistencies (Griffith et al. 2015a). Similar implications can be obtained from similar studies, such as Austin et al. (2020) and Saiewitz and Wang (2020). Overall, research implies that employing mindsets which improve auditors’ examination on contradictory evidence is critical to improve their skeptical judgment and application of professional skepticism.

Future research could more examine whether the implications from prior literature can be generalized to ADA practice. Particularly, future research should explore whether there are certain types of mindsets specifically related to ADA so that auditors can employ to improve professional skepticism when using ADA.

6. Social contexts and interactions involved in ADA practice

In this section, I discuss how the contextual environment around auditors and their interactions with key stakeholders in the ADA journey potentially influence their judgment quality and motivation for skeptical behavior when using ADA. The potential factors discussed in the current section include tone at the top, the work of data specialists, the audit committee’s attitude, sophistication of the client’s information technology (IT) systems, and regulations.

6.1. Tone at the top

Supervisors can play a significant role in auditors’ ADA journey (e.g., Nelson et al. 2016; Kim et al. 2017; Dennis and Johnstone 2018). Research finds that subordinate auditors’ judgments and decisions are influenced by their supervisors (e.g., Peecher et al. 2010; Kim et al. 2017). Therefore, establishing an appropriate tone at the top emphasizing the importance of applying professional skepticism in general or specifically in ADA practice may significantly influence auditors’ skeptical behavior when using ADA.

Future research could examine what is the appropriate leadership or tone at the top (e.g., transformational, transactional, delegative, participative, or authoritarian) to encourage auditor professional skepticism when using ADA. Research could also investigate whether and how the methods of expressing (e.g., explicitly vs. implicitly) the tone at the top would influence auditors’ skepticism in ADA practice.

6.2. The work of data specialists

Since auditors usually lack the expertise to fully interact with emerging technologies (e.g., Walker and Brown-Liburd 2019), data specialists are likely to play an important role in auditors’ ADA journey. At the development and initial adoption stage, auditors may rely on IT auditors or data specialists to develop the ADA tests and therefore rely on the visual outputs or exception reports from ADA (e.g., Austin et al. 2021). In subsequent years of applying ADA, auditors are expected to appropriately reduce their reliance on data specialists’ work by developing their own data skills. This potentially not only enhances their independence but also improves their understandings of the data analysis process, both of which are expected to benefit their application of professional skepticism (e.g., Holmstrom 2020).

Further research could provide empirical findings about auditors’ coordination and communication with data specialists in their ADA practice. Research could also examine how different forms of data specialists’ help (e.g., providing systematic training vs. helping on request, centralized vs. decentralized) influence the effectiveness of using ADA and professional skepticism.

6.3. The audit committee’s attitude

An audit committee’s attitude can also influence auditors’ ADA practice (FRC 2017). The FRC (2017) reports that auditors may feel pressure of adopting ADA to satisfy the audit committee’s expectations for applying ADA in the engagement. Research also finds that support of the audit committee can improve auditors’ application of professional skepticism (e.g., Brazel et al. 2021). Therefore, the audit committee’s expectations and support for auditors’ use of ADA may motivate a high-quality ADA practice. However, the motivating effect from audit committee may be limited since the de facto power in the client is usually at the hand of management instead of the audit committee (e.g., Gold et al. 2018). Therefore, when there are conflicting expectations on auditors’ use of ADA between the audit committee and management, auditors are likely to engage in motivated reasoning towards management’s preference instead of the audit committee’s (e.g., Kadous et al. 2003).

Further research may examine what characteristics of the audit committee (e.g., expertise) would influence auditors’ ADA practice. Research could provide evidence on what support the audit committees could provide to auditors for applying ADA in their audit engagement. Research could further examine how auditors respond when management and the audit committee have contradictory attitudes and expectations on the ADA practice.

6.4. Sophistication of the client’s IT systems

The sophistication of the client’s IT systems largely influences auditors’ ADA adoption decision and perhaps their judgment quality when using ADA. When the client has more sophisticated IT systems, auditors are more likely to use ADA in their audit testing (e.g., Eilifsen et al. 2020). Using ADA potentially improves auditors’ understanding of the client systems and hence their judgment quality. However, when the client’s IT systems are too advanced, auditors may experience increased information disadvantages compared to the client’s management and staff. In addition, if auditors potentially lack the technical knowledge to develop an independent understanding of the client’s IT systems and related controls, they may develop a motivation to heavily rely on management’s explanations (e.g., Griffith et al. 2015b; Griffith et al. 2021). In this circumstance, auditors may not be able to apply appropriate ADA techniques to test the related controls, potentially impairing their risk assessments and subsequent application of professional skepticism.

Further research could examine whether auditors exercise an inappropriate level of reliance on information provided by the client when there are more advanced IT systems in the client entity and, if so, what are the possible interventions to reduce this potential overreliance.

6.5. Regulation

Regulators’ attitude toward ADA can be especially important to auditors (e.g., Wang and Cuthbertson 2015; Salijeni et al. 2019). Auditors have been shown to respond to inspection risks by allocating more effort to areas with higher inspection risks (Detzen et al. 2020). Especially when there is no specific standard guiding the ADA practice (e.g., Wang and Cuthbertson 2015; Kipp et al. 2020), auditors may worry about regulators’ second-guessing in their application of ADA (e.g., Cao et al. 2021). Therefore, setting specific standards and guidance for the ADA practice can be important to motivate auditors’ adoption of ADA and improve their exercise of skepticism when using ADA (e.g., AICPA 2015, 2017). However, the form of such standards may need to be carefully considered since an inappropriate form could backfire (e.g., Peecher et al. 2013; Kang et al. 2020). For example, a proposed way to encourage auditors’ use of innovative audit procedures (e.g., ADA) is to implement an Audit Judgment Rule (AJR), which requires auditors to defend their rigorous, thoughtful, and deliberate judgments and hence aims to protect auditors from being second-guessed when using those innovative procedures. However, Kang et al. (2020) find that this specific requirement of AJR may unintendedly reinforce auditors’ selection towards more traditional procedures because it potentially activates auditors’ defensibility goal, which is more likely to be achieved by maintaining their current audit approaches (compared with employing innovative procedures).

Future research could examine how the characteristics of standards could influence auditors’ adoption decisions and auditing practice when using ADA, and what are the potential unintended consequences of proposed ADA standards to auditors’ judgment and application of professional skepticism.

7. Conclusion

Many stakeholders believe that the time has come for auditors to embrace technology (e.g., Alles and Gray 2016; Deloitte 2016; KPMG 2016, 2019; EY 2018; Eilifsen et al. 2020; PwC 2020; Austin et al. 2021). Although the adoption and use of ADA brings many potential benefits to audit practice, it also creates many challenges. This paper discusses the behavioral challenges to the appropriate application of professional skepticism from five perspectives: auditors’ attitudes toward ADA, data characteristics, anomalies identified by ADA, auditors’ mindsets, and social contexts and interactions involved in ADA practice. First, an inappropriate attitude toward ADA may negatively influence auditors’ assessment of audit evidence from ADA and therefore their application of professional skepticism. Second, auditors’ evaluation of audit evidence from ADA may be influenced by the perceived reliability (e.g., sources and structure) and relevance of data inputs. Since auditors’ exercise of professional skepticism can be negatively influenced by unreliable and irrelevant data inputs, judgment and effort may be necessary to conduct data evaluation and screening before analysis. Third, auditors’ judgment can also be influenced by the anomalies identified by ADA, such as the larger number of anomalies, false positives, and false negatives. Further, adopting appropriate mindsets is also expected to be critical to enhance the exercise of professional skepticism when using ADA. Last but not least, auditors should be aware that their social contexts and interactions with others (e.g., tone at the top, the work of data specialists, the audit committee’s attitude, sophistication of the client’s IT systems, and regulations) in their ADA journey can also influence their exercise of professional skepticism.

Future research linking ADA and auditors’ application of professional skepticism from those five perspectives has been recommended in this paper. For example, further research could examine potential behavioral interventions to prime auditors’ appropriate reliance on ADA. Future research is also recommended to investigate the potential effects of data characteristics on auditors’ assessment of evidence obtained from ADA. Researchers could continue to explore measures to mitigate the potential negative influences of ADA (e.g., the large number of anomalies and false positives) on auditors’ application of professional skepticism. Future research aiming to investigate appropriate mindsets that auditors should adopt when using ADA is also encouraged. Finally, research can explore social, contextual and environmental factors that motivate auditors’ better ADA practice and application of professional skepticism.

Overall, this paper informs academia, the audit profession, standard-setters, and regulators about the potential challenges to the appropriate application of professional skepticism when using ADA so that stakeholders can be alert to and prepared for those potential issues. Concluding, multiple efforts are needed to solve those challenges in auditors’ ADA journey and motivate the appropriate application of professional skepticism.

Xiaoxing Li MSc is a PhD candidate at the Department of Accounting, School of Business and Economics, Vrije Universiteit Amsterdam. She is currently working on her PhD dissertation in the area of audit data analytics and professional skepticism.

Acknowledgements

The author thanks the Foundation for Auditing Research for providing financial support through their research grant 2021B01. The views expressed in this document are those of the author and not necessarily those of the FAR. The author also thanks the editor, Anna Gold, and two anonymous reviewers for their comments and feedback.

References

  • Alles M, Brennan G, Kogan A, Vasarhelyi MA (2006) Continuous monitoring of business process controls: A pilot implementation of a continuous auditing system at Siemens. International Journal of Accounting Information Systems 7(2): 137–161. https://doi.org/10.1016/j.accinf.2005.10.004
  • Alles M, Gray GL (2016) Incorporating big data in audits: Identifying inhibitors and a research agenda to address those inhibitors. International Journal of Accounting Information Systems 22: 44–59. https://doi.org/10.1016/j.accinf.2016.07.004
  • Anderson SB, Hobson JL, Peecher ME (2020) The joint effects of rich data visualization and audit procedure categorization on auditor judgment. Working paper. https://doi.org/10.2139/ssrn.3737234
  • Appelbaum D, Kogan A, Vasarhelyi MA (2017) Big Data and analytics in the modern audit engagement: Research needs. Auditing: A Journal of Practice & Theory 36(4): 1–27. https://doi.org/10.2308/ajpt-51684
  • Austin AA, Carpenter TD, Christ MH, Nielson CS (2021) The data analytics journey: Interactions among auditors, managers, regulation, and technology. Contemporary Accounting Research 38(3): 1888–1924. https://doi.org/10.1111/1911-3846.12680
  • Austin AA, Hammersley JS, Ricci MA (2020) Improving auditors’ consideration of evidence contradicting management’s estimate assumptions. Contemporary Accounting Research 37(2): 696–716. https://doi.org/10.1111/1911-3846.12540
  • Baader G, Krcmar H (2018) Reducing false positives in fraud detection: Combining the red flag approach with process mining. International Journal of Accounting Information Systems 31: 1–16. https://doi.org/10.1016/j.accinf.2018.03.004
  • Backof AG, Carpenter TD, Thayer J (2018) Auditing complex estimates: How do construal level and evidence formatting impact auditors’ consideration of inconsistent evidence? Contemporary Accounting Research 35(4): 1798–1815. https://doi.org/10.1111/1911-3846.12368
  • Banerjee A, Chitnis UB, Jadhav SL, Bhawalkar JS, Chaudhury S (2009) Hypothesis testing, type I and type II errors. Industrial Psychiatry Journal 18(2): 127–131. https://doi.org/10.4103/0972-6748.62274
  • Barr-Pulliam D, Brazel JF, McCallen J, Walker K (2020) Data analytics and skeptical actions: The countervailing effects of false positives and consistent rewards for skepticism. Working paper, 44 pp. https://doi.org/10.2139/ssrn.3537180
  • Brown-Liburd H, Issa H, Lombardi D (2015) Behavioral implications of Big Data’s impact on audit judgment and decision making and future research directions. Accounting Horizons 29(2): 451–468. https://doi.org/10.2308/acch-51023
  • Cao T, Duh RR, Tan HT, Xu T (2021) Enhancing auditors’ reliance on data analytics under inspection risk using fixed and growth mindsets. The Accounting Review (forthcoming). https://doi.org/10.2308/TAR-2020-0457
  • Commerford BP, Dennis SA, Joe JR, Ulla JW (2021) Man versus machine: Complex estimates and auditor reliance on artificial intelligence. Journal of Accounting Research (forthcoming). https://doi.org/10.1111/1475-679X.12407
  • Dennis SA, Johnstone KM (2018) A natural field experiment examining the joint role of audit partner leadership and subordinates’ knowledge in fraud brainstorming. Accounting, Organizations and Society 66: 14–28. https://doi.org/10.1016/j.aos.2018.02.001
  • Desender K, Murphy P, Boldt A, Verguts T, Yeung N (2019) A postdecisional neural marker of confidence predicts information-seeking in decision-making. Journal of Neuroscience 39(17): 3309–3319. https://doi.org/10.1523/JNEUROSCI.2620-18.2019
  • Dietvorst BJ, Simmons JP, Massey C (2018) Overcoming algorithm aversion: People will use imperfect algorithms if they can (even slightly) modify them. Management Science 64(3): 1155–1170. https://doi.org/10.1287/mnsc.2016.2643
  • Dweck CS (2008) Mindset: The new psychology of success. Random House Digital, Inc (New York, NY).
  • Eilifsen A, Kinserdal F, Messier WF, McKee TE (2020) An exploratory study into the use of audit data analytics on audit engagements. Accounting Horizons 34(4): 75–103. https://doi.org/10.2308/HORIZONS-19-121
  • Gepp A, Linnenluecke MK, O’Neill TJ, Smith T (2018) Big data techniques in auditing research and practice: Current trends and future opportunities. Journal of Accounting Literature 40: 102–115. https://doi.org/10.1016/j.acclit.2017.05.003
  • Gold A, Klynsmit P, Wallage P, Wright AM (2018) The impact of the auditor selection process and audit committee appointment power on investment recommendations. Auditing: A Journal of Practice & Theory 37(1): 69–87. https://doi.org/10.2308/ajpt-51808
  • Gollwitzer PM (2012) Mindset theory of action phases. In Van Lange PAM, Kruglanski AW, Higgins ET (Ed.) Handbook of theories of social psychology. SAGE Publications Ltd (Los Angeles): 526–545. https://doi.org/10.4135/9781446249215.n26
  • Griffith EE, Hammersley JS, Kadous K (2015b) Audits of complex estimates as verification of management numbers: How institutional pressures shape practice. Contemporary Accounting Research 32(3): 833–863. https://doi.org/10.1111/1911-3846.12104
  • Hackenbrack K (1992) Implications of seemingly irrelevant evidence in audit judgment. Journal of Accounting Research 30(1): 126–136. https://doi.org/10.2307/2491095
  • Hoffman VB, Patton JM (1997) Accountability, the dilution effect, and conservatism in auditors’ fraud judgments. Journal of Accounting Research 35(2): 227–237. https://doi.org/10.2307/2491362
  • Holt T, Loraas TM (2021) A potential unintended consequence of Big Data: Does information structure lead to suboptimal auditor judgment and decision-making? Accounting Horizons 35(3): 161–186. https://doi.org/10.2308/HORIZONS-19-123
  • Holton C (2009) Identifying disgruntled employee systems fraud risk through text mining: A simple solution for a multi-billion dollar problem. Decision Support Systems 46(4): 853–864. https://doi.org/10.1016/j.dss.2008.11.013
  • Johnson R, Wiley L (2019) Auditing: A practical approach with data analytics (1st edn.). John Wiley and Sons, Inc (Hoboken).
  • Kadous K, Kennedy SJ, Peecher ME (2003) The effect of quality assessment and directional goal commitment on auditors’ acceptance of client‐preferred accounting methods. The Accounting Review 78(3): 759–778. https://doi.org/10.2308/accr.2003.78.3.759
  • Kang YJ, Piercey MD, Trotman A (2020) Does an audit judgment rule increase or decrease auditors’ use of innovative audit procedures? Contemporary Accounting Research 37(1): 297–321. https://doi.org/10.1111/1911-3846.12509
  • Kim S, Mayorga DM, Harding N (2017) Can I interrupt you? Understanding and minimizing the negative effects of brief interruptions on audit judgment quality. International Journal of Auditing 21(2): 198–211. https://doi.org/10.1111/ijau.12089
  • Kipp P, Olvera R, Robertson JC, Vinson J (2020) Audit data analytics and jurors’ assessment of auditor negligence: The effects of follow-up procedures and the lack of a standard. Working paper. https://doi.org/10.2139/ssrn.3775740
  • Krahel JP, Titera WR (2015) Consequences of Big Data and formalization on accounting and auditing standards. Accounting Horizons 29(2): 409–422. https://doi.org/10.2308/acch-51065
  • Krieger F, Drews P, Velte P (2021) Explaining the (non-) adoption of advanced data analytics in auditing: A process theory. International Journal of Accounting Information Systems 41: 100511. https://doi.org/10.1016/j.accinf.2021.100511
  • Logg JM, Minson JA, Moore DA (2019) Algorithm appreciation: People prefer algorithmic to human judgment. Organizational Behavior and Human Decision Processes 151: 90–103. https://doi.org/10.1016/j.obhdp.2018.12.005
  • Luippold BL, Kida TE (2012) The impact of initial information ambiguity on the accuracy of analytical review judgments. Auditing: A Journal of Practice & Theory 31(2): 113–129. https://doi.org/10.2308/ajpt-10259
  • Nelson MW, Proell CA, Randel AE (2016) Team-oriented leadership and auditors’ willingness to raise audit issues. The Accounting Review 91(6): 1781–1805. https://doi.org/10.2308/accr-51399
  • Nisbett RE, Zukier H, Lemley RE (1981) The dilution effect: Nondiagnostic information weakens the implications of diagnostic information. Cognitive Psychology 13(2): 248–277. https://doi.org/10.1016/0010-0285(81)90010-4
  • Nolder CJ, Kadous K (2018) Grounding the professional skepticism construct in mindset and attitude theory: A way forward. Accounting, Organizations and Society 67: 1–14. https://doi.org/10.1016/j.aos.2018.03.010
  • Önkal D, Goodwin P, Thomson M, Gönül S, Pollock A (2009) The relative influence of advice from human experts and statistical methods on forecast adjustments. Journal of Behavioral Decision Making 22(4): 390–409. https://doi.org/10.1002/bdm.637
  • Peecher ME, Piercey MD, Rich JS, Tubbs RM (2010) The effects of a supervisor’s active intervention in subordinates’ judgments, directional goals, and perceived technical knowledge advantage on audit team judgments. The Accounting Review 85(5): 1763–1786. https://doi.org/10.2308/accr.2010.85.5.1763
  • Peecher ME, Solomon I, Trotman KT (2013) An accountability framework for financial statement auditors and related research questions. Accounting, Organizations and Society 38(8): 596–620. https://doi.org/10.1016/j.aos.2013.07.002
  • Plumlee RD, Rixom BA, Rosman AJ (2015) Training auditors to perform analytical procedures using metacognitive skills. The Accounting Review 90(1): 351–369. https://doi.org/10.2308/accr-50856
  • Richins G, Stapleton A, Stratopoulos TC, Wong C (2017) Big data analytics: Opportunity or threat for the accounting profession? Journal of Information Systems 31(3): 63–79. https://doi.org/10.2308/isys-51805
  • Rose AM, Rose JM, Sanderson KA, Thibodeau JC (2017) When should audit firms introduce analyses of big data into the audit process? Journal of Information Systems 31(3): 81–99. https://doi.org/10.2308/isys-51837
  • Rose AM, Rose JM, Suh I, Thibodeau JC (2020) Analytical procedures: are more good ideas always better for audit quality? Behavioral Research in Accounting 32(1): 37–49. https://doi.org/10.2308/bria-52512
  • Saiewitz A, Wang EY (2020) Using cultural mindsets to reduce cross‐national auditor judgment differences. Contemporary Accounting Research 37(3): 1854–1881. https://doi.org/10.1111/1911-3846.12566
  • Tsai CI, Klayman J, Hastie R (2008) Effects of amount of information on judgment accuracy and confidence. Organizational Behavior and Human Decision Processes 107(2): 97–105. https://doi.org/10.1016/j.obhdp.2008.01.005
  • Waller WS, Zimbelman MF (2003) A cognitive footprint in archival data: Generalizing the dilution effect from laboratory to field settings. Organizational Behavior and Human Decision Processes 91(2): 254–268. https://doi.org/10.1016/S0749-5978(03)00024-4
  • Wang T, Cuthbertson R (2015) Eight issues on audit data analytics we would like researched. Journal of Information Systems 29(1): 155–162. https://doi.org/10.2308/isys-50955
  • Yeomans M, Shah A, Mullainathan S, Kleinberg J (2019) Making sense of recommendations. Journal of Behavioral Decision Making 32(4): 403–414. https://doi.org/10.1002/bdm.2118