Research Article
Print
Research Article
Internal audit and complex systems: emerging approaches
expand article infoEdo Roos Lindgreen, Peter Hartog§, Paul Hamaker|
‡ Universiteit van Amsterdam, Amsterdam, Netherlands
§ NL Institute of Internal Auditors, Amsterdam, Netherlands
| Amsterdam Business School, Amsterdam, Netherlands
Open Access

Abstract

When planning and performing their audits, internal auditors usually apply a practical reductionist approach in order to facilitate the timely and efficient allocation of finite audit resources and to manage the expectations of audit sponsors and auditees. Though practical and widely accepted, this approach runs the risk of oversimplification and underestimation of the complexity of the audit object. This may lead to incomplete and/or erroneous audit outcomes, which in turn may influence the decisions made by stakeholders using the auditor’s report. In this paper, we explore the audit of complex systems, argue that a different approach is necessary to tackle such systems, and explore the contours of a practically applicable approach loosely based on the Cynefin framework.

Keywords

Complex systems, internal audit, Cynefin framework

Relevance to practice

This paper discusses a practical challenge in the audit profession: planning and executing audits while taking the complexity of the audit object into account. The paper presents the contours of a practical approach for probing and sensing in the engagement planning based on the Cynefin framework.

1. Introduction

With an estimated 250.000 professionals worldwide, most of which are united in the global Institute of Internal Auditors (IIA), internal auditing has become an essential function in medium to large scale organizations. Internal auditing is no longer the domain of large banks, listed companies and multinationals. Most large organizations in the private and public sector include an internal audit function. Having an internal audit function is considered good practice or even mandatory in leading corporate governance codes, for example in the United Kingdom or The Netherlands (UKGC 2018; NLGC 2025). In general, internal audit reports directly to the board and to the audit committee. As such, internal audit is considered an essential source of information on operational, compliance, financial and strategic risks and on the effectiveness of risk management processes. The auditor’s reports and accompanying presentations are used, among others, by the board, the audit committee, (senior) management, the external auditor and regulators.

The roots of internal auditing as we know it today lie in the practice of financial auditing. The finance-centric internal auditors of the past were primarily concerned with financial reporting risks. Their objective was to establish that the numbers reported – such as the statement of profit and loss and the balance sheet – reliably and truthfully represented the financial position of the company. The internal auditors of today, on the other hand, are also – and even more – concerned with compliance risks, operational risks, management risks, commercial risks, strategic risks and so on, and typically report on the effectiveness of the governance, risk management and control processes that the company has put in place to mitigate those risks.

When preparing an audit, one of the first challenges the internal auditor faces is specifying its scope and depth: demarcating the audit object, determining the boundaries of what is audited and what is not; and specifying which aspects should be incorporated in the audit, and to which degree. This process is important for many reasons: to be relevant for the organization, to manage the expectations of the audit sponsor and audit users, to inform the auditee so they can give access to and/or prepare the necessary information, and to efficiently allocate the time and resources that are necessary to plan and perform the audit accordingly.

In financial auditing, this process is usually relatively straightforward. Typically, the auditor performs a risk analysis to gain an understanding of the business and determine which elements of the financial report should be the focus of the auditor’s attention. Based on this risk analysis, the auditor will then test the controls that are put in place to ensure the reliability of the financial reporting. If the auditor decides that these controls leave room for improvement and cannot be relied on within acceptable margins, they may resort to substantive procedures, where the financial information is hard-checked against data representing the business, such as invoices, transaction logs, inventory or contracts.

In internal auditing, determining scope and depth is usually more complicated. As stated, the internal auditor is not so much concerned with financial reporting risks as with risks associated with strategy execution, management, operations and so on. Rather than focusing on accounts and journal entries, the internal auditor wants to investigate, for example, entire business processes, governance structures, organizational culture and behaviour, information systems, operating companies, supply chains or production plants.

To determine the object of their audit, internal auditors usually follow a reductionist approach: decomposing reality into smaller entities and creating “workable” audit objects by imposing artificial boundaries on a subset of the audit object, which is in turn an artificial demarcation of the audit universe, defined by the IIA (2024a, b) as “the collection of all the auditable entities, processes, systems, functions, or areas within an organization that could be subject to an internal audit”. The resulting audit object is then treated as an isolated entity which is investigated, tested and reported on. It is not always obvious where to place the boundaries and to decide which elements of the audit universe are included in the audit object; many planning exercises include a fair amount of professional judgment, assumptions and educated guesses.

For reasons that will be explained below, the audit universe qualifies as a complex system, the components of which are complex systems in themselves, interrelated and interacting in complex ways (West 2017; Snowden and Boone 2007). Unfortunately, cutting through complexity comes at a price. Carving out parts of the audit universe invariably leads to a loss of information, information that might have influenced the outcome of the audit and thus the decisions made by the users of the audit report. To quote Le Coze (2005): “Decomposing a complex system can lead us to lose our understanding of it, especially when it is the interactions between its parts and therefore its organizational nature that defines its behaviour as a whole.” The same holds for the preparation of the audit engagement itself, where implicit and explicit choices regarding scope and depth are made, and where complexity may be observed as well.

Complex systems have been the subject of extensive study for decades in such diverse fields as biology, physics, sociology, virology, urban planning and logistics. One thing these studies have taught us is that complex systems are the rule rather than the exception. Most real-life systems are complex and should be studied and treated as such. Curiously, the relation between complex systems and internal auditing has been left largely untouched in academic and professional literature, with a few exceptions, e.g. Hartog and Paape (2020), Kolk et al. (2022). In this paper, we argue that the standard internal audit practice of demarcating and delineating audit objects with a “clear cut”, i.e. without taking the complexity of a system into account, can impair the quality and effectiveness of an audit and can lead to wrong decisions. New planning and audit approaches that take the complexity of the system under study into consideration can lead to more relevant, valid and reliable audit results and hence better decisions.

The structure of this paper is as follows. After a description of the general process of internal auditing, we describe the basics of complex systems. We describe the most important properties of those systems, including (un)predictability, nonlinear behaviour, emergence, scale invariance and unexpected breakdowns. We argue that the oversimplification inherent to reductionist approaches in auditing may pose serious risks to stakeholders using the auditor’s report and give some examples. After that, we propose the contours of an approach for auditing complex systems by applying probing and sensing techniques in the engagement planning phase. The article ends with some conclusions and directions for future research.

2. A closer look on internal auditing

Internal audit is defined by its dominant professional organization, the Institute of Internal Auditors (IIA), as “an independent, objective assurance and advisory service designed to add value and improve an organization’s operations. It helps an organization accomplish its objectives by bringing a systematic, disciplined approach to evaluate and improve the effectiveness of risk management, control, and governance processes” (IIA 2024a, b).

Typically, internal auditing is the domain of a specialized department in the organization; for the sake of clarity and adhering to common nomenclature, in this paper we will refer to this department as the Internal Audit Function (IAF). The IAF is led by a Chief Audit Executive (CAE) who, in a two-tier structure, usually reports directly to the board and in many cases also to the Audit Committee of the supervisory board, although the latter is not prescribed by the IIA guidelines.

There are many standards and guidelines to organize, plan, perform and evaluate internal audits, the most important of which are contained in the International Professional Practice Framework (IPPF) of the IIA (IIA 2024a, b). The actual form, shape and way of working of IAFs may differ substantially in practice, resulting from the large diversity in industry sectors, business models, governance models, regulatory conditions and specific characteristics of different organizations. For the purpose of this paper, we take a generic standards-based approach to internal auditing as the basis for our discussion.

In such an approach, the risk-based audit plan has a central role, as specified in standard 9.4 of the Global Internal Audit Standards (GIAS) (IIA 2024a, b). In the audit plan, the IAF obtains knowledge of what is sometimes called the “audit universe”, assesses and prioritizes the risks for the organization and plans its activities for the coming period – at least one year – based on this assessment. In this way, the IAF ensures that its activities are aligned with the organization’s strategies, objectives, and risks and that its resources are allocated as efficiently as possible.

Typically, the annual audit plan contains a list of audits to be performed in the coming period, which may run in the dozens for larger organizations. For each audit identified, the plan lists the audit objectives and scope of the audit, timeframe, audit team, reporting, and so on. After finalization of the plan and approval by the board – and, in many practical situations, also by the supervisory board, which is in turn advised by the audit committee – the IAF will perform each audit as planned.

The GIAS distinguish four phases in an audit engagement: planning, execution/fieldwork, communication and follow up.

  • In the engagement planning phase, the auditor obtains knowledge of the audit object, engages stakeholders, performs a risk assessment, defines the audit objectives, scope (including the depth of the audit) and the evaluation criteria to be used, and translates this into an engagement plan including resources and timelines. The resulting plan is discussed with management.
  • In the execution phase, the auditor collects information by performing desk research and carrying out the field work, which may consist of conducting interviews, analysing data and observing processes. In this phase, the auditor will also test the controls that should be in place to mitigate the risks; if necessary, additional substantive work is performed. In this phase, the auditor collects and documents audit evidence that supports the findings and conclusions.
  • In the communication or reporting phase, the auditor prepares a draft report containing key findings, weaknesses and deficiencies in the controls tested, and often recommendations for improvement. The report is discussed with the auditee and the auditee is invited to give a management response and an action plan, upon which the final report is issued and shared with the board and the supervisory board.
  • In the follow up phase, management will implement the corrective actions; the audit team will monitor this process and review the effectiveness of the measures taken, again reporting to the board and the supervisory board.

This is a logical and sound approach, and following this approach should provide necessary guidance to carry out audits in a structured and effective manner. However, in reality, its limitations may become visible when the auditor finds out that the real audit object is less clearly defined than suggested by the scope and depth explicated in the planning phase, and that other factors are decisive for the control and for the achievement of the objectives than suggested in the defined evaluation criteria.

In a round-table discussion with CAEs of leading organisations in The Netherlands (EMAS 2025), a majority of participants confirmed that during many audits, unexpected connections are revealed; examples include the involvement of other departments, the use of hitherto unknown IT-systems, or the influence of culture and behaviour. Auditors often lack the time to adjust for these loose ends; they work according to their planning and may end up largely ignoring these connections or saving them for a later audit. In the final audit report, the audit object is then treated as an isolated entity, which it is not.

Summarizing, the audit object demarcated in the scoping phase is not an isolated entity, but part of a complex system. The question now arises how internal auditors should deal with complex systems in practice. In the next paragraph, we will provide an introductory exploration to complex systems.

3. Basics of complex systems

Complex systems were first researched in physics and mathematics and have since been the subject of study in various academic disciplines (Marro 2016). As a consequence, many different definitions and interpretations coexist, some overlapping, some disjunct, some entangled, and some even contradictory.

Examples of complex systems studied in their respective fields range from fluid dynamics to galaxy formation (physics), from living organisms to biological ecosystems (biology), from human behaviour to societies (social sciences); from supply chains to multinational organizations (economics); for more examples, the reader is referred to West (2017), widely considered a standard work in this field.

The key characteristic of a complex system is that it consists of many smaller interconnected, interacting entities (Ladyman et al. 2013), which may be identical or have their own pattern and uniqueness. The interactions between these entities can take many forms, depending on the system at hand. For example, entities:

  • can compete for resources;
  • can attract or repulse other entities;
  • can exchange information or energy;
  • can trade or exchange goods and services for money;
  • can eat, or be eaten;
  • can work together or in isolation;
  • can lead or be led.

Typically, complex systems include feedback loops where interactions will influence subsequent interactions. Feedback loops can be reinforcing, dampening, or they can take other forms that change the behaviour of the systems and its constituent components. Typically, complex systems are non-ergodic: the behaviour of the system cannot be inferred from the behaviour of its components, and vice versa. Complex systems can be fractal in nature, i.e. built up from structures that are self-similar at different levels of scale (West 2017).

Complex systems have generated substantial interest in other fields as well. Heino et al. (2021) discuss the analysis of behaviour change mechanisms in complex environments. A useful approach to address complex systems in a managerial context is presented in the Cynefin framework (Snowden and Boone 2007). This framework recognizes that different situations in different contexts require different responses. It was created to provide executives with a practical tool to assess situations and conceive an adequate response. The authors discern four categories of situations, in increasing order of difficulty: clear (known knowns), complicated (known unknowns), complex (unknown unknowns), and chaotic (unknowables, such as “black swans” (Taleb 2007)). A fifth category, called disorder, applies when the category is uncertain. According to the authors, a complex situation is characterized by the fact that the relationship between cause and effect can’t be established beforehand, but only in hindsight; complex systems adapt and evolve in time. Accordingly, complex systems require careful analysis and repeated probing and sensing to clarify their intricacies before issuing a response.

4. Properties of complex systems

The sheer number of components of complex systems, the number and nature of the interactions between these components, the feedback loops present in such systems and their sometimes fractal structure give rise to a number of interesting behavioural properties, of which we name a few below (Marro 2016; West 2017).

Nonlinear behaviour

Complex systems behave in a nonlinear way, meaning that the relationship between the input and the output is not proportional or additive. In some cases, that relationship can be described by a nonlinear mathematical function, such as a power law or a set of differential equations. In other cases, the system’s behaviour is highly sensitive to very small variations in its input and is so unstable – including unexpected breakdowns – that it cannot be modelled or predicted in any meaningful way.

Emergence

Due to the interactions and feedback loops, a complex system can exhibit emergent behaviour that is larger than the linear sum of its parts and that cannot easily be predicted from the behaviour of its components. Sometimes, a complex system’s behaviour is so unexpected that it surprises even those who have full knowledge of its components. It is as if the system has a life of its own. But the reverse is also true – some complex systems may, under the right circumstances, produce behaviour that is predictable to a certain degree; see next point.

(Un)predictability

The presence of nonlinear behaviour and emergence does not mean the behaviour of a complex system can never be predicted – because, in many cases, it can; not by analysing its components and trying to infer the system’s behaviour, but rather by zooming out and studying the behaviour of the system at a larger scale and trying to detect regularities or patterns that show a consistent development over time. Mathematical functions, sets of differential equations or computer simulations can then be used to model the system and predict its long-term behaviour.

Scale invariance

When the behaviour of the system can be described by a power law (say, x to the power of n), it will be scale invariant, meaning that the behaviour of the system does not change when space, time or other relevant variables are rescaled. In other words, regardless of its scale, the system will always behave in the same predictable way.

Resilience

Due to the large number of interacting components, complex systems are well-known for their resilience in the face of disruptions or perturbations. This holds especially for systems with a high fractal dimension. Such systems exhibit self-similarity and redundancy at different levels of abstraction; examples include tropical rainforests, distributed computer systems, vascular systems or organizational structures. These fractal structures enable an efficient distribution of both resources and damage or stress (West 2017). Also systems with a high diversity among their components are known to be more resilient than systems where the components form a monoculture.

Unexpected breakdowns

In contrast to resilience, complex systems are known to be susceptible to unexpected breakdowns. Their nonlinear, emergent behaviour may give rise to spurious errors or sudden collapses that cannot be predicted from the system’s previous behaviour. Especially in highly interconnected systems, cascading failures lead to a higher vulnerability to breakdowns or extinction events (Ormerod 2006). In network research, the combination of high resilience and high vulnerability to breakdowns is known as “robust yet fragile” (Watts 2002).

5. Internal audit and complex systems

As stated earlier, the audit universe as a whole and most of its individual objects have many – if not all – of the characteristics and behavioural properties of complex systems as described above. To quote West (2017), “Organizations are organisms … not just the linear sum of replaceable parts”. Basically every system encountered by an internal auditor nowadays is a complex system or part thereof. One could argue that it is precisely this complexity that is preventing the definition and global adoption of a single view on good governance and control, as illustrated by the myriad of existing models, theories, frameworks and standards.

The interrelation between internal auditing and complex systems has been largely untouched in academic literature. Hartog and Paape (2020) and Kolk et al. (2022) do address complexity challenges, focusing on the audit aspects and evaluation criteria to be used.

Most experienced auditors will be able to recall moments in their career where reality turned out to be more complex than was assumed in the planning phase, in terms of scope and/or depth. For example, regarding scope: during or after the audit, the auditor may learn that the audit object is in fact much larger than expected or has unforeseen branches and connections to other components of the audit universe. Or regarding depth: the auditor who focuses on the design rather than the operational effectiveness of controls may find that the former is by no means a guarantee for the latter nor for the achievement of the objectives of the organization; or the auditor who investigates a potential fraud case may learn that the audit approach is more focused on symptoms than on underlying mechanisms or root causes.

Given the confidential nature of internal audits, such situations are not widely publicized in academic or professional literature. This is different for external financial audits. The past decades, there have been many published cases of external financial audits gone wrong; the timeline from Enron and WorldCom in 2002 (Bratton 2002) to Carillion in 2018 (Bhaskar and Flower 2019) and Wirecard in 2020 (Großeastroth and Koch 2023) is littered with examples of bankruptcies and fraud cases where the auditor failed to detect and raise red flags – at least, before the fact. There exists ample academic and professional literature on the possible reasons for this phenomenon. Asare et al. (2015) list a number of key obstacles, many of which – but not all – can be distilled to the underestimation of complexity by the auditor; examples include the inherent limitations in the scope and depth of audits, the failure to detect complex fraud schemes, underestimating the significance of identified risks, the application of standard audit procedures that are not sufficient to detect fraud, not understanding the business, underestimating the influence of organization culture, and failure to understand the impact of digitalization and use of information technology. In other words, many audit failures appear to be caused by an oversimplification of reality and not taking the complexity of the real world into account, deliberately or not. Comparable examples exist in the field of information systems auditing or IT-auditing; an infamous example in The Netherlands is the Diginotar case (Van der Meulen 2013), where the auditor tested the design of controls of this Certificate Authority according to official guidelines, but neglected to test their operating effectiveness and overlooked the fact that the controls in this complex system were not working – operating effectiveness: zero – and that the system had been hacked, with dire consequences.

In our opinion, if financial auditors underestimate the complexity of their audit objects, there is no reason to assume that internal auditors are immune to this phenomenon. In terms of the Cynefin framework mentioned earlier: it is safe to assume to internal auditors, too, will often prepare to tackle a presumedly clear or complicated audit object, only to find it is in the complex category.

6. Contours of a new approach

The above logically leads to the search for an approach that enables the auditor to recognize a complex environment and to conduct an audit that takes this complexity into account. The “probing and sensing” approach advocated by the Cynefin framework offers a promising direction (Snowden and Boone 2007). In the Cynefin framework, probing means: subjecting a system to interventions or tests, small or large, to see how it behaves. Sensing means: observing the behaviour of a system and its environment on different levels of abstraction. Although they can be viewed as separate actions, the combination of probing and sensing (“trial and error”) is especially powerful (Snowden and Boone 2007).

In the context of internal auditing, probing and sensing can be applied in both the engagement planning and the execution phase. Sensing techniques have always been essential instruments in the toolbox of the auditor. They include interviews, data analysis, behavioural analysis and continuous auditing, and are commonly applied in the execution phase. Probing techniques, less so. Examples of probing techniques include stress tests, penetration tests, reperformances, software tests, transaction insertion and fire drills. It should be noted that sensing without probing is common practice, but probing without sensing is useless.

When applied during fieldwork, probing and sensing may produce new information that likely leads to additional work. Unfortunately, audits are often carried out in a business environment that does not offer much flexibility in terms of deadlines or resource usage. Since time and resources are rationed, field work should be carried out according to the engagement plan; exploring all the nooks and crannies of a complex system during the field work in the execution phase will lead to unpredictable turnaround times generally considered undesirable by both the audit sponsor and the auditee. So, rather than in the execution phase, probing and sensing should first and most importantly be applied in the engagement planning phase, so that a realistic, deterministic engagement plan can be drawn up. In this phase essential information on the audit object is collected and analysed before the actual engagement plan is drawn up. Early planning can unlock essential information to be used as the basis for the scoping and work program in the engagement plan and can be considered the first step in a probing and sensing approach. In order to obtain as much information as possible in the engagement planning phase, it is preferable to finalize the engagement planning as late as possible, preferably just before the start of the field work. This would suggest a justification for increasing the percentage of hours used for planning versus the percentage of hours used for execution.

We would like to mention two other approaches which might be appropriate when the auditor is confronted with complex systems. In the engagement planning phase, the auditor gathers the necessary information to determine the scope, standards framework and approach, given the objective of the audit. The auditor should evaluate whether that information is sufficient, and how great the risk is that the wrong scope or approach is chosen given the complexity of the system to be examined. If the auditor were to conclude that that risk is real, a different approach than the traditional audit with predefined evaluation criteria might have to be chosen. An alternative to cope with this uncertainty could be an agile-like approach, in which the audit is carried out in smaller parts, and after each step the most meaningful next step is re-examined. These small parts can be viewed as a form of probing and sensing. An alternative is described in Schuiten (2022), taking the form of an approach without predefined norms, assuming really complex situations in which multiple realities occur, leading to intersubjective conclusions. As the author notes, this is a very different approach, incongruent with the usual objectivist worldview of internal audit. We add that its exploratory nature will create uncertainties in the expectation of the auditee and in the execution phase of the audit in terms of resource usage and execution time. This often does not fit in with a business environment that values adherence to deadlines and resource allocations. On the other hand, the relevance of the audit is paramount and therefore there must be sufficient certainty that the predefined engagement plan will lead to the desired result.

Concluding: when encountering a complex environment, auditors should tackle it by applying probing and sensing techniques in the engagement planning phase, perhaps taking more time than usual, starting early and finishing late. Below, we briefly list five possible techniques that have been applied successfully in other contexts. Please note that this list is not exhaustive; the authors welcome further suggestions or elaborations. Also note that specific situations may require a specific selection of the techniques below; a characteristic of complex systems is that there is no simple recipe to follow.

Collaborative risk assessment

The benefits of diverse teams have been extensively researched and described in academic literature, see e.g. Page (2007). Analogously, to obtain a more complete picture of a system, the auditor can engage in a collaborative analysis, together with stakeholders and subject matter experts, of the audit object itself, its context, and the risks associated with it. Are there communication bottlenecks or risk concentrations? Are there emergent risks? The analysis can take the form of a brainstorm, it can be physical or online, but the active participation of a diverse team of analysts is key to obtain the desired results. By collaborating with a diverse group of professionals, the auditor can extend their scope and prevent tunnel vision.

Stakeholder analysis

Stakeholder analysis is a venerable and well-developed discipline with origins in medicine, politics and management science (Brugha and Varvasovszky 2000). The purpose of stakeholder analysis is to gain insight in the entities that influence decision making processes or are otherwise involved in an ecosystem. Auditors, too, should perform a thorough stakeholder analysis in the engagement planning phase – which entities inside and outside the organization are somehow having an interest related to the audit object? What are their respective interests, and how do they influence each other?

Interface analysis

Every audit object interacts with other entities inside and outside the organization. In the engagement planning phase, the auditor should analyse these touch points or interfaces; see for example Maier (2009). How does the interaction take place? Is additional work necessary to determine the impact of interaction on those touch points? In this step, the auditor can identify feedback loops or connections inside the system and in interaction with outside world – first qualitative, and if possible, quantitative.

Behaviour analysis

This step can include sentiment analysis, for example by analyzing emails or complaints (reference); and an analysis of the control environment to evaluate policy enforcement, for example by studying audit reports or incident reports. The auditor can apply elements from behavioural auditing to gain insight in underlying motives of behaviour and observe informal practices or undocumented workarounds (Van der Meulen and Otten 2014).

Digital twins

The auditor should make extensive use of generative AI to analyse the audit object and its interactions with other entities. AI is evolving at breakneck speed and can be used to analyse the audit object, gain insight in potential risks, and so on. In the near future, the auditor will probably be able to create a toy model or “digital twin” of the environment, using AI (Singh et al. 2021); starting from a coarse model, further refinement can provide increasing insight in the audit object’s complexity and potential risk areas.

7. Conclusions and directions for future research

We have argued that many objects encountered by auditors can be classified as complex systems, and that such systems require a tailored audit approach. When auditing a complex system, internal auditors should spend more time in the engagement planning phase, applying probing and sensing techniques that have proven to be successful in other contexts. Practical audit experience with probing and sensing techniques appears to be limited, and it would be valuable to investigate if applying these techniques leads to better audits and hence better decisions. We invite auditors who have applied probing and sensing in the engagement planning and execution phases to contact us and share their experiences.

Prof. dr. E.E.O. Roos Lindgreen RE is Professor of Data Science in Auditing at the University of Amsterdam. He is program director of the Executive Programme of Digital Auditing and the Executive M.Sc. of Internal Auditing and also directs the Institute of Executive Programmes at Amsterdam Business School.

Drs. P. Hartog, CIA – Peter is Director of Professional Practices at IIA Netherlands and a lecturer at the Erasmus School of Accounting & Assurance (ESAA). By developing and sharing knowledge and best practices, he hopes to contribute to the further professionalization of the profession.

Drs. P.V. Hamaker – Paul, is Director Global Data Management at Heineken and Lecturer / Track Coordinator Executive Programme of Digital Auditing at the Amsterdam Business School.

Disclaimer: The authors have used generative AI (ChatGPT 4o) to test and refine ideas and to find relevant information and sources on the internet.

References

  • Asare SK, Wright AM, Zimbelman MF (2015) Challenges Facing Auditors in Detecting Financial Statement Fraud: Insights from Fraud Investigations. Journal of Forensic & Investigative Accounting 7(2): 63–112.
  • Bratton WW (2002) [May] Does Corporate Law Protect the Interests of Shareholders and Other Stakeholders?: Enron and the Dark Side of Shareholder Value. Tulane Law Review (1275). New Orleans: Tulane University Law School: 61. [SSRN 301475] https://doi.org/10.2139/ssrn.301475
  • EMAS (2025) CAE round-table, Programme Board Executive MSc of Auditing Studies, Amsterdam, 3 February 2025.
  • Gleick J (2011) Chaos: Making a New Science. Open Road Media.
  • Hartog P, Paape L (2020) De bril van de internal auditor; oogklep of varifocus? Maandblad voor Accountancy en Bedrijfseconomie 94(3/4): 177–180. https://doi.org/10.5117/mab.94.51285
  • Heino MTJ, Knittle K, Noone C, Hasselman F, Hankonen N (2021) Studying Behaviour Change Mechanisms under Complexity. Behavioral Sciences 11(5): 77. https://doi.org/10.3390/bs11050077
  • IIA [Institute of Internal Auditors] (2024a) International Professional Practices Framework - IPPF - 2024 Edition.
  • Kolk W, Paape L, Nikolic I, de Korte R (2022) Theorizing Participatory Control Systems: an organizational control concept for enabling and guiding adaptivity in complex situations. Maandblad voor Accountancy en Bedrijfseconomie 96(7/8): 267–277. https://doi.org/10.5117/mab.96.90745
  • Le Coze JC (2005) Are organisations too complex to be integrated in technical risk assessment and current safety auditing? Safety Science 43(8): 613–638. [ISSN 0925-7535] https://doi.org/10.1016/j.ssci.2005.06.005
  • Marro J (2016) Physics, nature and society. Springer International Publishing. [EAN 9783319349732]
  • Ormerod P (2007) Cascades of failure and extinction in dynamically evolving complex systems. In: Kertész J, Bornholdt S, Mantegna RN (Eds) Noise and Stochastics in Complex Systems and Finance, Vol. 6601, p. 66010Q. SPIE. https://doi.org/10.1117/12.727668
  • Singh M, Fuenmayor E, Hinchy EP, Qiao Y, Murray N, Devine D (2021) Digital Twin: Origin to Future. Applied System Innovation 4(2): 36. https://doi.org/10.3390/asi4020036
  • Snowden DJ, Boone ME (2007) A Leader’s Framework for Decision Making. Harvard Business Review 85(11): 68–76. [PMID 18159787]
  • Taleb N (2007) The Black Swan: The Impact of the Highly Improbable. Random House.
  • West G (2017) Scale – The Universal Laws of Life, Growth, and Death in Organisms, Cities, and Companies. Penguin.
  • Zang C, Cui P, Faloutsos C, Zhu W (2018) On Power Law Growth of Social Networks. IEEE Transactions on Knowledge and Data Engineering 30(9): 1727–1740. [1 Sept. 2018] https://doi.org/10.1109/TKDE.2018.2801844
login to comment