Print
Bridging the knowledge gap between academia and practice: how research can help develop the auditing profession (vice versa)
expand article infoOlof Bik, Jan Bouwens§
‡ Nyenrode Business University, Breukelen, Netherlands
§ University of Cambridge Judge Business School, Cambridge, United Kingdom
Open Access

1. Introduction

Conferences are an excellent opportunity to bridge the gap in knowledge between academics and practitioners, says Steven Salterio (Queen’s University, Canada). During the third international conference of the Foundation for Auditing Research, Salterio was one of the keynote speakers, talking about overcoming barriers in communication between academics and practice: Moving beyond the Lab.

Salterio based himself on three papers about knowledge transfer between academics and practice when performing auditing research, which he recently wrote with several co-authors (Gondowijoyo, Hoang, Luo, and Sylph – see reference list). Essential reading for everyone in auditing who is trying to answer the question: how can we strengthen the bridge between practice and academia in an attempt to further improve audit quality? After all, the aim of academics is to effectively scientifically inform policy-makers in and around the profession.

Salterio’s papers and his contribution to the conference, lead us to reflect about the following questions:

  1. How can we actually and effectively connect academics with audit practice (knowledge transfer)?
  2. How can we actually and effectively connect audit practice with academics (data gathering)?
  3. What are the challenges and best practices when it comes to strengthening the bridge between audit practice and academia?

The authors use this as a basis to sketch the status quo. This is followed by examining four means of knowledge transfer and their effectiveness. We then ask ourselves what auditing research can learn from evidence-based medicine. Finally, we present our own practical case study: field research carried out by the Foundation for Auditing Research, which involved academics collaborating intensively with the auditing profession.

Professor Steven Salterio is the Stephen J.R. Smith Chair of Accounting and Auditing at Smith School of Business at Queen’s University in Kingston, Ontario, Canada. His research investigates, among other areas, corporate governance with special attention to the role of the audit committee and external auditor; negotiations between auditor and client management on financial reporting issues and the effects of enhanced disclosure on the quality of corporate governance; and judgemental effects of performance measurement systems. Salterio has five years of direct practical audit experience and 20 plus years of consulting to audit firms, from Big 4 to regional. He is also an enthusiastic blogger, see his blog Musings on accounting research by Steve.

1. Knowledge transfer: status quo

According to Salterio’s figures, 24,000 academic articles have been published about auditing since 1970. However, the findings of these papers have only trickled down scarcely to practice, public debate and related (public) policy-making. Is this the fault of academics? Yes, partly, says Salterio. In one of his papers, he states: “Currently, based on its apparent low influence on policy makers it appears that auditing academic research is seen as neither a relevant nor a reliable source of knowledge” (Hoang et al. 2017, p. 12). Salterio added during the FAR conference: “Academic researchers often say: it is our fault that knowledge transfer does not take place. But they also use an excuse to not communicate more with, for example, policy-makers: auditors working in practice would not regard their relevant research as being all that relevant: They do not care what we do”. Academics thus admit their ‘guilt’, but also hide behind an excuse. Salterio says he explicitly rejects both views. He has also used the papers to put the ‘guilt’ of academics into perspective and to point to the role of audit practice when it comes to a smooth knowledge transfer: “Rather than taking the natural academic position that the limited success of at tempts to transfer research knowledge to policy-makers is ‘all our fault’ as researchers (…), we examine how such efforts of both parties can facilitate knowledge transfer” (Hoang et al. 2017, p. 1).

Communication gap between academics and practitioners

According to Salterio, poor knowledge transfer cannot only be attributed to academics; audit practice also plays a role. It is fair enough to say that audit practice may be overloaded with academic research (how does one choose from many sources of information?), which is not always practically worded either (“research is in a form that is unfamiliar to the potential user, thus creating a barrier to his (her) knowledge acquisition”; Hoang et al. 2017, p. 12) or is difficult to judge in terms of its merit or value (“it is not clear whether a potential user (i.e. a policy-maker) can distinguish about the quality of the research evidence among the various researchers”; Hoang et al. 2017, p. 12). However, this should be no reason for practitioners to ignore knowledge identified by academics: “in particular the process surrounding how to incorporate the transferred knowledge into best practice guidance” (Hoang et al. 2017, p. 15). Leading academics often have no practical experience. In turn, auditors and policy-makers often only have limited knowledge about conducting and interpreting academic research. This combination results in “absorptive capacity to be limited even if an academic makes a good faith attempt to communicate” (Hoang et al. 2017, p. 14).

2. How should things be done

What is the solution to this communication problem between academics and practice? Hoang et al. 2017, pp. 12-13) say the following about this in the papers:

The solution (e.g. Szulanski 2000) is direct exchanges of information between the recipient (i.e. the policy-maker) and the knowledge source (i.e. the researcher). However, the incentives of the source to compete or collaborate with the recipient and the amount of effort required from the source to support the transfer need to align in order for this to occur. Differences in the amount of overlapping understanding of the meaning of the conveyed information including its tacit elements necessary for use and interpretation determines the extent of support required”.

Academic research is sticky

During the conference, Salterio emphasised that communication is not about one-way transmission, but is a two-way process between sender and recipient. Both the sender (the academic) as well as the recipient (policy-makers and practitioners) have a responsibility to share knowledge. A lot can go wrong during this communication process: the sender encrypts his/her messages; the recipient must decrypt them again, and vice versa. A lot of academic research is sticky, says Salterio: the recipient must first translate the scientific research findings into practical recommendations before it can be used for problem solving.

Codified versus tacit knowledge

In this regard, Salterio identified two types of knowledge. Codified or explicit knowledge is easy to formulate, document and distribute, like accounting and auditing standards. Tacit or implicit knowledge is concealed within people’s skills, ideas and experiences. This implicit knowledge is particularly crucial when transferring knowledge from academic research, while practitioners need a lot of tacit knowledge to interpret research and translate it into practical applications. This involves becoming aware of the value of this knowledge, good personal contact in practical situations, and mutual trust for a proper learning climate.

Knowledge gap: production versus action

Knowledge transfer (or, more precisely, its absence) not only involves two types of knowledge, but also two (closed) circles, which do not touch each other (see figure 1). Academics primarily focus on production: collection and synthesis of knowledge. Practitioners primarily focus on action: knowledge as problem-solving. According to Salterio, the gap in knowledge is created by differing interests.

Figure 1.

The knowledge gap between academics and practice (source: presentation Salterio at the 3rd International FAR conference, June 2018, and Graham et al. (2006).

Comparison of four strategies for knowledge transfer

A purposeful strategy is needed to bridge this gap in knowledge. Salterio et al. (2018, p. 2) compared a few knowledge transfer strategies and assessed their effectiveness. “These approaches range from audit academics writing traditional research articles, to including audit academics on standard setting boards and task forces, to producing ‘literature reviews’ by teams of volunteer audit academics in loose concert with standard setters”.

Articles in journals: publish and forget

Firstly, Salterio et al. (2018) are highly critical about the traditionally academic means of communication – certainly when it comes to effectively informing practitioners. The “publish and forget approach” (Salterio et al. 2018, p. 5) refers to publishing research in the most prestigious academic journals. Anyone with a subscription to these journals or access to a high quality university library can familiarise themselves with this research. However, despite this accessibility and the often in-depth focus of such papers, few audit firms and professional organisations actually have structured access to these journals. ‘Practice notes’, like those published by the FAR, could be a solution: a summary of the paper (or several papers), which has been written for practice. But, because this often involves translating ‘single academic studies’, this method is also seen to lack effectiveness and completeness - and is accompanied by the connotation of ‘translate and forget’ (Salterio et al. 2018, p. 6) - because “predicting other’s comprehension of messages is very difficult to do” (Salterio et al. 2018, p. 13).

Committee for unlocking knowledge

In order to move away from this academic stickiness, Salterio believes things would be more effective if academics were to play a role within the profession or audit firms as members of a board or committee (‘board academic’, Salterio et al. 2018, p. 6) - possibly as an ‘academic fellow’ (Salterio et al. 2018, p. 6) - with the aim of unlocking academic knowledge for the concerned organisation. While these academics cannot, of course, be experts in all relevant academic areas, they can help to unlock academic knowledge and differentiate between “good and bad studies (Teixeira 2014)” (Salterio et al. 2018, p. 7). In this regard, Hoang et al. (2017, p. 40) emphasise a “general agreement on an evidence quality hierarchy”, “so as to be able to assess the quality of the research evidence used to answer the well-defined research questions”. “It does require that the individual is highly competent in research methods to be able to evaluate the quality of the research literature” (Salterio et al. 2018, p. 15).

Conferences and master classes

Conferences and master classes (the ‘academic-standard setter/practitioner conference’) can also play an important role when closing the knowledge gap: this interaction can help to create a bridge between academics and practitioners. However, their effectiveness will be greatly determined by how well people speak one another’s language and the extent to which both ‘sides’ are prepared to learn about each other. Once again, Hoang et al. (2017, p. 14) are first to point the finger at academics: “At such conferences, the focus is not on synthesizing what information from across studies can be transferred, but rather on the validity of and the contribution to knowledge of the individual research papers that are presented”. Or, in other words, academics do what they are good at and what they are accustomed to: debate with one another about the validity of research - not about what it means to the practice and about specific recommendations.

‘Social experiments’

For instance, the first two conferences of the Foundation for Auditing Research could be regarded as ‘social experiments’, where academics and auditors had to become accustomed to one another. But it will take time to gradually find one another after decades of limited interaction. As Salterio et al. (2018, p. 18) acknowledge, this can be a source of frustration for conference participants: “What is known is that academic conferences can challenge the [conference participants] to the point of being uncivil in their discourse (…), not necessarily an approach that will lead to knowledge transfer especially to standard setters”. This all results in that people not seldom lose sight of the primary objective, namely, offering academic information to, and finding potential uses for research findings in, practice.

Isolated worlds

How did things go at the third FAR conference? Did academics and practitioners explore opportunities to interact with one another in order to bridge the gap in knowledge? Salterio demonstrated that both worlds are still rather isolated. He believes presenting individual research during the conference only reinforces focus on production among academics. As a result, they do not pay enough attention to explicitly explaining their research to practice. But practitioners also failed to fully exploit the opportunity to actively interact and ask about implications for their daily practice. On the first day of the conference, Salterio counted the number of questions the audience asked in the formal question & answer sessions during and after the presentations. Almost half of the questions (13) came from other academics and related to methodology. The other half (14) came from practitioners and related to how the research could be applied. And what about informal interaction: did practitioners approach academics during the break and in the corridors to extract information about the practical application of academic research? Barely or not at all, says Salterio after talking to speakers at the conference: when asked, they said they had received few questions from practitioners in informal settings.

Inclusive call and supply side response

Finally, Salterio’s papers refer to the ‘inclusive call’ (Salterio et al. 2018, p. 7) and ‘supply side response’ (Salterio et al. 2018, p. 8) as knowledge sharing strategy. They are in keeping with the idea of specifically asking academics to provide input for a particular issue within audit practice, or to write a consultation reaction to draft standards or policy issues. “Academic researchers or others with appropriate research background are motivated to identify, evaluate, summarize, and meaningfully synthesize existing research that is relevant to standards-setting priorities and communicate the synthesized findings to those who make the critical policy decisions regarding future standards” (Salterio et al. 2018, p. 9). Its effectiveness is determined by the ‘utility’ (“the usefulness, relevance, timeliness, accessibility and ease-of-use of information or of a source”, Salterio et al. 2018, p. 20) and ‘credibility’ (“the perceived trustworthiness, authority, reliability and lack of bias [of an] information provider who draws on many sources of academic information including their own research”, Salterio et al. 2018, pp. 19-20).

3. Evidence based medicine: what can auditing research learn from it

When it comes to effectively informing the profession, the papers of Hoang et al. (2017) make a comparison with Evidence Based Medicine (EBM) and base themselves on knowledge transfer theory (including Nonaka and Takeuchi 1995; Zander and Kogut 1995) to arrive at several knowledge translation approaches for academically informed (evidence based) policy-making and standard setting in auditing. “The facet most relevant to our examination is literature on developing evidence-based ‘best practice’ guidelines and standard operating procedures (…). This area of EBM research (…) carefully examines how guidelines can be developed that are well-informed based on the evidence from research while accepting that such research cannot speak for itself (…) and must be translated into understandable and implementable guidance (…)” (Hoang et al. 2017, p. 16).

Widely supported knowledge cycle

As a result, the auditing profession does not have to work miracles, but merely has to examine the evidence-based approach adopted in medical science. In this approach there are no closed circles in the production and use of knowledge, but collaboration and close interaction between academics and practice. Knowledge is developed in a widely supported cycle, as demonstrated in the model that Salterio presented during the conference (see figure 2):

Figure 2.

Evidence-based knowledge transfer (source: Salterio’s presentation at the 3rd International FAR conference, June 2018).

Systematic literature synthesis

EBM is based on identifying already available academic evidence in a systematic, balanced and comprehensive manner: “EBM knowledge translation practices requires “a systematic review of all pertinent evidence (not just the evidence that supported a particular position), a critical analysis of the quality of the evidence, a synthesis of the evidence, a balancing of benefits and harms, an assessment of feasibility and practicality, a clear statement of the recommendation, and a detailed rationale” (Eddy 2005, p. 12)” (Hoang et al. 2017, p. 18). They then showed several examples for how such a systematic literature synthesis must be put together.

Iterative process

An important step in this involves defining a good research question based on intensive interaction between practice and academics (“there needs to be ongoing involvement with the policy-makers by the researchers so that underlying tacit knowledge about research and standard setting is transferred as part of an iterative process that allows each to understand the other’s concerns at a deep level” (Hoang et al. 2017, p. 26)) – so the involved academics know exactly which questions must be central in their research (and thus also to know which answers are already available and which have yet to be found). A research question that has been developed “with the advice of a practice-based committee that helps the researchers refine and understand what is the exact question to be answered” (Hoang et al. 2017, p. 41). In short, it is an iterative process where researchers and policy-makers inform one another during the research: “In particular, the generation of specific questions that research may be able to provide evidence on for policy-makers requires an iterative process of well-specified question development followed by academic-authored research syntheses (systematic reviews), where specific questions are answered in light of the best available evidence that is critically evaluated” (Hoang et al. 2017, p. 26). This systematically collected information must then be translated into specific recommendations, such as advantages and disadvantages, feasibility (people and resources) and compatibility with the values and preferences of patients.

Joint working groups

Salterio et al. (2018, p. 21) make a comparison with an important American experience to demonstrate that academics realise better results when “not in isolation, but through engagement with the information user”: the cooperation of the PCAOB – Auditing Section with the American Accounting Association. This cooperation consisted of “joint working groups”, like those of the FAR, which resulted in “intensive weekend meetings where each [research] project had a specific PCAOB staff member assigned to it” (Salterio et al. 2018, p. 22). But day-to-day interaction is also important: “Our interaction with the PCAOB staff has been fairly steady throughout the project, and often by email, with the goal of making sure our team understood some of the key issues of interest at the PCAOB”. Direct phone calls with staff and at least two rounds of interaction with staff occurred during the creation of the review (Hermanson 2005)” (Salterio et al. 2018, p. 21).

From evidence-based to evidence-informed

The evidence-based approach is informative, but not all-powerful. As Salterio stated during the conference, “academic research is unable to answer every question”. “To do this, it is sometimes necessary to draw upon other disciplines and sources, such as knowledge from practice, supervisory bodies or other policy-makers. That is when evidence-based knowledge development evolves towards evidence-informed knowledge development”.

Salterio describes this as follows in the papers: “The transfer of knowledge from research to policy-makers will only rarely decide an issue” (Hoang et al. 2017, p. 2). Or in other words, policy-making requires more than just research – although research offers important evidence-based insights, it is only one source of relevant information. Salterio continues: “Evidence-based means that the policy makers make an informed decision explicitly including evidence that comes from underlying academic research in addition to inputs from practitioners, other regulators, and parties that have traditionally been engaged in the policy-making process” (Hoang et al. 2017, p. 3). In addition, he says “the term ‘evidence-based’ can be better described as ‘evidence-informed’ policy-making (…), as rarely will research evidence lead to selecting only one right way”. Academics can add academic information to the public debate, so that “the best available basis for action at present” (Hoang et al. 2017, p. 5) is available.

More research ‘upon order’

The evidence-based approach can lead to a much greater wealth of information than ‘traditional’ research methods in the auditing profession, concludes Salterio. The profession thus also needs new ways of doing research; he states: more contact between sender and recipient, more research ‘upon order’ and more attention to synthesising research so more comprehensive evidence and more alternative sources can be obtained.

4. A field case: current FAR experiences

Therefore, in order to effectively offer research information to practitioners, one clearly needs to do research in and with the concerned practitioners - thus in and around the profession. And this requires intensive interaction. Salterio and Gondowijoyo (2017, pp. 22-23) expressed this as follows:

The researcher’s extensive engagement with the practice community enables them to identify key issues that practitioners are grappling with. (…) Their awareness of the institutional contexts that their informants are embedded in helps the researchers contextualize their informants’ responses. This means that they will be much quicker in systematizing what they hear and observe in the field than researchers that do not have similar sensitivity to the accounting context. (…) The behavioural accounting researchers’ competence in recruiting practitioner participants also helps them develop trust and rapport with their potential informants in a qualitative field study. This ability also aids them in securing (the often elusive) access to the field research site(s) and managing the prolonged engagement with the field”.

Three experiences

The Foundation for Auditing Research finds itself in the midst of the knowledge transfer challenges when trying to improve interaction between academic research and the auditing profession. To show how this can work in practice and what is needed, we would like to highlight the three following experiences of the FAR thus far:

  1. Firstly, a robust information security and legal infrastructure had to be established so data could be shared with research teams, without it being possible to trace back the initial client, account, employees, or audit firms. This took the best part of a year. ‘Legal clearance’ was obtained from all affiliated audit firms just before Christmas 2016. A lot of time and hard work was needed to coordinate everything with all parties, firms, research teams and the FAR. A milestone was reached at the end of 2016: that is when the FAR was given access to the audit firms.
  2. Several security measures were agreed to realise a research data set that can no longer be traced to a particular firm, specific client files, individual auditors, or employees. In case of large scale quantitative research, such identities are not always important: researchers look for patterns based on large quantities. This involves the following security measures:
  3. o Firms primarily anonymising their own data. Data can no longer be traced thanks to an encryption application.
  4. o Thanks to strict data management procedures, like the transformation and calculation of derived research variables, the FAR is also eliminating indirect traceability.
  5. o Remote access for academics. This means researchers often do not physically receive the data, but can only analyse them in a secure FAR environment to which they must log in (but no data can be copied or exported from this location).
  6. o Confidentiality checks by the FAR on all draft publications prior to release.
  7. o And of course, possibly most importantly, non-disclosure agreements (NDA’s) as ‘base line’ with all involved researchers. Under normal circumstances, the latter is often sufficient, but, when it comes to the FAR, where collaboration takes place with ten firms and dozens of researchers, a great deal of structuring and organisation is necessary.
  8. 3. Structured and reliable access to research data – the best part of 2017 was needed to find and learn from this data gathering process, in close collaboration with firms, research teams and the FAR. Besides questionnaires, experiments, case studies, and interviews, the FAR is also explicitly focusing on ‘archival data’: research data from audit files, the (financial) records of audited firms themselves, personnel files, and quality assurance systems. All parties realise that many cases are involved (for example, 500 audit files per year, spread pro rata across ten firms) and that many variables/information points are needed. An important finding, in relation to the required audit files, is that a lot of data is heavily secured and is difficult to access due to confidentiality-related regulations. An important finding in relation to the requested information points is that they must be defined unambiguously. Even so, major measurement differences could still be encountered between the ten firms and must be resolved by researchers. These data are often also collected from several systems (which must be connected to each other), or must be obtained from audit teams via an additional information request. It was necessary to learn how to collect such rich and in-depth data in a structured manner, and to arrange the internal organisation accordingly. This was already a challenge for structured, digitally available data, let alone for unstructured information which must often be collected and registered via requests to audit teams or even manually.

Exchanging ideas

Academics and practitioners can benefit from each other by doing research together. For instance, discussions between the FAR and audit firms took place in 2017 about which type of data the FAR would request for their research. For example, FAR research groups examine how and when audit-file-related comments from individual audit team members have an impact on activities. The exact details of discussions and ‘negotiations’ about audit findings (with the audited organisation) are also being examined. In this regard, we asked firms how they registered comments and discussions. The response of a concerned audit leader was as follows: “We do not always register this in our systems but, come to think of it, it would offer major advantages if we were to do so. At this moment in time, we have to dig very deep if we want to perform a root cause or error analysis. Such registration could help to improve and speed up this process”.

Academics have their own language

Academics also learn a lot from such discussions. Just like Salterio is critical of his own profession and says that academics speak a language that not many practising auditors understand, FAR experiences that its research fellows use academic standards, for example, to measure the quality of audit teams. This means practising auditors have difficulty recognising their own work. For example, academics measure ‘discretionary accruals’ which regard unexplained abnormalities in financial statements as quality-related issues in the audit. However, auditors claim these findings actually say nothing about the work they perform. If academics use such standards to measure quality, practising auditors are unlikely to show understanding for their approach.

Learn from each other

In short, FAR experiences that academics and practising auditors can learn a lot from each other on this front. Here is an example. An audit firm sets up an engagement registration system, which performs a real time comparison between performed audit activities and the audit plan (including worked hours). In case of abnormalities, the audit team is asked to explain the abnormalities in question (for which there may be a good reason, of course). The team concludes that audit quality had improved after the engagement registration system was introduced. In this case, an academic would ask: what is the cause and effect? Can the improvement really be attributed to the introduction of the engagement registration system? Academics can help to answer this question because they possess research techniques that can determine this causal relationship. For instance, an academic will ask firms to perform some audits with the new registration system and some without it. This will allow the researcher to determine whether the improvement in audit quality can be attributed to the introduction of the engagement registration system. Without this research, the improvement in quality might wrongly be attributed to the introduction of the new system. I.e., the wrong policy measures might be taken.

First hurdle cleared

In total, over two years were invested in creating robust foundations for the collaboration between FAR researchers and audit firms. As of spring 2018, the first useful (archival) research data are being delivered to research teams, which can now be used to start their studies. A lot has been learned and achieved in the meantime. An example is the enrichment of management information systems of the firms themselves, because the FAR requests certain information. This has gotten the firms thinking: should we not know this ourselves when managing and monitoring quality? As a result, it is very well possible that information that is currently collected manually, will soon be more easily available from a central information system.

Initial interim or provisional results from several early research projects were shared at the FAR’s 3rd international conference (see the articles elsewhere in this special MAB edition.) Does this mean the FAR has now passed the ‘tipping point’ of the learning curve? In any case, the first major hurdle has been cleared. To be fair, however, joint audit research will only be possible in the future if there is clear commitment from, and close interaction between, the involved researchers and the concerned audit firms. Academic research can only be made less sticky by working together, as Salterio argued at the FAR conference.

A discussion with Steven Salterio during the FAR conference: a brief impression

After the introduction by Steven Salterio, Michael de Ridder (FAR board member representing PwC) led the subsequent discussion with the audience. But he first focused on practitioners. How can they be more involved in academic research? Salterio: “There must be a two-way exchange between academics, on the one hand, and the practice, supervisory bodies and standard-setting bodies, on the other hand. They know the real problems and vexed issues. At the moment, academics are often asked: tell me everything you know about an issue, but the research question must be a lot more specific and question-based. At the start of the research, work more intensively with one another to specify the question. Do not start with a literature review, but examine what is going on in practice.”

Foreseeing the future means going back to the past. A participant of the conference asked which role Salterio foresees for research into the impact new technologies will have on the auditing profession and how resulting insights can be integrated into day-to-day practice. “This involves better understanding the lessons from the past”, stated Salterio. He looked around the auditorium: “Does anyone still know what EDP is? Many years ago, people said that Electronic Data Processing would change the profession forever. We cannot foresee the future. For example, I do not know how blockchain will influence the auditing sector. Will it, for instance, take over valuation activities? Who knows. However, we can examine patterns of change by better understanding the past”.

De Ridder concluded by saying that he was impressed by the comparison with the medical sector, as a means of closing the gap between academics and practitioners: one of the FAR’s main objectives. “Let’s see how we can get this process started”.

Prof. dr. J.F.M.G. Bouwens is a professor of management accounting at the University of Amsterdam and Managing Director of the Foundation for Auditing Research.

Prof. dr. Olof Bik RA is a professor of Behavioral Research in Auditing at Nyenrode Business University and Managing Director of the Foundation for Auditing Research.

References

  • Boland A, Cherry MG, Dickson R (2017) Carrying out a systematic review as a master’s thesis. In: Boland A, Cherry MG, Dickson R (eds.) Doing a Systematic Review. A student’s guide, 2nd editon. Sage.
  • Graham ID, Logan J, Harrison MB, Straus Se, Tetroe J, Caswell W, Robinson N (2006) Lost in knowledge translation: Time for a map? The Journal of Continuing Education in the Health Professions 26(1): 13–24. https://doi.org/10.1002/chp.47
  • Hermanson DR (2005) Performing a literature synthesis project for the PCAOB. Auditor’s Report 29(1): 1–5.
  • Hoang KJ, Salterio S, Sylph J (2017) Barriers to transferring accounting and auditing research to standard setters. Available on SSRN: https://ssrn.com/abstract=2928450
  • Nonaka I, Takeuchi H (1995) The knowledge creating company. Oxford University Press (Oxford).
  • Petticrew M, Roberts H (2006) Systematic reviews in the social sciences: A practical guide. Blackwell.
  • Salterio S, Gondowijoyo PM (2017) Moving beyond the lab: Building on experimental accounting researchers’ core competencies to expand methodological diversity in behavioral accounting research. In: Libby T, Thorne L (eds.) The Routledge Companion to Behavioural Research in Accounting, chapter 12. Routledge: 149–174. Also available on SSRN: https://ssrn.com/abstract=2792439
  • Salterio S, Hoang KJ, Luo Y (2018) Communication is a two-way street: Analyzing approaches to enhance effective audit research knowledge transfer to policymakers. Working paper. Available at SSRN: https://ssrn.com/abstract=3224709
  • Szulanski G (2000) The process of knowledge transfer: a diachronic analysis of stickiness. Organizational Behavior and Human Decision Processes 82(1): 9–27. https://doi.org/10.1006/obhd.2000.2884
  • Zander U, Kogut B (1995) Knowledge and the speed of the transfer and imitation of organizational capabilities: An empirical test. Organization Science 6(1): 76–92. https://doi.org/10.1287/orsc.6.1.76
login to comment