Kuali Research Needs
Kuali Research Needs
Kuali Research Needs
net/publication/313286738
Big Data and Analytics in the Modern Audit Engagement: Research Needs
CITATIONS READS
110 16,482
3 authors, including:
Some of the authors of this publication are also working on these related projects:
An Ontological Methodology for Classifying Social Media: Text Mining Analysis for Financial Data View project
Using Drones in Internal and External Audits: An Exploratory Framework View project
All content following this page was uploaded by Deniz A Appelbaum on 16 November 2017.
The DOI for this manuscript and the correct format for citing the paper are given at the top
of the online (html) abstract.
Once the final published version of this paper is posted online, it will replace this
preliminary version at the specified DOI.
Big Data and Analytics in the Modern Audit Engagement: Research
Needs
Deniz Appelbaum
PhD Candidate
Rutgers, the State University of New Jersey, Newark
[email protected]
cell: 202-425-3633
preprint
Director of CARLab
Rutgers, the State University of New Jersey, Newark
[email protected]
cell: 201-454-4377
accepted
manuscript
Big Data and Analytics in the Modern Audit Engagement: Research
Needs
Abstract:
Modern audit engagements often involve examination of clients that are using big data and
analytics to remain competitive and relevant in today’s business environment. Client systems
now are integrated with the cloud, the Internet of Things, and external data sources such as
social media. Furthermore, many engagement clients are now integrating this big data with new
and complex business analytical approaches to generate intelligence for decision making. This
scenario provides almost limitless opportunities and also the urgency for the external auditor to
preprint
utilize advanced analytics. This paper first positions the need for the external audit profession to
accepted
move towards big data and audit analytics. It then reviews the regulations regarding audit
manuscript
evidence and analytical procedures, in contrast to the emerging environment of big data and
advanced analytics. In a big data environment, the audit profession has the potential to
undertake more advanced predictive and prescriptive oriented analytics. The next section
proposes and discusses six key research questions and ideas followed with particular emphasis
on the research needs of quantification of measurement and reporting. This paper provides a
synthesis and review of the concerns facing the audit community with the growing use of big data
and complex analytics by their clients. It contributes to the literature by expanding upon these
There is an increasing recognition in the audit profession that the emergence of big data
(Vasarhelyi, Kogan, and Tuttle 2015) as well as growing use of data analytics in business
processes has brought a set of new concerns to the audit community. Accountants1, Large Audit
Firms2, Standard Setters3, and Academics 4 have been progressively raising many issues, among
which we find:
preprint
2. Which of these methods are the most promising?
These concerns have emerged even though analytical procedures in general have been
addressed by the American Institute of Certified Public Accountants (AICPA) guidelines of 1972
and in numerous academic papers since 1955. The Statement on Auditing Standards (SAS) No.
#1, states:
“The evidential matter required by the third standard (of field work) is obtained through two
general classes of auditing procedures: (a) tests of details of transactions and balances, and
(b) analytical review procedures applied to financial information (AICPA 1972 par.
320.70).”
There is a fine balance in every audit engagement between detailed evidence collection and
analytical procedures (Yoon 2016). Detailed evidence collection can be quite costly yet deemed
preprint
more reliable according to the standards, while analytical procedures are widely viewed as being
less costly and believed less reliable by regulators (Daroca and Holder 1985; Tabor and Willis
accepted
1985). Both processes are allowed by the standards; their degree of utilization depends on
manuscript
auditor professional judgment. While the requirement of tests of details of transactions and
completely undefined, except that it should be applied to financial data (Tabor and Willis 1985).
More recently, according to AU-C Section 520 about Analytical Procedures (AICPA 2012a),
• evaluate the reliability of the data from which these ratios are developed;
5
The PCAOB issued Release No. 2016-003 on May 11, 2016 re-proposing new standards for the audit report in
which in addition to the traditional pass/fail model “critical audit matters” (CAM) would be disclosed.
• develop an expectation of recorded amounts and ratios and whether these are accurate,
and finally
• determine the amount of difference (if any) between the recorded amounts and the
The lack of detailed recommendations in this age of automation and big data regarding which
analytical procedures to undertake in the external audit engagement has inspired considerable
discussion. Although the internal audit environment is increasingly using analytics (Vasarhelyi
et al. 2015; Perols and Lougee 2011; Dilla et al. 2010; Yue et al. 2007; Alles et al. 2006; Church
et al. 2001), the external audit field has not responded to the same degree. The regulations, such
as the guidance for sampling, have remained unchanged even though many audit clients
preprint
automate the collection and analysis of 100% of their transactions (Schneider et al. 2015; Zhang
et al. 2015).
accepted
manuscript
This paper provides a synthesis and review of the concerns facing the audit community with
the growing use of big data and complex analytics by their clients. It contributes to the literature
by clarifying and expanding upon these emerging concerns and by suggesting opportunities for
future research. This paper first reviews the current standards regarding evidence collection and
analytical procedures as currently understood in the profession, before discussing big data and
business analytics. The role of big data and business analytics and their implications for the audit
profession should first be understood in the context of current practice. Then this paper broadly
reviews each of these six concerns emerging in the profession as a result of the use of big data
and analytics by engagement clients. These concerns are subsequently followed with an
elaboration of additional forward looking research issues, with special emphasis on the
It is essential to understand the current scope and constraints of the public audit profession
before envisioning the role of more complex analytics and big data in the engagement. Since
auditing is largely a regulation driven profession, the expectations regarding evidence collection
and analytical procedures should be considered. The auditor still needs to test for basic assertions
to make sure that the objectives of the audit are fulfilled regardless of the nature of the evidence
and the way the evidence is being collected. The tests for certain assertions may change in the
current new environment with its different nature of evidence and the way this evidence is
collected and analyzed. However, even if the tests of assertions were to be altered, the assertions
preprint
themselves wouldn’t change and neither would the fundamental objective of the public auditor –
to provide opinion on the financial statements as to whether they represent the financial position
accepted
of the client in accordance with the generally accepted accounting principles.
The main purpose of the work conducted by an auditor in an external engagement is to obtain
reasonable assurance that the client’s financial statements are free from material misstatements
and to subsequently express an opinion regarding these financial statements and the client’s
internal controls in the auditor’s report. To accomplish this task, the auditor must design and
perform audit procedures to obtain sufficient appropriate evidence; furthermore, the Audit
Standards require auditors to examine physical evidence as part of the risk assessment process
(PCAOB 2010, AS 1105; AICPA 2012, SAS 122; IAASB 2009, ISA 500). Audit evidence is all
the information (whether obtained from audit procedures or other sources) that either confirms or
contradicts or is neutral about management’s assertions on the financial statements or internal
controls.
Additionally, the Sarbanes-Oxley Act (SOX) demands that auditors verify the accuracy of the
information or evidence that forms the basis of their audit opinion. Since SOX, audit firms have
relied more heavily on detailed audit examination and scanning for substantive tests as these are
regarded to be “harder” audit evidence formats than regression and other “softer” analytical
techniques (Glover et al 2014). The impact of this legislation on the profession’s analytical
procedures choices should not be ignored. However, as mentioned and footnoted in the
Introduction, every one of the “Big Four” has recently publicly announced efforts in the domain
Since audit evidence is all the information used by the auditors to form the audit opinion
preprint
(PCAOB, 2010, AS 1105), it should be both sufficient and appropriate. Basically, if the
underlying information is not reliable or strong enough and its origin isn’t verifiable, then more
accepted
evidence will need to be collected and reviewed (Appelbaum, 2016). Poor quality evidence
manuscript
cannot be compensated for by collecting a larger amount of data (PCAOB 2010, AS 1105).
However, in today’s complex IT and big data environment, the nature and competence of this
audit evidence has changed (Brown-Liburd and Vasarhelyi 2015; Warren et al. 2015; Nearon
2005). With big data, quantity of evidence is hardly an attribute with which to be concerned.
However, quality of electronic evidence becomes even more dominant in the equation and may
be more challenging to verify. Most stages of a transaction can be computer generated and
recorded and can only be verified electronically. For example, with additional information
available from external big data, intangible assets might be partially valued by the client from
information derived from text analysis of aggregated tweets and web scraping of social media.
Unfortunately, the reliability of these tweets and social media is hard to verify (Appelbaum
2016).
The issues for electronic accounting data and electronic audit evidence are drastically
different from that of manual and paper-based examination. Many of the characteristics that are
strengths with paper-based evidence pose issues for electronic evidence. Where paper
documentation is regarded as not easily altered, electronic data may be easily changed and these
alterations might not be detected, absent the appropriate controls. In paper-based evidence
collection, sources that are verified external to the client are considered to be highly reliable
(PCAOB 2010, AS 1105), whereas external electronic evidence is difficult to verify for origin
and reliability. Paper-based evidence is easy to evaluate and understand, whereas electronic data
and evidence may require a high level of technical expertise of the auditor. Since big data is
preprint
electronic data, big data presents a scenario where these complexities are magnified greatly.
Furthermore, the types of tests that should be undertaken by auditors to examine basic assertions
Analytical procedures are required by the Public Company Accounting Oversight Board
(PCAOB) in the planning phase (PCAOB 2010, AS No. #2110) and review phase (PCAOB
2010, AS No. #2810), but are undertaken according to auditor judgement in the substantive
The purpose of analytical procedures is different for each audit phase. For the risk
risks to the audit. The auditor is expected to perform analytical procedures for the revenue
auditor should also use his or her knowledge of the client and its industry to develop
expectations. The standards admit that the data may be at a more aggregated level and result in a
Accordingly, in AS No. #2305.04 analytical procedures are used in the substantive testing
phase to obtain evidence about certain assertions related to certain accounts or business cycles.
Analytical procedures may be more effective than tests of details in some circumstances (Yoon
2016). In AS No. #2305.09, the PCAOB states that “the decision about which procedure or
procedures to use to achieve a particular audit objective is based on the auditor’s judgement on
preprint
the expected effectiveness and efficiency of the available procedures.” The main limitations
appear to be the “availability” of certain procedures and the auditor’s judgement on the expected
accepted
effectiveness of certain analytical methods. The latter condition would appear to reflect the
manuscript
auditor’s level of familiarity with certain analytical methods.
For the review phase of the audit engagement, analytical procedures are required to evaluate
the auditor’s conclusions regarding significant accounts and to assist in the formation of the audit
opinion (PCAOB 2010, AS No. #2810.05-.10). Similarly, in the planning phase the auditor is
required to perform analytical procedures related to revenue during the relevant period. In this
section, there is no mention of any one analytical approach, except that this phase typically is
similar to the planning phase. As such, it is expected that the more complex exploratory or
Auditors are required to conduct the audit engagement within the parameters of the
regulations, regardless of the IT or accounting complexity of the client. It is highly probable that
the client may be undergoing processes with advanced analytical techniques and new sources of
data. The newest challenges facing the auditor are the increasing use of big data and the
this current audit environment of big data and advanced analytics, what follows are immediate
research questions that should be addressed if the profession is to integrate itself within this new
business paradigm.
Big Data
preprint
Many client systems now are increasingly integrated with the cloud, the Internet of Things,
and external data sources such as social media. Client data may exhibit large variety, high
accepted
velocity, and enormous volume – big data (Cukier and Mayer-Schoenberger 2013). This data
manuscript
may originate from sensors, videos, audio files, tweets and other textual social media – all data
types typically unfamiliar to an auditor (Warren et al. 2015). However, this big data provides
almost limitless opportunities to the external auditor to utilize advanced analytics. According to
extant analytics research (Holsapple, Lee-Post, and Pakath 2014; Lee et al. 2014; Delen and
Demirkan 2012), big data should provide auditors the opportunity to conduct prescriptive
analytics – that is, to apply techniques that computationally determine available actions and their
consequences and/or alternatives, given the engagement’s complexities, rules, and constraints
devices and the Internet of Things (IoT) (Atzori, Lera, and Morabito 2010; Domingos 2011; Dai
and Vasarhelyi 2016) is progressively interconnecting with corporate systems.6 The economics
of hardware and software development are of very different nature than traditional systems. It is
not inconceivable that analytic methods such as regression may be built into chips, including
powerful explanatory software7 that would provide interpretations of the results and recommend
Advances in text interpretation, voice recognition, and video (picture) recognition would
dimension, the latency of information and its processing systems are progressively reduced,
mainly as the result of faster chips, interconnected devices, and automatic sensing of
preprint
information. The traditional annual audit, or even quarterly report evaluation would have limited
audit profession to include in the examination, the standards regarding audit evidence may need
to be discussed and possibly re-examined in the context of big data. Regardless of the source,
the data should be reliable and verifiable. Table One outlines the challenges that big data poses
How can the availability of big data sets, both internally and externally to the enterprise, be
utilized to enhance analytics? Can the extremely large amounts of data compensate for uncertain
or, at times, lower quality of such data? There are some that argue that big data is meant to be
messy (Cukier and Mayer-Schoenberger 2013). In cases where big data is of dubious origins or
lacking audit trails (Appelbaum 2016), the standards currently would indicate that no amount
Consider for example the Jans et al. (2014) paper with the application of process mining. This
paper details the use of process mining on a 100% test of the transactions to find the anomalies in
the sample where controls fail in the processing of 26,185 POs. Basically, the audit trails are
problematic. A series of process mining tests (a type of ADA) narrows the sample of anomalies
preprint
down to the highest risk scenarios which exemplify high rates of violations among individuals and
small networks of people working together. It seems that this is the perfect example of how ADAs
accepted
can be used for more efficient audit testing.
manuscript
Fraud related issues may be as challenging, if not more, to the audit team in a big data
environment. More data does not necessarily equal more effective information, and the added
complexity of the big data could complicate the assessment of audit evidence for fraud
(Srivastava et al. 2009; Srivastava et al. 2011; and Fukukawa et al. 2014). Fraud detection also
focuses on the assessment of internal controls, regardless of whether the analytics are based on
sampling or on processing 100% of the population. It is important to point out that no matter how
strong the internal control system, management can still perpetrate fraud by over-riding the
internal controls. In a big data environment, it is quite possible that the volume and complexity
of the data might actually hinder what is already a troublesome task for many engagement teams
Furthermore, how can the amount of audit evidence provided by analytics in a big data
context be measured? How can this evidence be aggregated with other types of audit evidence in
a methodologically sound way? How can such quantitative measures be used to provide support
for the auditor’s judgement about the sufficiency of audit evidence? The entire standards of audit
evidence may need to be reassessed and subsequently revised in this age of electronic and big
data evidence (Appelbaum 2016; Brown-Liburd and Vasarhelyi 2015). Electronic and big data
evidence often raise issues opposite of those assumed by the standards for paper-based
documentation. As business processes now are very infrequently paper-driven, the standards on
reliable evidence, which are derived from quality evidence of sufficient amount, may need to be
preprint
revised to provide a more quantitative measure of quality vs. quantity in an IT audit.
in the public audit engagement context because the two terms might not be completely
important part of the audit process and mainly consists of an analysis of financial information
made by a study of believable or plausible relationships among both financial and non-financial
data. These analytical procedures could be as basic as scanning (viewing the data for abnormal
events or items for further examination) to more complex approaches (not clarified by the
standards, except that the approach should enable the auditor to appropriately develop an
defined as “the use of data, information technology, statistical analysis, quantitative methods,
and mathematical or computer-based models to help managers gain improved insight about their
operations, and make better, fact-based decisions” (Davenport and Harris 2007). BA may be
further conceptualized with the three dimensions of Domain, Orientation, and Technique.
(Holsapple et al 2014). Domain represents the context or environment for the analytics.
Orientation describes the vision or focus of the analysis – either descriptive, predictive, or
prescriptive. Descriptive orientation answers what happened and is backward looking. Its
techniques convert this analysis into useful information via visualization, graphs, and descriptive
statistics. Predictive orientation then takes the descriptive information of what happened and
hypothesizes what could happen. Predictive analysis is the process of developing expectation
models, with which auditors are quite familiar. Basically, predictive analysis uses data from the
preprint
past and the present to generate relevant predictions (many logical, statistical, machine learning
accepted
approaches). Prescriptive orientations take predictions further. Based on what happened and
manuscript
using experimental design, this mode presents an optimization analysis to identify the best
possible alternative. The techniques define the actual method or approach for analysis.
The focus or context of BA for management would be somewhat different from that of the
auditor. Management accountants are seeking to extract and develop insightful knowledge to
management decision-making. Internal auditors are seeking to verify the effectiveness and
accuracy of this information. External auditors are concerned with BA as they relate to
verification of the veracity of the financial statements. However, both audit tasks involve
generating expectation models as well as confirmatory models. Since auditors examine business
Techniques are the analytical approaches that can be described as either descriptive,
predictive, or prescriptive, depending on the task of the analysis and the type of data. The more
forward looking the task and the more varied and voluminous the data (big data), the more likely
the analysis will be prescriptive or at the very least, predictive. Advanced or more complex BA
may be defined as “Any solution that supports the identification of meaningful patterns and
correlations among variables in complex, structured and unstructured, historical, and potential
future data sets for the purposes of predicting future events and assessing the attractiveness of
various courses of action. Advanced analytics typically incorporate such functionality as data
preprint
mining, descriptive modeling, econometrics, forecasting, operations research, optimization,
conducting an effective and efficient engagement by utilizing ratio and trend analysis and
scanning, which are the techniques typically used and with which the auditor is comfortable
(Glover et al. 2014)? When would the auditor rely more on analytical procedures over
substantive detailed testing? Or, is there room in the current understanding and regulations of
analytical procedures for these more complex approaches? Can analytical procedures be
Stewart (2015) defines: “Audit Data Analytics (ADA) is the analysis of data underlying
financial statements, together with related financial or non-financial information, for the purpose
of identifying potential misstatements or risks of material misstatement.” This definition is
illustrated by linking analytical procedures with traditional data procedures (Figure One). ADA
encompasses both the traditional file interrogation with which auditors are quite familiar as well
as analytical procedures and analytics, some of which auditors may be less acquainted with. Both
may be more easily understood by obtaining an understanding of the modes of ADA. Traditional
file interrogation and analytical procedures are subsets of the larger field of ADA. If ADA is
understood as exploratory or confirmatory in task, this task oriented approach “allows” the
Liu (2014) has proposed the use of Exploratory Data Analysis (EDA) (Tukey 1977, 1980) in
the audit process to generate more directed and risk sensitive audit assertions for their ensuing
preprint
usage through Confirmatory Data Analysis (CDA). Furthermore, Liu (2014) examined where
these applications could be used in the audit process as well as their placement in extant audit
accepted
standards (see Appendix A). Liu (2014) and Stewart (2015) placed EDA and CDA into the
definition, Stewart (2015) and Liu (2014) add that ADA can be exploratory and confirmatory
and illustrate its functionalities. Although new or more complex methods can be proposed and
even adopted by firms, it does not mean that these methods are being promoted by the standards
– instead, these new methods are simply not precluded. For instance, while regression was
incorporated in the Deloitte, Haskins and Sells methodology (Stringer and Stewart 1966), its use
Since that time, audit researchers are revisiting Bayesian (Dutta and Srivastava 1993;
Srivastava 1996; Srivastava 2011; Srivastava et al, 2012; and Srivastava, Wright, and Mock
2012) and Dempster-Shafer (Gordon and Shortliffe 1985; Perl 1986; Shafer and Srivastava 1990;
Srivastava 1995; Srivastava and Shafer 1992; Sun, Srivastava, and Mock 2006; and Srivastava
2011) Frameworks of Belief Functions to assist with analysis of audit evidence uncertainties.
The strength of evidence that supports various assertions should be measured and aggregated to
finally determine if the assertions are true. This requirement holds true even in a big data
environment.
While measurement is one issue, the structure of evidence is another concern because some
items of evidence support one assertion or one account while others may provide support to
more than one assertion or accounts. Thus, the audit judgment is basically reasoning with
evidence in a network of variables, variables being the assertions and accounts, which is also
called in the Artificial Intelligence literature “Evidential Reasoning”. Gordon and Shortliffe
preprint
(1985) discuss this approach of the Dempster–Shafer theory of evidence in Rule-Based Expert
Systems and Pearl (1986) uses this approach for analyzing causal models under the Bayesian
accepted
framework, while more recently Srivastava (2011) applies this technique for aggregating audit
manuscript
evidence both under Bayesian and Dempster-Shafer Theory.
More recently, Dempster Shafer theory is applied to assist auditors with the aggregation of
evidence to obtain judgements about measuring risks and strengths (Fukukawa and Mock, 2011;
and Fukukawa, Mock, and Srivastava 2014). However, it is not clear to what degree the
profession feels comfortable and confident with implementing these approaches in the
engagement given the prevailing competitive and regulatory pressures. If the PCAOB were to
issue guidelines and best practices for applying Belief Functions and Dempster Shafer
Probability Theories for the risk assessment phase, then perhaps the engagement team, if
phases, but are non-committal about which techniques auditors should undertake to achieve these
objectives. Hence, whether an auditor employs more complex analytics such as Belief Functions
or “traditional analytical procedure” techniques such as ratio analysis would seem to depend on
the auditor’s own knowledge and less so on the standards. It has also been proposed that any
adoption by the external audit profession of either advanced analytics or big data would be due to
market or business forces exogenous to the firms (Alles 2015). The recent revival of interest in
engagements provides many areas for future debate and research. These areas are broadly
preprint
Six Concerns Relative to Advanced Analytics in the Modern Engagement
accepted
The advent of computers, large storagemanuscript
systems, and integrated software has transformed
business processes in the first wave of the information age. Their availability has brought to the
front the potential of a large number of analytic methods progressively being used in business
but still emerging in the external audit domain. The six questions enumerated in the Introduction
Perhaps this research question could be rephrased as: Should auditors expand their use of
analytical procedures beyond that of scanning, ratio and time series analysis, and detailed
examination? Are these techniques effective and efficient in a big data context? Basically, these
questions emerge and are summarized in Table Two: Should there be more guidance regarding
analytic methods in the audit? Do we know enough about these methods that this guidance can
be issued? What are the tradeoffs between 100% population tests, sampling, and ad hoc
analytics? The standards (PCAOB 2010, AS 1105) suggest that 100% testing would only apply
in certain situations, such as: the population consists of a small number of high value elements;
the audit procedure that is designed to respond to a significant risk and other means of testing do
not provide sufficient evidence; and finally, the audit procedure can be automated effectively and
applied to the entire population. The last condition is noteworthy, as current technologies can
support automation of basic audit tests such as three-way matching and sampling, in addition to
The strong emphasis on judgment that exists in auditing is justified by the enormous variety
preprint
of situations that complex businesses, different industries, international locations, and data
structures present to the engagement team, limiting their ability to narrowly pre-set audit rules.
accepted
Do modern statistical and machine learning methodologies make it possible to automate pre-set
larger judgment? Can audit findings and judgments be disclosed in more disaggregate manner
with the usage of drill-down technologies where the opinion would be rendered and broken down
into sub-opinions and quantified in terms of probabilistic estimates (Chesley 1977, 1978)? Can
the above be stated in terms of rules implementable in automated audit systems to continuously
monitor and drive Audit by Exception (ABE) (Vasarhelyi and Halper 1991)?
methods suggest different staging of the audit (audit re-modularization), changed organization
(separate analytic function), changed sequencing, changed tasks, changed timing (continuous,
agent driven, exception driven) (Vasarhelyi and Halper 1991) and changed personnel (more
literate in IT and data; specialized) making it difficult to evaluate the literature in the context of
the external audit. Appelbaum, Kogan, and Vasarhelyi (2016) have recently organized, examined
and categorized this body of external audit literature. That study covers more than 300 papers
published since the mid-1950’s that discuss analytics in at least one phase of the audit. Due to the
standards requiring analytical procedures in both the planning and review stages, these two
phases are the predominant focus in the literature as is substantive testing and sampling
(Appelbaum et al. 2016). Many different analytical techniques are utilized at all phases of the
audit, but in an inconsistent manner. Methods that are most promising are categorized as follows:
preprint
1) Audit Examinations: transaction tests, ratio analysis, sampling, confirmations, re-
accepted
performance, CAATS automation;
manuscript
2) Unsupervised9: Clustering, Text Mining, Visualizations, and Process Mining (discovery
models);
Expert Systems, Decision Aids, Bagging, Boosting, C4.5 classifiers, Bayesian Theory,
9
Unsupervised approaches are those techniques that draw inferences from unlabeled or unknown datasets since
there is minimal hypothesis of the results based on labeled responses
10
Supervised approaches are those techniques that draw inferences from labeled or known dataset types,
otherwise known as training data
5) Other Statistics: Multi-Criteria Decision Aid, Benford’s Law. Descriptive Statistics,
These analytical models range from very simple substantive tests and routines to more
complex and predictive techniques requiring significant auditor judgement. The auditor will need
to determine what type of analysis gives the best quality and most efficient audit, given the audit
task, the assessed audit risk, and the available data. Ideally, the auditor should be able to perform
most if not all procedures to more exacting standards in a big data and continuous auditing or
auditors would spend less time navigating through insufficient samples and instead, identify and
preprint
Auditors selecting these more complex techniques need to understand them in terms of their
benefits and limitations. Furthermore, the tasks of risk assessment, substantive procedures and
accepted
tests of controls may be different when 100% of the data is examined (Yoon 2016). For example,
manuscript
if auditors are examining 100% of items in the population (PCAOB 2010, AS No. #1105.24),
the emphasis and reason for testing internal controls should change. Internal Control testing has
been prescribed in the regulations (American Institute of Certified Public Accountants [AICPA]
1997, SAS No. #80) to supplement substantive testing for validating sampling results when
auditors have limited access to data. It has been suggested (IAAE 2016 p. 18) that internal
controls testing in an Audit by Exception type of environment could provide some assurance
To summarize the issues of which methods are the most promising (Table Three) given the
audit task as defined by the standards: A new environment of assurance is emerging where
automation of controls, full population testing, and analytic methods will interplay. Research is
needed on modern analytic methods to establish: their applicability in different instances, their
A set of questions arises with the application of analytics that must be tested in the field.
Would a safe harbor experimentation (a la XBRL) process be needed for the testing of
new proposal provides information of audit results in at least five areas where needed. How
manuscript
be affected in many ways by the emerging environment and its disruptive technologies. If some
form of Audit by Exception (ABE) (Vasarhelyi and Halper 1991) emerges whereby the audit
process is activated by alarms triggered in data streams, and a plethora of new analytics emerge,
clearly the sequence of events will be transformed and the applicability of analytic methods
expanded. Furthermore, there will be ubiquitous use of techniques such as visualization, and
multi-complementary use of many analytic methods. Visualizations are used heavily in business
management to explain the results of analysis (Dilla et al. 2010; Kohavi et al, 2004). Many
techniques exhibit varying strengths and weaknesses and are more beneficial when applied in
11
The AICPA has created the Audit Data Standard (Zhang et al. 2012) to guide in the formalization of data to be
received in the audit, its classification (into cycles), and its measurement.
combination rather than separately. The sequencing (or simultaneity) of events will change as
automated use of data analytics will precede / or coincide with the more traditional audit
examination which may progressively be reduced. For example, today the audit engagement
typically progresses as shown in Figure Two but is envisioned to eventually innovate to a more
The above process, which drives most current engagements, is sample driven; in a more data
Three).
preprint
However, in this ABE approach the auditors may face a different challenge: testing all of the
transactions may produce thousands of exceptions (Dohrer, McCullough, and Vasarhelyi 2015)
accepted
manuscript
if the threshold definition of a material deviation is set too high. That is, the threshold approach
for sampling most likely will not work in ABE; the threshold should be more precise to eliminate
the “false positive” exceptions. The standards require that all exceptions should be examined
(PCAOB 2010, AS No. #2305, AS No. #2315), but this was mandated for sampling (IAAE 2016
p. 17). In an ABE context, if the tests were not configured correctly, there could be an
additional tests to “explain away” many of exceptions and categorize the resulting few as
“Exceptional Exceptions” (Issa et al. 2016). Clearly auditors will need to possess a broad and
The level of automation of the audit, and as discussed before, the availability and comfort
with analytical techniques, the competences of the auditor, and the circumstances and assertions
of the specific audit process will guide the locus of the application. As such, ABE is a more
advanced audit approach, reflecting the confluence of automation, advanced analytics, and
revised regulations. Issues that may emerge during this process could be as follows (Table Four):
How different are the objectives of Internal Audit and External Audit in the current context (Li et
al. 2016)? Isn’t there a substantive overlap between business monitoring and real time assurance?
Considering that there is substantive overlap in data analytic needs, are the traditional three
lines of defense (Freeman 2015; Chambers 2014) still relevant12? Traditional auditing has a
preprint
retrospective approach, as traditional technologies did not allow for other approaches - can the
current environment allow for a prospective look and to what extent? What parts / procedures of
accepted
the audit are fully or partially automatable? Will the disruptive changes (Christensen 2013) be
formalized? In the same line, but extending expanded testing and reporting, should quantitative
guidelines be issued for ABE and its structures, and should within period results be disclosed as
part of the auditor’s report? The succinctness of the traditional report is not necessary any more,
12
There should be effective risk management functions within a company. These monitoring and assurance
functions have been modeled as the “Three Lines of Defense” by the IIA. This model serves as an example, where:
1) the first line of defense represents functions that own or manage the risk; 2) the second line of defense, where
there are functions that specialize in risk management and compliance; and 3) the third line of defense, where
there are functions that provide assurance
and drill downs on the results of Critical Audit Matters (CAM) examination, their details, and
In general, the aforementioned meetings between the AICPA’s ASB and the ASEC committee
have concluded that the standards do not forbid the usage of analytics, but it can be argued that
the standards, and the economics of external audit, make analytics more difficult or in some
instances impractical if not nearly impossible to use. For example, audits of financial and
insurance industry clients are quite complex and the engagement team may find it impractical
within the budgeted hours to conduct any additional analytical techniques beyond the acceptable
ratio analysis and sampling. The lack of a more detailed discussion of appropriate analytical
preprint
techniques within the standards, when placed in the context of a highly competitive business
environment, does not encourage the profession to explore new techniques even in the face of
accepted
manuscript
big data and automation. The use of more automation and analytics in the engagement,
particularly in a big data environment, generates these additional issues (Table Five):
are still being enforced by the PCAOB. Consequently, the pricing of the audit, in a
competitive environment, leaves little space for additional analytics even if these give
stronger assurance of fair representation. Furthermore, what would be the cost versus
benefit trade-off with the usage of analytics? Or, would there be a point where the cost of
conducting a sample driven audit exceeds that of ABE audit? When would the additional
assurance derived from the analytic results justify the cost of their application? Even
further, if a certain analytics method is more powerful and uncovers issues that were not
previously detected, what would be the liability of the accounting firm, particularly if
these issues were also present in the prior years? (Krahel and Titera 2015, p. 418)
and reviewing them (Dohrer, McCullough, and Vasarhelyi 2015). Do any areas of the
modern audit exist where these small judgmental samples still make sense (Elder at al.
2013)? In juxtaposition to the current requirements, would the auditor then need to
justify the use of sampling in circumstances where 100% of the data would be available
for testing?
The audit research literature itself has been scant regarding auditors’ sampling
statistical sampling, as well as how to effectively extract meaningful results from the
preprint
sampling (Elder et al. 2013, p. 103). Auditing standards (PCAOB 2010, AS No. #2315)
accepted
define sampling as “the application of an audit procedure to less than 100% of the items
manuscript
in an account balance or class of transactions for the purpose of evaluating some
characteristic of the balance or class.”, The auditor may choose to select all items for
testing if the level of sample risk from possible erroneous decisions is too high (AS No.
#2315.07).
There is little guidance as to when 100 percent testing would be more appropriate than
selecting specific items. In the standards about Audit Evidence (PCAOB 2010, AS No.
#1105.22-.29), sampling is not recommended when the data population is small and/or
not homogeneous, when there appears to be significant risk, when there are key items that
should be examined, when threshold tests should be applied, nor is it suggested when
audit procedures can be automated effectively and applied to the whole population. In the
standards regarding sampling (PCAOB 2010, AS No. #2315.07), the auditor should
weigh the cost and time to examine all of the data versus the perceived degree of
uncertainty from sampling and non-sampling risks, and judge accordingly. Consequently,
the practice of sampling has become embedded in basic public auditing practice. PCAOB
examinations have been very strict favoring sampling against analytical methods.
• Furthermore, Elder et al. (2013) were unaware of any literature that addresses the
auditor’s decision to use audit sampling of any type (Elder et al. 2013, p. 111) and
suggested that future research should address the issues of when sampling would be
appropriate and when other types of tests would negate the need for sampling. In
response, Yoon (2016) discussed how substantive analytical procedures (SAPs) applied
to 100 percent of the data (with the use of computer assisted auditing techniques) could
potentially provide a more efficient and effective audit evidence than sampling,
preprint
particularly in a big data environment. Perhaps for audit engagements where the client is
accepted
collecting or analyzing all of the transactions and the auditor is using automated audit
manuscript
software, the standards could more clearly establish that 100 percent tests using
audit evidence.
For example, three way matches used to be performed manually and reviewed
manually. Now advanced accounting systems and ERPs perform these automatically. Is
automation, how do the audit standards take this into consideration? Is there a difference
between automation and analytic methods? (Dohrer, McCullough, and Vasarhelyi 2015)
If such automation is viewed as preventive internal control, then how does it change the
balance between control testing and substantive testing in auditing the modern highly
automated enterprise environments? Furthermore, the situation will change if fraud is
suspected. Simply automating a process does not necessarily mean that the transactions
have been correctly processed and that internal controls are operating effectively. The
auditor may still need to test the automated system for its reliability by using test data.
depend on some form of “audit data standard” (Zhang et al. 2012)13. These apps will run
frequently or constantly (Vasarhelyi and Hoitash 2005). This form of evidence may use
external and internal data (Brown-Liburd and Vasarhelyi, 2015) potentially from external
sources like social media, thus providing valuable tertiary audit evidence that may be
used to complement / replace current tests. Would these need new guidance? Are the
current guidelines for traditional audit evidence the same for external or internal big data,
particularly social media? What qualities should these data possess in order to provide
manuscript
significantly improved if the models incorporate contemporaneous peer company data.
sources of information for obtaining an understanding of the relevant industry and the
client’s position, as outlined in the standards for risk assessment and review (PCAOB
2010, AS No. #2110, AS No. #2810). Large public accounting firms typically audit
multiple peers in the same industry, and they could create large internal data warehouses
to share such data among the engagement teams during the audit. The current strict
interpretation of audit client confidentiality rules causes the firms to err on the side of
13
The AICPA has published online a series of voluntary suggested audit data standards:
http://www.aicpa.org/InterestAreas/FRC/AssuranceAdvisoryServices/Pages/AuditDataStandardWorkingGroup.aspx
caution and disallow any sharing of data even though such data would never leave the
confines of the firms. New guidance interpreting client data confidentiality as being
safeguarded within a firm (and not within an engagement team) and specifically allowing
audit client data sharing among different engagement teams would greatly enhance the
performance of audit.
PCAOB Release No. 2016-003 proposes, concerning an unqualified opinion, that the audit report
disclose “Critical Audit Matters” (if any) in areas such as estimates, audit judgments, areas of
special risk, unusual transactions, and other significant changes in the financial statements. This
preprint
proposal 14poses a series of interesting questions worthwhile of research (Table 6): Is the level of
accepted
falling back into the comfort zone of the traditional auditor? After all, substantive industry
Audit Matters (CAMs) provide disclosures that are more disaggregate, or more informative than
audit judgments, areas of special risk, unusual transactions, or other significant changes in the
financial statements? Should these schemata be defined by the standard setters? On a longer
14
See also Lynne Turner’s comments (https://pcaobus.org//Rulemaking/Docket034/ps_Turner.pdf).
15
PCAOB Release No. 2013-005, August 13, 2013, Docket Matter No. 034, The Auditor’s Report on an audit of
Financial Statements When the Auditor expresses an Unqualified Opinion. This report discusses the auditor’s
responsibilities regarding certain other information in certain documents containing audited financial statements
and the related auditor’s reports and related amendments to the PCAOB standards.
range, if the auditor is using/ relying on ABE should there be a real-time seal or similar device
that would allow investors to know on an immediate basis that auditors are monitoring systems
As mentioned above, the application of analytics in the external audit is attracting substantial
attention from practice and academia. EY17 and the AAA18 among several others have brought
together these two groups for constructive dialogues. Auditor education and familiarity with
analytics has been positioned by the standards as a limiting factor regarding which techniques to
apply in the engagement (PCAOB 2010, AS No. #2305). Papers such as Tschakert, Kokina,
Kozlowski, and Vasarhelyi (2016) and Appelbaum, Schowalter, Sun, and Vasarhelyi (2015)
preprint
have discussed the issues facing audit education. In general, some conclusions could be drawn:
•
accepted
Accounting faculties tend not to be prepared to teach analytics.
analytics (however, the feeling is not pervasive – there are some anecdotal reports
to the contrary).
• The accounting curriculum is too full to add more IT, statistics, and modeling.
• As the CPA exam does not include these topics, there is little motivation by
• Firms will tend / or already have hired specialist groups from non-accounting
16
This type of continuous assurance would work better with some form of more frequent/ continuous reporting.
17
EYARC 2015, June 17/18 2015, Dallas Texas.
18
AAA, Accounting is Big Data, September ¾ 2015, New York, New York.
external to the audit team and brought in if the manager of the engagement setting
• Practitioners are also not prepared and their internal audit practices have not
• Firms have been developing software to improve their processes but feel curtailed
These factors lead to a series of educational research questions and potential projects that are
paradigm changing (Table Seven): If the curriculum is too full, if memorization in the age of
google is a different consideration, and if the domain of coverage is too large, then what
preprint
Should the CPA profession expand competencies or progressively rely more and more on
specialists from other domains, potentially using other (non-CPA firms) to provide these
accepted
manuscript
competencies? Should the set of CPE requirements of the profession be reformulated in terms of
a life-long-learning approach where new required skills are defined and progressively required in
the accountants learning/ competency profile? Who should manage this learning profile, and who
should set the requirements? Should there be a much wider set of accounting specializations
accountant skills? And some of these acquired through on the job activities and related
experience?
“It has also been shown that many internal audit procedures can be automated, thus saving costs,
allowing for more frequent audits and freeing up the audit staff for tasks that require human
judgment.” (AICPA, 2015)
It has been proposed in other technology adoption settings that such automation changes are
best considered as evolutionary instead of revolutionary (Kuhn and Sutton 2010). The topics and
suggestions mentioned in this paper may seem extensive in scope and massive in undertaking.
These issues could serve as either motivators or impediments to the use of big data and audit data
Ideally, it would seem that the goal for BD/ADA adoption by the profession would be to save
costs and attain greater efficiencies and effectiveness in the audit process. However, it is
preprint
conceivable that impediments exist that would dampen enthusiasm for BD/ADA adoption and
these conflicts may be similar to those of other technology initiatives. Here are just a few of the
accepted
issues that are proposed as being relevant to BD/ADA adoption (Table Eight):
manuscript
(Insert Table Eight here)
The literature regarding technology adoption is huge in the audit, accounting, and AIS
disciplines. This paper does not attempt to synthesize this literature in support of this discussion;
instead, a few select papers are highlighted and a very scant outline for BD/ADA adoption is
suggested for future research. For instance, the Information Fusion process that Perols and
Murthy (2012) propose could be applicable here in the context of BD/ADA adoption. Kuhn and
Sutton (2010) present research challenges that could correspond with BD/ADA in the area of
ADA/BD.
It has been suggested (Alles et al 2008; Geerts et al 2013) that the transformation of manual
Dzuranin and Malaescu (2016) provide a framework based on Design Science for such an
integration. Vasarhelyi (2013) proposes a four-step process based on the work of Parasurman et
al (2000). According to Parasurman et al (2000), human information processing and its evolution
from man to machine can be divided into four phases: 1) information acquisition; 2) information
analysis; 3) decision selection; and 4) action implementation. In the Alles et al (2008) proposal,
each such successive step should be undertaken methodically once benefits from the previous
accepted
successful change is more likely to occur if the manual process is re-engineered first to support
manuscript
the eventual automation. In the Alles et al. (2008) proposal, the first step of the process cycle is
the consideration of the drivers of change and endorsement by management; the second step in
the process is the development and the actual implementation of the components that would
enable this change; the third step consists of management, or baseline measurement and
evaluation of the solution. This process cycle is repeated for every level of automation
transformation in an incremental fashion. Such a process cycle approach could also apply as an
incremental use of analytics and big data by the public audit profession.
The initial drivers for the use of analytics and big data by external auditors are already in
place, with the increasing complexity of client transactions, analytics, and data sources and the
subsequent increase of audit risk to the engagement team if analytical procedures are manual and
overly simplistic (Alles 2015; Bedard et al, 2008). Firms are already embracing diverse
descriptive approaches (Dilla et al, 2010); it could be argued that some practitioners are about to
embark on the next phase, the adoption of more predictive analytics. Basically, firms are
discovering that manual and simplistic analytical procedures and data sources create an audit
which is more likely than not to be inefficient and ineffective in a big data context. Many firms
are investigating ways to integrate more advanced analytics in their engagements, but this
initiative is progressing cautiously (Alles 2015). It is suggested that many of the research issues
discussed here in this paper will need to be examined in the context of an incremental approach,
as illustrated in Figure Five. Figure Five illustrates how the process flow as depicted in Figure
Four could be integrated incrementally to incorporate advanced analytics and big data into
practice.
preprint
(Insert Figure Five about here)
accepted
manuscript
This incremental approach may already be observed to some degree in the audit process –
while some manual procedures have been automated, other audit procedures have not. Many
audit tests may be conducted on 100% of the test population using Computer Assisted Auditing
Techniques (CAATs) software packages (Wang and Cuthbertson 2015). These CAATs can
perform analytics very efficiently and quickly and can interface and link easily to the client’s
system. Although not all CAATs software packages are equipped to handle big data, this
limitation will eventually be solved. CAATs are used by auditors on many engagements for GL
tests, three way matches, detail tests, and sampling for example. However, these tests do not run
automatically but are manually selected by the engagement team. The auditor selects which
analytical procedures or tests to run and attributes to examine in the tests of assertions for a
Modern audit engagements often involve examination of clients that are using big data and
analytics to remain competitive and relevant in today’s business environment. Client systems
now create and acquire big data and apply advanced analytics to generate intelligence for
decision making. However, the public accounting profession is still bound by regulations that
may have been applicable years ago but whose relevance should be re-examined today in this
modern business environment. There are numerous issues surrounding the standards, practice,
and theory of audit data analytics that have emerged from these rapidly evolving different
preprint
corporate systems and which have not been addressed. This paper highlights six general areas of
such concerns and now provides a broad review and collection of additional critical ADA issues
accepted
that challenge the public auditing profession today.
Research Questions
manuscript
Many of the issues and sections reiterated similar research questions. Additional research
questions are now presented that seem to be also important to answer for audit data analytics to
succeed in gaining widespread practical acceptance. Also, quantification of many audit processes
and judgements may be called for with the heightened use of advanced analytics and big data.
1. How can analytics methods be used to create accurate expectation models for
advanced analytics. These more advanced approaches, combined with big data, may
2. What properties make a particular ADA technique more or less appropriate for a
particular audit function? There is a wide range of techniques appropriate for each
audit phase, given the client particularities, environment, and industry. The
manuscript
function is broader than that of financial statement auditing. Since assurance services
should improve the quality of information for decision makers, the quality (relevance
and reliability) of data is still paramount. The assurance function may be reorganized
in a broader format than the engagement, but standards must continue to be issued.
5. How should audit standards and processes be modified to enable and encourage the
utilization of ADA? The standards should be modified to suggest techniques that are
acceptable for each phase of the audit, given certain engagement contexts. For
19
A “suspicion function” is a linear multivariate equation that gives weights to characteristics of variables and
analytical evidence to estimate its probability of being fallacious.
20
Bumgarner and Vasarhelyi (2015) break down audit as retroactive and predictive. A predictive audit may be
preventive (when a suspicion score is large, a transaction is blocked for review), or just predictive to set up a
standard of comparison.
example, perhaps sampling should be modified for client engagements where 100%
of the data is electronically collected and available to the auditor. In this context,
the standards regarding data as audit evidence should also be examined in the context
of electronic data and big data – external evidence may not be as reliable in this case
6. What is the proper way of validating expectation models for ADA? Should this
validation be carried out for each audit client separately, or can it be extrapolated
from one client to all the other clients in the same industry? Validation of models may
be established over time by auditors for continuing clients and also for the auditors’
own industry expertise. As part of interim activities, updated information could be fed
preprint
into prescriptive analytical models that over time attain greater accuracy. The
accepted
standards could also feasibly provide guidance specific for certain industries.
manuscript
7. What additional verification processes would be desirable with the extant analytic
technology? Verification processes and validation remain as open issues with ADA
integration in the engagement. Over time, with continuing audit clients, it is likely
encourage the use of substantive audit analytics? The concept of accuracy may be
formally and quantitatively defined with the use of ADA. Auditor judgement is still
21
Acceptable relative error in engineering, materiality in accounting.
Evolution Towards Quantification of the Audit
technologies have allowed assurance that can be continuous (Vasarhelyi and Halper 1991),
predictive (Kuenkaikaew and Vasarhelyi 2013), prescriptive (Holsapple et al. 2014), and even
facilitate automatic data correction (Kogan et al. 2014). These techniques are intrusive, create
transparency, and maybe also some competitive impairment if all the details are disclosed, and
generate substantive concerns by the auditee. The public good tradeoff of increased information
disclosure versus economic interest of agents is a complex issue and its equilibrium may take
The increased amount of data available and the progressive ability to discover variances,
understand aggregate content, and to predict trends has clearly created an equilibrium misbalance
preprint
that is becoming larger and larger. Quantification can increase the value of information both
internally and externally, but it decreases information asymmetry which is very threatening for
accepted
manuscript
agents (managers) and principals. A common thread of research questions relative to
quantification were raised throughout this paper and are elaborated upon here:
• Do modern disclosure and statistical methodologies make it possible to, in certain cases,
automate pre-set rules in order to perform procedures, derive results, and integrate these
in a larger judgment? Such an approach is necessary for “close to the event continuous
auditing” (Vasarhelyi and Halper, 1991) that is progressively been made necessary due to
Traditional audit is backward looking due to the limitations of manual review and storage
procedures. These modern analytic methods allow for the detection and prevention of
propagation along downstream systems of potential faults (Kogan et al., 2014). These
correction that do not exist in extant systems. These emerging procedures will be difficult
to conceptualize from the point of view of “lines of defense” (IIA, 201323; Freeman 2015;
o If a midstream process detects faults and activates an error correction process that
process? Does this distinction make sense in the modern world of automation?
o If a continuous audit layer detects “serious faults” (Vasarhelyi and Halper, 1991)
preprint
and stops a system, is this layer a part of operations, control, or audit?
accepted
• Can audit findings and judgments be disclosed in more disaggregate manner with the use
manuscript
of drill-down technologies where the opinion would be rendered and broken down into
the new PCAOB proposal and does not directly address the type of precision that
disaggregation would allow. Turner (2014, p5) in the aforementioned comments to the
22
The AICPA has created the Audit Data Standard (Zhang, Yang, & Appelbaum, 2015) to guide in the formalization
of data to be received in the audit, its classification (into cycles), and its measurement.
23
“The tree lines of defense in effective risk management and control”, White paper, The Institute of Internal
auditors, January 2013.
24
More detailed and quantitative audit reports are being progressively disclosed. For example, in the Netherlands
(annual report of Aegon NV, 2015, p309) there is disclosure of the threshold of materiality EUR 65 million and the
statement that “We agree with the audit committee that we would report to the misstatements identified during
the audit about EUR 4 million (2014: EUR 4 million) as well as misstatements below that amount that, in our view,
warranted reporting for qualitative reasons.” Quantitative assessments are also made of coverage and other
variables as well as a much more detailed discussion of governance controls and procedures.
PCAOB states “it is clear that some oppose any disclosure of information not previously
disclosed by management. But such an approach defies common sense and is intended to
obfuscate and avoid disclosing the information investors want. I urge the Board to reject
such an approach as it will result in disclosures that are not worth the time or cost…
create illusory comfort for the readers, may be the solution for this dilemma. Research
• Should quantitative guidelines be issued for ABE and its structures, and should within
audit allows for continuous monitoring and remarkable (not necessarily material)
accepted
be linked to smart contracts (Kosba et al. 2015) that automatically would execute a pre-
manuscript
agreed (e.g. covenant condition) action? A continuous assurance environment requires
that events of substance, that can be predicted, be diagnosed and some action executed.
As the combinatorics of these events is almost infinite, progressively more and more
complex audit (and operational) judgments will be necessary, occupying auditors but
Conclusion
This paper contributes to the literature by discussing the concerns facing the external audit
profession as business moves towards big data and advanced analytics for many aspects of
operations and decision making. These suggested research issues, along with various proposals
towards greater use of big data and analytics will hopefully encourage and inspire ideas and
research that is useful for professionals, regulators, and researchers. Although many concerns are
reviewed, many are also not mentioned. It is expected that as research and findings evolve in this
domain, some concerns will become less important while others many unexpectedly gain
urgency. However, the emerging overall importance that big data and advanced analytics present
Most of the research discussion is focused within audit standards setting, audit practice
issues, and the development of better audit data analytics. While these areas all lend themselves
to empirical research in auditing, this paper has been oriented more towards theory and practice.
Theoretical proposals and questions as to how analytics and big data will be affecting the
preprint
external audit have been discussed. Future empirical research is required to validate these
accepted
manuscript
In part, this paper is motivated by a vision as to how audit data analytics could enhance or
replace certain auditor conducted procedures. But, perhaps there are other views that could be
regarded as more research friendly and perhaps more realistic to a more real-time use of audit
data analytics. Two specific areas seem to present easily integrated opportunities.
First, in the background discussion of analytical procedures and the standards, AS No.
#2305.04 mentions how analytical procedures are used in the substantive testing phase to obtain
evidence. The discussion focuses on how ADA might replace substantive testing and then is
is that perhaps ADAs are better used to focus auditors’ substantive testing.
Consider for example the Jans et al. (2014) paper with the application of process mining.
This paper details the use of process mining on a 100% test of the transactions to find the
anomalies in the sample where controls fail in the processing of 26,185 POs. A series of process
mining tests (a type of ADA) narrows the sample of anomalies down to the highest risk
scenarios which exemplify high rates of violations among individuals and small networks of
people working together. Possibly this could be regarded is the perfect example of how ADAs
can be used for more focused audit testing. This is but a small example of how archival
researchers may be able to contribute to the research stream through analysis of big data sets
with statistical procedures and/or machine learning techniques to improve the efficiency in
preprint
targeting substantive audit tests to better identify high risk areas.
The second area presents another aspect to the general discussion on education issues .
accepted
manuscript
Possibly additional attention should be focused on what competencies auditors need in this new
environment and how the auditor potentially can be a valuable partner in the use of ADA/BA.
There is a rich body of literature on industry knowledge, auditors’ abilities to recognize patterns
It seems that a major research thrust should perhaps be how this expertise and professional
judgment can be leveraged to develop and use more effective ADA/BA strategies during an audit
and to keep the auditor relevant in the tailoring of ADA processes to a given client’s business
processes. Ultimately the research focus would be more on the development of audit experts that
are both good auditors and good data scientists. Is this possible? Can a good audit-focused data
scientist produce better results than standardized ADAs? These ideas may perhaps provide
In conclusion, big data and business analytics are dramatically changing the business
environment and the capabilities of business processes. Business functions are changing,
business capabilities are being added, anachronistic business functions are being eliminated, and
most of all, processes are being substantially accelerated. The same should occur within the
external audit or assurance function, its rules need to be changed, its steps evolved, automation
integrated into its basic processes, and its timing should become almost instantaneous in
preprint
accepted
manuscript
43
References
Alles, M. G. 2015. Drivers of the use and facilitators and obstacles of the evolution of Big Data by the
audit profession. Accounting Horizons 29 (2): 439-449.
Alles, M., G. Brennan, A. Kogan, and M.A. Vasarhelyi. 2006. "Continuous monitoring of business
process controls: A pilot implementation of a continuous auditing system at Siemens",
International Journal of Accounting Information Systems 7, no. 2, pp. 137-161.
Alles, M. G., A. Kogan, and M.A. Vasarhelyi. 2008. Audit Automation for Implementing Continuous
Auditing: Principles and Problems. Working paper, Rutgers University. Newark, N.J. USA.
American Institute of Certified Public Accountants (AICPA). 1972. Responsibilities and Functions of the
Independent Auditor. SAS No. 1, Section 110. New York, NY. Available at:
http://www.aicpa.org/Research/Standards/AuditAttest/DownloadableDocuments/AU-00110.pdf
American Institute of Certified Public Accountants (AICPA). 1997. Amendment to SAS no. 31, Evidential
Matter. SAS No. 80. New York, NY: AICPA.
American Institute of Certified Public Accountants (AICPA). 2012a. Analytical Procedures. AU-C
preprint
Section 520, Source: SAS No. 122. New York, NY. Available at:
http://www.aicpa.org/Research/Standards/AuditAttest/DownloadableDocuments/AU-C-
00520.pdf
accepted
manuscript
American Institute of Certified Public Accountants. (AICPA). Audit Sampling Committee. 2012b. Audit
Sampling: Audit Guide. New York, NY: AICPA.
American Institute of Certified Professional Accountants (AICPA). 2015. Audit Analytics and
Continuous Audit: Looking Toward the Future, New York, NY
Appelbaum,D., D.S.Showalter, T. Sun, and M.A. Vasarhelyi. 2015. “Analytics Knowledge Required of
a Modern CPA in this Real-Time Economy: A Normative Position”. Presented at the Accounting
Information Systems Educator Conference, June 25-28, 2015. Colorado Springs
Appelbaum, D. 2016. Securing Big Data Provenance for Auditors: The Big Data Provenance Black Box
as Reliable Evidence. Working paper, Rutgers Business School, Newark N.J.
Big Data and Analytics in the Modern Audit Engagement: Research Needs
44
Atzori, L., A. Lera, and G. Morabito. 2010. The Internet of things: A survey. Computer Networks 54
(15): 2787–2805.
Bedard, J.C., D.R. Deis, M.B. Curtis, and J.G. Jenkins. 2008. Risk Monitoring and control in audit firms:
A research synthesis. Auditing; A Journal of Practice & Theory, Vol. 27, No. 1 (May), 187-218.
Brazel, Joe., Keith Jones, and Mark Zimbelman. 2009. Using Nonfinancial Measures to Assess Fraud
Risk. Journal of Accounting Research (December): pp. 1135-1166.
Brown-Liburd, H., and M.A. Vasarhelyi. 2015. Big Data and Audit Analytics. Journal of Emerging
technologies in Accounting, Forthcoming.
Bumgarner,N. and M.A Vasarhelyi. 2015. Auditing—A New View. AUDIT ANALYTICS, p.3.
Byrnes, P. 2014. Developing Automated Applications for Clustering and Outlier Detection: Data Mining
Implications for Auditing Practice, PhD Dissertation, Rutgers Business School, Continuous Audit
and reporting Lab, Newark, NJ, 2014
Chambers, A., 2014. New guidance on internal audit–an analysis and appraisal of recent developments.
Managerial Auditing Journal, 29(2), pp.196-218.
preprint
Chesley, G. R. (1975). Elicitation of subjective probabilities: a review. Accounting Review, 325-337.
accepted
Chesley, G. R. (1977). Subjective Probability Elicitation: The Effect of Congruity of Datum and
manuscript
Response Mode on Performance. Journal of Accounting Research, 1-11.
Christensen, C. 2013. The Innovator's Dilemma: When New Technologies Cause Great Firms to Fail.
Harvard Business Review Press.
Church, B. K., J.J.McMillan and A. Schneider. 2001. Factors affecting internal auditors' consideration of
fraudulent financial reporting during analytical procedures. Auditing: A Journal of Practice &
Theory, 20(1), 65-80
Cukier,K., and V. Mayer-Schoenberger. 2013. Rise of Big Data: How it's Changing the Way We Think
about the World, The. Foreign Affairs, 92, p.28
Big Data and Analytics in the Modern Audit Engagement: Research Needs
45
Daroca,F. P., and W.W. Holder. 1985. The use of analytical procedures in review and audit engagements.
Auditing-A Journal of Practice & Theory, 4(2), 80-92
Davenport, T. H., and L. K. Johnson. 2008. Competing on Analytics: The New Science of Winning.
Harvard Business School Press.
Davenport, T. H., and J. Kim. 2013. Keeping Up with the Quants. Harvard Business Review Press, USA.
Delen,D., and H. Demirkan. 2013. Data, information and analytics as services. Decision Support Systems,
55(1), 359-363
Dilla, W., Janvrin, D.J. and Raschke, R., 2010. Interactive data visualization: New directions for
accounting information systems research. Journal of Information Systems, 24(2), pp.1-37.
Dohrer, R., M.A. Vasarhelyi, and P.McCollough. 2015. Audit Data Analytics. Presentation delivered to
the IAASB, September 23.
Domingos, P. 2012. A few useful things to know about machine learning. Communications of the ACM,
55(10), 78-87
preprint
Dutta, S. K., and R. P. Srivastava. 1993. Aggregation of Evidence in Auditing: A Likelihood Perspective.
Auditing: A Journal of Practice and Theory, Vol. 12, Supplement: 137-160.
accepted
Dzuranin, A. C., and I. Malaescu. 2016. The Current State and Future Direction of IT Audit: Challenges
manuscript
and Opportunities. Journal of Information Systems. Vol. 30, No. 1. Spring 2016, pp. 7-20.
Elder, R. J., A.D. Akresh, S.M. Glover, J.L. Higgs, and J. Liljegren. 2013. Audit sampling research: A
synthesis and implications for future research. Auditing: A Journal of Practice & Theory, 32(sp1),
99-129.
Evans, J. R., and C. H. Lindner. 2012. Business Analytics: The Next Frontier for Decision Sciences.
Decision Line, 43 (2), pp. 4-6.
Freeman, S. 2015. Special report: Engaging lines of defense. Freeman, 31(4), 21.
Fukukawa, H. and T. J. Mock. 2011. Audit Risk Assessments Using Belief versus Probability. Auditing:
A Journal of Practice & Theory, Vol. 30, No. 1, February 2011: pp. 75–99.
Fukukawa, H, T. J. Mock, and R. P. Srivastava. 2014. Assessing the Risk of Fraud at Olympus and
Identifying an Effective Audit Plan. The Japanese Accounting Review, Vol. 4, pp. 1-27 (Invited).
Big Data and Analytics in the Modern Audit Engagement: Research Needs
46
Glover, S.M., Prawitt, D.F. and Drake, M.S., 2014. Between a Rock and a Hard Place: A Path Forward
for Using Substantive Analytical Procedures in Auditing Large P&L Accounts: Commentary and
Analysis. Auditing: A Journal of Practice & Theory, 34(3), pp.161-179.
Gordon J, Shortliffe EH. Method for managing evidential reasoning in a hierarchical hypothesis space.
Artificial Intelligence; 1985, pp. 323–57.
Hardy, C.A. and Laslett, G., 2014. Continuous Auditing and Monitoring in Practice: Lessons from
Metcash's Business Assurance Group. Journal of Information Systems, 29(2), pp.183-194.
Hoitash, R., Kogan, A. and Vasarhelyi, M.A., 2006. Peer-based approach for analytical procedures.
Auditing: A Journal of Practice & Theory, 25(2): pp.53-84.
Holsapple,C., A.Lee-Post, and R. Pakath. 2014. A unified foundation for business analytics. Decision
Support Systems, 64: 130-141
preprint
Issa,H., H.Brown-Liburd, and A.Kogan. 2016. Identifying and Prioritizing Control Deviations Using A
Model Derived from Experts’ Knowledge (Working Paper). Rutgers, NJ: Rutgers University.
accepted
Jans, M., Alles, M.G. and Vasarhelyi, M.A., 2014. A field study on the use of process mining of event
logs as an analytical procedure in auditing. The Accounting Review, 89(5), pp.1751-1773.
manuscript
Kobelius, J. 2010. The Forrester Wave: Predictive Analytics and Data Mining Solutions. Forrester
Research, Inc. USA.
Kogan, A., Alles, M. G., Vasarhelyi, M. A., & Wu, J. (2014). Design and evaluation of a continuous data
level auditing system. Auditing: A Journal of Practice & Theory, 33(4): 221-245.
Kohavi, R., L. Mason, R. Parekh, and Z. Zheng. 2004. Lessons and Challenges from Mining E-
Commerce Data. Machine Learning, 57 (1-2): 83-113.
Kosba, A., Miller, A., Shi, E., Wen, Z., & Papamanthou, C. (2015). Hawk: The blockchain model of
cryptography and privacy-preserving smart contracts. Cryptology ePrint Archive, Report
2015/675, 2015. http://eprint. iacr. org.
Krahel J.P. and W.R. Titera. 2015. Consequences of Big Data and Formalization on Accounting and
Auditing Standards. Accounting Horizons, 29 (2): pp 409-422.
Big Data and Analytics in the Modern Audit Engagement: Research Needs
47
Kuenkaikaew, S., & Vasarhelyi, M. A. (2013). The Predictive Audit Framework. International Journal of
Digital Accounting Research. Vol. 13: 37-71.
Kuhn, J. R., and S. G. Sutton. 2010. Continuous Auditing in ERP System Environments: The Current
State and Future Directions. Journal of Information Systems, Vol. 24, No. 1: pp. 91-112.
Lee,M., M.Cho, J.Gim, D.H.Jeong, and Jung, H. 2014. Prescriptive Analytics System for Scholar
Research Performance Enhancement. In HCI International 2014-Posters’ Extended Abstracts (pp.
186-190). Springer International Publishing.
Li,H., J.Dai, T.Gershberg, and M.A.Vasarhelyi. 2015. Understanding Usage and Value of Audit
Analytics in the Internal Audit: An Organizational Approach. Working paper, Continuous
Auditing and Reporting Laboratory, 2013.
Liu, Q .2014. The application of exploratory Data Analysis in Auditing. PhD Dissertation, Rutgers
Business School, Continuous Audit and reporting Lab, Newark, NJ, 2014
preprint
Montgomery, R.H. 1919. Auditing Theory and Practice, The Ronald Press, 1919 2nd edition, New York
accepted
Nearon, B.H., 2005. Foundations in auditing and digital evidence. The CPA Journal, 75(1): p.32.
manuscript
Nelson, K.M., Kogan, A., Srivastava, R.P., Vasarhelyi, M.A. and Lu, H., 2000. Virtual auditing agents:
the EDGAR Agent challenge. Decision Support Systems, 28(3), pp.241-253.
Pearl J. Evidential reasoning using stochastic simulation of causal models. Artificial Intelligence; 1986.
P. 245–57.
Perols,J., and B. Lougee. 2011. The Relation between earnings management and financial statement
fraud. Advances in Accounting, incorporating Advances in Accounting 27 (2011): pp 19-53
Perols, J.L. and Murthy, U.S., 2012. Information fusion in continuous assurance. Journal of Information
Systems, 26(2), pp.35-52.
Public Company Accounting Oversight Board (PCAOB). 2010. Audit Evidence. Auditing Standard (AS)
No. 1105. Washington, D.C: PCAOB.
Big Data and Analytics in the Modern Audit Engagement: Research Needs
48
Public Company Accounting Oversight Board (PCAOB). 2010. Identifying and Assessing Risks of
Material Misstatement. Auditing Standard (AS) No. 2110, Washington D.C.: PCAOB.
Public Company Accounting Oversight Board (PCAOB). 2010. Substantive Analytical Procedures.
Auditing Standard (AS) No. 2305. Washington D.C.: PCAOB.
Public Company Accounting Oversight Board (PCAOB). 2010. Audit Sampling. Auditing Standard (AS)
No. 2315. Washington D.C.: PCAOB.
Public Company Accounting Oversight Board (PCAOB). 2010. Evaluating Audit Results. Auditing
Standard (AS) No. 2810. Washington, D.C.: PCAOB.
Schneider,G., J.Dai, D.Janvrin, K.Ajayi, and R.L. Raschke. 2015. Infer, Predict, and Assure: Accounting
Opportunities in Data Analytics. Accounting Horizons Vol. 29, No. 3: 719-742.
Shafer, G. and R. P. Srivastava. 1990. The Bayesian And Belief-Function Formalisms: A General
Perspective for Auditing. Auditing: A Journal of Practice and Theory, Supplement: 110-148.
Song,S. K., D.J. Kim, M. Hwang, J. Kim, D.H. Jeong, S. Lee, and W.Sung. 2013. Prescriptive Analytics
System for Improving Research Power. In Computational Science and Engineering (CSE), 2013
preprint
IEEE 16th International Conference on (pp. 1144-1145). IEEE.
Song,S. K., D.H. Jeong, J.Kim, M. Hwang, J. Gim, and H. Jung. 2014. Research Advising System based
accepted
on Prescriptive Analytics. In Future Information Technology (pp. 569-574). Springer Berlin
manuscript
Heidelberg.
Srivastava, R. P. 2011. An Introduction to Evidential Reasoning for Decision Making under Uncertainty:
Bayesian and Belief Functions Perspectives. International Journal of Accounting Information
Systems, Vol. 12: 126–135.
Srivastava, R. P., and G. Shafer. 1992. Belief-Function Formulas for Audit Risk. The Accounting Review,
April: 249-283.
Big Data and Analytics in the Modern Audit Engagement: Research Needs
49
Srivastava, R. P., T. J. Mock, and L. Gao. 2011. The Dempster-Shafer Theory of Belief Functions for
Managing Uncertainties: An Introduction and Fraud Risk Assessment Illustration. Australian
Accounting Review, Volume 21, Issue 3, pp. 282–291.
Srivastava, R. P., T. J. Mock, and J. Turner. 2009. Bayesian Fraud Risk Formula for Financial Statement
Audits. ABACUS, Vol. 45, No. 1, pp. 66-87
Srivastava, R. P., T. J. Mock, K. Pincus, and A. Wright. 2012. Causal inference in auditing: A
framework. Auditing: A Journal of Practice and Theory, Vol. 31, No. 3, pp. 177-201.
Srivastava, R.P., A. Wright, and T. Mock. 2002. Multiple Hypothesis Evaluation in Auditing. Journal of
Accounting and Finance (Australian), Volume 42, No. 3, November: 251-277.
Stewart, T. 2015. Data analytics for financial-statement Audits, Chapter 5 in AICPA, Audit Analytics and
Continuous Audit: Looking Toward the Future, American Institute of Certified Public
accountants. New York, N.Y.
Stringer, K.W., and T.R.Stewart. 1986. Statistical techniques for analytical review in auditing. Ronald
Press.
preprint
Sun, L., R. P. Srivastava, and T. Mock. 2006. An Information Systems Security Risk Assessment Model
under Dempster-Shafer Theory of Belief Functions. Journal of Management Information Systems,
Vol. 22, No. 4: 109-142.
accepted
manuscript
Tabor,R. H., and J.T. Willis.1985.Empirical evidence on the changing role of analytical review
procedures. Auditing-a Journal of Practice & Theory, 4(2), 93-109
Tschakert, N., J. Kokina, S.Kozlowski, and Vasarhelyi, M.A. 2016. CPAs and Data Analytics. The
Journal of Accountancy, forthcoming.
Tukey, J. W. 1980. We need both exploratory and confirmatory. The American Statistician, 34: 23-25
Vasarhelyi,M.A., and F.B.Halper. 1991. The continuous audit of online systems. Auditing: A Journal of
Practice & Theory, 10(1), 110-125
Vasarhelyi, M.A. 2013. Formalization of Standards, Automation, Robots, and IT Governance. Journal of
Information Systems, Vol. 27, No. 1 Spring 2013, pp. 1-11.
Big Data and Analytics in the Modern Audit Engagement: Research Needs
50
Vasarhelyi, M. A. and S. Romero. 2014. Technology in audit engagements: a case study. Managerial
Auditing Journal, 29(4), pp.350-365
Vasarhelyi, M.A. 2015. The new scenario of business processes and applications on the digital world.
Working Paper, CarLab.
Vasarhelyi, M.A., A. Kogan, and B.M. Tuttle. 2015. Big data in accounting: An overview. Accounting
Horizons, 29 (2): 381-396.
Wang, T., and R. Cuthbertson. 2015. Eight Issues on audit data analytics we would like researched.
Journal of Information Systems 29 (1): 155-162.
Warren Jr, J.D., Moffitt, K.C. and Byrnes, P., 2015. How Big Data will change accounting. Accounting
Horizons, 29(2), pp.397-407.
Yoon, K. (2016). “Big Data as Audit Evidence: Utilizing Weather Indicators.” Chapter 3 of the
dissertation titled Three Essays on Unorthodox Audit Evidence, Rutgers University, Newark N.J.
Yue,D., X.Wu, Y.Wang, Y.Li, and C.H.Chu. 2007. A review of data mining-based financial fraud
detection research. In Wireless Communications, Networking and Mobile Computing, 2007.
preprint
WiCom. International Conference on (pp. 5519-5522). IEEE.
Zhang,L., A.R.Pawlicki, D.McQuilken, and W.R.Titera. 2012. The AICPA assurance services executive
accepted
committee emerging assurance technologies task force: The audit data standards (ADS) initiative.
manuscript
Journal of Information Systems, 26(1), 199-205.
Zhang,J., X.Yang, and D.Appelbaum. 2015. Toward Effective Big Data Analysis in Continuous
Auditing. Accounting Horizons 29 (2) 469-476.
Big Data and Analytics in the Modern Audit Engagement: Research Needs
51
APPENDIX A:
.27 The auditor should treat those assessed risks of material misstatement
due to fraud as significant risks and, accordingly, to the extent not already
done so, the auditor should obtain an understanding of the entity's related
controls, including control activities, relevant to such risks, including the
evaluation of whether such controls have been suitably designed and
implemented to mitigate such fraud risks. (Ref: par.A36–.A37)
preprint
.32 Even if specific risks of material misstatement due to fraud are not
identified by the auditor, a possibility exists that management override of
controls could occur. Accordingly, the auditor should address the risk of
accepted
management override of controls apart from any conclusions regarding the
manuscript
existence of more specifically identifiable risks by designing and
performing audit procedures to, etc.
a. test the appropriateness of journal entries recorded in the general
ledger and other adjustments made in the preparation of the financial
statements, including entries posted directly to financial statement
drafts. In designing and performing audit procedures for such tests, the
auditor should (Ref: par. .A47–.A50 and .A55)
i. obtain an understanding of the entity's financial reporting
process and controls over journal entries and other adjustments,
12
and the suitability of design and implementation of such
controls;
ii. make inquiries of individuals involved in the financial reporting
process about inappropriate or unusual activity relating to the
processing of journal entries and other adjustments; , etc..
c. evaluate, for significant transactions that are outside the normal
course of business for the entity or that otherwise appear to be unusual
given the auditor's understanding of the entity and its environment and
other information obtained during the audit, whether the business
Big Data and Analytics in the Modern Audit Engagement: Research Needs
52
rationale (or the lack thereof) of the transactions suggests that they may
have been entered into to engage in fraudulent financial reporting or to
conceal misappropriation of assets. (Ref: par. .A54)
.A49 When identifying and selecting journal entries and other adjustments
for testing and determining the appropriate method of examining the
underlying support for the items selected, the following matters may be
relevant:
• preprint
The characteristics of fraudulent journal entries or other adjustments.
Inappropriate journal entries or other adjustments often have unique
identifying characteristics. Such characteristics may include entries (a)
accepted
made to unrelated, unusual, or seldom-used accounts; (b) made by
individuals who typically do not make journal entries; (c) recorded at
manuscript
the end of the period or as post closing entries that have little or no
explanation or description; (d) made either before or during the
preparation of the financial statements that do not have account
numbers; or (e) containing round numbers or consistent ending
numbers.
• The nature and complexity of the accounts. Inappropriate journal
entries or adjustments may be applied to accounts that (a) contain
transactions that are complex or unusual in nature, (b) contain
significant estimates and period-end adjustments, (c) have been prone
to misstatements in the past, (d) have not been reconciled on a timely
basis or contain unreconciled differences, (e) contain intercompany
transactions, or (f) are otherwise associated with an identified risk of
material misstatement due to fraud. In audits of entities that have
several locations or components, consideration is given to the need to
select journal entries from multiple locations.
Big Data and Analytics in the Modern Audit Engagement: Research Needs
53
Table One:
accepted
risk assessment.
How can big data evidence be aggregated This research question can be integrated
manuscript
with other types of audit evidence in a with that of the data measurement
methodologically sound way? system.
How can quantitative measures be used to This research question can be integrated
provide support for the auditor’s judgment with that of the data measurement
about the sufficiency of audit evidence? system.
Alterability: How can the auditor be assured Research examining various tests for the
that the data has not been altered? assertion of accuracy in a big data
context should be conducted.
Credibility: How can the auditor be assured of Research examining/suggesting certain
the controls surrounding the generation of big verifications of controls should be
data external to the client? undertaken.
Completeness: How can the auditor verify that Research should be undertaken that can
the big data is complete? provide suggestions as to the verification
of big data for the assertion of
completeness.
Approvals: Should big data provide evidence Studies of controls measurements of big
of approvals/controls validations? Is this data at all levels of generation and
viable? extraction should be conducted. For
Big Data and Analytics in the Modern Audit Engagement: Research Needs
54
preprint
accepted
manuscript
Big Data and Analytics in the Modern Audit Engagement: Research Needs
55
Figure One:
O
LINKING
G ANALYTIICAL PROC
CEDURES TO TRADITIIONAL FILE
E INTERRO
OGATION
preprint
accepted
manuscript
Table Two:
preprint
more quantified and probabilistic recommendations provided later in this
manner? paper.
Can the above be stated in terms of rules A framework for an automated ABE
implementable in automated audit system should be proposed which takes
accepted
systems to continuously monitor and advantage of the big data processing and
manuscript
drive audit by exception (ABE)? business analytics capacities of modern
enterprise systems.
Table Two: Summary of the Issues regarding New Analytics in the Audit and Recommendations for Future Research
Big Data and Analytics in the Modern Audit Engagement: Research Needs
57
Table Three:
preprint
and data structures for defined audit tasks)? with academics and practitioners.
How would these approaches be quantified? A quantification framework could be
proposed and demonstrated.
accepted
How would these approaches be tested in This could be part of the AICPA initiative
the field? Sand box approaches with firm support and academic input.
manuscript
accompanied with successive levels of
adoption? Would these be provided a safe
harbor?
Again, how would this affect the audit A framework or guidance for a more
opinions? Could these modern analytical detailed and quantitative opinion disclosure
methods facilitate more transparent and should be developed and proposed.
quantitative disclosure?
Table Three: Summary of issues regarding which methods are most promising
Big Data and Analytics in the Modern Audit Engagement: Research Needs
58
wo:
Figure Tw
THE CUR
RRENT TYP
PICAL AUD
DIT PLAN
preprint
accepted
manuscript
Figure Thhree:
AUDIT BY
B EXCEPT
TION
preprint
accepted
manuscript
Table Four:
preprint
auditor judgment eventually be replaced automation, factoring such variables as
with prescriptive analytical algorithms? judgement and interim testing
Would leading audit firms allow such Would these firms be willing to be key
disruptive changes in engagement practice, innovators in the assurance side? (Perhaps if
absent regulation changes?
accepted
they were to be allowed a sandbox or safe
manuscript
harbor? )
Can the key contingencies in the audit be These should be examined and articulated
formalized? with frameworks/guidelines embedded in an
expert system
If the annual audit opinion can become The recommendations regarding this issue
more informative, as per recent CAM are discussed later in this paper. CAM
reviews, why stop there? Why not issue reviews could serve as the foundation of a
CAM level quarterly reports and reports on more quantitative opinion report. Other
demand? possibilities evolve for an immutable real-
time seal of the data and its assurance
Table Four: Issues regarding where in the audit these methods would be applicable
Big Data and Analytics in the Modern Audit Engagement: Research Needs
61
Table Five:
preprint
particularly if these issues have been on- “safe harbor” questions
going?
If the auditor has access and ability to test This is an issue that research should address,
100% of the dataset, would there still be allowing for time, accuracy, and cost
justification for the use of sampling? accepted
calculations for sampling versus 100% tests
manuscript
Is there a way to quantify the evaluation of This is an issue that the regulators should
the cost and time to run 100% tests versus address as part of the preceding question
the perceived liability of sampling risk and
judge accordingly?
Are 100% tests new type of audit evidence This question could be examined along with
or just automation? other issues relevant to big data
If these tests are considered automation, This is an issue that the regulators should
how do the standards take this into address, with input from the firms and
consideration? Should the current solution researchers. The controls testing and
of greater reliance on internal controls be verification process as it relates to an IT
quantified? audit and the reliability of information
generated within a system may need
clarification/quantification.
Is there a difference between automation This is an issue to be considered in future
and analytic methods? Isn’t automation research efforts by academics, as part of a
basically the automated application of scoring framework for audit evidence
analytics?
If such an automation is viewed as a This is an issue that the regulators should
preventative internal control, then how does address, with input from the firms and
Big Data and Analytics in the Modern Audit Engagement: Research Needs
62
accepted
manuscript
Big Data and Analytics in the Modern Audit Engagement: Research Needs
63
Table Six:
accepted
manuscript
Big Data and Analytics in the Modern Audit Engagement: Research Needs
64
Table Seven:
AUDITOR COMPETENCIES
accepted
Table Seven: What are the competencies needed by auditors in this environment?
manuscript
Big Data and Analytics in the Modern Audit Engagement: Research Needs
65
Table Eight:
preprint
BD/ADA vs. not using and by what metrics? evaluation results may differ between
stakeholders. Process of measurement metrics
and expectations should be developed.
How would BD/ADA adoption take place at This question ties in with the process
the firm level and regulatory level?
accepted
development (third) question
manuscript
Would audit procedures need to be re-aligned Should current audit procedures and
to fit this new engagement environment? regulations be changed prior to use of
BD/ADA?
How would auditors best prepare for these How would firms and regulators go about
tasks that require more judgment and less best preparing practitioners to transition to
routine work? more judgement based and analytical
approaches?
Table Eight: Issues that might impact BD/ADA adoption
Big Data and Analytics in the Modern Audit Engagement: Research Needs
66
Figure Four:
1. Drivers
2.
6. Evaluation Management
preprint
accepted
5.
manuscript
3.
Measurement Development
4.
Implementation
Big Data and Analytics in the Modern Audit Engagement: Research Needs
67
Figure Fivve:
POSSIBL
LE CYCLES
S OF ADOPT
TION
• drivers
• drivers • drivers
• managem ment
use off • management predictive • management prescriiptive • developmment
more • development analytics • development analyytics
• implementation
descriptive • implementaation with more • implementation with big
• measurement
analyticcs • measureme ent data • measurement datta
on
• evaluatio
• evaluation • evaluation
Figure Fivve: Three possiible cycles of adoption for thee use of more advanced
a analyytics and big daata by the publlic
preprint
audit profeession
accepted
manuscript
preprint
accepted
manuscript
Big Data and Analytics in the Modern Audit Engagement: Research Needs