16

AIS auditing

Audit tools for a continuous auditing approach

Maria Céu Ribeiro

Introduction

This chapter addresses a topic which has profoundly affected auditing1 research during recent years: the links between information and communication technology, particularly Accounting Information Systems (AIS) and auditing. It covers the advent and subsequent development of continuous auditing (CA), a new approach to monitoring and auditing information, following the transformative impact of the technological advances on business practices. The spread of routines and interfaces led to new business models for enterprise architectures and, consequently, new AIS. Today’s process of recording and storing business transactions (data and process workflows) into those integrated information systems handles millions of transactions in a real-time approach to conducting business. Within the accounting domain, auditing has also been impacted by the computerization of the financial area of business processes, which has affected everything from accounting recording, ledger posting and system reports (Vasarhelyi et al., 2010). Scanners, card readers, databases and networks have made much of today’s transaction processing an electronic activity with less paper documentation (Brown et al., 2007), where manual examinations may not be good enough for assurance testing.

The contrast between contemporary AIS and traditional auditing tools is notable. While business systems are leveraged for cross-application integration and intertwining sequential business processes (e.g., internet-based electronic data interchange (EDI), manufacturing, inventories, sales) allowing continuous information collection, traditional auditing tools are limited to data extraction, cumbersome spreadsheets, manual manipulation and limited automation (Chan and Vasarahelyi, 2011). Given the resulting effects on auditing, both researchers and practitioners are progressively giving more attention to the demands and opportunities for audit tasks to be performed automatically, continuously and even in real time (Chiu et al., 2014), entailing the concept of CA and a new auditor role. While the traditional auditor works to extract sample data from information systems, with the auditor’s presence in the auditee’s facility, using spreadsheets and basic sampling and analytical techniques, the future auditor will, in contrast, remotely have a periodically (daily, monthly, etc.) look at its dashboard into the automated audit system to check if any audit status indicators have been flashed for further investigation or if statistical reports produced by the system indicate any unusual trend (Byrnes et al., 2014).

This chapter is organized as follows. The first section provides a review of the traditional auditing issues, minor improvements and continuing limitations in the context of the new real-time enterprise systems. Section two describes the framework of CA and section three explores CA’s state of the art, followed by section four which analyzes the impact of the recent phenomenon of big data on CA. Section five focuses on the development of the continuous monitoring activities, one of CA components, while the last section provides concluding remarks and highlights still unanswered questions.

Traditional auditing concerns in today’s accounting systems

The traditional auditor focuses on time intensive manual procedures auditing around the computer (Rezaee et al., 2002; Chan and Vasarhelyi, 2011). All the work is performed several months after the occurrence of relevant events and anomalies detected are investigated only at the end of the audit period. Auditors dealt with the introduction of technology and current complex AIS by tailoring some computer programs to do traditional audit procedures and developing generalized audit software to provide information on data files. However, globally and as experienced by the author during her career of more than 20 years at a Big 4 firm, auditors depend on traditional tools to support the audit process. Those minor changes and improvements in audit tools, still within the traditional approach described above, are briefly analyzed below.

Information technology and traditional auditing

With changes to business systems’ architecture, such as internet/cloud-based applications and enterprise resource planning (ERP) systems, and more automated controls, the IT audit function gained a more critical role. Remote audit procedures were an opportunity to innovate the audit process by shifting the audit team from being entirely on-site to become virtual teams, increasingly coordinating auditing activities among auditors physically present at the auditee site and others in other locations (Teeter et al., 2010). During internal control evaluations and transactional testing, auditors interview process owners via videoconferencing, connect to the client system over the network, periodically check audit logs, run analytical tests through a terminal and test for anomalies pulling sample transactions over the network through macros (Teeter et al., 2010). As complex AIS became ubiquitous, there was a need to adapt auditing to the computerized environment, and Computer-Assisted Audit Tools (CAATs) were designed to aid in automating the audit process and obtaining data from the ERP systems.

Computer-assisted auditing techniques in traditional auditing

Nowadays, auditors use CAATs for retrieving data, analyzing transactional data to detect anomalies or verifying system controls, such as checking who performed a control, since the documentary evidence may not exist. Apart from choosing statistical samples and using the computer’s speed to help with massive volumes of data, those tools are mainly extraction tools to perform data analysis, after data has been processed in the system, using queries activated only periodically (Byrnes et al., 2014). In substance, this is still the once-a-year, traditional, backward-looking audit on a sample basis with no continuous setting, and the analytical tools remain based on basic statistical techniques, such as ratio or trend analysis – even though completed remotely or by applying CAATs (Alles et al., 2008).

As traditional paper trail audits became impossible because documents are electronically stored and transaction processing is now a real-time activity, a different auditing stream has emerged. In the following section of this chapter we examine the continuous auditing methodology and its recent framework developments, a fundamental redesign of audit processes using today’s technology.

Continuous auditing for a real-time approach

In the current business environment, information is processed, collected and reported so that it can provide near immediate feedback to the stakeholders. In addition to its strong infrastructure of automation. What distinguishes CA from traditional auditing performed today is the CA’s real-time approach. Presently, there is no agreed standard definition for CA (Brannen, 2006), but recently the primary definition of CA by CICA/AICPA (1999) has been restated and expanded in Bumgarner and Vasarhelyi (2015) as:

a methodology that enables [independent] auditors to provide [written] assurance on a subject matter for which an entity’s management is responsible, using a continuous opinion schema [series of auditors’ reports] issued virtually simultaneously with, or a short period of time after, the occurrence of events underlying the subject matter.

p. 48

We could add the following to the above definition – continuous audit may entail predictive modules and may supplement organizational controls. The continuous audit environment will be progressively automated with auditors taking progressively higher judgment functions. The audit will be analytic, by exception, adaptive, and cover financial and non-financial functions. Analyzing the above definition and proposed extension, it is clear that CA is a new conceptualization of the whole auditing process, being its ultimate point to bring audit results closer to the occurrence of relevant events and as such, a significant improvement in reacting to problems as they occur for an immediate resolution.

Overview of the continuous auditing components

In a CA environment, data flowing through the system are analyzed continuously (e.g. daily). The characteristics of the whole population of transactions being monitored are compared to expected results by software continuously running as an analytical review technique (Chiu et al., 2014). In this “future-oriented” audit, historical data help to model the future data. When an anomaly is detected, the auditor will be notified for further investigation through emails, other notifications or system reports (Kuhn and Sutton, 2010).

Although initially targeted to external auditors, CA has matured from a pilot data analysis to deal with the issue of auditing large paperless database systems (Vasarhelyi and Halper, 1991) to a reality increasingly affecting internal auditors as well (Vasarhelyi et al., 2010). Its concept was first expanded in an implementation for controls testing at Siemens as a reaction to Sarbanes Oxley (SOX).2 The CA framework evolved to a composite model with two key primary components: procedures for monitoring business process controls – continuous controls monitoring (CCM); and procedures for detailed transactions testing – continuous data audit (CDA) (Alles et al., 2006). Incorporating the concept of CCM into the original CA conceptualization led to the renaming of the original CA to CDA. More recently, Vasarhelyi et al (2010) proposed a third element in the CA methodology, the continuous risk monitoring and assessment (CRMA), which has been expanded by Bumgarner and Vasarhelyi (2015) with the continuous compliance monitoring (COMO). These four primary components of the CA framework are analyzed next.

Continuous controls monitoring and continuous data audit

Globally, the end result of monitoring is to obtain information about the performance of a process, system or data. CCM consists of a set of procedures used for monitoring internal controls and helps to ensure that procedures and business processes are operating effectively (Alles et al., 2006). The validation of the implemented controls is rooted on the conversion of manual controls assessment to automated platforms. CCM procedures include, for instance, continuously monitoring user access controls, user account authorizations and workflows related to business processes (Vasarhelyi et al., 2010). Simultaneously with the CCM, CDA verifies the integrity of data flowing within and between systems to ensure that errors in the data are minimized (Chan and Vasarhelyi, 2011). CDA includes procedures for verifying underlying master data (for instance, pricing master file with invoices), transactional data flows and key process metrics using analytics (Vasarhelyi et al., 2010), through which transactional data are continuously tested for anomalies.

CCM is a part of the wider continuous monitoring (CM) activities. CM and CA terms are often used interchangeably. However, they are separate concepts and activities, although both consist of the analysis of data on a real-time basis against benchmarks. The key difference relates to the ownership of the process. While CM is viewed as a management function to ensure processes are working as defined and approved, auditors may likewise have a process in place that continually tests controls implemented by management based upon criteria defined by auditors (CCM) (Alles et al., 2006). As such, in the context of technology, auditors have regularly seen CM around testing controls (Brannen, 2006). With the need to issue opinions on the adequacy of internal controls (e.g., a SOX requirement), it became clear that CMM insights and analytics would be of interest to management in assessing the effectiveness of such controls. Hence, CA is a technological innovation of the traditional audit process being used by a variety of actors to continually gather data to support not only auditing, but also management objectives and activities (Alles et al., 2006; Chan and Vasarhelyi, 2011), both for the issuance of the mandated annual audit opinion and for business process reviews.

Other continuous auditing components: continuous risk monitoring and assessment and continuous compliance monitoring

The recent sub-prime crisis made it obvious that enterprise risk management (ERM) in place was not adequate to assess business risks (Bumgarner and Vasarhelyi, 2015). Auditors have their own systematized approaches heavily reliant on unstructured and periodic assessment of risk and judgment (Vasarhelyi et al., 2010). Having real-time information of changes in business and audit environments is critical in CA. Technological advances also allow for closer and more realistic measurements of risk and continuous risk monitoring and assessment (CRMA), using algorithms and probability models to assess judgments and risk evaluations and to monitor operational and environmental risks (Vasarhelyi et al., 2010). Hence, the aim of CRMA is to make CA dynamic by reflecting risk management practices in the audit itself; therefore, new CCM and CDA resources may be updated as entity risks change.

Although much of the traditional world of compliance is qualitative, monitoring organizational compliance with regulation is progressively being implemented using information technology (continuous compliance monitoring – COMO), in particular by financial institutions. Implementing CRMA and COMO is still a slow-moving work in progress, as it will first require the formalization of the practice and solution on how it can be automated (Bumgarner and Vasarhelyi, 2015) – a prerequisite whose fulfilment is still in its infancy.

Continuous auditing and audit automation

CA has to be based upon computer-assisted tools and techniques, being feasible only if implemented as an automated process with full access to relevant events (Kogan et al., 1999); this explains why the terms “audit automation” and “continuous auditing” might be confused or interchangeably used (e.g., Chiu et al., 2014). Identification of exceptions, analyzing numeric patterns, reviewing trends and testing controls are all automated, so IT plays a key role in CA activities. However, CA is not a simple automation of the traditional audit procedures. All auditors have some kinds of tools and audit automation to support their work, such as electronic working papers in customized audit databases and data analysis tools; however, this is not the concept of the CA methodology. Automation requires formalized audit procedures programmed into an automated audit system that can run continuously. The automated tools are used to determine whether organizational data is accurately maintained and internal controls function properly (Vasarhelyi et al., 2010). However, if organizational data are strictly manual, traditional auditing should be maintained. Hence, before implementing a CA approach, it is required to evaluate the extent to which data, controls and key processes are, or should be, formalized and automated.

As continuous monitoring of internal controls and testing of transactions are automated, the main role of the auditor will be investigating exceptions from the audit system and focusing on the high-level judgmental audit areas. Therefore, automation is an essential ingredient to CA, though manual involvement remains important particularly in situations where extensive judgement is required and where exceptions and outliers are identified (Vasarhelyi et al., 2004). Human factors will, thus, continue to be integrated in the audit process, although not being in the foreground as in traditional audit.

Continuous auditing as a proactive audit

Extensive research (e.g., Kogan et al., 1999; Brown et al., 2007; Chiu et al., 2014) has been conducted regarding the functionality, benefits and challenges of CA. In theory, the technological feasibility of CA appears simple as accounting information is now recorded electronically and computer networks allow remote access to that information. However, in practice, the great variety of software systems used in organizations makes it difficult for auditors to develop auditing tools, and furthermore, to make the implementation of these audit tools economically feasible (Kogan et al., 1999). CA is a capital-intensive technology, which requires sizable investments not only in hardware, but also in software and networking. Application development experiences are addressed in detail in the following section.

CA has been defined both as a process and as a technology (Bumgarner and Vasarhelyi, 2015). As a process, CA is a rethink of auditing, from the way that data is made available to the auditor, to the kind of tests the auditor conducts, how alarms are dealt with, what kinds of reports are issued and how often and to whom they are issued for follow up (Vasarhelyi et al., 2010). It allows the auditor to actively detect and investigate exceptions as they occur. Depending on system capabilities, transactions involving internal control violations and transaction anomalies can even be aborted or suspended in real time until investigated. Hence, CA can be considered a proactive rather than a reactive audit, as it has been since its inception (Chan and Vasarhelyi, 2011). When predictive, the auditor will rely on models to predict results in an account or transaction, which is compared with actuals in near real time to detect substantive variances (Kogan et al., 2014). While conceptually it seems simple, the actual practice of CA has been low, as discussed in the next section.

Continuous auditing in practice – state of the art

The real-time environment generated by advances in AIS gave birth to the CA process and some implementation experiences of technologies have progressively been prototyped (Vasarhelyi and Halper, 1991; Vasarehelyi et al., 2004; Alles et al., 2006; Kogan et al., 2014; Singh and Best, 2015). Since the pilot implementation of CCM as a proof of concept in a large transnational company, internal auditors have increased their use of technology with the goal of automating the internal audit process (Alles et al., 2006). However, although the concept of CA has been researched for many years and some applications noted that organizations and auditors, particularly external auditors, have been generally unable to turn this concept into practice. The ensuing discussion presents the alternative architectures for supporting CA and the development of predictive models to define the benchmarks to better understand why CDA and CCM appear to be struggling for acceptance (Byrnes et al., 2014).

Technical architecture of continuous auditing

The CA cycle starts with the auditor connecting into the processing system and ends when the auditor disconnects (Chan and Vasarhelyi, 2011). Major issues regarding this connection relate to access to the processing system and data (direct access, either to the transactional database or to the application layer, or intermediated access through a data warehouse), as well as its access security.

CA and auditing system architectures to capture the data are based on two main designs: Embedded Audit Modules (EAM) (Vasarhelyi and Halper, 1991) or Monitoring and Control Layer (MCL) introduced by Vasarhelyi et al. (2004). Through EAM, audit programs are integrated directly within the application to provide CM of the system’s processing of transactions through examination of each transaction as it enters the system, using the language of the application itself (Rezaee et al., 2002). However, those modules running in the background of the system may reduce the transactional processing capability and efficiency of the system. Moreover, as the audit application is permanently resident within the processing system, possible manipulation by the auditee’s personnel and dependency of its IT department to make changes also create some concerns about the integrity of the EAM approach. Alternatively, MCL is an external software module operating independently of the information system to be monitored or audited but linked into the system (Vasarhelyi et al., 2004). Unlike EAM, the CA system receives periodic interfaces of data as determined by the auditor that are processed against pre-defined rule-sets of audit programs located inside the auditing application, which is physically and virtually outside the processing system. Alles et al. (2006) have documented a system prototype based on MCL for controls testing at Siemens. As pointed out by those authors, in contrast with EAM, this approach has fewer issues related to software maintenance, client independence and reliance on IT personnel.

To make CA possible and cost-effective, it was expected that many of the controls would be integrated controls (CICA/AICPA, 1999). However, today’s organizations have not yet implemented end-to-end centralized and automated controls that CCM depends upon. Adopting EAM solutions would require implementing several modules, one in each application, and the existence of a variety of systems might challenge the required connections with the MCL. Rezaee et al. (2002) proposed a conceptual technical architecture for building CA systems that combine the use of audit data warehouses (integrates data from all application systems throughout the organization) and audit data marts (smaller warehouses that focus on only one functional area, such as accounting). Each data mart loads appropriate data from the data warehouse and audit tests into the data mart periodically run and generate exception reports. Kogan et al. (2014) also designed a data-oriented system for organizations in which data derived from multiple legacy systems are deployed in a single data warehouse. In terms of the platform for the audit software, an audit data warehouse, linked with the disparate systems, that integrates the relevant data being generated, has been seen as a viable technical solution (Rezaee et al., 2002).

Securing CA is also crucial for its architecture. Moving data over the network for remote processing and opening new channels between auditors and auditees using the networking infrastructure of the Internet has to be supported by security technologies and policies to ensure that the audit applications are protected against unauthorized alteration; furthermore, reliable and efficient electronic communication methods need to be in place. The next subsection discusses how to build standardized audit tests, resident in the EAM or in audit data warehouses, running continuously and generating exception reports.

Business processes modeling and data analytics

Before the stage of the data modeling and benchmark development and data analytics, audit procedures should be automated at a relatively low level down to the level of individual processes (Alles et al., 2006; Vasarhelyi et al., 2004). Audit systems only detect anomalies that the auditor anticipates and, even more specifically, those anomalies that those applications are programmed to seek; therefore, irregularities to be monitored should be previously defined. In developing a CA system for verifying key process metrics, the assumption is that access to transaction level data will enable auditors to design expectation models for analytical procedures at the business process level, as opposed to the traditional practice of relying on ratio or trend analysis at a higher level of aggregation (Vasarhelyi et al., 2010). Performing analytical procedures requires the determination of what can be expected and a level of precision (i.e., how accurate the auditor wants the model to be according to the auditor’s perception).

Data modeling and data analytics techniques, developed from statistics and data mining, are used for analytical procedures for monitoring and testing transaction details (controls exceptions and transactions verifications) and account balances (Chan and Vasarhelyi, 2011). For CCM, internal control policies serve as the benchmark against which employees’ actions are compared and any violation is flagged for verification. For CDA at the account level, data analytics help to understand the evolution of the activity. Data modeling involves the use of historical audited transaction data and account balances to generate a prediction of data through empirical models of expected behavior, such as linear regressions. Based on the assumption that future transaction data and its behavior characteristics should be similar to the past, data analytics are used to compare present unaudited transactions and account balances (metrics) against the benchmarks created by data modeling considering an acceptable range (Chan and Vasarhelyi, 2011). Estimates of the coefficient of the variables considered in the models should be statistically significant in order to have a better precision on the metrics generated (Kogan et al., 2014). Variances from these metrics are treated as an alert. Hence, imperfect models will generate false positive errors (a false alarm) and false negative errors (an anomaly not detected by the system).

Aggregation and benchmarks for continuous auditing analytics

Much of the recent research on CA has focused on developing improved models for actual and more reasonable comparisons (Chiu et al., 2014; Kogan et al., 2014). Creating a metric that will prove effective in detecting exceptions is not a trivial task since it must be based on what is “usual” for an observation (Kogan et al., 2014). In an environment where disaggregated data is available (in contrast with the traditional audit), financial and/or non-financial metrics can be used, such as document counts or number of transactions. Auditing on different metrics would enable auditors to have a more diverse set of patterns and benchmarks (Kogan et al., 2014). There is, however, a trade-off regarding aggregation of data. The more disaggregated the metrics are, the more variability is observed among individual transactions; this is more likely to lead to unstable analytical models and generating more errors. Depending on the accuracy of those analytical criteria, problems may emerge both from the flow of false positives (exceptions detected that are not exceptions) and alarm floods which might generate a substantial information overload for the auditor and, on the contrary, failure in detecting exception (Kuhn and Sutton, 2010). However, on the positive side, using disaggregated metrics will narrow down the scope of the auditor’s follow up.

A major issue in this new research is the feasibility of using statistics in practice. How willing auditors are to model CA applications in practice is an open question (Kuhn and Sutton, 2010). Academics have the clear competitive advantage to innovate the stages of data modeling and data analytics for trying to fit the reality into a benchmark, but that research will be fruitless without its implementation and validation in practice (Chan and Vasarhelyi, 2011; Kogan et al., 2014).

While not yet an established methodology, interest in exploiting CA process has advanced over the recent years, particularly in internal audit (Vasarhelyi et al., 2012). The drivers and constraints of CA have proven to be economic and regulatory, given that auditing is a business practice and not a piece of software (Alles et al., 2008). Is the absence of exceptions or the estimates of the predictive models being slightly below/over actual data enough to conclude that controls are effective and transactions and balances accurate (Titera, 2013)? How can organizations integrate in the audit working papers CA alerts, weaknesses of automatic controls, exceptions and basis of predictive models or other evidence from the latest advances in technologies, such as camera devices, which might monitor a warehouse and at the same time might be used to confirm deliveries of materials (Chiu et al., 2014)? The emergence of big data has changed the landscape of CA, as big data becomes an important source for analytics. The next section of this chapter discusses the impact of the recent phenomenon of big data in the CA approach.

Big data in a continuous auditing environment

Big data3 originates from traditional transaction systems and many exogenous new sources, such as emails, telephone calls, social media and security videos. Since much of this big data informs and affects corporate decisions that are important to both internal and external stakeholders, auditors will need to expand from traditional financial structured data analysis (e.g., general ledger or transaction data) to other non-structured data out of the system data, and nonfinancial data, such as social networks logs, company emails and newspaper articles, to identify potential transactional anomalies and trends (Brown-Liburd et al., 2015; Cao et al., 2015).

Big data in the audit environment

The advent of big data means that there is extensive relevant audit evidence outside the organization in the form of non-financial data. However, traditional analytical tools, such as Excel and Access, require structured data to perform effectively, and existing CAATs, due to limited use of advanced statistical techniques, do not have the capability to import such information (Brown-Liburd et al., 2015). Incorporating big data into the audit process is overall value-added for auditors, but this does not come without challenges. Opposed to explaining causation, that is a critical aspect in auditing, the use of big data limits analyses to correlations through looking for patterns that might help in determining expectations in analytical procedures (Cao et al., 2015). This focus is problematic because correlations simply identify anomalies that direct the auditor’s attention to investigate their causes and alone do not provide sufficient and appropriate audit evidence (Brown-Liburd et al., 2015).

Big data is a powerful predictor for auditor expectations of financial data. However, the so-called four “Vs” of big data, massive volume, high velocity, variety and uncertain veracity present challenges beyond the capability of current CA methods (Zhang et al., 2015). Therefore, an effective development of the CA methodology to accommodate big data analytics requires updating the infrastructure for accessing and retrieving data with diverse formats – in a particularly challenging context where CA is neither fully implemented nor an established technology yet, as previously discussed.

How big data are transforming the continuous audit

While the collection of big data4 is relatively easy, the same cannot be said about processing and extracting useful information from large amounts of data (Brown-Liburd et al., 2015). A major concern is data quality, as noise in big data leads to an overload of false positive alarms (Cao et al., 2015). Data consistency, data identification, data integrity and data aggregation are some sources of concerns for the current CA architecture for the layer dealing with data provisioning, filtering and diagnostics (Zhang et al., 2015).

The new CA approach needs to verify the relationship among data sources and manage data inconsistencies, such as data formats and, most importantly, any contradictions between data from different sources. The unstructured nature of the data that comes in many formats, such as text, image or video, complicates the data management and processing software, as well as the data identification (Brown-Liburd et al., 2015). For example, the revenue amount for a given sale can be easily identified by the CA system, but it can be challenging to automatically connect this information with the associated sales terms and conditions, which are in an unstructured textual format (Zhang et al., 2015). Furthermore, as volume and types of big data are so expansive, it becomes more difficult to identify data that has been modified or deleted in order not to lose reliable data for audit analysis purposes. Current methods of verifying data integrity, such as reasonability, edit checks, and comparison with other sources, may not be practical for big data audit applications. How to integrate techniques of data inconsistency checks in the audit data warehouse or MCL without losing efficiency, and the suitability of current methods to address the data identification issue for the CA with big data, are open research questions (Zhang et al., 2015). Modified and incomplete-data detecting and repairing techniques are also imperative in CA systems (Zhang et al., 2015).

Finally, as big data is most likely coming from different sources, CA using big data needs to aggregate data to meaningfully summarize and simplify it. However, as discussed in the previous section, there is a trade-off in data aggregation. The challenges introduced by the aggregation become more evident in a CA of big data and present topics for much future research (Zhang et al., 2015). Besides the greater availability of such non-financial data, technological advancements have also increased the importance placed on internal controls and CCM. The next section describes the recent development of CM techniques.

Development of continuous monitoring activities

Today, thousands of data flows are captured in the different processes of business and hundreds of controls to generate transactions and reports are used through ERPs. As stated before, CA can be defined as a process that continually tests controls based upon criteria defined by the auditor and data analytics models may also be a direct test of control. We now turn to CCM to further explore how controls can be monitored on a continuous basis.

Monitoring of control settings in the CA conceptual model

In a traditional audit, controls testing is performed on a sample basis through inquiries, observation, inspection or re-performance and generally phased to perform at an interim date and the remaining portion at period end. As the documentation of business events is increasingly being conducted through computer-based processes, which automatically collect data, and businesses are progressively implementing electronic documents and signatures, traditional manual audit activities, such as observation and inspection, are becoming less applicable or even impossible within the current environment (Chiu et al., 2014). To detect control violations, the CCM audit software looks, for instance, for master data tables to check the approved business partners (e.g., customers, suppliers).

The CCM application was first used by internal auditors to restrain the headcount demands of their SOX tasks following the requirement for assurance over financial reporting controls (Alles et al., 2008). However, even in the internal audit context, its practical implementation, whether by using MCL or EAM, is still lower than envisaged a few years ago (Vasarhelyi et al., 2012). One possible reason is that the validation of the effectiveness of any manual control through a CCM methodology should be formalized by the conversion of the manual control to automated platforms (Vasarhelyi et al., 2010). Using process mining (PM) to identify transactions that do not follow an approved workflow has been seen as an alternative for monitoring transactions and investigating them in detail, instead of using formalized audit programs in a computer executable format integrated into the CCM software.

Process mining as an audit tool

PM is used to evaluate ERP log files and gather insight into what steps people actually take when performing their tasks (Jan et al., 2013).5 Information is extracted from an event log, which is a chronological record of computerized activities saved into a file in the system. PM is distinctive as an audit tool because it focuses on the path of transactions and not directly on the validation of the values in the related process and uses the full population of data instead of a sample. It is, thus, a powerful tool for tests of controls, contrasting with the traditional approach. The data recorded by an ERP system includes not only entries made by users of that system, the input-data, but also meta-data, which is information automatically recorded by the system about that input and, as such, of particular interest to the auditor (Jan et al., 2014). To create an event log, that data is extracted from various tables throughout the ERP system database and assembled into a structured database to allow an adequate analysis of the input (activity) and other information about the actual operation of the process. However, the major challenge is the ERP system capturing the meta-data located across numerous tables into a structured and usable event log (Jan et al., 2014).

Besides obtaining meta-information about individual data entries, PM provides the ability to detect patterns across transactions and the users entering that data, such as whether certain transactions are regularly associated with a third party, at a certain time, or in a certain order (Jan et al., 2013). Hence, it can be used as a complementary analytical procedure tool for CA, and particularly, in the CCM context.

Contribution of process mining in continuous auditing

As already discussed, one major issue with analytics is the number of false positives. As a follow up procedure, PM may be of great value to explore in depth the circumstances that gave rise to the anomalies resulting from the analytical testing, to either identify a control failure or, alternatively, to improve the models to reduce future false positives (Jan et al., 2013). Singh and Best (2015) developed a prototype continuous monitoring system that relies exclusively on recorded transaction activity of profile users to recreate transaction histories and relationships among individuals as soon as events occur. The authors demonstrate that it is feasible to implement CM in practice using the full population of transaction data and meta-data from a SAP system to enrich the audit process.

PM can be used in conjunction with other analytical procedures to narrow down the auditor investigation, but may itself be used as a primary analytical procedure, instead of business process modeling (Jan et al., 2014). Whether PM may complement rather than replace CCM analytics needs to be researched. Given the difficulties and high cost of applying process mining to the entirety of an organizations’ data on a timely basis, there might be some advantages in confining PM to the event logs of the anomalous transactions to be checked and investigated (Jan et al., 2014).

Conclusion

The traditional auditing paradigm based on sampling is still dominant nowadays (Byrnes et al., 2014). Technology moves on, but auditing has not. The development of ERP systems provides the necessary infrastructure for the effective shift of auditing from a periodic review to a real-time (or near real-time) process through CA and PM applications. Design issues notwithstanding, there has been some applicability, particularly by internal auditors. The lesson learned in some implementations in the past years is simple: organizations and auditors should start small (Chan and Vasarhelyi, 2011). Moving to automation relative to data, processes and controls are essential for CA development; consequently, organizations already possessing strong automated processes and controls are better suited to a CA approach.

Some questions, however, remain, as continuously advancing technologies, ERP systems and AIS in general, will most likely look different in the near future. Moreover, CA systems need to be adapted to the big data phenomenon and ensure data quality processing. Will the existing CA architecture, not yet broadly implemented, be effective for those future ERPs and adaptable to big data challenges? Would hybridization of continuous and traditional auditing procedures be the way for a most effective dynamic of the CA approach in the current and forthcoming environment? Expectations models have to be developed for each business process and might vary between processes and times of the year (Kogan et al., 2014). Would PM be a better CCM methodology? Business risks are changing dynamically and CDA and CCM procedures might also have to be constantly adapted. But, how to use technology to continuously monitor and assess those risks, and organizational compliance to redirect audit procedures? These are relevant questions to which answers are not yet available.

Finally, system architecture and software components are important cornerstones; however, the auditor skill sets are also fundamental for a successful application of a CA approach. The data analytics environment will result in auditor judgment playing a much more significant role due to potential large numbers of exceptions to evaluate and metrics to be continuously reviewed (Vasarhelyi et al., 2010). Are students ready to shift from traditional auditor profile and knowledge to skills required by CA? The evolution toward CA may take time and implementing it may be complex, but it is not an insurmountable challenge in the continuous improvement – indeed, in the inevitable adaptation to a new business and IT context – of the auditing profession.

Acknowledgement

The author is thankful to Professor João Oliveira for his useful suggestions.

Notes

1  Auditing, for purposes of this chapter, includes internal and external auditing, except when otherwise detailed.

2  SOX 404 mandates that all publicly traded companies must establish, document, test and maintain internal controls procedures to ensure their effectiveness.

3  Chapters 12 and 13 expand on this topic.

4  The role of data analytics coupled with big data in auditing has been discussed in several papers (Alles, 2015; Brown-Liburd et al., 2015; Cao et al., 2015; Zhang et al., 2015).

5  Jan et al. (2013) explored the use of PM in auditing by identifying the sources of value-added PM when applied to auditing.

References

Alles, M., Brennan, G., Kogan, A. and Vasarhelyi, M. A. (2006). Continuous monitoring of business process controls: a pilot implementation of a continuous auditing system at Siemens. International Journal of Accounting Information Systems, 7(2), 137–161.

Alles, M., Kogan, A. and Vasarhelyi, M. A. (2008). Putting continuous auditing theory into practice: lessons from two pilot implementations. Journal of Information Systems, 22(2), 195–214.

Alles, M. (2015). Drivers of the use and facilitators of the evolution of Big Data by the audit profession. Accounting Horizons, 29(2), 439–449.

Brannen, L. (2006). Demystifying continuous audit. Business Finance, 12(3), 4–4.

Brown, C., Wong, J. and Baldwin, A. (2007). A review and analysis of the existing research streams in continuous auditing. Journal of Emerging Technologies in Accounting, 4, 1–28.

Brown-Liburd, H., Hussein, I. and Lombardi, D. (2015). Behavioral implications of Big Data’s impact on audit judgment and decision making and future research directions. Accounting Horizons, 29(2), 451–468.

Byrnes, P., Criste, T., Stewart, T. and Vasarhelyi, M. A. (2014). Reimagining Auditing in a Wired World. New York: AICPA.

Bumgarner, N. and Vasarhelyi, M. A. (2015). Continuous Auditing: A New View. Audit Analytics and Continuous Audit: Looking Toward the Future. New York: AICPA.

CICA/AICPA (Canadian Institute of Chartered Accountants/American Institute of Certified Public Accountants). (1999). Continuous Auditing, Research Report. Toronto, Canada.

Cao, M., Chychyla, R. and Stewart, T. (2015). Big Data analytics in financial statement audits. Accounting Horizons, 29(2), 423–429.

Chan, D. Y. and Vasarhelyi, M. A. (2011). Innovation and practice of continuous auditing. International Journal of Accounting Information Systems, 12(2), 152–160.

Chiu, V., Liu, Q., and Vasarhelyi, M. A. (2014). The development and intellectual structure of continuous auditing research. Journal of Accounting Literature, 33(1), 37–57.

Kogan, A, Sudit, E. F. and Vasarhelyi, M. A. (1999). Continuous online auditing: a program of research. Journal of Information Systems, 13(2), 87–103.

Kogan, A., Alles, M., Vasarhelyi, M. A. and Wu, J. (2014). Design and evaluation of a continuous data level auditing system. Auditing: A Journal of Theory and Practice, 33(4), 221–245.

Kuhn Jr., J. R. and Sutton, S. G. (2010). Continuous auditing in ERP system environments: the current state and future directions. Journal of Information Systems, 24(1), 91–112.

Jan, M., Alles, M. and Vasarhelyi, M. A. (2013). The case for process mining in auditing: sources of value added and areas of application. International Journal of Accounting Information Systems, 14, 1–20.

Jan, M., Alles, M. and Vasarhelyi, M. A. (2014). A field study on the use of process mining of event logs as an analytical procedure in auditing. The Accounting Review, 89(5), 1751–1773.

Rezaee, Z., Sharbatoghlie, A., Elam, R. and McMickle, P. L. (2002). Continuous auditing: building automated auditing capability. Auditing: A Journal of Practice & Theory, 21(1), 147–163.

Singh, K. and Best, P. J. (2015). Design and implementation of continuous monitoring and auditing in SAP enterprise resource planning. International Journal of Auditing, 19, 307–317.

Teeter, R. A., Alles, M. G. and Vasarehelyi, M. A. (2010). The remote audit. Journal of Emerging Technologies in Accounting, 7, 73–88.

Titera, W. R. (2013). Updating audit standard – enabling audit data analysis. Journal of Information Systems, 27(1), 325–331.

Vasarhelyi, M. A. and Halper, F. B. (1991). The continuous audit of online systems. Auditing: A Journal of Practice & Theory, 10(1), 110–125.

Vasarhelyi, M. A., Alles, M. and Kogan, A. (2004). Principles of analytic monitoring for continuous assurance. Journal of Emerging Technologies in Accounting, 1, 1–21.

Vasarhelyi, M. A., Alles, M. and Williams, K. T. (2010). Continuous Assurance for the Now Economy: A Thought Leadership Paper for the Institute of Chartered Accountants in Australia. Queensland, Australia: Institute of Chartered Accountants.

Vasarhelyi, M. A., Alles, M., Kuenkaikaew, S. and Littley, J. (2012). The acceptance and adoption of continuous auditing by internal auditors. International Journal of Accounting Information Systems, 13, 267–281.

Zhang, J., Yang, X. and Appelbaum, D. (2015). Toward effective Big Data analysis in continuous auditing. Accounting Horizons, 29(2), 469–476.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset