The en­dur­ing im­por­tance of pro­fes­sional skep­ti­cism

Accounting Today - - Assurance - By Sara Lord

When you think of the au­dit of the fu­ture, what words come to mind? Big data, ar­ti­fi­cial in­tel­li­gence, vir­tual work­force, blockchain, pro­fes­sional skep­ti­cism? If pro­fes­sional skep­ti­cism didn’t make your short list, it should.

Pro­fes­sional skep­ti­cism is a foun­da­tion of the au­dit­ing pro­fes­sion that we need to main­tain and evolve to sup­port the au­dit of the fu­ture.

Pro­fes­sional skep­ti­cism has al­ways been used to val­i­date in­for­ma­tion through prob­ing ques­tions, crit­i­cal as­sess­ment of ev­i­dence, and at­ten­tion to red flags and in­con­sis­ten­cies. The au­di­tor’s use of pro­fes­sional skep­ti­cism will need to evolve with the use of tech­no­log­i­cal ad­vance­ments by the pro­fes­sion and by clients. Skep­ti­cism will need to be ap­plied to all stages of the au­dit process, and the au­di­tor will need to be trained to find risks and po­ten­tial er­rors that tech­nol­ogy-based tools have missed.

For decades, au­di­tors have de­vel­oped pro­fes­sional skep­ti­cism by start­ing with un­der­stand­ing sim­ple re­la­tion­ships and ex­pec­ta­tions. For ex­am­ple, agree­ing that changes in bal­ance sheet ac­counts lead to amounts re­ported in the cash flow state­ment al­lowed the au­di­tor to learn how in­for­ma­tion works to­gether and how fi­nan­cial state­ment in­for­ma­tion should flow in­ter­nally. From this ba­sis, au­di­tors learned to iden­tify anom­alies as they pro­gressed through their ca­reers.

When ex­e­cut­ing an au­dit, au­di­tors tran­si­tioned from pre­dictable an­a­lyt­i­cal pro­ce­dures, such as those for de­pre­ci­a­tion ex­pense, to more com­plex analy­ses, such as how changes in rev­enue are tied to ex­ter­nal in­di­ca­tors. These ex­pe­ri­ences fos­tered the au­di­tor’s pro­fes­sional skep­ti­cism by eval­u­at­ing ex­pec­ta­tions and iden­ti­fy­ing risks and re­sults that don’t make sense in the con­text of all of the rel­e­vant in­for­ma­tion.

But as we con­tinue the evo­lu­tion into data-driven tech­nol­ogy, how does the au­di­tor know that the re­sults are rea­son­able? In the world of ar­ti­fi­cial in­tel­li­gence, au­di­tors will need to un­der­stand and be skep­ti­cal about how the soft­ware is work­ing and learn­ing, as well as un­der­stand why out­puts from an ar­ti­fi­cial in­tel­li­gence tool were pos­si­ble, and in­ter­pret what they mean in the con­text of in­di­vid­ual and unique client sit­u­a­tions.

Firms will dif­fer in how they tackle this need to un­der­stand and be skep­ti­cal about tech­nol­ogy. Some firms will build teams of data sci­en­tists who an­a­lyze data pro­vided by au­di­tors, leav­ing the au­di­tors to in­ter­pret the re­sults. Some firms will train their au­di­tors to run the ap­pli­ca­tions and tools and to un­der­stand how to eval­u­ate the data.

In these firms, the au­di­tor will need more tech­nol­ogy train­ing and have an in­ter­est in the pro­cess­ing of data. Yet other firms will have a com­bi­na­tion of both data sci­en­tists ded­i­cated to the use of tech­nol­ogy and au­di­tors who can use the tools.

It has never been more crit­i­cal to in­vest in peo­ple and de­velop the ap­pro­pri­ate pro­cesses to drive au­dit qual­ity through ap­pro­pri­ate skep­ti­cal be­hav­ior.

Re­gard­less of where the ma­nip­u­la­tion of data lies, the au­di­tor of the fu­ture will need to un­der­stand how the data was ma­nip­u­lated and what the re­sults mean, and then be able to ap­ply pro­fes­sional skep­ti­cism to iden­tify whether or not the re­sults are re­flec­tive of the unique client sit­u­a­tion.

This will re­quire au­di­tor train­ing to evolve. For ex­am­ple, in the fu­ture, au­di­tors will spend less time learn­ing about how a non­sta­tis­ti­cal sam­ple works, and more time learn­ing about the eval­u­a­tion of the sam­ple in­puts and out­puts — be­ing skep­ti­cal about what is iden­ti­fied and what is not iden­ti­fied.

Ac­count­ing and au­dit­ing knowl­edge will con­tinue to be base­line knowl­edge, but there will be a need for more train­ing on crit­i­cal think­ing, re­la­tion­ships, and pro­cess­ing mul­ti­ple out­comes. These changes will im­pact au­di­tors of all ex­pe­ri­ence lev­els.

Many com­pa­nies and firms are ex­per­i­ment­ing with or us­ing op­ti­cal char­ac­ter recog­ni­tion to read and eval­u­ate leases for adop­tion of the new lease ac­count­ing stan- dard. This is a per­fect ex­am­ple of where pro­fes­sional skep­ti­cism is re­quired.

When us­ing OCR tools, the au­di­tor needs to not just ac­cept the re­sults the OCR tool pro­vides, but in­stead ap­ply pro­fes­sional skep­ti­cism in eval­u­at­ing mul­ti­ple fac­tors. Au­di­tors need to ask ques­tions, such as: What key terms is the OCR search­ing for? Do those data queries rep­re­sent all of the risks of the client’s unique cir­cum­stances? What do iden­ti­fied out­liers mean, and why were they iden­ti­fied as out­liers? How do I know the re­sults are com­plete and a unique pro­vi­sion with ac­count­ing im­pli­ca­tions that weren’t over­looked?

While we may be able to add con­fi­dence that the OCR tool is more ac­cu­rate at iden­ti­fy­ing spe­cific terms or items in a con­tract than a hu­man may be, we must still think about what may have been missed. What might have tripped a skep­ti­cal thought in a reader that some­thing was un­ex­pected and needed to be eval­u­ated fur­ther?

While we of­ten fo­cus on the chal­lenges of build­ing pro­fes­sional skep­ti­cism in a data- and tech­nol­ogy-driven world, there are ben­e­fits to the changes our tech­nol­ogy-driven cul­ture brings that our pro­fes­sion can build on. Be­cause of the vast amounts of data at their fin­ger­tips, our chil­dren have learned skep­ti­cism at much younger ages.

I grew up with bound en­cy­clo­pe­dia books that were avail­able in li­braries and revered as hold­ers of truth­ful knowl­edge about any topic be­tween their cov­ers. I had no skep­ti­cism of the ac­cu­racy of that in­for­ma­tion. In con­trast, my chil­dren have grown up with a va­ri­ety of web­sites they can search and phones or vir­tual as­sis­tants they can ask for an­swers. I’ve had to teach them at a young age to check ref­er­ences and ver­ify that the in­for­ma­tion shown came from a rep­utable source. Skep­ti­cism is learned as a life skill, and this skill set pro­vides a foun­da­tion to build the pro­fes­sional skep­ti­cism needed in the au­dit­ing pro­fes­sion.

It has never been more crit­i­cal to in­vest in peo­ple and de­velop the ap­pro­pri­ate pro­cesses to drive au­dit qual­ity through ap­pro­pri­ate skep­ti­cal be­hav­ior. Pro­fes­sional skep­ti­cism is foun­da­tional to the au­dit pro­fes­sion and a sig­nif­i­cant part of what makes us rel­e­vant to pro­tect fi­nan­cial state­ment users and the cap­i­tal mar­kets to­day and into the fu­ture.


Sara Lord is a part­ner and the na­tional di­rec­tor of au­dit ser­vices for RSM US LLP.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.