In our latest Weekend Patch and Vulnerability Report, we warned readers that significant vulnerabilities had been discovered in mobile banking applications from USAA, Chase, Wells Fargo, Bank of America, and TD Ameritrade. According to The Wall Street Journal and Yahoo News, the vulnerabilities discovered by viaForensics could potentially allow a hacker to learn your username, password, and financial information. Information could be stolen just by visiting a malicious website.
The report that critical vulnerabilities had been found in mobile banking applications brought to mind my blog post last September when I discussed the wisdom of mobile online banking with my friend, Biz Coach, Terry Corbell. In my interview with Terry on his blog I had said “I recommend that consumers ignore any and all attempts to induce them to use their phones for online banking.”
Needless to say, Terry received a scathing comment to that blog post from a marketing representative in the mobile banking industry. The commenter was absolutely positively certain that mobile banking was secure, that the software had been thoroughly tested and vetted, and that I didn't know what I was talking about.
With this week's story, it turns out that I was the one who knew what he was talking about not the mobile banking guy. But this blog isn't about who's right and who's wrong. This blog is about learning from experience, particularly that when it comes to cyber security we all need to be a lot more intellectually humble when we talk about how secure something is.
Right now, the cyber criminals are winning. They are winning in part because too many people have a false sense of their own security. They have this false sense of security because they haven't "been there, done that." I have.
For me it was a no-brainer that significant security vulnerabilities were going to be found in mobile banking applications. I had worked for several years in the Aerospace industry securing critical national security software. Before that I had been a research mathematician studying the logic of computer programs. And, as Yogi Berra said, "You can observe a lot just by watching."
I can remember the day we found a critical vulnerability in Cruise missile software that might have kept us from successfully responding to a nuclear attack. I know the managerial, political and especially intellectual challenges we went through to be in a position to catch that mistake. And that's just one example of how experience has taught me that writing high quality software is incredibly challenging (and expensive).
We're taught that pride goeth before the fall. That is certainly true in the battle against cyber crime. That's why perhaps the most important thing I learned in trying to prevent, find and fix critical logic errors in complex software is intellectual humility.
Intellectual humility is the ability to suspend our own belief in something we normally believe in, like the attorney hiring another attorney to find weaknesses in his argument or the doctor seeking a second opinion to look for holes in his diagnosis.
Most of us develop a normal amount of intellectual humility in those areas of our greatest expertise. We understand and appreciate just how hard it is to do the things that we are accustomed to doing and we learn through experience how to pay detailed attention to the things we need to do to do our job.
The challenge is that, human nature being what it seems to be, our intellectual humility doesn't easily carry over to domains where we lack firsthand knowledge and experience. We tend to over-simplify in those places we know little about. This isn't usually a problem: any intellectual humility I might lack regarding how dangerous lions are is mitigated by the fact that I am under no threat from a lion. Unfortunately, when it comes to cyber security, because we're all on the Internet it's as if the lion is right next door. And he's hungry.
We can't expect a marketing representative in the mobile banking industry to have tested communications software controlling our nuclear missiles any more than we can expect the CEO of a bank to have written cyber security software requirements for an advanced military intelligence system. Nor can we expect the people who run our business IT networks to have the same sensitivity to security that we had 25 years ago when we designed a secure network for the Strategic Air Command.
You can see where the danger is in this since these are the same people who influence (and often make) buying decisions about software that we use to manage money and sensitive information; software that has to be adequately secure to protect the money and information it touches. And, lacking the experience, these otherwise well-meaning men and women don't understand the necessity of being intellectually humble in the presence of complex software.
That's why people who have to make decisions about cyber security management must maintain their own healthy skepticism, resisting any temptation they may have to believe cyber security claims, whether from marketing people, their banks or their own internal IT staff. Ronald Reagan is famous for saying: "Trust. But verify." Do him one better: drop the trust.
© Copyright 2010. Citadel Information Group. All Rights Reserved.