By Chris Trytten
Data breaches have plagued companies for more than a decade and show no signs of abating but actually increasing 29 percent in 2015 compared to the prior year. The number of incidents exposing one million or more records increased 40 percent, while three mega events occurred in the third quarter of last year, each exposing more than 10 million records.
Alarmingly, the costs associated with data breaches are climbing fast. In finance and banking, the average cost per stolen record is $215 (2015 Ponemon report) and customers churn at a rate of 3.1 percent after a data breach. Research shows that more than three quarters of breaches are due to compromised or stolen credentials.
In short, we continue to be breached, we are painfully aware of the costs and we know the central cause of data breaches. So why hasn’t this problem been solved?
It turns out that solving this problem is not trivial.
To understand why the data breach problem seems so intractable, let’s look at the current security context that fosters their continued existence.
The New Adversary
Our understanding of the nature of our adversary needs to be revised significantly upward. It is no longer a teenager in his mom’s basement at 2 a.m. trying to impress his friends by splashing humorous animated gifs across corporate monitors. The new cyber criminal is a professional IT veteran who knows how we operate, knows our weaknesses and patiently architects and launches sophisticated attacks that leave us aghast.
Moreover, our new adversary isn’t in it for the glory but for the cold, hard cash — and lots of it. The sheer number and scope of data breaches underscore the reality that much of the corporate data we store and protect has a high market value and can be monetized online, quickly and anonymously.
These are the means and motives that guarantee continued data breaches, unless we get smarter, fast.
The New Environment
Many banking and financial institutions still rely on an aging perimeter security model to safeguard digital assets. In this model, there is a neat demarcation between a trusted internal zone and an untrusted external zone, a security perimeter, replete with firewalls, intrusion detection systems and anti-virus software. The working assumptions are that 1) within the security perimeter all actors and assets are safe and not threats to corporate security and those outside the perimeter are unsafe and potential security threats; and 2) the perimeter is effective at preventing hostile, outside parties from penetrating the perimeter and attacking corporate IT assets.
Given the continued onslaught of data breaches that handily bypass our perimeter security, it’s a safe bet that these assumptions are no longer true.
Disruptors Change Everything
The key trends largely responsible for the dissolution of the perimeter are the advent of cloud computing, the explosion of mobile devices and the greatly expanded access to IT resources by non-employee actors, such as vendors, partners, service providers and even customers. Together, these developments allow unfettered access to applications, many of which exist outside the traditional security perimeter, using uncontrolled mobile platforms by just about anyone, anytime, anywhere. What could go wrong?
The Perimeter Isn’t Dead, It’s Just Not Enough
All the while this new and nebulous corporate computing model is upsetting the security apple cart, IT still needs to care for traditional computing platforms, applications and data that sit quietly behind the security perimeter but are even in more need of an updated security model. Remember that a large number of breaches over the past five years have used targeted IT systems, such as AD servers, as a core component of the attack. These systems were safely ensconced inside the security perimeter. Despite this, cyber crooks were able to penetrate the network, steal and create admin credentials for systems such as AD servers.
The reality is that being “inside the perimeter” or “outside the perimeter” is no longer inclusive enough to be useful to security practitioners. “Identities” can no longer be defined based on a location in the topology. Authenticating and granting access to IT assets needs to be based on an expanded definition of “identity” and become the central point of control, the “new perimeter” for the distributed IT environment.
What Is an Identity?
An “identity” is a set of attributes that defines a person in relationship to his or her organization, such as job role, department, hire date and compensation. In addition, an identity has a defined set of entitlements, such as access rights to corporate systems, applications and data. Historically, the Identity and the entitlements have been mediated by credentials used to authenticate an identity assertion, to prove that a person is who they say they are. This is the heart of the challenge facing IT.
We have depended and continue to depend on passwords as the primary authentication credential; however, passwords are being stolen and used to attack corporate IT assets with increasing frequency and ease. To combat the wholesale theft of passwords, we have tried to counter their security shortcomings by imposing draconian password policies on users, with little to show for our efforts. People write down anything they can’t remember. If they don’t write it down, they forget it and call for a password reset. Further, even strong passwords can be stolen through a myriad of attacks ranging from social engineering to malware to simple shoulder surfing.
Multifactor Will Save Us All
Exhausted by our inability to tame passwords, we have moved on to multifactor authentication, using something you know, something you have and something you are in various combinations, according to an assessment of risk. It seems like such a good idea, but its implementation is fraught with obstacles.
Multifactor authentication solutions have been costly to provision, deploy and administer. They also have not always provided full coverage, securing all systems, data and applications. Leaving just one system behind because of integration barriers leaves a door wide open to attack. Cyber criminals are opportunistic targeting the weakest link.
And, as we are painfully aware, cyber criminals don’t sit still. The threats aligned against IT are constantly shifting and changing. Any authentication solution needs to be easily extensible to incorporate new security threats as they arise.
Finally, it is no longer acceptable to trade off ease of use for security. Unfortunately, this hasn’t been the case for most multifactor implementations. We need new authentication security models that are in line with the complex IT environment we live in, but shouldn’t trade security for ease of use and vice versa.
The Rise of the Smartphone
Even as the deluge of mobile devices has turned IT security on its head, it also brings great promise as a security platform. For a device to be a viable security platform, it has to be ubiquitous and not an “extra-thing-I-have-to-haul-around.”
A few statistics will make the case for the ubiquity of smartphones:
- The global installed base to reach 3 billion in 2016 (Strategy Analytics).
- In 2016, 350 million employees worldwide will use smartphones; 200 million will bring their own (Forrester).
- Worldwide, 85 percent of handset subscribers to use smartphones by 2020 (Forrester).
People are now more concerned about making sure they have their smartphone than their car keys when they leave the house in the morning, making them an essential — not an optional — device easily forgotten or lost.
In addition to their wide adoption, smartphones also have attributes that make them valuable as a security platform. These include sensors, network interfaces and biometrics. Smartphones provide a dizzying array of authentication factors, most of them capable of operating completely transparently, satisfying a critical requirement for both security and ease of use.
In spite of the highly distributed, variable IT environment, users do show patterns of behavior that can be captured and used to assess risk during authentication transactions. Factors to be interrogated can include devices commonly used for access, where and at what time they normally access resources and recent history.
With this knowledge, IT can implement dynamic, risk-based policies providing variable levels of identity assurance. In other words, context allows risk assessment and the resultant risk factor determines the level of authentication. Risk-based access and authentication policies make it possible to deliver increased security without negatively impacting the end-user experience.
Barriers to Strong Authentication
Many authentication technologies coming to market could potentially take the wind out of the data breach juggernaut. With all good news comes a cautionary note, however. To be effective, security solutions must be easy to deploy, administer and use.
Unfortunately, many identity and authentication solutions are expensive, complex and require long deployment times, often involving application code changes. Furthermore, many authentication solutions only provide partial coverage, offering a limited set of authenticators and leaving many systems and applications unprotected. More than ever, the market needs solutions that provide full coverage and are easy to deploy and use.
Chris Trytten is the market solutions manager at Crossmatch (www.crossmatch.com). In his current position, he is using his experience serving the financial and retail markets by guiding the product and market teams to address the security needs of these industries. Trytten is the author of multiple security white papers and articles.