Yes, the title reads correctly. And if you are a parent, there is a high likelihood you will relate to the story I am about to tell.
Imagine my surprise when I logged in to my online banking account to discover a $395 purchase being held on my account from the Apple app store. I thought for sure it had to be fraud. It could not have been my son again. I had already been through this before. Unauthorized in-app purchases that my son had made because I had failed to set up my phone to enter a password before authorizing any transactions. I learned my lesson and set up the appropriate controls so that my son could not buy anything without my permission. Well, smarty pants did do it again, proceeding to buy $395 in gems in Hay Day only days later. When I asked him how he managed to do this when there was a password on the phone, he said, “I watched you type it in and remembered it.”
This happened a few years ago, well before FaceID and other biometrics became more widely available on mobile phones. While the security of mobile devices has transformed with the adoption of built-in biometrics features such as facial recognition, they are still limited in the world of mobile applications (and nearly unusable in the online channel). The use of physical biometrics to authenticate identity and transactions is on the rise in some cases. For example, the use of facial recognition to confirm in-app purchases will prevent my son from future swindles. However, in the broader world of banking and payments, there have been doubts about its ability to provide the necessary levels of trust in cases where money is involved and is limited to mostly mobile devices. In addition, there have been ethical concerns about racial bias and high false positives in some physical biometrics solutions.
Behavioral biometrics, on the other hand, continues to change the world of fraud prevention across a myriad of use cases and digital channels for financial institutions and payment providers without the limitations created by physical biometrics. Analyzing user behavior provides a wealth of insights to assure a level of trust around the authenticity of an identity or transaction. It also preserves the privacy of the user by looking at how – rather than what – personal data is entered.
Behavioral biometrics also looks beyond specific behavioral attributes of a user and compares them against a population of user profiles to determine whether it is associated with known genuine or fraudulent patterns. The use of data science, combined with machine learning models, call out behavioral anomalies that are not associated with a majority of genuine users in the larger population. This ongoing analysis is what helps organizations keep up with the rapid evolution of cybercrime attacks.
A recent innovation in behavioral biometrics is age analysis which involves looking at digital behavior patterns to predict the relative age of a user. This was developed as a way to enhance detection of fraud in the account opening process. In creating the solution, we set out to answer a simple question: Can keystroke dynamics, swipe patterns and other behavioral activities predict the accuracy of a user’s age group in a statistically significant way?
The first thing that came to my mind when I heard about age analysis related to my opening story and got me wondering what types of digital behaviors could differentiate me from my son. There are, in fact, behavioral patterns that shift with age, such as the time required to shift from the Control key to a letter key during data input, mobile device orientation, and swiping patterns. For example, once you hit 40 years old, your keystrokes become slower every year by 2-3 millisecond on your shift-to-letter key press time. These anomalies are exhibited in one instance of a confirmed fraud case where a user was trying to apply for a new credit card. In the example, the applicant’s declared age was 72, but their behavior suggested they were much younger when comparing behaviors to the population of that age group, specifically the time it took to input their annual income. BioCatch was the only solution among the credit issuer’s fraud technology stack to declare the application high-risk.
Behavioral biometrics has also become a technological trailblazer for solving many other complex fraud use cases including social engineering scams, automated attacks, and mule account detection.
Learn more about the BioCatch age analysis feature and how one card issuer is using it to protect vulnerable users and uncover fraud risk in digital account opening.