Multifactor Authentication

Site: Saylor Academy
Course: CS406: Information Security
Book: Multifactor Authentication
Printed by: Guest user
Date: Thursday, April 25, 2024, 10:16 PM

Description

Authentication can be accomplished with one factor, two factors, or multiple factors. Which one is the weakest level of authentication and which is the most secure and why? When would a more secure system be required? Be able to explain these multifactor authentication methods: password protection, token presence, voice biometrics, facial recognition, ocular-based methodology, hand geometry, vein recognition, fingerprint scanner, thermal image recognition, and geographical location. What are some challenges of multiple factor authentication when using biometrics? There is a lot of interesting information covered in this article that you do not need to memorize, but that you should be aware of.

Abstract

Today, digitalization decisively penetrates all the sides of the modern society. One of the key enablers to maintain this process secure is authentication. It covers many different areas of a hyper-connected world, including online payments, communications, access right management, etc. This work sheds light on the evolution of authentication systems towards Multi-Factor Authentication (MFA) starting from Single-Factor Authentication (SFA) and through Two-Factor Authentication (2FA). Particularly, MFA is expected to be utilized for human-to-everything interactions by enabling fast, user-friendly, and reliable authentication when accessing a service. This paper surveys the already available and emerging sensors (factor providers) that allow for authenticating a user with the system directly or by involving the cloud. The corresponding challenges from the user as well as the service provider perspective are also reviewed. The MFA system based on reversed Lagrange polynomial within Shamir's Secret Sharing (SSS) scheme is further proposed to enable more flexible authentication. This solution covers the cases of authenticating the user even if some of the factors are mismatched or absent. Our framework allows for qualifying the missing factors by authenticating the user without disclosing sensitive biometric data to the verification entity. Finally, a vision of the future trends in MFA is discussed.


Source: Aleksandr Ometov, Aleksandr Ometov, Niko Mäkitalo, Sergey Andreev, Tommi Mikkonen, and Yevgeni Koucheryavy, https://www.mdpi.com/2410-387X/2/1/1/htm
Creative Commons License This work is licensed under a Creative Commons Attribution 4.0 License.

1. Introduction

The continuous growth in the numbers of smart devices and related connectivity loads has impacted mobile services seamlessly offered anywhere around the globe. In such connected world, the enabler keeping the transmitted data secure is, in the first place, authentication.
According to the fundamental work in, authentication is a process where a "user identifies himself by sending x to the system; the system authenticates his identity by computing x and checking that it equals the stored value y". This definition has not changed significantly over time despite the fact that a simple password is no longer the only factor for validating the user from the information technology perspective .

Authentication remains a fundamental safeguard against illegitimate access to the device or any other sensitive application, whether offline or online (see Figure 1). Back in time, the transactions were authenticated primarily by physical presence, i.e., for example, by applying the wax seal. Closer to present days and with the advancement of our civilization, it was realized that the validation based on the sender identification only is not always adequate on the global scale.

Figure 1. Conceptual authentication examples.


Initially, only one factor was utilized to authenticate the subject. By that time, Single-Factor Authentication (SFA) was mostly adopted by the community due to its simplicity and user friendliness. As an example, the use of a password (or a PIN) to confirm the ownership of the user ID could be considered. Apparently, this is the weakest level of authentication. By sharing the password, one can compromise the account immediately. Moreover, an unauthorized user can also attempt to gain access by utilizing the dictionary attack, rainbow table, or social engineering techniques . Commonly, the minimum password complexity requirement is to be considered while utilizing this type of authentication.

Further, it was realized that authentication with just a single factor is not reliable to provide adequate protection due to a number of security threats. As an intuitive step forward, Two-Factor Authentication (2FA) was proposed that couples the representative data (username/password combination) with the factor of personal ownership, such as a smartcard or a phone.
Today, three types of factor groups are available to connect an individual with the established credentials:

  1. Knowledge factor – something the user knows, such as a password or, simply, a "secret";
  2. Ownership factor – something the user has, such as cards, smartphones, or other tokens;
  3. Biometric factor – something the user is, i.e., biometric data or behavior pattern.
Subsequently, Multi-Factor Authentication (MFA) was proposed to provide a higher level of safety and facilitate continuous protection of computing devices as well as other critical services from unauthorized access by using more than two categories of credentials. For the most part, MFA is based on biometrics, which is automated recognition of individuals based on their behavioral and biological characteristics. This step offered an improved level of security as the users were required to present the evidence of their identity, which relies on two or more different factors. The discussed evolution of authentication methods is shown in Figure 2.

Figure 2. Evolution of authentication methods from SFA to MFA.


Today, MFA is expected to be utilized in scenarios where safety requirements are higher than usual. According to SC Media UK, 68 percent of Europeans are willing to use biometric authentication for payments. Consider the daily routine of ATM cash withdrawal. Here, the user has to provide a physical token (a card) representing the ownership factor and support it with a PIN code representing the knowledge factor to be able to access a personal account and withdraw money.
This system could be easily made more complex by adding the second channel like, for example, a one-time password to be entered after both the card and the user password were presented. In a more interesting scenario, it could be done with the facial recognition methods. Moreover, a recent survey discovered that 30 percent of enterprises planned to implement the MFA solution in 2017, with 51 percent claiming that they already utilize MFA, and 38 percent saying that they utilize it in "some areas" of operation. This evidence supports the MFA as an extremely promising direction of the authentication evolution.

As one of the interesting future trends, authentication between a vehicle and its owner or a temporary user may be considered. Based on the statistics, a vehicle is stolen every 45 s in the U.S. The current authentication method that allows for starting and using the vehicle is still an immobilizer key. The MFA may significantly improve access to most of the electronic devices from both security and user experience perspectives.

Generally, MFA applications could be divided into three market-related groups: commercial applications, i.e., account login, e-commerce, ATM, physical access control, etc.; governmental applications, i.e., identity documents, government ID, passport, driver's license, social security, border control, etc.; and forensic applications, i.e., criminal investigation, missing children, corpse identification, etc. Generally, the number of scenarios related to authentication is indeed large. Today, MFA becomes an extremely critical factor for:

  • Validating the identity of the user and the electronic device (or its system);
  • Validating the infrastructure connection;
  • Validating the interconnected IoT devices, such as a smartphone, tablet, wearable device, or any other digital token (key dongle).
Presently, one of the main MFA challenges is the absence of correlation between the user identity and the identities of smart sensors within the electronic device/system. Regarding security, this relationship must be established so that only the legitimate operator, e.g., the one whose identity is authenticated in advance, can gain the access rights. At the same time, the MFA process should be as user-friendly as possible, for example:

  1. Customers first register and authenticate with the service provider to activate and manage services they are willing to access;
  2. Once accessing the service, the user is required to pass a simple SFA with the fingerprint/token signed in advance by the service provider;
  3. Once initially accepted by the system, the customer authenticates by logging in with the same username and password as setup previously in the customer portal (or social login). For additional security, the managing platform can enable secondary authentication factors. Once the user has successfully passed all the tests, the framework automatically authenticates to the service platform;
  4. The secondary authentication occurs automatically based on the biometric MFA, so the user would be requested to enter an additional code or provide a token password only in case the MFA fails.
Biometrics indeed significantly contribute to the MFA scheme and can dramatically improve identity proofing by pairing the knowledge factor with the multimodal biometric factors, thus making it much more difficult for a criminal to eavesdrop on a system while pretending to be another person. However, the utilization of biological factors has its challenges mainly related to the ease of use, which largely impacts the MFA system usability.

From the user experience perspective, fingerprint scanner already provides the most widely integrated biometric interface. This is mainly due to its adoption by smartphone vendors on the market. On the other hand, it is not recommended to be utilized as a standalone authentication method. However, the use of any biometrics often requires a set of separate sensing devices. The utilization of already integrated ones allows for reducing the authentication system costs and facilitate the adoption by end users. A fundamental trade-off between usability and security is one of the critical drivers when considering the authentication systems of today.

Another challenge is that the use of biometrics relies on a binary decision mechanism. This was well studied over past decades in classical statistical decision theory from the authentication perspective. There are various possible solutions to control a slight mismatch of the actual "measured" biometrics and the data stored in previously captured samples. The two widely utilized techniques are: false accept rate (FAR) and false reject rate (FRR). Manipulations with the decision criteria allow adjusting the authentication framework based on the predefined cost, risks, and benefits. The MFA operation is highly dependent on FAR and FRR, since obtaining zero values for both of the metrics is almost infeasible. The evaluation of more than one biometric feature to establish the identity of an individual can improve the operation of the MFA system dramatically.
Since the currently available literature faces a lack of detailed MFA analysis suitable for non-specialists in the field, the main contributions of this work are as follows:

  1. This work provides a detailed analysis of factors that are presently utilized for MFA with their corresponding operational requirements. Potential sensors to be utilized are surveyed based on the academic and industrial sources (Section 2);
  2. The survey is followed by the challenges related to MFA adoption from both the user experience and the technological perspectives (Section 3);
  3. Further, the framework based on the reversed Lagrange polynomial is proposed to allow for utilizing MFA in cases where some of the factors are missing (Section 4). A discussion on the potential evaluation methodology is also provided;
  4. Finally, the vision of the future of MFA is discussed (Section 5)

2. State-of-the-Art and Potential MFA Sources

Presently, the authentication systems already utilize an enormous number of sensors that enable identification of a user. In this section, we elaborate on the MFA-suitable factors, corresponding market-available sensors, and related challenges. Furthermore, we provide additional details on the ones that are to be potentially deployed in the near future.

2.1. Widely Deployed MFA Sensors/Sources

Today, identification and authentication for accessing sensitive data are one of the primary use cases for MFA. We further list the factors already available for the MFA utilization without acquiring additional specialized equipment.

2.1.1. Password Protection

The conventional way to authenticate a user is to request a PIN code, password, etc.. The secret pass-phrase traditionally represents a knowledge factor. It requires only a simple input device (at least one button) to authenticate the user.

2.1.2. Token Presence

The password could then be supplemented with a physical token­ – for example, a card, which is recommended as a second factor group­ – the ownership. From the hardware perspective, a user may present a smartcard, phone, wearable device, etc., which are more complicated to delegate. In this case, the system should be equipped with a radio interface allowing for two-way communication with the token. On the other hand, the most widely known software token is one-time software generated password. The main drawback of the above is the problem of uncontrollable duplication.

2.1.3. Voice Biometrics

Most of the contemporary smart electronic devices are equipped with a microphone that allows utilizing voice recognition as a factor for MFA. At the same time, the technology advancement of tomorrow may allow special agencies not only to recognize the speakers but also to mimic their voices including the intonation, timbre, etc., which is a serious drawback of utilizing voice as a primary authentication method.

2.1.4. Facial Recognition

As the next step, facial recognition could be considered. At the beginning of its development, the technology was based on the landmark picture analysis, which was relatively simple to replicate by supplying the system with a photo. The next phase was by enabling three-dimensional face recognition, i.e., by asking the user to move head during the authentication process in a specific manner. Finally, the advancement of this system reached the point of recognizing the actual expressions of the user. To enable facial recognition, it is required to equip the system with at least one output device and a camera.

2.1.5. Ocular-Based Methodology

The iris recognition techniques are on the market for more than 20 years. This approach does not require the user to be close to the capture device while analyzing the color pattern of the human eye. Retina analysis is another attractive technique. Here, a thin tissue composed of neural cells that are located in the posterior portion of the eye is captured and analyzed. Because of the complex structure of the capillaries that supply the retina with blood, each person’s retina is unique. The most prominent challenges in those methods are the need for high quality capture device and robust mathematical technique to analyze the image.

2.1.6. Hand Geometry

Some systems employ the analysis of the physical shape of a hand to authenticate the user. Initially, pegs were utilized to validate the subject, but the usability of such methods was low. Further on, the flatbed scanner was used to obtain the image without the need to fix the user’s hand in one specific position. Today, some systems utilize conventional cameras not requiring close contact with the capture surface. This approach is, however, not very robust to the environment. Some vendors apply so-called photoplethysmography (PPG) to determine whether a wearable device (e.g., a smartwatch) is currently on its user’s wrist or not. The process is similar to the one followed when measuring heart rate.

2.1.7. Vein Recognition

The advances in fingerprint scanners offer an opportunity to collect the vein picture of the finger as well. More complicated devices utilize palm print recognition to acquire and store the shape/movement of the entire hand. At the current stage of development, vein biometrics are still vulnerable to spoofing attacks.


2.1.8. Fingerprint Scanner

Utilizing fingerprint scanner as the primary authentication mechanism is currently being pushed by the majority of smartphone/personal computer vendors. This solution is intuitive to use but remains extremely simple to fabricate­ – mainly due to the fact that our fingerprints could be obtained from almost anything we touch. The integration potential of this method is indeed high, even though it is also not recommended to be used as a standalone authentication approach. Most of the smartphone vendors install an additional camera to obtain the fingerprint instead of more safe vein recognition.

2.1.9. Thermal Image Recognition

Similarly to vein recognition, thermal sensor is utilized to reconstruct the unique thermal image of one's body blood flow in proximity. Many challenges with this authentication method may arise due to the user conditions: sickness or emotion may significantly influence the perceived figures.

2.1.10. Geographical Location

Utilizing the device's and user's geographical location to validate whether access to the device/service could be granted is a special case of location-based authentication. Importantly, GPS signal could be easily jammed or considered faulty due to the propagation properties; thus, it is recommended to utilize at least two location sources, for example, GPS and wireless network cell ID. A smartphone could be used to support MFA from the location acquisition perspective.

2.2. Future of MFA Integration

Accelerated adoption across many industries as well as increased availability of biometric services in a wide range of readily-available consumer products is pushing the concept of tight MFA integration. Currently, researchers and early technology adopters attempt to integrate new sensors to be utilized in MFA systems.

2.2.1. Behavior Detection

Back in time, behavior recognition was utilized to analyze military telegraph operator's typing rhythm to track the movement of the troops. Today, gestures for authentication purposes may range from conventional to "hard-to-mimic" ones, since motor-programmed skill results in the movement being organized before the actual execution.

A modern example of such identification is the process of tapping the smartphone screen. This approach could be easily combined with any text-input authentication methods as a typing pattern is unique for each person. In case the MFA system is specifically developed for predefined gesture analysis, the user is required to replicate a previously learned movement while holding or wearing the sensing device.

A natural step of authentication for widely used handheld and wearable devices is the utilization of accelerometer fingerprinting. For instance, each smartphone holder could be verified based on the gait pattern by continuously monitoring the accelerometer data that is almost impossible to fake by another individual .

For in-vehicle authentication, the integral system is expected to monitor the driver-specific features, which could be analyzed from two perspectives: vehicle-specific behavior: steering angle sensor, speed sensor, brake pressure sensor, etc.; and human factors: music played, calls made, presence of people in the car, etc.. Another important blocker-factor is alcohol sensor. The engine start function could be blocked in case when the level of alcohol in the cabin is above an acceptable legal limit.

2.2.3. Occupant Classification Systems (OCS)

Some vehicular systems already have the OCS solutions integrated in consumer cars. A system of sensors can detect who is currently in the passenger/driver seat by utilizing, for example, weight or posture and automatically adjusting the vehicle to personal needs.

2.2.4. Electrocardiographic (ECG) Recognition

ECG data could be collected from the user’s smart watch or activity tracker and compared with an individually stored pattern. The main benefit of using this factor for authentication is that ECG signals emerge as a potential biometric modality with the advantage of being difficult (or close to impossible) to mimic. The only way is by utilizing the existing personal recording.

2.2.5. Electroencephalographic (EEG) Recognition

his solution is based on the brain waves analysis and could be considered from the fundamental philosophical proposition "Cogito ergo sum" by R. Descartes, or "I think, therefore I am". It allows for obtaining a unique sample of the person's brain activity pattern. Formerly, EEG data capture could have been performed only in clinical settings by using invasive probes under the skull or wet-gel electrodes arrayed over the scalp. Today, the simple EEG collection is possible by utilizing market-available devices having the size of a headset.

2.2.6. DNA Recognition

Human cell lines are an essential resource for research, which is most frequently used in reverse genetic approaches or as in vitro models of human diseases. It is also a source of unique DNA fingerprinting information. Even though the process is time-consuming and expensive, it may be potentially utilized to pre-authorize the user to the highly secure facility along with other factors.
Subsequently, a comparison of the main indicators for the already deployed and emerging factors is given in Table 1. The factors/sensors are evaluated based on the following parameters:

Table 1. Comparison of suitable factors for MFA: H – high; M – medium; L – low; n/a – unavailable.

3. MFA Operation Challenges

An integration of novel solutions has always been a major challenge for both developers and managers. The key challenges are presented in Figure 3. In the first place, user acceptance is a critical aspect for the adoption of strong identity and multi-factor authentication. While adopting and deploying MFA solutions, it is required to follow a careful and thorough approach – where most challenges arise from opportunities and potential benefits.


Figure 3. Main operational challenges of MFA.




3.1. Usability

The main usability challenges emerging in the authentication process could be characterized from three perspectives:

  • Task efficiency – time to register and time to authenticate with the system;
  • Task effectiveness – the number login attempts to authenticate with the system;
  • User preference – whether the user prefers a particular authentication scheme over another.
In addition to the approaches discussed previously, researchers have already started an investigation of more specific effects in the authentication procedures based on a variety of human factors. The authors of provided a study on how the user age affects the task efficiency in cases of PIN and graphic access mechanisms. It is concluded that younger generation can spend up to 50 percent less time to pass the authentication procedure in both cases. Interestingly, the authors of have shown that gender, in the same case, does not affect the results.

Another direction in the authentication mechanisms usability is related to cognitive properties of the selected human. The work in offered an overview on how to make the passwords memorable while keeping them relatively usable and secure at the same time. Paper by Belk et al. delivered a research on the task completion efficiency and effectiveness among the conventional passwords and the realistic ones. The results revealed that, for most of the participants, the utilization of graphic passwords requires more time than for the textual ones. However, cognitive differences between users, i.e., being Verbal or Imager, affect the task completion significantly. Here, Verbals complete the text-based tasks faster than Imagers and vice versa. The work by Ma et al. studied the impact of disability (Down syndrome) in the same two scenarios. It was once again confirmed that textual passwords are utilized better compared to the graphical ones.

In addition, the properties of the authentication device play a major role in this process. The authors of investigated the usability of textual passwords on mobile devices. It was proven that using a smartphone or other keyboardless equipment for creating a password suffers from poor usability as compared to conventional personal computers. Another work confirmed the same theory from a task efficiency perspective.

Today, most of the online authentication services are knowledge-based, i.e., depend on the username and password combination. More complex systems require the user to interact with additional tokens (one-time passwords, code generators, phones, etc.). Complementing traditional authentication strategies, MFA is not feasible without biometrics. From this perspective, the work in provided an analysis on how gamification and joy can positively impact the adoption of new technology. The gesture-related user experience research conducted in showed that security and user experience do not necessarily need to contradict one other. This work also promoted pleasure as the best way for fast technology adoption. The reference addressed the usability of the ECG solution for authentication, and it was concluded that the application of ECG is not yet suitable for dynamic real-life scenarios.

Many researchers promoted the utilization of personal handheld devices to be utilized during the MFA procedure. Michelin et al. proposed using the smartphone's camera for facial and iris recognition while keeping the decision-making in the cloud. Another work on biometric authentication for an Android device demonstrated an increased level of satisfaction related to higher task efficiency achieved with the MFA solution. Reference studied the usability and practicality of biometric authentication in the workplace. It was concluded that the ease of technology utilization and its environmental context play a vital role – the integration and the adoption will always incur additional and unexpected resource costs.

An extremely important problem of MFA usability roots in the fact that "not all users can use any given biometric system". People who have lost their limb due to an accident may not be able to authenticate using a fingerprint. Visually impaired people may have difficulties using the iris-based authentication techniques.

Biometric authentication requires an integration of new services and devices that results in the need for additional education during adoption, which becomes more complicated for seniors and due to related understandability concerns. One fact is clear – user experience plays a prominent role in successful MFA adoption; some say, "user comes first". Today, research in usable security for knowledge-based user authentication is in the process of finding a viable compromise between the usability and security – many challenges remain be addressed and will arise soon.

3.2. Integration

Even if all the usability challenges are resolved during the development phase, integration brings further problems from both technological and human perspectives.

Most of the consumer MFA solutions remain hardware-based. Generally, "integrating physical and IT security can reap considerable benefits for an organization, including enhanced efficiency and compliance plus improved security". However, convergence is not so simple. Related challenges include bringing the physical and the IT security teams together, combining heterogeneous system components, and upgrading the physical access systems.

While developing the MFA system, biometrics independence should be considered carefully, i.e., assurance of interoperability criteria should be met. The framework needs to have functionality to handle the biometric data from sensors other than the initially deployed ones. The utilization of multi-biometrics, that is, simultaneous usage of more than one factor should also be taken into account.

Another major interoperability concern is vendor dependency. Enterprise solutions are commonly developed as stand-alone isolated systems that offer an extremely low level of flexibility. Integration of newly introduced to the market sensors would require complicated and costly updates, which most probably will not be considered soon.

Further, it should be noted that most of the currently available MFA solutions are not fully/partially open source. This introduces the questions of trustworthiness and reliability to the third party service providers. The available level of transparency delivered by both hardware and software vendors should be taken into consideration while selecting the MFA framework in the first place.

3.3. Security and Privacy

Any MFA framework is a digital system composed of critical components, such as sensors, data storage, processing devices, and communication channels. All of those are typically vulnerable to a variety of attacks at entirely different levels, ranging from replay attempts to adversary attacks. Security is thus a necessary tool to enable and maintain privacy. Therefore, we begin with the attacks executed on the input device itself. Letting only the legitimate controller access and process sensitive personal data exposes the community to the main risks related to MFA security that are listed further.

The first of the key risks is related to data spoofing that would be successfully accepted by the MFA system. Notably, due to biometrics being used by a variety of MFA frameworks, a glaring opportunity for the attacker to analyze both the technology underlying the sensor and the sensor itself results in revealing the most suitable spoofing materials. The main goal of the system and hardware architects is to provide either a secure environment or, in case it is not possible, to consider the related spoofing possibilities in advance. A risk of capturing either physical or electronic patterns and reproducing them within the MFA system should be addressed carefully.

Conventionally, the safeguard to protect against electronic replay attacks requires utilization of a timestamp. Unfortunately, a biometric spoofing attack is fairly simple to execute. Even though biometrics can improve the performance of the MFA system, they can also increase the number of vulnerabilities that can be exploited by an intruder. Further risk is sensitive data theft during the transmission between the sensor and the processing/storage unit. Such theft may primarily occur due to insecure transmission from the input device through extraction and matching blocks to the database, and there is potential for an attack. The required levels of data safety should be guaranteed to resist against this risk type.

Another opportunity to attack the MFA system is by capturing the secret data sample. For knowledge factors, the system would be immediately compromised in case zero-knowledge solutions are not utilized. Specific interest is dedicated to capturing a biometric sample that could not be updated or changed over time. Hence, protection of the biometric data requires a higher level of security during capture, transmission, storage, and processing phases.

The following risk is related to the theft from the data storage. Conventionally, databases are stored in a centralized manner, which offers a single point of failure. At the same time, some of the remote systems contacting the database are not always legitimately authorized to access the personal data stored. High level of isolation is required to protect the data from theft in addition to utilizing irreversible encryption. Subsequent risk is related to location-related attacks. The GPS signal could be vulnerable to position lock (jamming) or to feeding the receiver with false information, so that it computes an erroneous time or location (spoofing). Similar techniques may be applied to cellular- and WLAN-based location services.

Finally, being an information technology system, MFA framework should deliver relatively high levels of "throughput", which reflects the capability of a system to meet the needs of its users in terms of the number of input attempts per time period. Even if the biometrics are considered suitable in every other aspect, but the system can only perform, e.g., one biometrics-based match per hour, whereas it is required to operate at 100 samples per hour, such a solution should not be considered as feasible. The recommendation here is to select appropriate processing hardware for the server/capture side.

The MFA security framework should also support a penetration testing panel to assess its potential weaknesses. Today, the developers are often conducting external audit to evaluate the risks and act based on such evaluation for more careful planning. The MFA system should thus be assessed to deliver a more secure environment.

3.4. Robustness to Operating Environment

Even if the security and privacy aspects are fully resolved, the biometric systems, mainly fingerprinting, were falling short of fulfilling the "robustness" requirement since the very beginning of their journey. This was mainly due to the operational trials being conducted in the laboratory environment instead of the field tests. One distinct example is voice recognition, which was highly reliable in a silent room but failed to verify the user in urban landscapes.

A similar problem applies to early facial recognition techniques, which failed to operate without adequate light support, quality camera, etc.. The flip side of the coin was the need for continuous supervision of the examined subject. Even today, there are either bits of advice on where to look/place fingers, or there is visual aid available during the security check. The lack of experience in machine-to-human interaction is commonly analyzed with Failure to Enroll (FTE) as well as Failure to Acquire (FTA) rates. They both depend on the users themselves as well as the additive environmental noise.

Since a significant part of MFA is highly dependent on biometry, it could be classified as inherently probabilistic due to such nature. The base of the biometric authentication lies in the field of pattern matching, which in turn relies on approximation. Approximate matching is a critical consideration in any MFA system, since difference between users could be crucial due to a variety of factors and uncertainty. The image captured during a fingerprint scan would be different every time it is observed because of the presentation angle, pressure, dirt, moisture, or differentiation of sensors even if taken of the same person.

Two important error rates used to quantify the performance of a biometric authentication system are FAR and FRR. FAR is the percentage of impostors inaccurately allowed as genuine users. It is defined as the ratio of the number of false matches to the total number of impostor match attempts. FRR is the number of genuine users rejected from using the system, which is defined as the ratio of the number of false rejections to the total number of genuine match attempts.

Literature further recommends the utilization of the Crossover Error Rate (CER) in addition to the previously discussed metrics. This parameter is defined as the probability of the system being in a state where FAR equals to FRR. The lower this value is, the better the system performs. According to, "Higher FAR is preferred in systems where security is not of prime importance, whereas higher FRR is preferred in high-security applications". The point of equality between FAR and FRR is referred to as Equal Error Rate (EER). Based on the above, it could be once again concluded that a system utilizing solely biometrics may not be considered as a preferred MFA framework.

By analyzing the above listed challenges, it is possible to evaluate and assess the entire MFA system. In what follows, we propose an approach to enable MFA for vehicular integration based on the availability of a large number of sensors in modern vehicles.

4. Enabling Flexible MFA Operation

In this work, we offer a new authentication scheme that focuses on the vehicle-to-everything (V2X) scenarios, since cars of today are already equipped with multiple sensors that could potentially be utilized for MFA. Conventionally, the user has a username/password/PIN/token and will additionally be asked to utilize a biometric factor, such as facial features or fingerprints. The general overview supported by a follow-up discussion is given in Figure 4. If the authentication procedure fails to establish trust by using this combination of factors, then the user will be prompted to authenticate by utilizing another previously registered factor or a set of those. This MFA system may not only verify the accuracy of the user input but also determine how the user interacts with the devices, i.e., analyze the behavior. The more the user interacts with the biometric system, the more accurate its operation becomes.

Figure 4. Current and emerging MFA sensors for vehicles.


Another feature of the discussed scenario is the actual sensor usability in case of interaction with a car. If a sensor (e.g., a fingerprint reader) is being utilized and that device is not available from where the user is attempting to log in or gain access­ – the user experience becomes inadequate. Having a dual-purpose device­ – smartphone or smartwatch (suitable for executing the information security primitives), which the user already has in his or her possession­ – as an additional MFA factor (not only as a token) makes both the system costs and usability much more reasonable.

The presence of large amounts of sensor data brings us to the logical next step of its application in MFA. We further envision potential utilization of the corresponding factors to authenticate the user without implementing a dedicated "verifier" with the actual biometric data except for the one collected in real time.

4.1. Conventional Approach

One of the approaches considered within the scope of this work is based on utilizing Lagrange polynomials for secret sharing. The system secret S is usually "split" and distributed among a set of key holders. It could be recovered later on, as described in and numerous other works, as

\begin{array}{l} f(x)=S+a_{1} x+a_{2} x^{2}+\cdots+a_{l-1} x^{l-1} \\ f(0)=S \end{array}

where a_{i} are the generated polynomial indexes and x is a unique identification factor F_{i}. In such systems, every key holder with a factor ID obtains its own unique key share S_{I D}=f(I D)

In conventional systems, it is required to collect any / shares \left\{S_{I D_{1}}, S_{I D_{2}}, \ldots, S_{I D_{l}}\right\} of the initial secret to unlock the system, while the curve may offer n>l points, as it is shown in Figure 5. The basic principle behind this approach is to specify the secret S and use the generated curve based on the random coefficients a_{i} to produce the secret shares S_{i}. This methodology is successfully utilized in many secret sharing systems that employ the Lagrange interpolation formula.


Figure 5. Lagrange secret sharing scheme.

Unfortunately, this approach may not be applied for the MFA scenario directly, since the biometric parameters are already in place, i. e ., we can neither assign a new S_{i} to a user nor modify them. On the one hand, the user may set some of the personal factors, such as password, PIN-code, etc. On the other hand, some of them may be unchangeable (biometric parameters and behavior attributes). In this case, an inverse task where the shares of the secret S_{I D_{i}} are known as factor values S_{i} is to be solved. Basically, S_{i} are fixed and become unique \left\{S_{1}, S_{2}, \ldots, S_{l}\right\} when set for a user. In this case, S is the secret for accessing the system and should be acquired with the user factor values. A possible solution based on the reversed Lagrange interpolation formula is proposed in the following subsection.

4.2. Proposed Reversed Methodology

In this work, we consider the MFA system with explicit / factors F. Each factor F_{i} has a unique secret S_{i} obtained with the corresponding procedure (PIN, fingerprint, etc.) from the user. In the worst case, it is related to the biometric data-the probability that it changes over time is low. The corresponding factors and secrets could then be represented as

 \begin{array}{l} F_{1}: S_{1} \\ F_{2}: S_{2} \\ \cdots \\ F_{l}: S_{L} \\ F_{l+1}: T \end{array}

where S_{i} is the secret value obtained from the sensor (factor), l is the number of factors required to reconstruct the secret, and F_{l+1} is a timestamp collected at time instant T. It is important to note that providing the actual secrets to the verifier is not an option, especially in case of sensitive biometric data, because a fingerprint is typically an unchangeable factor. Hence, letting even a trusted instance obtain the corresponding data is a questionable step to make. Conversely, compared to the method considered in Section 4.1, the modified algorithm implies that S_{i} are obtained from the factors (only one polynomial describes the corresponding curve), as it is shown in Figure 5. In other words, the proposed methodology produces the system secret S based on the collected factor values S_{i} instead of assigning them in th first place. A system of equations connected to the Lagrange interpolation formula with the factors, their values, and the secret for the system access is

 \left\{\begin{array}{l} S_{1}=\bar{S}+a_{1} F_{1}+a_{2} F_{1}^{2}+\cdots+a_{l-1} F_{1}^{l-1}+a_{l} F_{1}^{l} \\ S_{2}=\bar{S}+a_{1} F_{2}+a_{2} F_{2}^{2}+\cdots+a_{l-1} F_{2}^{l-1}+a_{l} F_{2}^{l} \\ \ldots \\ S_{l}=\bar{S}+a_{1} F_{l}+a_{2}
F_{l}^{2}+\cdots+a_{l-1} F_{l}^{l-1}+a_{l} F_{l}^{l} \\ T=\bar{S}+a_{1} T+a_{2} T^{2}+\cdots+a_{l-1} T^{l-1}+a_{l} T^{l} \end{array}\right.

 where a_{i} are the corresponding generated coefficients, f(x)=S+a_{1} x+a_{2} x^{2}+\cdots+a_{l-1} x^{l-1}, and f(0)=S. The system in Equation (3) has only one solution for S and it is well known from the Lagrange interpolation formula.

 Lemma 1. One and only one polynomial curve f(x) of degree l-1 could be described by 1 points on the plane \left(x_{1}, y_{1}\right),\left(x_{2}, y_{2}\right), \ldots,\left(x_{l}, y_{l}\right)

 f_{x}=a_{0}+a_{1} x+\ldots+a_{l-1} x^{l-1},\left\{f\left(x_{i}\right)=y_{i}\right\}_{i=1}^{l}

 Hence, the system secret S may be recovered based on / collected shares as given by the conventional Lagrange interpolation formula without the need to transfer the original factor secrets S_{i} to the verifier. Hence, the sensitive person-related data is kept private, as

S=(-1)^{l} \sum_{i=1}^{l+1} S_{i} \prod_{j=1, j \neq i}^{l+1} \frac{F_{j}}{F_{i}-F_{j}}

where F_{l+1}=T The proposed modifications are required to assure the uniqueness of the acquired data, see Figure 6.

Figure 6. Reversed method based on the Lagrange polynomial.


Due to the properties of the Lagrange formulation, there can only be one curve described by the corresponding polynomial (Lemma 1); therefore, each set of \overline{\left[F_{i}: S_{i}\right]} will produce its unique \bar{S}. However, if the biometric data collected by MFA has not been changed over time, the secret will always remain the same, which is an obvious vulnerability of the considered system. On the other hand, a simple addition of the timestamp should always produce a unique curve, as it is shown in Figure 6 for T, T_{1}, and T_{2}.

The proposed solution provides robustness against the case where all S_{i} remain unchanged over time. This is achieved by adding a unique factor of time T, which enables the presence of F_l with the corresponding secret. It is necessary to mention that the considered threshold scheme based on the Lagrange interpolation formula utilizes Rivest-Shamir-Adleman (RSA) mechanism or EIGamal encryption/decryption algorithm for authentication during the final step. In this case, it is proven that we obtain a secure threshold scheme related to secrets S_{i} in.

4.3. Proposed MFA Solution for V2X Applications

Indeed, our proposed solution may operate out-of-the-box in case where all l factors are present. The system may thus provide a possibility to identify and report any outdated factor information ­– for example, weight fluctuation. Access to a service could be automated when some of the factors are not present. We further elaborate on this feature in the current subsection.

4.3.1. Factor Mismatch

Assuming that the number of factors in our system is l=4, the system secret S can be represented in a simplified way as a group of

S \leftarrow\left[\begin{array}{llll} F_{1} & F_{2} & F_{3} & F_{4} \end{array}\right]

Here, if any of S_{i} are modified-the secret recovery mechanism would fail. An improvement to this algorithm is delivered by providing separate system solutions \bar{S}_{i} for a lower number of factors collected. Basically, for \bar{l}=3, the number of possible combinations of factors with one missing is equal to four, as follows

\begin{array}{l} \overline{S_{1}} \leftarrow\left[\begin{array}{lll} F_{1} & F_{2} & F_{3} \end{array}\right] \\ \overline{S_{2}} \leftarrow\left[\begin{array}{lll} F_{1} & F_{3} & F_{4} \end{array}\right] \\ \overline{S_{3}} \leftarrow\left[\begin{array}{lll}
F_{1} & F_{2} & F_{4} \end{array}\right] \\ \overline{S_{3}} \leftarrow\left[\begin{array}{lll} F_{2} & F_{3} & F_{4} \end{array}\right] \end{array}

The device may thus grant access based on a predefined risk function policy. As the second benefit, it can inform the user (or the authority) that a particular factor F_{i} has to be updated based on the failed S_{i} combination. Indeed, this modification brings only marginal transmission overheads, but, on the other hand, enables higher flexibility in authentication and missing factor validation.

4.3.2. Cloud Assistance

Another important scenario for MFA is potential assistance of the trusted authority in F_i:S_i mismatch or loss. In case when the user fails to present a sufficient number of factors, the trusted authority can be requested to provide the temporary factor keys, as it is demonstrated in Figure 7.

Figure 7. Trusted authority assistance in authentication when user is missing two factors.

For example, assume that the user forgot or lost two factors F_{2} and F_{3} with the corresponding keys S_{1}=f\left(F_{1}\right) and S_{2}=f\left(F_{2}\right). The trusted authority is willing to assist in authentication-two temporary keys S_{\Phi_{1}}=f\left(\Phi_{1}\right) and S_{\Phi_{2}}=f\left(\phi_{2}\right) are thus generated and sent to the user via a secure channel. Obtaining these keys and applying the Lagrange interpolation formula with RSA or EIGamal encryption/decryption-based threshold authentication procedure involves the following factors and keys

\begin{array}{l} F_{1}: S_{1} \\ F_{2}: S_{2} \\ \cdots \\ F_{l}: S_{L} \\ F_{l+1}: T \\ \phi_{1}: S_{\Phi_{1}} \\ \phi_{2}: S_{\Phi_{2}} \end{array}

This allows for gaining access to the device. The proposed solution is designed explicitly to complete the MFA step of the authentication, that is, its usage for SFA and 2FA is not recommended. This is mainly due to the features of the Lagrange interpolation formula. Basically. in the SFA case and without the F_{l+1}: T factor, the equation at hand can be simply represented as S_{1}=S+b_{1} F_{1},

i.e., it will become 'a point'. Even adding a random timestamp factor will not provide any valuable level of biometric data protection, since an eavesdropper could be able to immediately recover the factor secret.

The above is not suitable for the 2FA either, since providing two factors allows the curve to have linear behavior,

i.e., the eavesdropper is required two attempts to recover the secrets. However, adding a timestamp factor here allows for providing the necessary level of safety with three actual factors, as discussed below.

4.4. Potential Evaluation Techniques

Conventionally, authentication systems utilizing only the knowledge of ownership factors operate in pass/fail mode, i.e., the input data is either correct or incorrect. When it comes to using biometrics, the system faces potential errors during the biometric sample capturing, which was discussed previously in Section 3.4. We further elaborate on our proposed methodology from the crucial FAR/FRR perspective.

Typically, the FAR/FRR parameters of a sensor are provided by vendors based on the statistically collected data. For the MFA framework, we assume two possible decisions made during the user authentication phase, as it is displayed in Figure 8: H_{0} ­– the user is not legitimate; or H_{1} ­– the user is legitimate. These form the entire sample space of P\left(H_{0}\right)+P\left(H_{1}\right)=1. The risk policy is assumed to be handled by the authentication system owner who also sets up the distributions of P\left(H_{0}\right) and P\left(H_{1}\right).


Figure 8. MFA system mode. P_{TH} is the selected threshold.

Generalizing, there might be n biometric sensors collecting the user input data. Each individual sensor measurement from the set Z=\left\{z_{1}, \ldots, z_{n}\right\} is distributed within [0,1], and this set is further analyzed under the conditions of two previously considered hypotheses. The measurements delivered from the sensors could be processed in two different ways as introduced in the sequel.

4.4.1. Strict Decision Methodology

Each sensor decides whether the user is legitimate or not by returning either accept or reject. The MFA system then combines the collected results and provides a group decision based on the resulting vector. Hence, it is possible to utlize the threshold decision functions or weighted threshold functions depending on the reliability of the sensor. For the first case, the sensor will return the value z_{i}, z_{i}=[0 ; 1], which could be interpreted as either YES or NO. Then, the conditional probabilities P\left(z_{i} \mid H_{0}\right) and P\left(z_{i} \mid H_{1}\right) are defined by F A R_{i} and F R R_{i} values, respectively, for i -th sensor. Here, F A R_{i} and F R R_{i} are taken at the CER/EER point, e . g ., z_{i} is selected at the point where F A R_{i}=F R R_{i}. Generally, this methodology reflects the scenarios of ownership or knowledge factors from the biometric perspective.

4.4.2. Probabilistic Decision Methodology

The sensor responds with a result of its measurements as well as a probabilistic characteristics. Further, the data is merged before the final decision is made. Therefore, the entire set of the measured data could be utilized when making a group decision and, accordingly, a common result might be established based on the set collected from all sensors.

In the second case, the sensor returns a result of the measurements as well as the template comparison in the form of a match score z_{i} (0 \leq z_{i} \leq 1). For each of the values z_{i}, the conditional probability P\left(z_{i} \mid H_{0}\right) is calculated based on the F A R_{i} values at z_{i}. In addition, the conditional probability P\left(z_{i} \mid H_{1}\right) is determined by F R R_{i} values at z_{i}.

This approach offers an opportunity to consider the strict decision methodology as a simplified model of the probabilistic one for the case where F A R_{i} and F R R_{i} are given only in one point. Here, the measurement result can only take two values, i.e., higher or lower than the selected threshold.

4.4.3. Evaluation

In this work, we consider a more general case of the probabilistic decision-making methodology, while a combination of the measurement results for the individual sensors is made similarly to the previous works by using the Bayes estimator. Since the outcomes of measurements have a probabilistic nature, the decision function is suitable for the maximum a posteriori probability solution.

In more detail, the decision function may be described as follows. At the input, it requires a conditional probability of the measured value from each sensor P\left(z_{i} \mid H_{0}\right) and P\left(z_{i} \mid H_{1}\right) together with a priori probabilities of the hypotheses P\left(H_{0}\right) and P\left(H_{1}\right). The latter values could be a part of the company's risk policy as they determine the degree of confidence for specific users. Then, the decision function evaluates the a posteriori probability of the hypothesis P\left(H_{1} \mid Z\right) and validates that the corresponding probability is higher than a given threshold P_{T H}

The measurement-related conditional probabilities can be considered as independent random variables; hence, the general conditional probability is as follows:

P\left(Z \mid H_{J}\right)=\prod_{z_{i} \in Z} P\left(z_{i} \mid H_{J}\right), J \in\{0 ; 1\}

Further, the total probability P(Z) is calculated as

P(Z)=\prod_{z_{1} \in Z} P\left(z_{i} \mid H_{0}\right) P\left(H_{0}\right)+\prod_{z_{1} \in Z} P\left(z_{i} \mid H_{1}\right) P\left(H_{1}\right)

where P\left(z_{i} \mid H_{J}\right), J \in\{0 ; 1\} are known from the sensor characteristics, while P\left(H_{0}\right) and P\left(H_{1}\right) are a priori probabilities of the hypotheses (a part of the company's risk policy).

Based on the obtained results, the posterior probability for each hypothesis H_{J}, J \in\{0 ; 1\} can be produced as

P\left(H_{1} \mid Z\right)=\frac{\prod_{z_{1} \in Z} P\left(z_{i} \mid H_{1}\right) P\left(H_{1}\right)}{P(Z)}

For a comprehensive decision over the entire set of sensors, the following rule applies

P\left(H_{1} \mid Z\right)>P_{T H} \Rightarrow\{\text {Accept}\}, \text { else }\{\text {Reject}\}

As a result, the decision may be correct or may lead to an error. The FAR and FRR values could then be utilized for selecting the appropriate threshold P_{T H} based on all of the involved sensors.

5. Discussion and Future Prospects

Today, authentication matters more than ever before. In the digital era, most users will rely on biometrics in matters concerning systems security and authorization to complement the conventional passwords. Even though privacy, security, usability, and accuracy concerns are still in place, MFA becomes a system that promises the security and ease of use needed for modern users while acquiring access to sensitive data.

Without a doubt, biometrics are one of the key layers to enable the future of MFA. This functionality is often regarded not standalone but as a supplement to traditional authentication approaches like passwords, smart cards, and PINs. Combining two or more authentication mechanisms is expected to provide a higher level of security when verifying the user. The expected evolution towards MFA is rooted in the synergistic biometric systems that allow for significantly improved user experience and MFA system throughput, which would be beneficial for various applications (see Figure 9). Such systems will intelligently couple all three factor types, namely, knowledge, biometrics, and ownership.


Figure 9. Biometric MFA for the airport scenario.

Since conventional single-factor systems of today are based on only one parameter (unimodality property), if its acquisition is affected in any way (be it noise or disruption), the overall accuracy will degrade. As a reminder, collecting a single type of non-knowledge related data, e.g., biometrics, could exclude part of the user population when particular disabilities are present. Moreover, spoofing this only factor is a relatively simple task.

One of the most promising directions in MFA is behavior-based biometrics providing entirely new ways of authenticating the users. The solutions that are based on muscular memory, e.g., writing or gestures, coupled with machine learning become more prominent examples. Already today, software can extrapolate user handwriting and reach the confidence levels of above 99.97 percent. More forward-looking MFA sources to be utilized in the nearest future are heart and brain. The attractive area of ECG and EEG analysis is also expected to provide unique identification samples for each subject.

Another military-inspired research activity already shows the capability to identify the users based on the way they interact with computer. This approach takes into consideration the typing speed, typical spelling mistakes, writing rhythm, and other factors. The appropriate terminology is not settled yet, and some call this methodology Passive Biometrics, while others name it Continuous Authentication. It results in having a unique fingerprint of the user–computer interaction pattern, which is extremely difficult to replicate.

All of the discussed MFA scenarios require significant memory resources to statistically analyze the input data and store the biometric samples even if utilizing different optimization techniques. A very promising direction of the MFA development is therefore in the area of neural networks and Big Data. Here, many successful applications have been known to the community for more than a decade. Examples could be found in where conventional factors, such as iris, retina, fingerprints, etc., are considered. Utilizing neural networks for the next-generation biometrics is the most likely way to proceed due to presently high levels of the analysis complexity.

In summary, biometric technology is a prominent direction driven by the mobile device market. The number of smartphones to be sold only in the US is expected to reach 175 million units by 2018 with the corresponding market to exceed $50.6B in revenues by 2022. It is believed that a strong push towards the utilization of biometrics in many areas of life is imminent, since most of the flagman devices are already equipped with the fingerprint scanner and facial recognition technology in addition to convention PIN codes.

This work provided a systematic overview of the state-of-the-art in both technical and usability issues, as well as the major challenges in currently available MFA systems. In this study, we discussed the evolution of authentication from single- through two- and towards multi-factor systems. Primarily, we focused on the MFA factors constituting the state-of-the-art, future possible directions, respective challenges, and promising solutions. We also proposed an MFA solution based on the reversed Lagrange polynomial as an extension of Shamir's Secret Sharing scheme, which covers the cases of authenticating the user even if some of the factors are mismatched or absent. It also helps qualify the missing factors without disclosing the sensitive data to the verifier.