Multifactor Authentication
Site: | Saylor Academy |
Course: | CS406: Information Security |
Book: | Multifactor Authentication |
Printed by: | Guest user |
Date: | Wednesday, 2 April 2025, 11:58 PM |
Description
Authentication can be accomplished with one factor, two factors, or multiple factors. Which one is the weakest level of authentication and which is the most secure and why? When would a more secure system be required? Be able to explain these multifactor authentication methods: password protection, token presence, voice biometrics, facial recognition, ocular-based methodology, hand geometry, vein recognition, fingerprint scanner, thermal image recognition, and geographical location. What are some challenges of multiple factor authentication when using biometrics? There is a lot of interesting information covered in this article that you do not need to memorize, but that you should be aware of.
Table of contents
- Abstract
- 1. Introduction
- 2. State-of-the-Art and Potential MFA Sources
- 2.1. Widely Deployed MFA Sensors/Sources
- 2.1.1. Password Protection
- 2.1.2. Token Presence
- 2.1.3. Voice Biometrics
- 2.1.4. Facial Recognition
- 2.1.5. Ocular-Based Methodology
- 2.1.6. Hand Geometry
- 2.1.7. Vein Recognition
- 2.1.8. Fingerprint Scanner
- 2.1.9. Thermal Image Recognition
- 2.1.10. Geographical Location
- 2.2. Future of MFA Integration
- 2.2.1. Behavior Detection
- 2.2.3. Occupant Classification Systems (OCS)
- 2.2.4. Electrocardiographic (ECG) Recognition
- 2.2.5. Electroencephalographic (EEG) Recognition
- 2.2.6. DNA Recognition
- 3. MFA Operation Challenges
- 4. Enabling Flexible MFA Operation
- 4.3. Proposed MFA Solution for V2X Applications
- 5. Discussion and Future Prospects
Abstract
Today, digitalization decisively penetrates all the sides of the modern society. One of the key enablers to maintain this process secure is authentication. It covers many different areas of a hyper-connected world, including online payments, communications,
access right management, etc. This work sheds light on the evolution of authentication systems towards Multi-Factor Authentication (MFA) starting from Single-Factor Authentication (SFA) and through Two-Factor Authentication (2FA). Particularly, MFA
is expected to be utilized for human-to-everything interactions by enabling fast, user-friendly, and reliable authentication when accessing a service. This paper surveys the already available and emerging sensors (factor providers) that allow for
authenticating a user with the system directly or by involving the cloud. The corresponding challenges from the user as well as the service provider perspective are also reviewed. The MFA system based on reversed Lagrange polynomial within
Shamir's Secret Sharing (SSS) scheme is further proposed to enable more flexible authentication. This solution covers the cases of authenticating the user even if some of the factors are mismatched or absent. Our framework allows for qualifying the
missing factors by authenticating the user without disclosing sensitive biometric data to the verification entity. Finally, a vision of the future trends in MFA is discussed.
Source: Aleksandr Ometov, Aleksandr Ometov, Niko Mäkitalo, Sergey Andreev, Tommi Mikkonen, and Yevgeni Koucheryavy, https://www.mdpi.com/2410-387X/2/1/1/htm
This work is licensed under a Creative Commons Attribution 4.0 License.
1. Introduction
Figure 1. Conceptual authentication examples.
- Knowledge factor – something the user knows, such as a password or, simply, a "secret";
- Ownership factor – something the user has, such as cards, smartphones, or other tokens;
- Biometric factor – something the user is, i.e., biometric data or behavior pattern.
Figure 2. Evolution of authentication methods from SFA to MFA.
-
Validating the identity of the user and the electronic device (or its system);
-
Validating the infrastructure connection;
-
Validating the interconnected IoT devices, such as a smartphone, tablet, wearable device, or any other digital token (key dongle).
- Customers first register and authenticate with the service provider to activate and manage services they are willing to access;
- Once accessing the service, the user is required to pass a simple SFA with the fingerprint/token signed in advance by the service provider;
- Once initially accepted by the system, the customer authenticates by logging in with the same username and password as setup previously in the customer portal (or social login). For additional security, the managing platform can enable secondary authentication factors. Once the user has successfully passed all the tests, the framework automatically authenticates to the service platform;
- The secondary authentication occurs automatically based on the biometric MFA, so the user would be requested to enter an additional code or provide a token password only in case the MFA fails.
- This work provides a detailed analysis of factors that are presently utilized for MFA with their corresponding operational requirements. Potential sensors to be utilized are surveyed based on the academic and industrial sources (Section 2);
- The survey is followed by the challenges related to MFA adoption from both the user experience and the technological perspectives (Section 3);
- Further, the framework based on the reversed Lagrange polynomial is proposed to allow for utilizing MFA in cases where some of the factors are missing (Section 4). A discussion on the potential evaluation methodology is also provided;
- Finally, the vision of the future of MFA is discussed (Section 5)
2. State-of-the-Art and Potential MFA Sources
Presently, the authentication systems already utilize an enormous number of sensors that enable identification of a user. In this section, we elaborate on the MFA-suitable factors, corresponding market-available sensors, and related challenges. Furthermore, we provide additional details on the ones that are to be potentially deployed in the near future.
2.1. Widely Deployed MFA Sensors/Sources
Today, identification and authentication for accessing sensitive data are one of the primary use cases for MFA. We further list the factors already available for the MFA utilization without acquiring additional specialized equipment.
2.1.1. Password Protection
The conventional way to authenticate a user is to request a PIN code, password, etc.. The secret pass-phrase traditionally represents a knowledge factor. It requires only a simple input device (at least one button) to authenticate the user.
2.1.2. Token Presence
The password could then be supplemented with a physical token – for example, a card, which is recommended as a second factor group – the ownership. From the hardware perspective, a user may present a smartcard, phone, wearable device, etc., which are more complicated to delegate. In this case, the system should be equipped with a radio interface allowing for two-way communication with the token. On the other hand, the most widely known software token is one-time software generated password. The main drawback of the above is the problem of uncontrollable duplication.
2.1.3. Voice Biometrics
Most of the contemporary smart electronic devices are equipped with a microphone that allows utilizing voice recognition as a factor for MFA. At the same time, the technology advancement of tomorrow may allow special agencies not only to recognize the speakers but also to mimic their voices including the intonation, timbre, etc., which is a serious drawback of utilizing voice as a primary authentication method.
2.1.4. Facial Recognition
As the next step, facial recognition could be considered. At the beginning of its development, the technology was based on the landmark picture analysis, which was relatively simple to replicate by supplying the system with a photo. The next phase was by enabling three-dimensional face recognition, i.e., by asking the user to move head during the authentication process in a specific manner. Finally, the advancement of this system reached the point of recognizing the actual expressions of the user. To enable facial recognition, it is required to equip the system with at least one output device and a camera.
2.1.5. Ocular-Based Methodology
The iris recognition techniques are on the market for more than 20 years. This approach does not require the user to be close to the capture device while analyzing the color pattern of the human eye. Retina analysis is another attractive technique. Here, a thin tissue composed of neural cells that are located in the posterior portion of the eye is captured and analyzed. Because of the complex structure of the capillaries that supply the retina with blood, each person’s retina is unique. The most prominent challenges in those methods are the need for high quality capture device and robust mathematical technique to analyze the image.
2.1.6. Hand Geometry
Some systems employ the analysis of the physical shape of a hand to authenticate the user. Initially, pegs were utilized to validate the subject, but the usability of such methods was low. Further on, the flatbed scanner was used to obtain the image without the need to fix the user’s hand in one specific position. Today, some systems utilize conventional cameras not requiring close contact with the capture surface. This approach is, however, not very robust to the environment. Some vendors apply so-called photoplethysmography (PPG) to determine whether a wearable device (e.g., a smartwatch) is currently on its user’s wrist or not. The process is similar to the one followed when measuring heart rate.
2.1.7. Vein Recognition
2.1.8. Fingerprint Scanner
Utilizing fingerprint scanner as the primary authentication mechanism is currently being pushed by the majority of smartphone/personal computer vendors. This solution is intuitive to use but remains extremely simple to fabricate – mainly due to the fact that our fingerprints could be obtained from almost anything we touch. The integration potential of this method is indeed high, even though it is also not recommended to be used as a standalone authentication approach. Most of the smartphone vendors install an additional camera to obtain the fingerprint instead of more safe vein recognition.
2.1.9. Thermal Image Recognition
Similarly to vein recognition, thermal sensor is utilized to reconstruct the unique thermal image of one's body blood flow in proximity. Many challenges with this authentication method may arise due to the user conditions: sickness or emotion may significantly influence the perceived figures.
2.1.10. Geographical Location
Utilizing the device's and user's geographical location to validate whether access to the device/service could be granted is a special case of location-based authentication. Importantly, GPS signal could be easily jammed or considered faulty due to the propagation properties; thus, it is recommended to utilize at least two location sources, for example, GPS and wireless network cell ID. A smartphone could be used to support MFA from the location acquisition perspective.
2.2. Future of MFA Integration
Accelerated adoption across many industries as well as increased availability of biometric services in a wide range of readily-available consumer products is pushing the concept of tight MFA integration. Currently, researchers and early technology adopters attempt to integrate new sensors to be utilized in MFA systems.
2.2.1. Behavior Detection
Back in time, behavior recognition was utilized to analyze military telegraph operator's typing rhythm to track the movement of the troops. Today, gestures for authentication purposes may range from conventional to "hard-to-mimic" ones, since motor-programmed
skill results in the movement being organized before the actual execution.
2.2.3. Occupant Classification Systems (OCS)
Some vehicular systems already have the OCS solutions integrated in consumer cars. A system of sensors can detect who is currently in the passenger/driver seat by utilizing, for example, weight or posture and automatically adjusting the vehicle to personal needs.
2.2.4. Electrocardiographic (ECG) Recognition
ECG data could be collected from the user’s smart watch or activity tracker and compared with an individually stored pattern. The main benefit of using this factor for authentication is that ECG signals emerge as a potential biometric modality with the advantage of being difficult (or close to impossible) to mimic. The only way is by utilizing the existing personal recording.
2.2.5. Electroencephalographic (EEG) Recognition
his solution is based on the brain waves analysis and could be considered from the fundamental philosophical proposition "Cogito ergo sum" by R. Descartes, or "I think, therefore I am". It allows for obtaining a unique sample of the person's brain activity pattern. Formerly, EEG data capture could have been performed only in clinical settings by using invasive probes under the skull or wet-gel electrodes arrayed over the scalp. Today, the simple EEG collection is possible by utilizing market-available devices having the size of a headset.
2.2.6. DNA Recognition
Table 1. Comparison of suitable factors for MFA: H – high; M – medium; L – low; n/a – unavailable.
Factor | Universality | Uniqueness | Collectability | Performance | Acceptability | Spoofing |
Password | n/a | L | H | H | H | H |
Token | n/a | M | H | H | H | H |
Voice | M | L | M | L | H | H |
Facial | H | L | M | L | H | M |
Ocular-based | H | H | M | M | L | H |
Fingerprint | M | H | M | H | M | H |
Hand geometry | M | M | M | M | M | M |
Location | n/a | L | M | H | M | H |
Vein | M | M | M | M | M | M |
Thermal image | H | H | L | M | H | H |
Behavior | H | H | L | L | L | L |
Beam-forming | n/a | M | L | L | L | H |
OCS | n/a | L | L | L | L | M |
ECG | L | H | L | M | M | L |
EEG | L | H | L | M | L | L |
DNA | H | H | L | H | L | L |
-
Universality stands for the presence of factor in each person;
-
Uniqueness indicates how well the factor differentiates one person from another;
-
Collectability measures how easy it is to acquire data for processing;
-
Performance indicates the achievable accuracy, speed, and robustness;
-
Acceptability stands for the degree of acceptance of the technology by people in their daily life;
-
Spoofing indicates the level of difficulty to capture and spoof the sample.
3. MFA Operation Challenges
An integration of novel solutions has always been a major challenge for both developers and managers. The key challenges are presented in Figure 3. In the first place, user acceptance is a critical aspect for the adoption of strong identity and multi-factor authentication. While adopting and deploying MFA solutions, it is required to follow a careful and thorough approach – where most challenges arise from opportunities and potential benefits.Figure 3. Main operational challenges of MFA.
3.1. Usability
-
Task efficiency – time to register and time to authenticate with the system;
-
Task effectiveness – the number login attempts to authenticate with the system;
-
User preference – whether the user prefers a particular authentication scheme over another.
3.2. Integration
3.3. Security and Privacy
3.4. Robustness to Operating Environment
4. Enabling Flexible MFA Operation
In this work, we offer a new authentication scheme that focuses on the vehicle-to-everything (V2X) scenarios, since cars of today are already equipped with multiple sensors that could potentially be utilized for MFA. Conventionally, the user has a username/password/PIN/token and will additionally be asked to utilize a biometric factor, such as facial features or fingerprints. The general overview supported by a follow-up discussion is given in Figure 4. If the authentication procedure fails to establish trust by using this combination of factors, then the user will be prompted to authenticate by utilizing another previously registered factor or a set of those. This MFA system may not only verify the accuracy of the user input but also determine how the user interacts with the devices, i.e., analyze the behavior. The more the user interacts with the biometric system, the more accurate its operation becomes.Figure 4. Current and emerging MFA sensors for vehicles.
4.1. Conventional Approach
One of the approaches considered within the scope of this work is based on utilizing Lagrange polynomials for secret sharing. The system secret S is usually "split" and distributed among a set of key holders. It could be recovered later on, as described in and numerous other works, aswhere are the generated polynomial indexes and
is a unique identification factor
. In such systems, every key holder with a factor ID obtains its own unique key share
In conventional systems, it is required to collect any shares
of the initial secret to unlock the system, while the curve may offer
points, as it is shown in Figure 5. The
basic principle behind this approach is to specify the secret
and use the generated curve based on the random coefficients
to produce the secret shares
. This methodology is successfully utilized in many secret sharing systems
that employ the Lagrange interpolation formula.
Figure 5. Lagrange secret sharing scheme.
Unfortunately, this approach may not be applied for the MFA scenario directly, since the biometric parameters are already in place, i. e ., we can neither assign a new4.2. Proposed Reversed Methodology
In this work, we consider the MFA system with explicit / factorswhere
where
Lemma 1. One and only one polynomial curve
Hence, the system secret
where
Figure 6. Reversed method based on the Lagrange polynomial.
Due to the properties of the Lagrange formulation, there can only be one curve described by the corresponding polynomial (Lemma 1); therefore, each set of will produce its unique
. However, if the biometric
data collected by MFA has not been changed over time, the secret will always remain the same, which is an obvious vulnerability of the considered system. On the other hand, a simple addition of the timestamp should always produce a unique curve, as
it is shown in Figure 6 for
, and
.
The proposed solution provides robustness against the case where all remain unchanged over time. This is achieved by adding a unique factor of time
which enables the presence of
with the corresponding secret. It is necessary to
mention that the considered threshold scheme based on the Lagrange interpolation formula utilizes Rivest-Shamir-Adleman (RSA) mechanism or EIGamal encryption/decryption algorithm for authentication during the final step. In this case, it is proven
that we obtain a secure threshold scheme related to secrets
in.
4.3. Proposed MFA Solution for V2X Applications
Indeed, our proposed solution may operate out-of-the-box in case where all l factors are present. The system may thus provide a possibility to identify and report any outdated factor information – for example, weight fluctuation. Access to a service could be automated when some of the factors are not present. We further elaborate on this feature in the current subsection.4.3.1. Factor Mismatch
Assuming that the number of factors in our system is the system secret
can be represented in a simplified way as a group of
Here, if any of are modified-the secret recovery mechanism would fail. An improvement to this algorithm is delivered by providing separate system solutions
for a lower number of factors collected. Basically, for
the number of possible combinations of factors with one missing is equal to four, as follows
The device may thus grant access based on a predefined risk function policy. As the second benefit, it can inform the user (or the authority) that a particular factor has to be updated based on the failed
combination. Indeed, this
modification brings only marginal transmission overheads, but, on the other hand, enables higher flexibility in authentication and missing factor validation.
4.3.2. Cloud Assistance
Another important scenario for MFA is potential assistance of the trusted authority inFigure 7. Trusted authority assistance in authentication when user is missing two factors.
For example, assume that the user forgot or lost two factors and
with the corresponding keys
and
. The trusted authority is willing to assist in authentication-two temporary
keys
and
are thus generated and sent to the user via a secure channel. Obtaining these keys and applying the Lagrange interpolation formula with RSA or EIGamal encryption/decryption-based
threshold authentication procedure involves the following factors and keys
This allows for gaining access to the device. The proposed solution is designed explicitly to complete the MFA step of the authentication, that is, its usage for SFA and 2FA is not recommended. This is mainly due to the features of the Lagrange
interpolation formula. Basically. in the SFA case and without the factor, the equation at hand can be simply represented as
,
i.e., it will become 'a point'. Even adding a random timestamp factor will not provide any valuable level of biometric data protection, since an eavesdropper could be able to immediately recover the factor secret.
The above is not suitable for the 2FA either, since providing two factors allows the curve to have linear behavior,
i.e., the eavesdropper is required two attempts to recover the secrets. However, adding a timestamp factor here allows for providing the necessary level of safety with three actual factors, as discussed below.
4.4. Potential Evaluation Techniques
Conventionally, authentication systems utilizing only the knowledge of ownership factors operate in pass/fail mode, i.e., the input data is either correct or incorrect. When it comes to using biometrics, the system faces potential errors during the biometric
sample capturing, which was discussed previously in Section 3.4. We further elaborate on our proposed methodology from the crucial FAR/FRR perspective.
Typically, the FAR/FRR parameters of a sensor are provided by vendors based on the statistically collected data. For the MFA framework, we assume two possible decisions made during the user authentication phase, as it is displayed in Figure 8: –
the user is not legitimate; or
– the user is legitimate. These form the entire sample space of
. The risk policy is assumed to be handled by the authentication system owner who also sets
up the distributions of
and
.
Figure 8. MFA system mode. is the selected threshold.
4.4.1. Strict Decision Methodology
Each sensor decides whether the user is legitimate or not by returning either accept or reject. The MFA system then combines the collected results and provides a group decision based on the resulting vector. Hence, it is possible to utlize the threshold decision functions or weighted threshold functions depending on the reliability of the sensor. For the first case, the sensor will return the value , which could be interpreted as either YES or NO. Then, the conditional probabilities
and
are defined by
and
values, respectively, for
-th sensor. Here,
and
are taken at the CER/EER point, e . g .,
is selected at the point where
. Generally, this methodology reflects the scenarios of ownership or knowledge factors from the biometric perspective.
4.4.2. Probabilistic Decision Methodology
The sensor responds with a result of its measurements as well as a probabilistic characteristics. Further, the data is merged before the final decision is made. Therefore, the entire set of the measured data could be utilized when making a group decision and, accordingly, a common result might be established based on the set collected from all sensors.
In the second case, the sensor returns a result of the measurements as well as the template comparison in the form of a match score
. For each of the values
, the conditional probability
is calculated based on the
values at
. In addition, the conditional probability
is determined by
values at
.
This approach offers an opportunity to consider the strict decision methodology as a simplified model of the probabilistic one for the case where and
are given only in one point. Here, the measurement result can only take two values,
i.e., higher or lower than the selected threshold.
4.4.3. Evaluation
In this work, we consider a more general case of the probabilistic decision-making methodology, while a combination of the measurement results for the individual sensors is made similarly to the previous works by using the Bayes estimator. Since the outcomes of measurements have a probabilistic nature, the decision function is suitable for the maximum a posteriori probability solution.
In more detail, the decision function may be described as follows. At the input, it requires a conditional probability of the measured value from each sensor and
together with a priori
probabilities of the hypotheses
and
. The latter values could be a part of the company's risk policy as they determine the degree of confidence for specific users. Then, the decision function evaluates
the a posteriori probability of the hypothesis
and validates that the corresponding probability is higher than a given threshold
The measurement-related conditional probabilities can be considered as independent random variables; hence, the general conditional probability is as follows:
Further, the total probability is calculated as
where are known from the sensor characteristics, while
and
are a priori probabilities of the hypotheses (a part of the company's risk policy).
Based on the obtained results, the posterior probability for each hypothesis can be produced as
For a comprehensive decision over the entire set of sensors, the following rule applies
As a result, the decision may be correct or may lead to an error. The FAR and FRR values could then be utilized for selecting the appropriate threshold based on all of the involved sensors.
5. Discussion and Future Prospects
