151 results on '"Marc Fischlin"'
Search Results
2. Post-quantum Security for the Extended Access Control Protocol
- Author
-
Marc Fischlin, Jonas von der Heyden, Marian Margraf, Frank Morgner, Andreas Wallner, and Holger Bock
- Published
- 2023
3. Nostradamus Goes Quantum
- Author
-
Barbara Jiabao Benedikt, Marc Fischlin, and Moritz Huppert
- Published
- 2022
4. A Random Oracle for All of Us
- Author
-
Marc Fischlin, Felix Rohrbach, and Tobias Schmalz
- Published
- 2022
5. Towards Post-Quantum Security for Signal's X3DH Handshake
- Author
-
Marc Fischlin, Christian Janson, Jacqueline Brendel, Douglas Stebila, Felix Günther, Dunkelman, Orr, Jacobson Jr., Michael J., and O’Flynn, Colin
- Subjects
Isogeny ,Theoretical computer science ,Handshake ,Computer science ,SIGNAL (programming language) ,Key (cryptography) ,post-quantum ,key encapsulation mechanisms ,key exchange ,signal protocol ,X3DH ,Reuse ,Quantum ,Key exchange - Abstract
Modern key exchange protocols are usually based on the Diffie–Hellman (DH) primitive. The beauty of this primitive, among other things, is its potential reusage of key shares: DH shares can be either used a single time or in multiple runs. Since DH-based protocols are insecure against quantum adversaries, alternative solutions have to be found when moving to the post-quantum setting. However, most post-quantum candidates, including schemes based on lattices and even supersingular isogeny DH, are not known to be secure under key reuse. In particular, this means that they cannot be necessarily deployed as an immediate DH substitute in protocols. In this paper, we introduce the notion of a split key encapsulation mechanism (split KEM) to translate the desired key-reusability of a DH-based protocol to a KEM-based flow. We provide the relevant security notions of split KEMs and show how the formalism lends itself to lifting Signal’s X3DH handshake to the post-quantum KEM setting without additional message flows. Although the proposed framework conceptually solves the raised issues, instantiating it securely from post-quantum assumptions proved to be non-trivial. We give passively secure instantiations from (R)LWE , yet overcoming the above-mentioned insecurities under key reuse in the presence of active adversaries remains an open problem. Approaching one-sided key reuse, we provide a split KEM instantiation that allows such reuse based on the KEM introduced by Kiltz (PKC 2007), which may serve as a post-quantum blueprint if the underlying hardness assumption (gap hashed Diffie–Hellman) holds for the commutative group action of CSIDH (Asiacrypt 2018). The intention of this paper hence is to raise awareness of the challenges arising when moving to KEM-based key exchange protocols with key-reusability, and to propose split KEMs as a specific target for instantiation in future research. ISSN:0302-9743 ISSN:1611-3349
- Published
- 2021
6. BUFFing signature schemes beyond unforgeability and the case of post-quantum signatures
- Author
-
Cas Cremers, Rune Fiedler, Samed Duzlu, Marc Fischlin, and Christian Janson
- Subjects
Scheme (programming language) ,Security properties ,Theoretical computer science ,Transformation (function) ,Digital signature ,Computer science ,NIST ,computer ,Quantum ,Formal description ,Signature (logic) ,computer.programming_language - Abstract
Modern digital signature schemes can provide more guarantees than the standard notion of (strong) unforgeability, such as offering security even in the presence of maliciously generated keys, or requiring to know a message to produce a signature for it. The use of signature schemes that lack these properties has previously enabled attacks on real-world protocols. In this work we revisit several of these notions beyond unforgeability, establish relations among them, provide the first formal definition of non re-signability, and a transformation that can provide these properties for a given signature scheme in a provable and efficient way.Our results are not only relevant for established schemes: for example, the ongoing NIST PQC competition towards standardizing post-quantum signature schemes has six finalists in its third round. We perform an in-depth analysis of the candidates with respect to their security properties beyond unforgeability. We show that many of them do not yet offer these stronger guarantees, which implies that the security guarantees of these post-quantum schemes are not strictly stronger than, but instead incomparable to, classical signature schemes. We show how applying our transformation would efficiently solve this, paving the way for the standardized schemes to provide these additional guarantees and thereby making them harder to misuse.
- Published
- 2021
7. Constructing Random Oracles—UCEs
- Author
-
Marc Fischlin and Arno Mittelbach
- Subjects
Computer science ,Compression (functional analysis) ,Hash function ,Data_CODINGANDINFORMATIONTHEORY ,Function (mathematics) ,Algorithm ,Block cipher - Abstract
Indifferentiability provides us with a framework to analyze and sanity-check hash function constructions that are based on a simpler primitive such as a compression function or a block cipher.
- Published
- 2021
8. The Full Power of Random Oracles
- Author
-
Arno Mittelbach and Marc Fischlin
- Subjects
Statistics::Theory ,Theoretical computer science ,Computer science ,business.industry ,Cryptography ,Construct (python library) ,Computer Science::Computational Complexity ,Mathematical proof ,business ,Computer Science::Cryptography and Security ,Power (physics) - Abstract
We have already seen a first glimpse of how random oracles simplify cryptographic proofs when we showed how to use random oracles to construct commitment schemes. Over the course of this book we will see many additional examples of how we can prove the security of cryptographic schemes with the help of random oracles, and this chapter, in particular, is dedicated to the study of how random oracles facilitate such security proofs.
- Published
- 2021
9. Collision Resistance
- Author
-
Arno Mittelbach and Marc Fischlin
- Published
- 2021
10. The Random Oracle Model
- Author
-
Arno Mittelbach and Marc Fischlin
- Subjects
Pairwise independence ,Theoretical computer science ,Computer science ,Hash function ,Common denominator ,Random oracle - Abstract
In the previous chapter we looked at dedicated forms of hash functions that we categorized as non-cryptographic hash functions. Their common denominator is that we can prove the existence of constructions that fulfill the properties (e.g., pairwise independence) without having to rely on unproven assumptions.
- Published
- 2021
11. Constructions of Keyed Hash Functions
- Author
-
Marc Fischlin and Arno Mittelbach
- Subjects
Security properties ,Collision resistance ,Theoretical computer science ,Computer science ,Hash function - Abstract
In the previous chapters we studied constructions of hash functions in the unkeyed (resp. publicly keyed) setting for security properties such as collision resistance or second-preimage resistance.
- Published
- 2021
12. Foundations
- Author
-
Arno Mittelbach and Marc Fischlin
- Published
- 2021
13. A Cryptographic Analysis of the TLS 1.3 Handshake Protocol
- Author
-
Marc Fischlin, Douglas Stebila, Felix Günther, and Benjamin Dowling
- Subjects
0301 basic medicine ,Authentication ,Authenticated key exchange ,Transport Layer Security ,Handshake ,business.industry ,Computer science ,Applied Mathematics ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,Transport Layer Security (TLS) ,Cryptography ,Computer security model ,Handshake protocol ,Computer Science Applications ,Authenticated Key Exchange ,03 medical and health sciences ,030104 developmental biology ,0302 clinical medicine ,Forward secrecy ,030220 oncology & carcinogenesis ,business ,Software ,Key exchange ,Computer network - Abstract
We analyze the handshake protocol of the Transport Layer Security (TLS) protocol, version 1.3. We address both the full TLS 1.3 handshake (the one round-trip time mode, with signatures for authentication and (elliptic curve) Diffie–Hellman ephemeral ((EC)DHE) key exchange), and the abbreviated resumption/“PSK” mode which uses a pre-shared key for authentication (with optional (EC)DHE key exchange and zero round-trip time key establishment). Our analysis in the reductionist security framework uses a multi-stage key exchange security model, where each of the many session keys derived in a single TLS 1.3 handshake is tagged with various properties (such as unauthenticated versus unilaterally authenticated versus mutually authenticated, whether it is intended to provide forward security, how it is used in the protocol, and whether the key is protected against replay attacks). We show that these TLS 1.3 handshake protocol modes establish session keys with their desired security properties under standard cryptographic assumptions., Journal of Cryptology, 34 (4), ISSN:1432-1378, ISSN:0933-2790
- Published
- 2021
- Full Text
- View/download PDF
14. Limitations of Random Oracles
- Author
-
Arno Mittelbach and Marc Fischlin
- Subjects
Pseudorandom number generator ,Theoretical computer science ,Symmetric-key algorithm ,business.industry ,Computer science ,Hash function ,TheoryofComputation_GENERAL ,Computer Science::Computational Complexity ,business ,Computer Science::Cryptography and Security - Abstract
Random oracles are a very powerful tool. As we have seen, they simultaneously give rise to one-way functions, collision-resistant hash functions, pseudorandom generators, symmetric encryption schemes, and more.
- Published
- 2021
15. Constructing Compression Functions
- Author
-
Arno Mittelbach and Marc Fischlin
- Subjects
Transformation (function) ,Computer science ,Compression (functional analysis) ,Process (computing) ,Cryptographic hash function ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Algorithm - Abstract
We have seen that cryptographic hash functions that can process arbitrarily long inputs can be built from fixed-input-length compression functions via the Merkle–Damgard transformation (Chapter 13).
- Published
- 2021
16. The Theory of Hash Functions and Random Oracles
- Author
-
Arno Mittelbach and Marc Fischlin
- Published
- 2021
17. Iterated Hash Functions
- Author
-
Arno Mittelbach and Marc Fischlin
- Subjects
Public-key cryptography ,Theoretical computer science ,business.industry ,Iterated function ,Computer science ,Hash function ,Data_FILES ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,business ,Collision - Abstract
So far we have studied various hash function properties as well as use cases of hash functions. When unkeyed or used with a public key, hash functions can be collision resistant, second-preimage resistant, or one-way.
- Published
- 2021
18. Multipath TLS 1.3
- Author
-
Lars Porth, Jean-Pierre Münch, Sven-André Müller, and Marc Fischlin
- Subjects
Authentication ,business.industry ,Computer science ,Key (cryptography) ,Cryptography ,Adversary ,business ,Certificate ,Protocol (object-oriented programming) ,Key exchange ,Multipath propagation ,Computer network - Abstract
In a multipath key exchange protocol (Costea et al., CCS’18) the parties communicate over multiple connection lines, implemented for example with the multipath extension of TCP. Costea et al. show that, if one assumes that an adversary cannot attack all communication paths in an active and synchronized way, then one can securely establish a shared key under mild cryptographic assumptions. This holds even if classical authentication methods like certificate-based signatures fail. They show how to slightly modify TLS to achieve this security level.
- Published
- 2021
19. Pseudorandomness and Computational Indistinguishability
- Author
-
Marc Fischlin and Arno Mittelbach
- Subjects
Theoretical computer science ,Computer science ,business.industry ,Core (graph theory) ,Pseudorandomness ,Hash function ,Computational indistinguishability ,Computational security ,Cryptography ,Function (mathematics) ,business - Abstract
In the previous chapter we began our study of hash function properties with the introduction of computational security and one-wayness. While one-way functions are at the core of computational security and modern cryptography, the security guarantees given by a function that is merely one-way are relatively few.
- Published
- 2021
20. Encryption Schemes
- Author
-
Arno Mittelbach and Marc Fischlin
- Published
- 2021
21. Computational Security
- Author
-
Arno Mittelbach and Marc Fischlin
- Published
- 2021
22. Constructing Random Oracles—Indifferentiability
- Author
-
Marc Fischlin and Arno Mittelbach
- Subjects
Theoretical computer science ,Computer science ,Hash function ,Construct (python library) ,Random oracle ,Holy Grail - Abstract
The holy grail of hash function design is to construct a hash function which behaves like a random oracle. This is, of course, impossible (see Chapter 12). Nevertheless, while we know that we cannot construct an actual random oracle the goal should still be to come as close as possible.
- Published
- 2021
23. Cryptographic Analysis of the Bluetooth Secure Connection Protocol Suite
- Author
-
Marc Fischlin and Olga Sanina
- Subjects
Authentication ,business.industry ,Computer science ,Cryptography ,Computer security ,computer.software_genre ,law.invention ,Authenticated Key Exchange ,Bluetooth ,Identification (information) ,law ,Secrecy ,Key (cryptography) ,business ,Protocol (object-oriented programming) ,computer - Abstract
We give a cryptographic analysis of the Bluetooth Secure Connections Protocol Suite. Bluetooth supports several subprotocols, such as Numeric Comparison, Passkey Entry, and Just Works, in order to match the devices’ different input/output capabilities. Previous analyses (e.g., Lindell, CT-RSA’09, or Troncoso and Hale, NDSS’21) often considered (and confirmed) the security of single subprotocols only. Recent practically verified attacks, however, such as the Method Confusion Attack (von Tschirschnitz et al., S&P 21), against Bluetooth’s authentication and key secrecy property often exploit the bad interplay of different subprotocols. Even worse, some of these attacks demonstrate that one cannot prove the Bluetooth protocol suite to be a secure authenticated key exchange protocol. We therefore aim at the best we can hope for and show that the protocol still matches the common key secrecy requirements of a key-exchange protocol if one assumes a trust-on-first-use (TOFU) relationship. This means that the adversary needs to mount an active attack during the initial connection, otherwise the subsequent reconnections remain secure. Investigating the cryptographic strength of the Bluetooth protocol, we also look into the privacy mechanism of address randomization in Bluetooth (which is only available in the Low Energy version). We show that the cryptography indeed provides a decent level of address privacy, although this does not rule out identification of devices via other means, such as physical characteristics.
- Published
- 2021
24. Signature Schemes
- Author
-
Arno Mittelbach and Marc Fischlin
- Published
- 2021
25. Random Oracle Schemes in Practice
- Author
-
Marc Fischlin and Arno Mittelbach
- Subjects
Theoretical computer science ,Computer science ,business.industry ,Cryptography ,Trapdoor function ,Function (mathematics) ,business ,Computer Science::Cryptography and Security ,Power (physics) ,Random oracle - Abstract
In the following we give an overview about cryptographic schemes in practice and standards which rely on the random oracle methodology. In all cases the power of random oracles facilitates the design of very efficient solutions, usually combined with suitable number-theoretic primitives such as the discrete-logarithm-based one-way function or the RSA trapdoor function.
- Published
- 2021
26. Single-to-Multi-theorem Transformations for Non-interactive Statistical Zero-Knowledge
- Author
-
Marc Fischlin and Felix Rohrbach
- Subjects
Soundness ,Statement (computer science) ,050101 languages & linguistics ,Computer science ,05 social sciences ,String (computer science) ,02 engineering and technology ,Gas meter prover ,Mathematical proof ,Algebra ,Transformation (function) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,0501 psychology and cognitive sciences ,Zero-knowledge proof ,Polynomial number - Abstract
Non-interactive zero-knowledge proofs or arguments allow a prover to show validity of a statement without further interaction. For non-trivial statements such protocols require a setup assumption in form of a common random or reference string (CRS). Generally, the CRS can only be used for one statement (single-theorem zero-knowledge) such that a fresh CRS would need to be generated for each proof. Fortunately, Feige, Lapidot and Shamir (FOCS 1990) presented a transformation for any non-interactive zero-knowledge proof system that allows the CRS to be reused any polynomial number of times (multi-theorem zero-knowledge). This FLS transformation, however, is only known to work for either computational zero-knowledge or requires a structured, non-uniform common reference string.
- Published
- 2021
27. Non-cryptographic Hashing
- Author
-
Arno Mittelbach and Marc Fischlin
- Subjects
Theoretical computer science ,Computer science ,business.industry ,Hash function ,Data_FILES ,Cryptographic hash function ,Cryptography ,business ,Data structure - Abstract
There are notions of hash functions originating from the area of fast storage and retrieval in data structures which have also found many applications in cryptography and other areas of computer science.
- Published
- 2021
28. Iterated Hash Functions in Practice
- Author
-
Arno Mittelbach and Marc Fischlin
- Subjects
Theoretical computer science ,Iterated function ,Computer science ,Compression (functional analysis) ,Hash function ,Data_FILES - Abstract
In the previous chapters we discussed how hash functions can be constructed by iterating fixed-input-length primitives such as compression functions. We now take a brief detour away from the theory of hash functions and instead discuss a few of the most important hash functions used in practice.
- Published
- 2021
29. The Random Oracle Controversy
- Author
-
Arno Mittelbach and Marc Fischlin
- Subjects
Statistics::Theory ,Theoretical computer science ,Computer science ,Computer Science::Computational Complexity ,Standard model (cryptography) ,Random oracle - Abstract
Over the course of the previous chapters we have seen how random oracles allow for the creation of elegant and efficient schemes which can furthermore be proven secure in the random oracle model. In this chapter we have a closer look at what it means to have a security proof in the random oracle model rather than in the standard model.
- Published
- 2021
30. Information-Theoretic Security of Cryptographic Channels
- Author
-
Marc Fischlin, Felix Günther, Philipp Muth, Meng, Weizhi, Gollmann, Dieter, Jensen, Christian D., and Zhou, Jianying
- Subjects
TheoryofComputation_MISCELLANEOUS ,business.industry ,Computer science ,Cryptography ,0102 computer and information sciences ,Data_CODINGANDINFORMATIONTHEORY ,Adversary ,Computer security ,computer.software_genre ,01 natural sciences ,Information-theoretic security ,03 medical and health sciences ,0302 clinical medicine ,010201 computation theory & mathematics ,030220 oncology & carcinogenesis ,Bounded function ,Computer Science::Multimedia ,Confidentiality ,business ,computer ,Secure channel ,Communication channel ,Computer Science::Information Theory ,Computer Science::Cryptography and Security - Abstract
We discuss the setting of information-theoretically secure channel protocols where confidentiality of transmitted data should hold against unbounded adversaries. We argue that there are two possible scenarios: One is that the adversary is currently bounded, but stores today's communication and tries to break confidentiality later when obtaining more computational power or time. We call channel protocols protecting against such attacks future-secure. The other scenario is that the adversary already has extremely strong computational powers and may try to use that power to break current executions. We call channels withstanding such stronger attacks unconditionally-secure. We discuss how to instantiate both future-secure and unconditionally-secure channels. To this end we first establish according confidentiality and integrity notions, then prove the well-known composition theorem to also hold in the information-theoretic setting: Chosen-plaintext security of the channel protocol, together with ciphertext integrity, implies the stronger chosen-ciphertext notion. We discuss how to build future-secure channel protocols by combining computational message authentication schemes like HMAC with one-time pad encryption. Chosen-ciphertext security follows easily from the generalized composition theorem. We also show that using one-time pad encryption with the unconditionally-secure Carter-Wegman MACs we obtain an unconditionally-secure channel protocol., Information and Communications Security 22nd International Conference, ICICS 2020, Copenhagen, Denmark, August 24–26, 2020, Proceedings, ISBN:978-3-030-61078-4, ISBN:978-3-030-61077-7
- Published
- 2020
31. On the Memory Fault Resilience of TLS 1.3
- Author
-
Robin Leander Schröder, Michael Yonli, Lukas Brandstetter, and Marc Fischlin
- Subjects
Computer science ,Memory faults ,Computer security model ,Resilience (network) ,Computer security ,computer.software_genre ,Fault (power engineering) ,Protocol (object-oriented programming) ,computer ,Signature (logic) ,Key exchange - Abstract
Recently, Aranha et al. (Eurocrypt 2020) as well as Fischlin and Gunther (CT-RSA 2020) investigated the possibility to model memory fault attacks like Rowhammer in security games, and to deduce statements about the (in)security of schemes against such attacks. They looked into the fault-resistance of signature and AEAD schemes. Here, we extend the approach to the TLS 1.3 key exchange protocol.
- Published
- 2020
32. Signatures from Sequential-OR Proofs
- Author
-
Christian Janson, Patrick Harasser, and Marc Fischlin
- Subjects
Security properties ,Statement (computer science) ,Theoretical computer science ,Computer science ,Group (mathematics) ,0102 computer and information sciences ,02 engineering and technology ,Gas meter prover ,Mathematical proof ,01 natural sciences ,Witness ,Ring signature ,010201 computation theory & mathematics ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Zero-knowledge proof - Abstract
OR-proofs enable a prover to show that it knows the witness for one of many statements, or that one out of many statements is true. OR-proofs are a remarkably versatile tool, used to strengthen security properties, design group and ring signature schemes, and achieve tight security. The common technique to build OR-proofs is based on an approach introduced by Cramer, Damgard, and Schoenmakers (CRYPTO’94), where the prover splits the verifier’s challenge into random shares and computes proofs for each statement in parallel.
- Published
- 2020
33. Modeling Memory Faults in Signature and Authenticated Encryption Schemes
- Author
-
Felix Günther, Marc Fischlin, and Jarecki, Stanislaw
- Subjects
Authenticated encryption ,business.industry ,Computer science ,Memory faults ,Cryptography ,02 engineering and technology ,Computer security model ,computer.software_genre ,Computer security ,020202 computer hardware & architecture ,Software ,0202 electrical engineering, electronic engineering, information engineering ,Malware ,020201 artificial intelligence & image processing ,business ,computer ,Host machine ,Randomness - Abstract
Memory fault attacks, inducing errors in computations, have been an ever-evolving threat to cryptographic schemes since their discovery for cryptography by Boneh et al. (Eurocrypt 1997). Initially requiring physical tampering with hardware, the software-based rowhammer attack put forward by Kim et al. (ISCA 2014) enabled fault attacks also through malicious software running on the same host machine. This led to concerning novel attack vectors, for example on deterministic signature schemes, whose approach to avoid dependency on (good) randomness renders them vulnerable to fault attacks. This has been demonstrated in realistic adversarial settings in a series of recent works. However, a unified formalism of different memory fault attacks, enabling also to argue the security of countermeasures, is missing yet. In this work, we suggest a generic extension for existing security models that enables a game-based treatment of cryptographic fault resilience. Our modeling specifies exemplary memory fault attack types of different strength, ranging from random bit-flip faults to differential (rowhammer-style) faults to full adversarial control on indicated memory variables. We apply our model first to deterministic signatures to revisit known fault attacks as well as to establish provable guarantees of fault resilience for proposed fault-attack countermeasures. In a second application to nonce-misuse resistant authenticated encryption, we provide the first fault-attack treatment of the SIV mode of operation and give a provably secure fault-resilient variant., Lecture Notes in Computer Science, 12006, ISSN:0302-9743, ISSN:1611-3349, Topics in Cryptology – CT-RSA 2020, ISBN:978-3-030-40185-6, ISBN:978-3-030-40186-3
- Published
- 2020
34. Security Reductions for White-Box Key-Storage in Mobile Payments
- Author
-
Wil Michiels, Estuardo Alpirez Bock, Marc Fischlin, Christian Janson, and Chris Brzuska
- Subjects
business.operation ,Computer science ,business.industry ,media_common.quotation_subject ,Cryptography ,Certification ,Computer security ,computer.software_genre ,Payment ,MasterCard ,Mobile payment ,Key derivation function ,White box ,business ,computer ,Implementation ,media_common - Abstract
The goal of white-box cryptography is to provide security even when the cryptographic implementation is executed in adversarially controlled environments. White-box implementations nowadays appear in commercial products such as mobile payment applications, e.g., those certified by Mastercard. Interestingly, there, white-box cryptography is championed as a tool for secure storage of payment tokens, and importantly, the white-boxed storage functionality is bound to a hardware functionality to prevent code-lifting attacks.
- Published
- 2020
35. Intercept-Resend Emulation Attacks against a Continuous-Variable Quantum Authentication Protocol with Physical Unclonable Keys
- Author
-
Gernot Alber, Lukas Fladung, Georgios M. Nikolopoulos, and Marc Fischlin
- Subjects
FOS: Computer and information sciences ,Computer Science - Cryptography and Security ,Computer Networks and Communications ,Computer science ,square-root measurement ,FOS: Physical sciences ,Applied Physics (physics.app-ph) ,01 natural sciences ,emulation attack ,lcsh:Technology ,010309 optics ,physical unclonable functions ,Robustness (computer science) ,0103 physical sciences ,physical unclonable keys ,minimum-error discrimination ,010306 general physics ,Protocol (object-oriented programming) ,Quantum ,dual homodyne detection ,Computer Science::Cryptography and Security ,Quantum Physics ,Emulation ,business.industry ,lcsh:T ,Applied Mathematics ,Physics - Applied Physics ,Quantum authentication ,unambiguous state-discrimination ,Computer Science Applications ,continuous-variable quantum authentication ,Range (mathematics) ,Computational Theory and Mathematics ,Authentication protocol ,Coherent states ,business ,Quantum Physics (quant-ph) ,Cryptography and Security (cs.CR) ,Software ,Computer network - Abstract
Optical physical unclonable keys are currently considered to be rather promising candidates for the development of entity authentication protocols, which offer security against both classical and quantum adversaries. In this work we investigate the robustness of a continuous-variable protocol, which relies on the scattering of coherent states of light from the key, against three different types of intercept-resend emulation attacks. The performance of the protocol is analysed for a broad range of physical parameters, and our results are compared to existing security bounds., Comment: close to the version to be published in Cryptography [special Issue "Quantum Cryptography and Cyber Security"]
- Published
- 2019
36. Breakdown Resilience of Key Exchange Protocols: NewHope, TLS 1.3, and Hybrids
- Author
-
Jacqueline Brendel, Marc Fischlin, and Felix Günther
- Subjects
Computer science ,business.industry ,Hash function ,Cryptography ,Constant (mathematics) ,business ,Resilience (network) ,Computer security ,computer.software_genre ,computer ,Key exchange ,Quantum computer - Abstract
Broken cryptographic algorithms and hardness assumptions are a constant threat to real-world protocols. Prominent examples are hash functions for which collisions become known, or number-theoretic assumptions which are threatened by advances in quantum computing. Especially when it comes to key exchange protocols, the switch to quantum-resistant primitives has begun and aims to protect today’s secrets against future developments, moving from common Diffie–Hellman-based solutions to Learning-With-Errors-based approaches, often via intermediate hybrid designs.
- Published
- 2019
37. How to Sign with White-Boxed AES
- Author
-
Marc Fischlin and Helene Haagh
- Subjects
Stateless protocol ,business.industry ,Computer science ,Advanced Encryption Standard ,0102 computer and information sciences ,02 engineering and technology ,Cryptographic protocol ,Computer security ,computer.software_genre ,01 natural sciences ,Signature (logic) ,Stateful firewall ,010201 computation theory & mathematics ,Component (UML) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Message authentication code ,business ,computer ,Sign (mathematics) - Abstract
We investigate the possibility to use obfuscated implementations of the Advanced Encryption Standard AES (“white-boxed AES”) to devise secure signature schemes. We show that the intuitive idea to use AES-based message authentication codes to sign, and the white-boxed implementation to verify, fails in general. This underlines that providing a secure white-box implementation is only the first step and that using it securely as a component in cryptographic protocols may be harder than originally thought. We therefore provide secure signature schemes based on white-boxed AES and on random oracles, as well as stateful and stateless constructions without random oracles. All our solutions are shown to be secure for reasonable parameters.
- Published
- 2019
38. Hybrid Key Encapsulation Mechanisms and Authenticated Key Exchange
- Author
-
Marc Fischlin, Jacqueline Brendel, Brian Goncalves, Nina Bindel, and Douglas Stebila
- Subjects
Cryptographic primitive ,Transport Layer Security ,business.industry ,Computer science ,020206 networking & telecommunications ,02 engineering and technology ,Computer security ,computer.software_genre ,Authenticated Key Exchange ,Public-key cryptography ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Key encapsulation ,business ,Quantum ,computer ,Protocol (object-oriented programming) ,Key exchange ,Computer Science::Cryptography and Security ,Quantum computer - Abstract
Concerns about the impact of quantum computers on currently deployed public key cryptography have instigated research into not only quantum-resistant cryptographic primitives but also how to transition applications from classical to quantum-resistant solutions. One approach to mitigate the risk of quantum attacks and to preserve common security guarantees are hybrid schemes, which combine classically secure and quantum-resistant schemes. Various academic and industry experiments and draft standards related to the Transport Layer Security (TLS) protocol already use some form of hybrid key exchange; however sound theoretical approaches to substantiate the design and security of such hybrid key exchange protocols are missing so far.
- Published
- 2019
39. Information-Theoretically Secure Data Origin Authentication with Quantum and Classical Resources
- Author
-
Georgios M. Nikolopoulos and Marc Fischlin
- Subjects
FOS: Computer and information sciences ,Scheme (programming language) ,Computer Science - Cryptography and Security ,Computer Networks and Communications ,Computer science ,Hash function ,FOS: Physical sciences ,Cryptography ,lcsh:Technology ,01 natural sciences ,010305 fluids & plasmas ,0103 physical sciences ,data origin authentication ,Message authentication code ,010306 general physics ,Quantum ,computer.programming_language ,Quantum Physics ,Authentication ,lcsh:T ,business.industry ,Applied Mathematics ,message authentication ,data integrity ,Computer Science Applications ,Computational Theory and Mathematics ,Key (cryptography) ,quantum cryptography ,Quantum Physics (quant-ph) ,business ,Cryptography and Security (cs.CR) ,computer ,Software ,Computer network - Abstract
In conventional cryptography, information-theoretically secure message authentication can be achieved by means of universal hash functions, and requires that the two legitimate users share a random secret key, which is twice as long as the message. We address the question as of whether quantum resources can offer any advantage over classical unconditionally secure message authentication codes. It is shown that passive prepare-and-measure quantum message-authentication schemes cannot do better than their classical counterparts. Subsequently we present an interactive entanglement-assisted scheme, which ideally allows for the authentication of classical messages with a classical key, which is as long as the message., Comment: close to the version to be published in Cryptography
- Published
- 2020
40. Unpicking PLAID: a cryptographic analysis of an ISO-standards-track authentication protocol
- Author
-
Felix Günther, Victoria Fehr, Tommaso Gagliardoni, Giorgia Azzurra Marson, Jean Paul Degabriele, Arno Mittelbach, Kenneth G. Paterson, and Marc Fischlin
- Subjects
021110 strategic, defence & security studies ,Authentication ,Standardization ,Computer Networks and Communications ,business.industry ,Computer science ,Fingerprint (computing) ,0211 other engineering and technologies ,Logical access control ,020206 networking & telecommunications ,Protocol analysis ,Cryptography ,02 engineering and technology ,Computer security ,computer.software_genre ,Authentication protocol ,0202 electrical engineering, electronic engineering, information engineering ,Smart card ,Safety, Risk, Reliability and Quality ,business ,computer ,Software ,Information Systems - Abstract
The Protocol for Lightweight Authentication of Identity (PLAID) aims at secure and private authentication between a smart card and a terminal. Originally developed by a unit of the Australian Department of Human Services for physical and logical access control, PLAID has now been standardized as an Australian standard AS-5185-2010 and is currently in the fast-track standardization process for ISO/IEC 25185-1. We present a cryptographic evaluation of PLAID. As well as reporting a number of undesirable cryptographic features of the protocol, we show that the privacy properties of PLAID are significantly weaker than claimed: using a variety of techniques, we can fingerprint and then later identify cards. These techniques involve a novel application of standard statistical and data analysis techniques in cryptography. We discuss potential countermeasures to our attacks and comment on our experiences with the standardization process of PLAID.
- Published
- 2016
41. Self-Guarding Cryptographic Protocols against Algorithm Substitution Attacks
- Author
-
Sogol Mazaheri and Marc Fischlin
- Subjects
business.industry ,Computer science ,Initialization ,0102 computer and information sciences ,02 engineering and technology ,Cryptographic protocol ,Encryption ,01 natural sciences ,Public-key cryptography ,010201 computation theory & mathematics ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,business ,Algorithm ,Key exchange - Abstract
We put forward the notion of self-guarding cryptographic protocols as a countermeasure to algorithm substitution attacks. Such self-guarding protocols can prevent undesirable leakage by subverted algorithms if one has the guarantee that the system has been properly working in an initialization phase. Unlike detection-based solutions they thus proactively thwart attacks, and unlike reverse firewalls they do not assume an online external party. We present constructions of basic primitives for (public-key and private-key) encryption and for signatures. We also argue that the model captures attacks with malicious hardware tokens and show how to self-guard a PUF-based key exchange protocol.
- Published
- 2018
42. Simulatable Channels: Extended Security that is Universally Composable and Easier to Prove
- Author
-
Jean Paul Degabriele and Marc Fischlin
- Subjects
0301 basic medicine ,Computer science ,business.industry ,Cryptography ,Data_CODINGANDINFORMATIONTHEORY ,Construct (python library) ,Computer security ,computer.software_genre ,Mathematical proof ,Market fragmentation ,03 medical and health sciences ,030104 developmental biology ,0302 clinical medicine ,030220 oncology & carcinogenesis ,Ciphertext ,Universal composability ,business ,computer ,Secure channel ,Communication channel - Abstract
Ever since the foundational work of Goldwasser and Micali, simulation has proven to be a powerful and versatile construct for formulating security in various areas of cryptography. However security definitions based on simulation are generally harder to work with than game based definitions, often resulting in more complicated proofs. In this work we challenge this viewpoint by proposing new simulation-based security definitions for secure channels that in many cases lead to simpler proofs of security. We are particularly interested in definitions of secure channels which reflect real-world requirements, such as, protecting against the replay and reordering of ciphertexts, accounting for leakage from the decryption of invalid ciphertexts, and retaining security in the presence of ciphertext fragmentation. Furthermore we show that our proposed notion of channel simulatability implies a secure channel functionality that is universally composable. To the best of our knowledge, we are the first to study universally composable secure channels supporting these extended security goals. We conclude, by showing that the Dropbear implementation of SSH-CTR is channel simulatable in the presence of ciphertext fragmentation, and therefore also realises a universally composable secure channel. This is intended, in part, to highlight the merits of our approach over prior ones in admitting simpler security proofs in comparable settings.
- Published
- 2018
43. Invisible Sanitizable Signatures and Public-Key Encryption are Equivalent
- Author
-
Marc Fischlin and Patrick Harasser
- Subjects
Discrete mathematics ,business.industry ,0102 computer and information sciences ,02 engineering and technology ,One-way function ,Encryption ,01 natural sciences ,Public-key cryptography ,Adaptive chosen-ciphertext attack ,Digital signature ,010201 computation theory & mathematics ,Scheme (mathematics) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,business ,Signature (topology) ,Equivalence (measure theory) ,Mathematics - Abstract
Sanitizable signature schemes are signature schemes which support the delegation of modification rights. The signer can allow a sanitizer to perform a set of admissible operations on the original message and then to update the signature, in such a way that basic security properties like unforgeability or accountability are preserved. Recently, Camenisch et al. (PKC 2017) devised new schemes with the previously unattained invisibility property. This property says that the set of admissible operations for the sanitizer remains hidden from outsiders. Subsequently, Beck et al. (ACISP 2017) gave an even stronger version of this notion and constructions achieving it. Here we characterize the invisibility property in both forms by showing that invisible sanitizable signatures are equivalent to \(\mathsf {IND{-}}\mathsf {CPA}\)-secure encryption schemes, and strongly invisible signatures are equivalent to \(\mathsf {IND{-}}\mathsf {CCA2}\)-secure encryption schemes. The equivalence is established by proving that invisible (resp. strongly invisible) sanitizable signature schemes yield \(\mathsf {IND{-}}\mathsf {CPA}\)-secure (resp. \(\mathsf {IND{-}}\mathsf {CCA2}\)-secure) public-key encryption schemes and that, vice versa, we can build (strongly) invisible sanitizable signatures given a corresponding public-key encryption scheme.
- Published
- 2018
44. Replay Attacks on Zero Round-Trip Time: The Case of the TLS 1.3 Handshake Candidates
- Author
-
Felix Günther and Marc Fischlin
- Subjects
Handshake ,computer.internet_protocol ,Computer science ,business.industry ,QUIC ,Cryptography ,0102 computer and information sciences ,02 engineering and technology ,Computer security model ,Computer security ,computer.software_genre ,01 natural sciences ,010201 computation theory & mathematics ,Server ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,business ,computer ,Replay attack ,Key exchange ,Server-side - Abstract
We investigate security of key exchange protocols supporting so-called zero round-trip time (0-RTT), enabling a client to establish a fresh provisional key without interaction, based only on cryptographic material obtained in previous connections. This key can then be already used to protect early application data, transmitted to the server before both parties interact further to switch to fully secure keys. Two recent prominent examples supporting such 0-RTT modes are Google's QUIC protocol and the latest drafts for the upcoming TLS version 1.3. We are especially interested in the question how replay attacks, enabled through the lack of contribution from the server, affect security in the 0-RTT case. Whereas the first proposal of QUIC uses state on the server side to thwart such attacks, the latest version of QUIC and TLS 1.3 rather accept them as inevitable. We analyze what this means for the key secrecy of both the preshared-key-based 0-RTT handshake in draft-14 of TLS 1.3 as well as the Diffie-Hellman-based 0-RTT handshake in TLS 1.3 draft-12. As part of this we extend previous security models to capture such cases, also shedding light on the limitations and options for 0-RTT security under replay attacks.
- Published
- 2017
45. Zero Round-Trip Time for the Extended Access Control Protocol
- Author
-
Jacqueline Brendel and Marc Fischlin
- Subjects
0301 basic medicine ,Computer science ,business.industry ,020206 networking & telecommunications ,Context (language use) ,Cryptography ,Access control ,02 engineering and technology ,Computer security model ,03 medical and health sciences ,030104 developmental biology ,Extended Access Control ,0202 electrical engineering, electronic engineering, information engineering ,Key (cryptography) ,Smart card ,business ,Protocol (object-oriented programming) ,Computer network - Abstract
The Extended Access Control (EAC) protocol allows to create a shared cryptographic key between a client and a server. While originally used in the context of identity card systems and machine readable travel documents, the EAC protocol is increasingly adopted as a universal solution to secure transactions or for attribute-based access control with smart cards. Here we discuss how to enhance the EAC protocol by a so-called zero-round trip time (0RTT) mode. Through this mode the client can, without further interaction, immediately derive a new key from cryptographic material exchanged in previous executions. This makes the 0RTT mode attractive from an efficiency viewpoint such that the upcoming TLS 1.3 standard, for instance, will include its own 0RTT mode. Here we show that also the EAC protocol can be augmented to support a 0RTT mode. Our proposed EAC+0RTT protocol is compliant with the basic EAC protocol and adds the 0RTT mode smoothly on top. We also prove the security of our proposal according to the common security model of Bellare and Rogaway in the multi-stage setting.
- Published
- 2017
46. PRF-ODH: Relations, Instantiations, and Impossibility Results
- Author
-
Felix Günther, Christian Janson, Jacqueline Brendel, and Marc Fischlin
- Subjects
0301 basic medicine ,Theoretical computer science ,Hierarchy (mathematics) ,business.industry ,Computer science ,Cryptography ,0102 computer and information sciences ,01 natural sciences ,Random oracle ,03 medical and health sciences ,030104 developmental biology ,010201 computation theory & mathematics ,Extended Access Control ,Impossibility ,Variety (universal algebra) ,business ,Key exchange ,Standard model (cryptography) - Abstract
The pseudorandom-function oracle-Diffie–Hellman (PRF-ODH) assumption has been introduced recently to analyze a variety of DH-based key exchange protocols, including TLS 1.2 and the TLS 1.3 candidates, as well as the extended access control (EAC) protocol. Remarkably, the assumption comes in different flavors in these settings and none of them has been scrutinized comprehensively yet. In this paper here we therefore present a systematic study of the different PRF-ODH variants in the literature. In particular, we analyze their strengths relative to each other, carving out that the variants form a hierarchy. We further investigate the boundaries between instantiating the assumptions in the standard model and the random oracle model. While we show that even the strongest variant is achievable in the random oracle model under the strong Diffie–Hellman assumption, we provide a negative result showing that it is implausible to instantiate even the weaker variants in the standard model via algebraic black-box reductions to common cryptographic problems.
- Published
- 2017
47. Abstreitbarkeit bei eID-Lösungen
- Author
-
Marc Fischlin
- Subjects
Philosophy ,Humanities - Abstract
Verfahren zu elektronischen Identitaten sollen personliche Daten schutzen und Missbrauch im Internet-Verkehr verhindern. Dabei ist der Schutz der Privatsphare der Anwender ein wichtiger Gesichtspunkt. Wir betrachten hier speziell die sogenannte Abstreitbarkeit als einen Teil der Privatspharen-Aspekte von eID-Losungen.
- Published
- 2014
48. Robust Multi-Property Combiners for Hash Functions
- Author
-
Krzysztof Pietrzak, Anja Lehmann, and Marc Fischlin
- Subjects
Theoretical computer science ,Applied Mathematics ,Hash function ,SWIFFT ,MDC-2 ,Computer Science Applications ,Collision resistance ,SHA-2 ,Hash chain ,Cryptographic hash function ,Algorithm ,Software ,Double hashing ,Computer Science::Cryptography and Security ,Mathematics - Abstract
A robust combiner for hash functions takes two candidate implementations and constructs a hash function which is secure as long as at least one of the candidates is secure. So far, hash function combiners only aim at preserving a single property such as collision-resistance or pseudorandomness. However, when hash functions are used in protocols like TLS they are often required to provide several properties simultaneously. We therefore put forward the notion of robust multi-property combiners and elaborate on different definitions for such combiners. We then propose a combiner that provably preserves (target) collision-resistance, pseudorandomness, and being a secure message authentication code. This combiner satisfies the strongest notion we propose, which requires that the combined function satisfies every security property which is satisfied by at least one of the underlying hash function. If the underlying hash functions have output length n, the combiner has output length 2n. This basically matches a known lower bound for black-box combiners for collision-resistance only, thus the other properties can be achieved without penalizing the length of the hash values. We then propose a combiner which also preserves the property of being indifferentiable from a random oracle, slightly increasing the output length to 2n+?(logn). Moreover, we show how to augment our constructions in order to make them also robust for the one-wayness property, but in this case require an a priory upper bound on the input length.
- Published
- 2013
49. Less is more: relaxed yet composable security notions for key exchange
- Author
-
Nigel P. Smart, Bogdan Warinschi, Christina Brzuska, Stephen C. Williams, and Marc Fischlin
- Subjects
Key-agreement protocol ,Computer Networks and Communications ,Computer science ,business.industry ,Key distribution ,020206 networking & telecommunications ,Cryptography ,0102 computer and information sciences ,02 engineering and technology ,Computer security model ,Computer security ,computer.software_genre ,01 natural sciences ,010201 computation theory & mathematics ,Composability ,Universal composability ,0202 electrical engineering, electronic engineering, information engineering ,Safety, Risk, Reliability and Quality ,business ,Key management ,computer ,Software ,Key exchange ,Information Systems - Abstract
Although they do not suffer from clear attacks, various key agreement protocols (for example that used within the TLS protocol) are deemed as insecure by existing security models for key exchange. The reason is that the derived keys are used within the key exchange step, violating the usual key-indistinguishability requirement. In this paper, we propose a new security definition for key exchange protocols that offers two important benefits. Our notion is weaker than the more established ones and thus allows the analysis of a larger class of protocols. Furthermore, security in the sense that we define enjoys rather general composability properties. In addition, our composability properties are derived within game-based formalisms and do not appeal to any simulation-based paradigm. Specifically, we show that for protocols, whose security relies exclusively on some underlying symmetric primitive, can be securely composed with key exchange protocols provided that two main requirements hold: (1) No adversary can break the underlying primitive, even when the primitive uses keys obtained from executions of the key exchange protocol in the presence of the adversary (this is essentially the security requirement that we introduce and formalize in this paper), and (2) the security of the protocol can be reduced to that of the primitive, no matter how the keys for the primitive are distributed. Proving that the two conditions are satisfied, and then applying our generic theorem should be simpler than performing a monolithic analysis of the composed protocol. We exemplify our results in the case of a profile of the TLS protocol.
- Published
- 2013
50. Key Confirmation in Key Exchange: A Formal Treatment and Implications for TLS 1.3
- Author
-
Bogdan Warinschi, Benedikt Schmidt, Marc Fischlin, and Felix Günther
- Subjects
010201 computation theory & mathematics ,Computer science ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,0102 computer and information sciences ,02 engineering and technology ,01 natural sciences - Published
- 2016
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.