When executing cryptographic protocols, we usually assume we know how our devices operate, and the success of the protocol relies on this. However, ensuring that devices really operate as intended is far from easy and devices that behave badly may be exploitable by an adversary. This is a particular problem for the task of generating random numbers using quantum mechanics.
In this tutorial, I will discuss protocols that can certify randomness generation based only on the input- output behavior of any devices used, and without needing to model how they produce their outputs (other than that they obey the laws of physics). I will explain the model in detail, before discussing the ideas that go into security proofs.
Zachary Dutton, Raytheon BBN
Michele Mosca, University of Waterloo and EvolutionQ
Andreas Poppe, Huawei Technologies Duesseldorf
Gregoire Ribordy, ID Quantique
Tuesday, 1:40 p.m. – Video
Tutorial: “Challenges to Physical Security of Today’s Quantum Technologies”
Vadim Makarov, University of Waterloo
Wednesday, 9 a.m. – Video
I will discuss security threats at the optical implementation layer of quantum communications. Examples of side-channel attacks, countermeasures, and testing the quality of countermeasures will be given.
At our present level of technology, the security-critical part of a quantum communication system is essentially an analog optoelectronic system connected to the optical channel, and is easily accessible by
an adversary. Today’s implementations sport a surprisingly rich set of imperfections and vulnerabilities, presenting challenges to standardization efforts. This is not surprising in a historical perspective, as our quantum technology today is as rudimentary as the electronic communication and computing were 70 years ago. The history also hints that the technology will improve.
“From the First Loophole-Free Bell Test to a Quantum Internet”
Ronald Hanson, Delft University of Technology
The realization of a highly connected network of qubit registers is a central challenge for quantum information processing and long-distance quantum communication. Diamond spins associated with NV centers are promising building blocks for such a network as they combine a coherent optical interface  (similar to that of trapped atomic qubits) with a local register of robust and well-controlled nuclear spin qubits .
Here we present our latest progress towards scalable quantum networks, which includes the first loophole-free violation of Bell’s inequalities [3,4] and the realization of a robust quantum network memory with nuclear spin qubits using decoherence-protected subspaces .
 W. Pfaff et al., Science 345, 532 (2014).
 J. Cramer et al., Nature Comm. 7, 11526 (2016).
 B. Hensen et al., Nature 526, 682 (2015).
 B. Hensen et al., Scientific Reports (in press), see also arxiv:1603.05705.  A. Reiserer et al., Phys. Rev. X 6, 021040 (2016).
“A Strong Loophole-Free Test of Local Realism and Applications to Randomness”
Krister Shalm, National Institute of Standards and Technology (NIST)
Eighty-one years ago, Einstein, Podolsky, and Rosen published a paper with the aim of showing that the wave function in quantum mechanics does not provide a complete description of reality. The Gedankenexperiment showed that quantum theory, as interpreted by Niels Bohr, leads to situations where distant particles, each with their own “elements of reality,” could instantaneously affect one another.
Such action at a distance seemingly conflicts with relativity. The hope was that a local theory of quantum mechanics could be developed where individual particles are governed by elements of reality, even if these elements are hidden from us. This concept is known as local realism.
In 1964, John Bell, continuing Einstein’s line of investigation, showed that the predictions of quantum mechanics are fundamentally incompatible with any local realistic theory. Bell’s theorem has profoundly shaped our modern understanding of quantum mechanics, and lies at the heart of quantum information theory. However, all experimental tests of Bell’s theorem have had to make assumptions that lead to loopholes.
This past year, a loophole-free violation of Bell’s 1964 inequalities, a ‘holy grail’ in the study of the foundations of quantum mechanics for half a century, was finally achieved by three different groups.
Here, we present the loophole-free Bell experiment carried out at the National Institute of Standards and Technology (NIST) that requires the minimal set of assumptions possible. We obtain a statistically significant violation of Bell’s inequality using photons that are space-like separated, and therefore forbidden by relativity from communicating. We find that local realism is not compatible with our experimental results. Specifically, we use rigorous statistical methods to reject the null hypothesis that nature obeys local realism with a p-value on the order of 10^-9.
Besides testing local realism, a loophole-free Bell test can be used in a device-independent configuration to extract randomness. I’ll also briefly discuss our work at NIST using our loophole-free Bell test setup to extract randomness, as well as our plans to incorporate this source into the NIST randomness beacon.
Local realism is the worldview in which physical properties of objects exist independently of measurement and where physical influences cannot travel faster than the speed of light. Bell’s theorem states that this worldview is incompatible with the predictions of quantum mechanics, as is expressed in Bell’s inequalities. Previous experiments convincingly supported the quantum predictions. Yet, every experiment requires assumptions that provide loopholes for a local realist explanation.
Here, we report a Bell test that closes the most significant of these loopholes simultaneously. Using a well-optimized source of entangled photons, rapid setting generation, and highly efficient superconducting detectors, we observe a violation of a Bell inequality with high statistical significance.
“Event-Ready Loophole Free Bell Test Using Heralded Atom-Atom Entanglement”
Harald Weinfurter, LMU Munich
Atom-photon entanglement together with entanglement swapping enables an event-ready Bell experiment closing the detection as well as the locality loophole.
Atomic states offer clear advantages for Bell experiments due to the high detection efficiency. In our experiment two entangled atom-photon couples separated by 400 m. line of sight are combined using entanglement swapping. In spite of comparatively low collection efficiency of the photons, we obtain about 1-2 entangled atom pairs per minute. The Bell-state measurement of the entanglement swapping protocol, implemented to detect both |Ψ+> and|Ψ–>, heralds each measurement run. It serves as signal for the observers to start their measurements and to report on their respective results for every run. In this case, a limited detection efficiency is not an issue anymore and enters only in the noise of the experiment. Thus the well-known Clauser-Horn-Shimony-Holt (CHSH) Bell-inequality can be used to obtain high significance with a modest number of events.
Warranting space-like separated observation of the state of the two atoms is enabled by introducing a state dependent ionisation scheme with detection efficiency of the fragments above 95 percent within less than 800 ns. The random number generation is achieved by sampling a telegraph signal and, without any post-processing, exhibits no bias. No correlations are observable for times longer than 100 ns, which altogether makes the two observers truly independent from each other.
In a measurement with a predefined number of 5000 events an overall CHSH S-parameter of S– = 2.35 ± 0.047 was obtained with a p-value of p = 6.73 * 10-7, indicating significant disagreement with LHV theories. The question arises whether this experiment can be improved toward device independent key distribution.
Tutorial: “Lattices, Rings, and Cryptography: Theory and Practice”
Chris Peikert, University of Michigan
Friday, 9 a.m. – Video
Point lattices provide one of the most attractive potential foundations for post-quantum cryptography, i.e., classical systems that are secure against quantum attacks. In addition to powerful objects like fully homomorphic encryption, lattices yield solutions to “everyday” tasks like key exchange and digital signatures. In order to be efficient enough for practical use, such systems typically need to use “algebraically structured” lattices defined over certain polynomial rings.
This tutorial will survey the state-of-the-art in lattice and ring-based cryptography, with a particular focus on theoretical foundations like the (Ring-)SIS/LWE problems and their “worst-case hardness” theorems, classical and quantum cryptanalysis, recent practical implementations, and important open questions and research directions.