QUANTUM COMPUTING By Sandeep Neeli.

Slides:



Advertisements
Similar presentations
Quantum Parallel Computing BY NIC & TIM: GUARDIANS OF THE HOOD.
Advertisements

FUTURE TECHNOLOGIES Lecture 13.  In this lecture we will discuss some of the important technologies of the future  Autonomic Computing  Cloud Computing.
Quantum Computing. Introduction to Computing Is currently done on your laptop today Numbers as we commonly use them are in decimal (base 10) format. Computers.
Quantum Computing Ambarish Roy Presentation Flow.
CNS2009handout 21 :: quantum cryptography1 ELEC5616 computer and network security matt barrie
Matthew Guidry. The Fundamentals of Cryptography  One of the fundamentals of cryptography is that keys selected for various protocols that are computationally.
Quantum Computing Joseph Stelmach.
Shor’s Algorithm Osama Awwad Department of Computer Science Western Michigan University July 12, 2015.
Future Computers CSCI 107, Spring When Moore’s law runs out of room When transistors become only tens of atoms thick –Quantum mechanics applies.
Quantum computing Alex Karassev. Quantum Computer Quantum computer uses properties of elementary particle that are predicted by quantum mechanics Usual.
By: Mike Neumiller & Brian Yarbrough
Moore’s Law the number of circuits on a single silicon chip doubles every 18 to 24 months.
Tallinn University of Technology Quantum computer impact on public key cryptography Roman Stepanenko.
Quantum Computing David Dvorak CIS 492. Quantum Computing Overview What is it? How does it work? –The basics –Clarifying with examples Factoring Quantum.
Quantum Computing The Next Generation of Computing Devices? by Heiko Frost, Seth Herve and Daniel Matthews.
From Bits to Qubits Wayne Viers and Josh Lamkins
Quantum Computers. Overview Brief History Computing – (generations) Current technology Limitations Theory of Quantum Computing How it Works? Applications.
Cove: A Practical Quantum Computer Programming Framework Matt Purkeypile Doctorate of Computer Science Dissertation Defense June 26, 2009.
The Turing machine Olena Lastivka. Definition Turing machine is a theoretical device that manipulates symbols on a strip of tape according to a table.
Quantum Information Jan Guzowski. Universal Quantum Computers are Only Years Away From David’s Deutsch weblog: „For a long time my standard answer to.
Limits and Horizon of Computing Post silicon computing.
Lecture note 8: Quantum Algorithms
An Introduction to Quantum Phenomena and their Effect on Computing Peter Shoemaker MSCS Candidate March 7 th, 2003.
CHEMISTRY 2000 Topics of Interest #2: Quantum Computers.
By Joseph Szatkowski and Cody Borgschulte. ● Uses phenomenon associated with quantum mechanics instead of electrical circuitry ● Quantum mechanics explains.
Quantum Computing Paola Cappellaro
Quantum Computer 電機四 鄭仲鈞. Outline Quantum Computer Quantum Computing Implement of Quantum Computer Nowadays research of Quantum computer.
Quantum Computing by Mathew Ross Jared Davis - Group L -
Quantum Computers by Ran Li.
Nawaf M Albadia
Quantum computing, teleportation, cryptography Computing Teleportation Cryptography.
Quantum Computing and Quantum Programming Language
Cove: A Practical Quantum Computer Programming Framework Matt Purkeypile (DCS3) Winter 2009.
As if computers weren’t fast enough already…
Quantum Computing: An Introduction Khalid Muhammad 1 History of Quantum Computing Bits and Qubits Problems with the Quantum Machine.
An Introduction to Quantum Computation Sandy Irani Department of Computer Science University of California, Irvine.
Quantum Computers By Ryan Orvosh.
Norman Littlejohn COSC480.  Quantum Computing  History  How it works  Usage.
QUANTUM PHYSICS BY- AHRAZ, ABHYUDAI AND AKSHAY LECTURE SECTION-5 GROUP NO. 6.
1 An Introduction to Quantum Computing Sabeen Faridi Ph 70 October 23, 2007.
Quantum Computing: An Introduction
Quantum Computing Keith Kelley CS 6800, Theory of Computation.
Sub-fields of computer science. Sub-fields of computer science.
Richard Cleve DC 3524 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 667 / Phys 767 C&O 481 / C&O 681 Lecture.
Prabhas Chongstitvatana Chulalongkorn University
QUANTUM COMPUTING: Quantum computing is an attempt to unite Quantum mechanics and information science together to achieve next generation computation.
D. Cheung – IQC/UWaterloo, Canada D. K. Pradhan – UBristol, UK
Paul M. Dooley Tamer Tayea Wenlin Zhou Ian M. Johson Joshua Tarlow
Poomipat Phusayangkul
Introduction to Quantum Computing Lecture 1 of 2
Quantum computING & CRYPTOLOGY
Quantum Cryptography Arjun Vinod S3 EC Roll No:17.
Limits and Horizon of Computing
Building Quantum Computers
Quantum Cryptography Alok.T.J EC 11.
Recent Advances in Quantum Computing
3.1 Introduction to CPU Central processing unit etched on silicon chip called microprocessor Contain tens of millions of tiny transistors Key components:
Quantum Computing: an introduction
Qubit Recycling in Quantum Computation
OSU Quantum Information Seminar
Quantum Computation and Information Chap 1 Intro and Overview: p 28-58
Quantum Computing Prabhas Chongstitvatana Faculty of Engineering
Quantum Computing Hakem Alazmi Jhilakshi Sharma Linda Vu.
Quantum Computing Andrew Krumbach Carolyn Camara
Quantum Computing Joseph Stelmach.
Quantum Computing.
The Future is Quantum Computing
Presentation transcript:

QUANTUM COMPUTING By Sandeep Neeli

What is a Quantum Computer? A quantum computer is a device for computation that makes direct use of quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data. A theoretical model is the quantum Turing machine, also known as the universal quantum computer. 

Past and Present 1965, Intel’s co-founder Gordon Moore saw that the number of transistors and the speed of computer chips were doubling about every 18 months. If technology followed Moore’s Law, then the shrinking size of circuitry packed into silicon chips would eventually reach a point where individual elements would be no larger than a few atoms. Here, a problem arises because at the atomic scale of physical laws that govern the behavior and properties of the circuit are inherently quantum in nature, not classical. The limits of classical computers and its computations brought up the idea of computers based on quantum mechanics. 1970s and 1980s: Theorists proposed idea of quantum computers. 1985: Deutsh of Oxford University wrote paper on quantum computers that went unnoticed. No one doubted quantum computer would work, but no point to this difficult and expensive task.

1994, A computer scientist at AT&T Bell Labs suggested that the strange, almost spooky way that a quantum computer could go about its business made it the perfect code-breaking machine. 1996: first quantum computer by IBM’s Chuang. This computer and others to follow look more like chemistry experiments than computers, but then, they are! 2001:First working 7-qubit NMR computer demonstrated at IBM's Almaden Research Center. First execution of Shor's algorithm. The number 15 was factored using 1018 identical molecules, each containing 7 atoms. 2009: Researchers at Yale University created the first rudimentary solid-state quantum processor. The two-qubit superconducting chip was able to run elementary algorithms. Each of the two artificial atoms (or qubits) were made up of a billion aluminum atoms but they acted like a single one that could occupy two different energy states. 2011:  D-Wave Systems announced the first commercial quantum annealer on the market by the name D-Wave One. The company claims this system uses a 128 qubit processor chipset

Classical Vs Quantum A classical computer has a memory made up of bits, where each bit represents either a one or a zero. A quantum computer maintains a sequence of qubits. A single qubit can represent a one, a zero, or, crucially, any quantum superposition of these. Put another way, a traditional memory register with eight bits can store only one of a possible 256 digital “words”, but a quantum register with eight qubits can represent and compute all 256 words at once. In traditional computer architecture, the expressions and or and not are embodied in the electrical circuits. To manipulate qubits, quantum circuits use nuclear magnetic resonance (NMR) or laser pulses to obtain these operations

 In general a quantum computer with n qubits can be in an arbitrary superposition of up to 2n different states simultaneously (this compares to a normal computer that can only be in one of these 2n states at any one time).

Shor’s Algorithm Shor’s Algorithm is one of the quantum algorithm developed in 1994 named after mathematician Peter Shor. This is used for integer Factorization which finds the prime factors if given an integer N. On a quantum computer, to factor an integer N, Shor's algorithm runs in polynomial time. Specifically it takes time O((log N)3), demonstrating that the integer factorization problem can be efficiently solved on a quantum computer. Given a quantum computer with a sufficient number of qubits, Shor's algorithm can be used to break public-key cryptography schemes such as the widely used RSA scheme http://www.doc.ic.ac.uk/~nd/surprise_97/journal/vol4/spb3/

How likely it is to happen in near future With classical computers gradually approaching their limit, the quantum computer promises to deliver a new level of computational power. With them comes a whole new theory of computation that incorporates the strange effects of quantum mechanics and considers every physical object to be some kind of quantum computer. This power can only be unleashed with the correct type of algorithm, a type of algorithm that is extremely difficult to formulate. Some algorithms have already been invented; they are proving to have huge implications on the world of cryptography. This is because they enable the most commonly used cryptography techniques to be broken in a matter of seconds. http://www.bbc.co.uk/news/science-environment-12811199

Quantum computing vs Cryptography We all use cryptography every day, and most of us do so without knowing it. If we rely on it so much, is there any danger that the security of current cryptosystems could be compromised? Although still many years away, it turns out that there is such a threat. Current cryptographic techniques are all based around advanced mathematics, but the cryptography of the future is likely to pass to the realm of the physicists. Physicists have come up with the theoretical notion of a quantum computer which could break virtually all known cryptographic algorithms. It turns out that most of the mathematical problems that are at the heart of our current cryptographic systems are perfectly suited to being tackled by quantum computers.

The first, Shor's algorithm, can be adapted to crack virtually all existing public key cryptographic algorithms that are considered to be secure today. Shor's algorithm would allow a quantum computer to solve factoring problems (used by the RSA algorithm), the discrete logarithm problem (used by the El Gamal algorithm) and even certain versions of the elliptic curve discrete logarithm problem (used in elliptic curve cryptography). A second algorithm known as Grover's algorithm provides a way for a quantum computer to search an unsorted list, a method which could be used to crack symmetric ciphers.

Conclusion Much effort is being expended into building a working Quantum computer, but it is likely to be many years before it becomes a reality. Although the advent of quantum computers would be the nail in the coffin for many of the cryptographic algorithms in use today, there are other algorithms and technologies ready to take their place. I believe we can look forward to an age of quantum computers rather than needing to fear that they will make our computing insecure.

References http://en.wikipedia.org/wiki/Quantum_computer http://www.bbc.co.uk/news/science-environment-12811199 http://computer.howstuffworks.com/quantum-computer1.htm http://alumni.imsa.edu/~matth/quant/299/paper/node21.html