Author: S. Barry Cooper
Release Date: 2013-03-18
In this 2013 winner of the prestigious R.R. Hawkins Award from the Association of American Publishers, as well as the 2013 PROSE Awards for Mathematics and Best in Physical Sciences & Mathematics, also from the AAP, readers will find many of the most significant contributions from the four-volume set of the Collected Works of A. M. Turing. These contributions, together with commentaries from current experts in a wide spectrum of fields and backgrounds, provide insight on the significance and contemporary impact of Alan Turing's work. Offering a more modern perspective than anything currently available, Alan Turing: His Work and Impact gives wide coverage of the many ways in which Turing's scientific endeavors have impacted current research and understanding of the world. His pivotal writings on subjects including computing, artificial intelligence, cryptography, morphogenesis, and more display continued relevance and insight into today's scientific and technological landscape. This collection provides a great service to researchers, but is also an approachable entry point for readers with limited training in the science, but an urge to learn more about the details of Turing's work. 2013 winner of the prestigious R.R. Hawkins Award from the Association of American Publishers, as well as the 2013 PROSE Awards for Mathematics and Best in Physical Sciences & Mathematics, also from the AAP Named a 2013 Notable Computer Book in Computing Milieux by Computing Reviews Affordable, key collection of the most significant papers by A.M. Turing Commentary explaining the significance of each seminal paper by preeminent leaders in the field Additional resources available online
Author: Alan Mathison Turing
Publisher: Oxford University Press
Release Date: 2004-09-09
Lectures, scientific papers, top secret wartime material, correspondence, and broadcasts are introduced and set in context by Jack Copeland, Director of the Turing Archive for the History of Computing."--Jacket.
Documents the innovations of a group of eccentric geniuses who developed computer code in the mid-20th century as part of mathematician Alan Turin's theoretical universal machine idea, exploring how their ideas led to such developments as digital television, modern genetics and the hydrogen bomb.
Author: Martin Davis
Publisher: CRC Press
Release Date: 2018-02-28
The breathtakingly rapid pace of change in computing makes it easy to overlook the pioneers who began it all. Written by Martin Davis, respected logician and researcher in the theory of computation, The Universal Computer: The Road from Leibniz to Turing explores the fascinating lives, ideas, and discoveries of seven remarkable mathematicians. It tells the stories of the unsung heroes of the computer age – the logicians.
Award winning authors Jim Ottaviani and Leland Purvis present a historically accurate graphic novel biography of English mathematician and scientist Alan Turing in The Imitation Game. English mathematician and scientist Alan Turing (1912–1954) is credited with many of the foundational principles of contemporary computer science. The Imitation Game presents a historically accurate graphic novel biography of Turing’s life, including his groundbreaking work on the fundamentals of cryptography and artificial intelligence. His code breaking efforts led to the cracking of the German Enigma during World War II, work that saved countless lives and accelerated the Allied defeat of the Nazis. While Turing’s achievements remain relevant decades after his death, the story of his life in post-war Europe continues to fascinate audiences today. Award-winning duo Jim Ottaviani (the #1 New York Times bestselling author of Feynman and Primates) and artist Leland Purvis (an Eisner and Ignatz Award nominee and occasional reviewer for the Comics Journal) present a factually detailed account of Turing’s life and groundbreaking research—as an unconventional genius who was arrested, tried, convicted, and punished for being openly gay, and whose innovative work still fuels the computing and communication systems that define our modern world. Computer science buffs, comics fans, and history aficionados will be captivated by this riveting and tragic story of one of the 20th century’s most unsung heroes.
Author: Andrew Hodges
Publisher: Princeton University Press
Release Date: 2014-11-10
Genre: Biography & Autobiography
A NEW YORK TIMES BESTSELLER The official book behind the Academy Award-winning film The Imitation Game, starring Benedict Cumberbatch and Keira Knightley It is only a slight exaggeration to say that the British mathematician Alan Turing (1912-1954) saved the Allies from the Nazis, invented the computer and artificial intelligence, and anticipated gay liberation by decades--all before his suicide at age forty-one. This New York Times–bestselling biography of the founder of computer science, with a new preface by the author that addresses Turing's royal pardon in 2013, is the definitive account of an extraordinary mind and life. Capturing both the inner and outer drama of Turing’s life, Andrew Hodges tells how Turing’s revolutionary idea of 1936--the concept of a universal machine--laid the foundation for the modern computer and how Turing brought the idea to practical realization in 1945 with his electronic design. The book also tells how this work was directly related to Turing’s leading role in breaking the German Enigma ciphers during World War II, a scientific triumph that was critical to Allied victory in the Atlantic. At the same time, this is the tragic account of a man who, despite his wartime service, was eventually arrested, stripped of his security clearance, and forced to undergo a humiliating treatment program--all for trying to live honestly in a society that defined homosexuality as a crime. The inspiration for a major motion picture starring Benedict Cumberbatch and Keira Knightley, Alan Turing: The Enigma is a gripping story of mathematics, computers, cryptography, and homosexual persecution.
Author: Chris Bernhardt
Publisher: MIT Press
Release Date: 2016-05-13
Genre: Biography & Autobiography
In 1936, when he was just twenty-four years old, Alan Turing wrote a remarkable paper in which he outlined the theory of computation, laying out the ideas that underlie all modern computers. This groundbreaking and powerful theory now forms the basis of computer science. In Turing's Vision, Chris Bernhardt explains the theory, Turing's most important contribution, for the general reader. Bernhardt argues that the strength of Turing's theory is its simplicity, and that, explained in a straightforward manner, it is eminently understandable by the nonspecialist. As Marvin Minsky writes, "The sheer simplicity of the theory's foundation and extraordinary short path from this foundation to its logical and surprising conclusions give the theory a mathematical beauty that alone guarantees it a permanent place in computer theory." Bernhardt begins with the foundation and systematically builds to the surprising conclusions. He also views Turing's theory in the context of mathematical history, other views of computation (including those of Alonzo Church), Turing's later work, and the birth of the modern computer. In the paper, "On Computable Numbers, with an Application to the Entscheidungsproblem," Turing thinks carefully about how humans perform computation, breaking it down into a sequence of steps, and then constructs theoretical machines capable of performing each step. Turing wanted to show that there were problems that were beyond any computer's ability to solve; in particular, he wanted to find a decision problem that he could prove was undecidable. To explain Turing's ideas, Bernhardt examines three well-known decision problems to explore the concept of undecidability; investigates theoretical computing machines, including Turing machines; explains universal machines; and proves that certain problems are undecidable, including Turing's problem concerning computable numbers.
This volume presents an historical and philosophical revisiting of the foundational character of Turing’s conceptual contributions and assesses the impact of the work of Alan Turing on the history and philosophy of science. Written by experts from a variety of disciplines, the book draws out the continuing significance of Turing’s work. The centennial of Turing’s birth in 2012 led to the highly celebrated “Alan Turing Year”, which stimulated a world-wide cooperative, interdisciplinary revisiting of his life and work. Turing is widely regarded as one of the most important scientists of the twentieth century: He is the father of artificial intelligence, resolver of Hilbert’s famous Entscheidungsproblem, and a code breaker who helped solve the Enigma code. His work revolutionized the very architecture of science by way of the results he obtained in logic, probability and recursion theory, morphogenesis, the foundations of cognitive psychology, mathematics, and cryptography. Many of Turing’s breakthroughs were stimulated by his deep reflections on fundamental philosophical issues. Hence it is fitting that there be a volume dedicated to the philosophical impact of his work. One important strand of Turing’s work is his analysis of the concept of computability, which has unquestionably come to play a central conceptual role in nearly every branch of knowledge and engineering.
Author: John MacCormick
Publisher: Princeton University Press
Release Date: 2018-05-15
An accessible and rigorous textbook for introducing undergraduates to computer science theory What Can Be Computed? is a uniquely accessible yet rigorous introduction to the most profound ideas at the heart of computer science. Crafted specifically for undergraduates who are studying the subject for the first time, and requiring minimal prerequisites, the book focuses on the essential fundamentals of computer science theory and features a practical approach that uses real computer programs (Python and Java) and encourages active experimentation. It is also ideal for self-study and reference. The book covers the standard topics in the theory of computation, including Turing machines and finite automata, universal computation, nondeterminism, Turing and Karp reductions, undecidability, time-complexity classes such as P and NP, and NP-completeness, including the Cook-Levin Theorem. But the book also provides a broader view of computer science and its historical development, with discussions of Turing's original 1936 computing machines, the connections between undecidability and Gödel's incompleteness theorem, and Karp's famous set of twenty-one NP-complete problems. Throughout, the book recasts traditional computer science concepts by considering how computer programs are used to solve real problems. Standard theorems are stated and proven with full mathematical rigor, but motivation and understanding are enhanced by considering concrete implementations. The book's examples and other content allow readers to view demonstrations of—and to experiment with—a wide selection of the topics it covers. The result is an ideal text for an introduction to the theory of computation. An accessible and rigorous introduction to the essential fundamentals of computer science theory, written specifically for undergraduates taking introduction to the theory of computation Features a practical, interactive approach using real computer programs (Python in the text, with forthcoming Java alternatives online) to enhance motivation and understanding Gives equal emphasis to computability and complexity Includes special topics that demonstrate the profound nature of key ideas in the theory of computation Lecture slides and Python programs are available at whatcanbecomputed.com
Author: Paul J. Nahin
Publisher: Springer Science & Business Media
Release Date: 2014-04-09
Can a computer have a soul? Are religion and science mutually exclusive? Is there really such a thing as free will? If you could time travel to visit Jesus, would you (and should you)? For hundreds of years, philosophers, scientists and science fiction writers have pondered these questions and many more. In Holy Sci-Fi!, popular writer Paul Nahin explores the fertile and sometimes uneasy relationship between science fiction and religion. With a scope spanning the history of religion, philosophy and literature, Nahin follows religious themes in science fiction from Feynman to Foucault and from Asimov to Aristotle. An intriguing journey through popular and well-loved books and stories, Holy Sci-Fi! shows how sci-fi has informed humanity's attitudes towards our faiths, our future and ourselves.
Handbook of the History of Logic brings to the development of logic the best in modern techniques of historical and interpretative scholarship. Computational logic was born in the twentieth century and evolved in close symbiosis with the advent of the first electronic computers and the growing importance of computer science, informatics and artificial intelligence. With more than ten thousand people working in research and development of logic and logic-related methods, with several dozen international conferences and several times as many workshops addressing the growing richness and diversity of the field, and with the foundational role and importance these methods now assume in mathematics, computer science, artificial intelligence, cognitive science, linguistics, law and many engineering fields where logic-related techniques are used inter alia to state and settle correctness issues, the field has diversified in ways that even the pure logicians working in the early decades of the twentieth century could have hardly anticipated. Logical calculi, which capture an important aspect of human thought, are now amenable to investigation with mathematical rigour and computational support and fertilized the early dreams of mechanised reasoning: “Calculemus . The Dartmouth Conference in 1956 – generally considered as the birthplace of artificial intelligence – raised explicitly the hopes for the new possibilities that the advent of electronic computing machinery offered: logical statements could now be executed on a machine with all the far-reaching consequences that ultimately led to logic programming, deduction systems for mathematics and engineering, logical design and verification of computer software and hardware, deductive databases and software synthesis as well as logical techniques for analysis in the field of mechanical engineering. This volume covers some of the main subareas of computational logic and its applications. Chapters by leading authorities in the field Provides a forum where philosophers and scientists interact Comprehensive reference source on the history of logic