Alan Turing
| Alan Turing | |
|---|---|
| Alan Turing, English mathematician, logician, and computer scientist | |
| Tradition | Mathematics, Computer science, Logic, Cryptography |
| Influenced by | Kurt Gödel, Alonzo Church, David Hilbert |
| Lifespan | 1912–1954 |
| Notable ideas | Turing machine; foundations of computer science; Artificial intelligence (Turing Test); contributions to codebreaking in World War II; work on morphogenesis |
| Occupation | Mathematician, Logician, Computer scientist, Cryptanalyst |
| Influenced | John von Neumann, Claude Shannon, Artificial intelligence, Computability theory, Philosophy of mind |
| Wikidata | Q7251 |
Alan Mathison Turing (1912–1954) was a British mathematician, logician and codebreaker who is widely regarded as a founding figure of computer science and artificial intelligence. His 1936 paper on computable numbers introduced the mathematical concept of a “Turing machine,” laying out the limits of computation and foreshadowing the modern digital computer. During World War II, Turing led crucial work at Britain’s Bletchley Park to crack Nazi Germany’s Enigma ciphers. After the war he helped design early computers and posed “the imitation game” as a test of machine intelligence. Turing’s life was also marked by tragedy: in 1952 he was convicted of homosexuality (then a crime in Britain) and forced to undergo chemical castration. He died in 1954 at age 41. In later decades his achievements have been celebrated with honors ranging from the ACM Turing Award in computing to a posthumous royal pardon and many memorials. His ideas remain central to computer science and our understanding of computation and intelligence.
Early Life and Education
Turing was born on June 23, 1912, in London, the second son of an upper-middle-class British couple. His father, Julius Turing, was a civil servant working in India; Alan and his older brother John were often cared for by relatives and family friends during their parents’ absences. As a boy, Turing showed an intense curiosity and talent for mathematics and science, often preferring the company of objects and puzzles to that of people. He attended Sherborne School in Dorset (from 1926 to 1931), a boarding school steeped in traditional education. There he developed a passion for number puzzles and science, even sneaking books on cryptography and mathematics into the school library. A well-known anecdote recounts that as a teenager Turing discovered a book on mathematical recreations (by W.W. Rouse Ball) which included chapters on codes and ciphers – a discovery that sparked his lifelong fascination with encrypting and decrypting messages.
In 1931 Turing won a scholarship to King’s College, Cambridge, to read mathematics. He excelled: he earned first-class honors in the Cambridge Mathematical Tripos in 1934 and was elected a Fellow of King’s College in 1935, a rare honor for a graduate in his early twenties. His fellowship was based on research in probability theory, but Cambridge collegial life also put him in touch with the leading mathematicians of the day. In particular, he attended lectures by topologists like Max Newman, who introduced him to the work of Kurt Gödel on the limits of formal logic. By 1935 Turing was deep in mathematical logic, the branch of math that studies algorithms and formal reasoning.
After Cambridge, Turing traveled to the United States in 1936–1938 to study for a Ph.D. at Princeton University under Alonzo Church, a leading American logician. At Princeton he worked on the foundations of mathematics, completing a doctoral thesis titled Systems of Logic Based on Ordinals (published 1939). His thesis arose from the problem set by David Hilbert: could there be a mechanical way to decide the truth of any mathematical statement? This groundwork thesis prepared Turing for his wartime and postwar work. He earned his Ph.D. in 1938 and promptly returned to Britain, cutting short an offer to stay at Princeton for a lectureship.
Computability and the Turing Machine
Back at Cambridge in 1938, Turing tackled one of the most famous problems in mathematics: the Entscheidungsproblem (German for “decision problem”), posed by David Hilbert. The Entscheidungsproblem asked whether there exists a single mechanical procedure (an algorithm) to determine, for any mathematical statement, whether it is true or provable. In his landmark 1936 paper “On Computable Numbers, with an Application to the Entscheidungsproblem,” Turing gave a striking answer. He showed that no such universal procedure can exist.
Turing’s key idea was to define a simple theoretical device – now called the Turing machine – that operates on an infinite strip of tape divided into cells. Each cell can hold a symbol (such as 0 or 1 or a blank), and the machine has a “head” that reads and writes symbols and can move the tape one cell left or right. A finite set of rules (the “program” of the Turing machine) tells the head, based on the symbol it sees and the machine’s internal state, which symbol to write, how to move, and what new state to adopt. Though elementary, this model captures the essential idea of mechanized calculation: step-by-step symbol manipulation under fixed rules. Turing proved that any mathematical calculation or algorithm could be carried out by some such machine (or by an equivalent set of instructions).
In his paper, Turing also introduced the concept of a universal Turing machine – a single machine that could simulate any other Turing machine if fed with a description (encoding) of that machine and its input. This is analogous to the way a modern general-purpose computer can run any program given enough memory. From this work emerged the Church–Turing thesis, which says that any problem that can be solved by a human using a step-by-step procedure (an “effective” method) can also be solved by a Turing machine. While unprovable (it is an identification of two intuitively similar ideas), the Church–Turing thesis set a firm boundary on what is algorithmically possible. It implies, for example, that certain problems – like determining in all cases whether an arbitrary program will eventually stop (the halting problem) – have no algorithmic solution.
Turing’s analysis was deeply influential. It, along with Alonzo Church’s contemporaneous work in lambda calculus, shattered the hope of finding a single procedure to solve all mathematical questions. Instead, they showed that some questions are in principle undecidable. Turing’s reformulation of this result in terms of a simple machine made it especially clear and concrete; others in his field noted that von Neumann and Gödel themselves recognized the importance of his approach. Indeed, later accounts recount that von Neumann kept a well-worn copy of Turing’s 1936 paper in his library, having studied it carefully.
In computer science today, the Turing machine is not a practical computer design but remains a fundamental theoretical tool. Courses in theoretical computer science use it to teach about algorithms, computation length, complexity, and decidability. A system or programming language is said to be Turing-complete if it can simulate a universal Turing machine, meaning it can compute anything that is computable given enough time and memory (for example, modern general-purpose programming languages are Turing-complete). Simply put, Turing’s model underpins the mathematical definition of “algorithm” and shows the ultimate power and limits of mechanical computation.
Codebreaking and World War II
When World War II broke out in 1939, Turing’s career took a secretive turn. That summer he was recruited to the Government Code and Cypher School, Britain’s codebreaking agency. He soon moved to the school’s wartime headquarters at Bletchley Park, Buckinghamshire. There, his work would be pivotal to the Allied war effort.
Germany encrypted most of its military communications with the Enigma machine, an electromechanical cipher device. The Allies had some help: Polish cryptanalysts had broken early versions of Enigma in the 1930s and passed on key insights before Poland fell. The Polish design of a codebreaking device called the bomba was shared with the British. However, Germans regularly changed daily settings (like rotor wirings and start positions), making Enigma a moving target.
Turing built on the Polish groundwork. He developed methods to speed up finding the daily Enigma settings by hand and machine. A critical idea was the use of cribs: guessed or known snippets of plaintext likely to appear in a message (for example, weather reports often began with the word “WETTER” or “HEILHITLER” frequently appeared). The Enigma machine never enciphered a letter to itself, so if you align a guessed word under a ciphertext line and find no letters line up identically, you have a plausible crib alignment. The Bombe – an electromechanical machine designed by Turing and colleague Gordon Welchman – automated this process. It ran through many possible Enigma rotor settings against a crib to find consistent matches. By 1940–41, upgraded Bombes were running around the clock at Bletchley, rapidly uncovering the daily keys for Enigma messages. (Later, another code machine called “Tunny” – the Lorenz cipher used by German high command – was tackled by Turing and others using different statistical methods, leading to the development of the electronic Colossus computer under Tommy Flowers.)
Turing became a leading figure among the codebreakers. He took particular responsibility for the vital U-boat communications that threatened Allied shipping in the Atlantic. Winston Churchill later remarked that U-boat peril was Britain’s gravest threat, and Turing’s unit helped turn the tide. By 1942, thanks to the Bombes and other techniques, the cryptanalytic teams were decoding tens of thousands of German naval messages each month. In her memoir, Turing’s mother later wrote that London would sometimes know sinkings days before the public headlines.
Though Turing worked in relative isolation (his research remained under strict cover), he demonstrated remarkable creativity. For example, in 1942 he also devised a method for breaking the Tunny cipher, enabling Bletchley Park to read Germany’s most secret communications (at that time known only as “Fish” and later recognized as Lorenz-SZ40/42). When the European war ended in 1945, Turing’s contributions were formally honored by Britain’s government: he was made an Officer of the Order of the British Empire (OBE) for his codebreaking work. However, details of his work would stay classified for many years. Within Britain, his wartime achievements were little known outside military and intelligence circles until much later.
''Example (Codebreaking by cribbing): Imagine an intercepted Enigma message that begins with “XQZHVWOS...” and suppose you suspect it encodes the word “BABY” starting at position 4 (so “BABY” is your crib). If no letter in “BABY” matches the corresponding ciphertext letter (“O”), you try shifting. The Bombe machine could mechanically test many shifts and rotor configurations, quickly finding the one where “BABY” could appear without any letter enciphering to itself. Once a correct setting was found, the Allies could read all messages for that day.
Postwar Computing and Early Computers
After World War II, Turing turned his experience toward building real computers. In 1945 he joined the National Physical Laboratory (NPL) in London, where he sought to design an electronic computer called the Automatic Computing Engine (ACE). In a 1946 report (titled “Proposed Electronic Calculator”), Turing laid out an ambitious design for a machine with stored programs and high speed. His design was one of the first detailed proposals for what we now call a stored-program computer – a single machine that could perform many tasks by running different programs stored in memory. Unfortunately, his ACE design was deemed too complex and costly by the NPL officials. The lab instead built a smaller Pilot ACE prototype, which ran its first program in 1950 (after Turing had moved on). The Pilot ACE was one of the first working computers in Britain and was, for a short time, the fastest computer in the world.
Frustrated by delays at NPL, Turing accepted a position at the University of Manchester in late 1948. There he worked with the Manchester Mark I computer, one of Britain’s earliest electronic stored-program machines (the university had inherited a prototype built by the private Ferranti company). Turing’s role was mainly on software and algorithms. He and others wrote programs for the Mark I, demonstrating its capabilities in mathematical calculation. He also planned future computing projects and continued theoretical work. Interestingly, during this period he even experimented with early computer game concepts: in 1952 he drafted a program (called “Turbochamp”) that could play chess. Lacking a modern computer, Turing simulated the program himself on paper, taking a half-hour to carry out each move. When played, the program could indeed “play” legally, although Turing’s friend Alick Glennie ultimately won the game.
Overall, Turing was a pioneer at the dawn of the computer age. He had been one of the first to outline a universal computer in theory, and now he helped bring real machines into being. His geographic move to Manchester also placed him among engineers who would produce the first British commercial computer, the Manchester Mark I (completed 1949) and later the MUSE and others. Turing’s insistence on the stored-program design – where instructions and data share memory – became the engineering foundation for all later computers (often called the von Neumann architecture).
Artificial Intelligence and the Turing Test
By 1950, Turing’s interests had broadened to the question of machine intelligence. In a famous paper “Computing Machinery and Intelligence” (published in the journal Mind), he posed the provocative question: “Can machines think?” Rather than argue directly about mental processes, he proposed an operational test. He asked readers to imagine an “imitation game” involving a human, a machine, and an interrogator communicating by written questions. If the machine could convince the interrogator that it was human as often as a real person could, then Turing argued that we should say the machine is intelligent. This idea soon became known as the Turing Test.
The test was not meant to be a final verdict on machine minds, but a way to sidestep metaphysical disputes about consciousness. Turing acknowledged some critics (like people who insisted that machines could never have souls or be creative), and he addressed objections one by one. He speculated that by the year 2000, machines with a storage equivalent to brain neurons would be developed and could learn in manners akin to a human child. He suggested teaching machines by reward and punishment (an early notion of machine learning).
Turing’s framing – asking for behavioral evidence of intelligence rather than inspecting inner workings – was highly influential. It essentially founded the field of artificial intelligence by giving researchers a concrete goal. In the decades that followed, scientists and philosophers debated and expanded on his ideas. Some researchers in AI terms, for instance, developed algorithms for pattern recognition, logic, and game playing, inspired by Turing’s vision of “general intelligence.” Others interpreted “thinking” in different ways.
''Defining terms: In this context, “machine intelligence” refers to a computer’s ability to exhibit behavior that a human might consider intelligent, such as understanding language or solving problems. The Turing Test specifically looks at whether a person interacting with an unseen respondent cannot reliably tell if it’s a human or a program.
However, Turing’s test also invited criticisms. Some argued that merely imitating human conversation is a poor measure of understanding. Philosopher John Searle famously proposed the “Chinese Room” thought experiment, suggesting that a computer could pass the Turing Test (by following syntax rules) without actually understanding any Chinese (semantics). Others have remarked that the test is anthropocentric – it judges “intelligence” by human standards alone, ignoring whether non-human intelligences (or different forms of reasoning) might exist. In modern AI research, the Turing Test is rarely used as a literal benchmark; instead, it remains a provocative philosophical idea about machine cognition. Turing himself treated his imitation game more as a challenge and definition than a practical exam – it opened questions rather than closing them.
Nonetheless, the 1950 paper was a landmark. It made “artificial intelligence” a thinkable problem and introduced concepts (like machine learning, which he called “child machines” to be taught) that previewed later developments. Turing is often cited for his early optimism that computer technology could one day achieve tasks like human thinking, long before computing power made that a reality.
Mathematical Biology and Morphogenesis
In addition to logic and machines, Turing had a wide-ranging curiosity that extended into biology. In 1952 he published “The Chemical Basis of Morphogenesis,” a pioneering paper in mathematical biology. He asked: how do the complex patterns of nature arise from simple ingredients? For example, how do animals get striped fur or spotted skin?
Turing’s insight was to apply mathematics to chemical reactions. He theorized that if two or more chemicals (morphogens) interact and spread (diffuse) through a medium at different rates, they could destabilize an initially uniform state and form patterns. Using differential equations, he showed how certain reaction-diffusion processes could lead to stable stripes, spots, or spirals. This work was remarkably ahead of its time. It suggested that simple “non-linear” chemical interactions could create the diversity of form seen in nature. Turing also performed the first computer simulations of these equations, making him an early adopter of using electronic computers to model scientific problems.
This morphogenesis theory turned out to have long-lasting influence. In developmental biology, it provided one of the first mathematical explanations for pattern formation on animal skins, seashells, and plant leaves. Later scientists in the 1970s and beyond found that Turing’s equations appear in many models of biological growth. Some even trace modern ideas in artificial life and chaos theory back to Turing’s work, as it demonstrated how complex order can emerge from simple rules. In biology, Turing had been inspired by pattern puzzles since childhood (his mother later recalled him watching flowers grow). Biographers note that he was influenced by D’Arcy Thompson’s On Growth and Form (1917), a classic text on mathematical biology. Yet Turing took it further by formulating specific equations and using computation. In a sense, when one visits the Manchester Museum today, one of its exhibits shows the lineage from Turing’s chess-playing work to his morphogenesis research – both are examples of applying formal logic to different fields.
Turing’s biology work remained relatively unknown in his short lifetime (it was published when he was already a public scandal due to his conviction, which overshadowed his academic activities). However, it is now recognized as one of the first attempts to build a mathematical theory of development. In his collected works, it fills a whole volume and speaks to his habit of choosing fundamental problems wherever he turned.
Influence and Reception
Over time, Alan Turing’s early insights proved remarkably prescient and foundational. In computer science, he is often hailed as one of the “fathers” of the field. (This is not in any formal sense to exclude others, but rather an acknowledgment that his 1936 abstraction of computing and the notion of a universal machine are cornerstones of the discipline.) Textbooks in computing routinely begin chapters on algorithms or computability by referencing the “Turing machine model.” Key terms such as “algorithm,” “computability,” and “decision problems” trace back to the framework he established. Courses on theory of computation still use Turing machines as a teaching tool. His ideas directly led to the concept of a programmable computer: by defining a machine that could simulate any other, he anticipated the stored-program computers built in the 1940s and 1950s. Many computer scientists see the Church–Turing thesis as clarifying what problems can ever be solved by software, influencing areas like cryptography, verification, and complexity theory.
In cryptography and military history, Turing’s wartime role made him a British hero after the secret of Bletchley Park was revealed. Popular accounts often point to Telegrams allegedly by Churchill praising the codebreakers or historians crediting Turing with shortening the war by up to two years. Historians note, however, that breaking Enigma was a collective success aided by many teams (Polish, British, Americans, and others). Today, visitors to Bletchley Park can see a reconstruction of the Bombe machine Turing helped design, and museums often cite him as the person who broke Enigma (while also naming colleagues like Gordon Welchman). His recognition was limited immediately after the war, partly due to secrecy: he was appointed an OBE in 1946 but had no public mention of his role until decades later.
In artificial intelligence and cognitive science, Turing’s legacy is also immense. He coined practical questions that AI researchers have pursued ever since. The term “Turing test” (though he called it an “imitation game”) is now a standard reference in discussions of machine intelligence. Many AI textbooks and histories begin with Turing’s 1950 paper, describing it as prescribing an early test for conversational AI. Modern AI research may not focus on fooling chat judges as he imagined, but experts still debate the meaning of intelligence he introduced. His suggestion that a machine “rival the brain” if properly programmed inspired generations of scientists. Some credit Turing’s influence on thinkers like Alan Newell and Herb Simon (who coined “artificial intelligence” in 1956) or later computer pioneers like Marvin Minsky. The Turing Award, established by the Association for Computing Machinery (ACM) in 1966 and often called “the Nobel Prize of Computer Science,” was explicitly named in his honor. The award’s citation describes Turing as the mathematician who “articulated the mathematical foundation and limits of computing” and was a key codebreaker during WWII.
More broadly, Turing became a cultural icon. His biography by Andrew Hodges (“Alan Turing: The Enigma,” 1983) helped bring his story to public attention. Plays (like “Breaking the Code” by Hugh Whitemore) and films (“The Imitation Game,” 2014) dramatized his life for wide audiences. In academic circles he is respected for bridging disciplines: he showed that questions of mind, mathematics, and physics could be explored through computation. The Stanford Encyclopedia of Philosophy notes that his 1950 paper gave “a fresh approach to the traditional mind-body problem” by framing it in terms of algorithms ([[[Plato|plato]].stanford.edu](https://plato.stanford.edu/entries/turing/#:~:text=Alan%20Turing%20,of%20the%20artificial%20intelligence%20program)). In mathematics, his 1936 work is considered one of the “founding works of modern computer science,” providing an “absolute limitation on what computation could achieve” ([plato.stanford.edu](https://plato.stanford.edu/entries/turing/#:~:text=His%20first%20true%20home%20was,work%20in%20logic%20and%20other)).
By the 21st century, institutions around the world celebrated Turing’s legacy. The Alan Turing Institute was established in London as the national center for data science and artificial intelligence research. Manchester University, where he worked, has hosted international conferences on “Turing centenary” themes. In 2012 (what would have been his 100th birthday), special events were held globally. A bronze statue of Turing sits on a bench holding an apple in Sackville Park, Manchester (unveiled on what would have been his 89th birthday in 2001). English Heritage placed a blue plaque on his London home in Maida Vale in 1998 and on his Wilmslow home in Manchester in 2004.
Today on the wake of Turing’s theories, fundamental computer science concepts and practical computing alike carry his mark. When students study algorithms or the limits of computation, when engineers design computers and software, Turing’s work forms part of the foundation. His name is also on one of Britain’s new banknotes: he is featured on the £50 polymer note introduced in 2021, the first scientist chosen for a British banknote. Overall, Turing’s reception in science has grown steadily from relative obscurity at mid-century to iconic recognition in recent decades.
Critiques and Debates
While Turing is widely revered, scholars have debated aspects of his legacy and ideas. One line of debate concerns credit and context in computing history. Sometimes popular accounts call Turing “the father of the computer” or “father of computer science.” Critics point out that behind every famous figure stands a community of contributors. For example, Charles Babbage (1791–1871) designed mechanical “analytical engines” in the 19th century, and his collaborator Ada Lovelace wrote what many consider the first computer program. In the 20th century, separate teams independently built early computers: Konrad Zuse built a working programmable computer in Germany in 1941 (the Z3), and in the U.S. ENIAC was completed in 1945 by Eckert and Mauchly. John von Neumann formulated the practical stored-program architecture (often called the von Neumann architecture) that influenced computer makers from the late 1940s onward.
From one perspective, Turing himself did not build the largest or fastest machines nor did he coin the term “computer.” However, supporters note that his influence was conceptual and theoretical. As historian George Dyson has observed, von Neumann himself regarded Turing’s 1936 paper on universal machines as the fundamental conception underlying all later computers ([news.ycombinator.com](https://news.ycombinator.com/item?id=30610083#:~:text=development%20of%20early%20working%20computers,1%5D.Dyson%20also)). In a lecture Dyson noted researchers at the Institute for Advanced Study (von Neumann’s institute) had worn out multiple copies of Turing’s paper by the late 1940s. Thus some argue that Turing’s impact was more in theory than in hardware, but equally profound: while Zuse and ENIAC engineers showed that machines could be built, Turing showed what such machines are capable of in principle. The ACM’s choice of Turing’s name for its top award reflects the view that computer science (especially its theoretical side) rightly honors the mathematician behind modern computing ideas.
Another debate concerns the Turing Test and the nature of artificial intelligence. Over the decades, many thinkers have critiqued the idea of masking intelligence behind human-like conversation. For instance, philosopher John Searle’s Chinese Room argument suggests that a program could pass the Turing Test without any “understanding” – akin to a person following English instructions to manipulate Chinese symbols without knowing Chinese. Similarly, other critics argue the test is too focused on social imitation: it judges intelligence by whether a computer can fool a person in conversation, which may not capture problem-solving ability, visual-spatial reasoning, or other forms of intelligence. Indeed, by the 2000s some AI researchers regarded chasing the Turing Test as a distraction from practical progress. AI systems today often excel at tasks (e.g. image recognition, playing complex games, or large-scale data processing) far beyond the kind of dialog Turing envisioned, yet might still easily fail a simple chat-based test.
Nevertheless, defenders of Turing point out that he proposed the imitation game not as a strict final definition but as an accessible way to think about machine ability. He explicitly raised and answered many objections in his essay, showing he did not consider it the last word. The debate inspired by the Turing Test has shaped much of AI philosophy: it has installed the question of “machine thinking” at the forefront of discussions, even while alternate tests (such as solving mathematical problems or passing a Visual Turing Test) have been suggested.
Critics have also looked at Turing’s work in context of philosophy of mind. Some thinkers, like physicist Roger Penrose, later advanced the idea that human consciousness might involve non-algorithmic processes (Penrose posits uncomputable aspects of quantum physics in the brain). Such views challenge the universal applicability of Turing’s computability, suggesting there might be more to thought than symbol manipulation. However, these ideas remain speculative and not part of mainstream computing theory; most computer scientists implicitly accept the Church–Turing thesis as defining the formal limits of computable tasks.
There is also discussion of the historical narrative around his personal story. For decades after the war, Turing’s role at Bletchley Park was an official secret (due to the Official Secrets Act), so he did not receive public credit. Some have argued that in the Cold War era, secrecy meant Britain downplayed or even ignored his achievements. Only with declassification in the 1970s and publication of biographies did the public learn the full extent of his wartime work. In recent years some writers have critiqued the tendency to overly mythologize him – for example, by attributing every Allied success to Turing alone – and emphasize the contributions of his colleagues (such as codebreakers Gordon Welchman, Dilly Knox, and others) and earlier Polish achievements. Historians strive today for a balanced view: Turing’s inventions (like the Bombe) and ideas were crucial, but they sat within a large collaborative effort. A fair assessment notes both the individual genius and the teamwork at Bletchley Park.
Finally, society’s treatment of Turing’s personal life has been debated. His 1952 conviction and the abuse he suffered under the law is now seen as a grave injustice by modern standards. Some discussions in bioethics and legal history consider Turing’s case emblematic of the clash between scientific brilliance and social prejudice. This has also sparked debate about how many other historical figures have been unjustly treated or undercredited due to personal issues or societal taboos. In Turing’s case, those debates have largely leaned towards a consensus: he was a victim of unjust laws, and this should not diminish his scientific legacy but rather highlight the tragedy of lost years.
Legacy and Honors
Alan Turing’s legacy has grown in magnitude and visibility in the decades after his death. He is commemorated by numerous honors and institutions worldwide. Most prominently, the ACM A.M. Turing Award, often called the Nobel Prize of Computing, is named for him (even though he never received it himself). This award, given annually since 1966 to outstanding computer scientists, explicitly cites Turing’s articulation of the mathematical foundations of computing and his cryptologic achievements as inspiration for its name ([awards.acm.org](https://awards.acm.org/about/2018-turing#:~:text=is%20named%20for%20Alan%20M,foundation%20and%20limits%20of%20computing)).
In Britain, Turing has become a cultural icon for both science and social progress. He was included on the Bank of England’s new £50 note (unveiled in 2021) as the first scientist to appear on British currency. In science and education, the Alan Turing Institute (founded in 2015) serves as the UK’s national AI and data science centre, signifying his continuing relevance to current technology. Universities and research groups worldwide include his name or image on lecture halls and memorial lectures. The Manchester University School of Computer Science annually hosts a “Turing Lecture” in his honor, and the Manchester Museum of Science and Industry features exhibits on his life and work.
Public memorials also abound. In Manchester, a bronze statue of Turing seated on a bench (holding an apple) stands in a public park, symbolizing both his genius and the manner of his death. In London, an English Heritage plaque on his former home marks his contributions. Bletchley Park has exhibitions on Turing’s Bombe and on the breaking of Enigma, often accompanied by educational programs for students. Popular media have continued his story: numerous books, documentaries, and articles explore his life each year, and he is cited in contexts from AI development to LGBT history.
Turing’s story has also had a tangible legal and social aftermath. In 2009 the British Prime Minister formally apologized for “the appalling way” Turing was treated after the war. In 2013, Queen Elizabeth II granted him a posthumous royal pardon for the 1952 conviction. More broadly, in 2017 Parliament enacted the so-called “Alan Turing Law,” which pardoned thousands of men convicted under old laws criminalizing homosexuality. These events have tied Turing’s name not only to science but also to human rights and social change.
In the scientific community, persons new to the field often learn early on about Turing machines or Turing-completeness. His image appears in conferences, articles, and textbooks as a shorthand for theoretical depth. Students encounter terms like “Turing test” or “Turing-complete language” in courses. Scholars often invoke his work when discussing the essence of algorithms or the nature of mind. In short, Turing left a dual legacy: he is a hero of science for his intellectual breakthroughs, and a hero of conscience for the posthumous recognition of his mistreatment.
Selected Works
- On Computable Numbers, with an Application to the Entscheidungsproblem (1936–1937) – Introduces the Turing machine concept and proves the impossibility of a general algorithmic decision procedure.
- Systems of Logic Based on Ordinals (1939) – Turing’s Ph.D. dissertation on formal logic, extending earlier work on Gödel’s incompleteness by introducing “ordinal logics.”
- Proposed Electronic Calculator (1946) – Report to Britain’s National Physical Laboratory outlining the design of the Automatic Computing Engine (ACE).
- Computing Machinery and Intelligence (1950) – Essay in the journal Mind posing the question “Can machines think?” and proposing the imitation game (Turing Test) as a criterion for machine intelligence.
- The Chemical Basis of Morphogenesis (1952) – Paper in the Philosophical Transactions of the Royal Society modeling how patterns in biological growth can arise from reaction-diffusion processes.
(Other notable reports and papers by Turing include an unpublished 1948 “Intelligent Machinery” report on early AI ideas, and work on the ACE at NPL. Many of his wartime lectures and records on codebreaking remain classified or summarized in later histories.)
Timeline
- 1912 – Alan Turing is born on June 23 in London.
- 1926–1931 – Attends Sherborne School; develops interest in mathematics and cryptography.
- 1931–1934 – Studies mathematics at King’s College, Cambridge; earns first-class honors.
- 1935 – Elected a Fellow of King’s College, Cambridge.
- 1936 – Publishes “On Computable Numbers,” introducing the Turing machine; solves the Entscheidungsproblem (with Alonzo Church).
- 1936–1938 – Pursues Ph.D. at Princeton University under Alonzo Church.
- 1938 – Ph.D. granted; Turing returns to England.
- 1939 – Joins Britain’s codebreaking efforts at Bletchley Park as World War II begins.
- 1940–1941 – Develops the electric Bombe machine to decipher German Enigma ciphers.
- 1942 – Devises a method to break the Tunny (Lorenz) cipher.
- 1945 – War ends; Turing appointed Officer of the Order of the British Empire (OBE).
- 1946 – Plans for the ACE computer presented to the National Physical Laboratory.
- 1948 – Moves to University of Manchester; begins work on software for early computers.
- 1950 – Publishes “Computing Machinery and Intelligence” and writes one of the first computer chess programs.
- 1951 – Elected Fellow of the Royal Society (FRS) for his theoretical achievements.
- 1952 – Convicted of “gross indecency” for a homosexual relationship; chooses chemical castration. Publishes his morphogenesis paper.
- 1954 – Dies on June 7 in Wilmslow, Cheshire, at age 41 (cause ruled cyanide poisoning).
- 1966 – The ACM establishes the A.M. Turing Award, named in his honor.
- 1983 – Andrew Hodges’s biography “Alan Turing: The Enigma” is published, bringing his story to wider recognition.
- 1998–2004 – English Heritage erects blue plaques in his memory (London home and Wilmslow residence).
- 2009 – UK Prime Minister formally apologizes for Turing’s treatment under past laws.
- 2012 – Worldwide symposiums and memorials mark the centenary of Turing’s birth.
- 2013 – Turing is posthumously pardoned by the Queen.
- 2017 – “Alan Turing Law” pardons thousands of men convicted under historical anti-homosexuality laws in the UK.
- 2021 – Turing is featured on the new Bank of England £50 note.
Each of these milestones reflects Turing’s multifaceted legacy: as a mathematician, a war hero, a victim of injustice, and an icon of science and culture.
Sources
- [No title]