Claude Shannon: Difference between revisions

Content deleted Content added
Further information about his PhD paper.
Tags: citing a blog or free web host Visual edit Mobile edit Mobile web edit
No edit summary
Tags: Visual edit Mobile edit Mobile web edit
Line 168:
Using electrical switches to implement logic is the fundamental concept that underlies all [[Computer|electronic digital computers]]. Shannon's work became the foundation of [[digital circuit]] design, as it became widely known in the electrical engineering community during and after [[World War II]]. The theoretical rigor of Shannon's work superseded the ''ad hoc'' methods that had prevailed previously. [[Howard Gardner]] hailed Shannon's thesis "possibly the most important, and also the most noted, master's thesis of the century."<ref>{{cite book |title=The Mind's New Science: A History of the Cognitive Revolution |first=Howard |last=Gardner |author-link=Howard Gardner |publisher=Basic Books |year=1987 |isbn=978-0-465-04635-5 |page=[https://archive.org/details/mindsnewscience00howa/page/144 144] |url=https://archive.org/details/mindsnewscience00howa/page/144 }}</ref>
 
Shannon received his PhD in mathematics from MIT in 1940.<ref name="MIT obituary"/> Vannevar Bush had suggested that Shannon should work on his dissertation at the [[Cold Spring Harbor Laboratory]], in order to develop a mathematical formulation for [[Gregor Mendel|Mendelian]] [[genetics]]. This research resulted in Shannon's PhD thesis, called ''An Algebra for Theoretical Genetics''.<ref>{{cite thesis|hdl=1721.1/11174|title=An Algebra for Theoretical Genetics|year=1940|publisher=Massachusetts Institute of Technology|type=Thesis|last1=Shannon|first1=Claude Elwood}} — Contains a biography on pp. 64–65.</ref> However, the thesis went unpublished after Shannon lost interest, but it did contain important results.<ref name=":11" /> Notably, he was one of the first to apply an algebraic framework to study theoretical population genetics.<ref>{{Cite journal |last=Chalub |first=Fabio A. C. C. |last2=Souza |first2=Max O. |date=2017-12-01 |title=On the stochastic evolution of finite populations |url=https://doi.org/10.1007/s00285-017-1135-4 |journal=Journal of Mathematical Biology |language=en |volume=75 |issue=6 |pages=1735–1774 |doi=10.1007/s00285-017-1135-4 |issn=1432-1416}}</ref> In addition, Shannon devised a general expression for the distribution of several linked traits in a population after multiple generations under a random mating system, which was original at the time,<ref>{{Cite journal |last=Hanus |first=Pavol |last2=Goebel |first2=Bernhard |last3=Dingel |first3=Janis |last4=Weindl |first4=Johanna |last5=Zech |first5=Juergen |last6=Dawy |first6=Zaher |last7=Hagenauer |first7=Joachim |last8=Mueller |first8=Jakob C. |date=2007-11-27 |title=Information and communication theory in molecular biology |url=http://link.springer.com/10.1007/s00202-007-0062-6 |journal=Electrical Engineering |language=en |volume=90 |issue=2 |pages=161–173 |doi=10.1007/s00202-007-0062-6 |issn=0948-7921}}</ref> andwith had also derived athe new theorem that had not been workedunworked out by other [[Population genetics|population geneticists]] of the time.<ref>{{Cite web |last=Pachter |first=Lior |author-link=Lior Pachter |date=2013-11-06 |title=Claude Shannon, population geneticist |url=https://liorpachter.wordpress.com/2013/11/05/claude-shannon-population-geneticist/ |access-date=2024-07-29 |website=Bits of DNA |language=en}}</ref>
 
In 1940, Shannon became a National Research Fellow at the [[Institute for Advanced Study]] in [[Princeton, New Jersey]]. In Princeton, Shannon had the opportunity to discuss his ideas with influential scientists and [[mathematician]]s such as [[Hermann Weyl]] and [[John von Neumann]], and he also had occasional encounters with [[Albert Einstein]] and [[Kurt Gödel]]. Shannon worked freely across disciplines, and this ability may have contributed to his later development of mathematical [[information theory]].<ref>{{cite thesis|hdl=1721.1/39429|title=The Essential Message: Claude Shannon and the Making of Information Theory|year=2003|publisher=Massachusetts Institute of Technology|type=Thesis|last1=Guizzo|first1=Erico Marui}}</ref>