What does a JPEG have to do with economics and quantum gravity? All of them are about what happens when you simplify world-descriptions. A JPEG compresses an image by throwing out fine structure in ways a casual glance won't detect. Economists produce theories of human behavior that gloss over the details of individual psychology. Meanwhile, even our most sophisticated physics experiments can't show us the most fundamental building-blocks of matter, and so our theories have to make do with descriptions that blur out the smallest scales. The study of how theories change as we move to more or less detailed descriptions is known as renormalization.
This tutorial provides a modern introduction to renormalization from a complex systems point of view. Simon DeDeo will take students from basic concepts in information theory and image processing to some of the most important concepts in complexity, including emergence, coarse-graining, and effective theories. Only basic comfort with the use of probabilities is required for the majority of the material; some more advanced modules rely on more sophisticated algebra and basic calculus, but can be skipped. Solution sets include Python and Mathematica code to give more advanced learners hands-on experience with both mathematics and applications to data.
We'll introduce, in an elementary fashion, explicit examples of model-building including Markov Chains and Cellular Automata. We'll cover some new ideas for the description of complex systems including the Krohn-Rhodes theorem and State-Space Compression. And we'll show the connections between classic problems in physics, including the Ising model and plasma physics, and cutting-edge questions in machine learning and artificial intelligence.
What is artificial intelligence? Could unintended consequences arise from increased use of this technology? How will the role of humans change with AI? How will AI evolve in the next 10 years?
In this episode, Haley interviews leading Complex Systems Scientist, Professor of Computer Science at Portland State University, and external professor at the Santa Fe Institute, Melanie Mitchell. Professor Mitchell answers many profound questions about the field of artificial intelligence and gives specific examples of how this technology is being used today. She also provides some insights to help us navigate our relationship with AI as it becomes more popular in the coming years.
About the Tutorial:
This tutorial will present you with the basics of how to use NetLogo to create an agent-based modeling. During the tutorial, we will briefly discuss what agent-based modeling is, and then dive in to hands-on work using the NetLogo programming language, which is developed and supported at Northwestern University by Uri Wilensky. No programming background or knowledge is required, and the methods examined will be useable in any number of different fields.About the Instructor(s):
Bill Rand is an assistant professor of Business Management at the Poole College of Management at North Carolina State University and a computer scientist by training. He has co-authored a textbook on agent-based modelingwith Uri Wilensky, the author of the NetLogo programming language. He is also the author of over 50 scholarly papers, many of which use agent-based modeling as their core methodology. He received his doctorate in computer science in 2005 from the University of Michigan, and was also awarded a postdoctoral fellowship to Northwestern University, where he worked directly with Uri Wilensky as part of the NetLogo development team.Syllabus
- Introduction to ABM
- Tabs, Turtles, Patches, and Links
- Code, Control, and Collections
- Putting It All Together
WE’RE LAUNCHING A NEW TUTORIAL!
Fundamentals of NetLogo, a primer on the most used agent-based modeling software, will be available tomorrow.
Stay tuned for our launch announcement, and check out all our tutorials at https://t.co/APIkME07y5 pic.twitter.com/M8qIJp1R6x
— ComplexityExplorer (@ComplexExplorer) April 2, 2018
Introduction to Dynamical Systems and Chaos (Summer, 2016)
About the Course:
In this course you’ll gain an introduction to the modern study of dynamical systems, the interdisciplinary field of applied mathematics that studies systems that change over time.
Topics to be covered include: phase space, bifurcations, chaos, the butterfly effect, strange attractors, and pattern formation. The course will focus on some of the realizations from the study of dynamical systems that are of particular relevance to complex systems:
- Dynamical systems undergo bifurcations, where a small change in a system parameter such as the temperature or the harvest rate in a fishery leads to a large and qualitative change in the system’s
- Deterministic dynamical systems can behave randomly. This property, known as sensitive dependence or the butterfly effect, places strong limits on our ability to predict some phenomena.
- Disordered behavior can be stable. Non-periodic systems with the butterfly effect can have stable average properties. So the average or statistical properties of a system can be predictable, even if its details are not.
- Complex behavior can arise from simple rules. Simple dynamical systems do not necessarily lead to simple results. In particular, we will see that simple rules can produce patterns and structures of surprising complexity.
About the Instructor:
David Feldman is Professor of Physics and Mathematics at College of the Atlantic. From 2004-2009 he was a faculty member in the Santa Fe Institute’s Complex Systems Summer School in Beijing, China. He served as the school’s co-director from 2006-2009. Dave is the author of Chaos and Fractals: An Elementary Introduction (Oxford University Press, 2012), a textbook on chaos and fractals for students with a background in high school algebra. Dave was a U.S. Fulbright Lecturer in Rwanda in 2011-12.
5 Jul 2016 9am PDT to
20 Sep 2016 3pm PDT
A familiarity with basic high school algebra. There will be optional lessons for those with stronger math backgrounds.
- Introduction I: Iterated Functions
- Introduction II: Differential Equations
- Chaos and the Butterfly Effect
- Bifurcations: Part I (Differential Equations)
- Bifurcations: Part II (Logistic Map)
- Phase Space
- Strange Attractors
- Pattern Formation
- Summary and Conclusions
Source: Complexity Explorer
- Brit Cruise (Khan Academy) Informtion Theory
- Seth Lloyd (Complexity Explorer/YouTube) Introduction to Information Theory
- Thomas Cover (Stanford | YouTube) Information Theory
- Raymond Yeung (Chinese University of Hong Kong | Coursera) Information Theory (May require account to see 3 or more archived versions)
- David MacKay (University of Cambridge) Information Theory, Inference, and Learning Algorithms
- Andrew Eckford (York University | YouTube) Coding and Information Theory
- S.N. Merchant (IIT Bombay | NPTEL :: Electronics & Communication Engineering) Introduction to Information Theory and Coding
Fortunately, most are pretty reasonable, though vary in their coverage of topics. The introductory lectures don’t require as much mathematics and can probably be understood by those at the high school level with just a small amount of basic probability theory and an understanding of the logarithm.
The top three in the advanced section (they generally presume a prior undergraduate level class in probability theory and some amount of mathematical sophistication) are from professors who’ve written some of the most commonly used college textbooks on the subject. If I recall a first edition of the Yeung text was available via download through his course interface. MacKay’s text is available for free download from his site as well.
Feel free to post other video lectures or resources you may be aware of in the comments below.
Editor’s Update: With sadness, I’ll note that David MacKay died just days after this was originally posted.
Introduction to Information Theory
About the Tutorial:
This tutorial introduces fundamental concepts in information theory. Information theory has made considerable impact in complex systems, and has in part co-evolved with complexity science. Research areas ranging from ecology and biology to aerospace and information technology have all seen benefits from the growth of information theory.
In this tutorial, students will follow the development of information theory from bits to modern application in computing and communication. Along the way Seth Lloyd introduces valuable topics in information theory such as mutual information, boolean logic, channel capacity, and the natural relationship between information and entropy.
Lloyd coherently covers a substantial amount of material while limiting discussion of the mathematics involved. When formulas or derivations are considered, Lloyd describes the mathematics such that less advanced math students will find the tutorial accessible. Prerequisites for this tutorial are an understanding of logarithms, and at least a year of high-school algebra.
About the Instructor(s):
Professor Seth Lloyd is a principal investigator in the Research Laboratory of Electronics (RLE) at the Massachusetts Institute of Technology (MIT). He received his A.B. from Harvard College in 1982, the Certificate of Advanced Study in Mathematics (Part III) and an M. Phil. in Philosophy of Science from Cambridge University in 1983 and 1984 under a Marshall Fellowship, and a Ph.D. in Physics in 1988 from Rockefeller University under the supervision of Professor Heinz Pagels.
From 1988 to 1991, Professor Lloyd was a postdoctoral fellow in the High Energy Physics Department at the California Institute of Technology, where he worked with Professor Murray Gell-Mann on applications of information to quantum-mechanical systems. From 1991 to 1994, he was a postdoctoral fellow at Los Alamos National Laboratory, where he worked at the Center for Nonlinear Systems on quantum computation. In 1994, he joined the faculty of the Department of Mechanical Engineering at MIT. Since 1988, Professor Lloyd has also been an adjunct faculty member at the Sante Fe Institute.
Professor Lloyd has performed seminal work in the fields of quantum computation and quantum communications, including proposing the first technologically feasible design for a quantum computer, demonstrating the viability of quantum analog computation, proving quantum analogs of Shannon’s noisy channel theorem, and designing novel methods for quantum error correction and noise reduction.
Professor Lloyd is a member of the American Physical Society and the Amercian Society of Mechanical Engineers.
Yoav Kallus is an Omidyar Fellow at the Santa Fe Institute. His research at the boundary of statistical physics and geometry looks at how and when simple interactions lead to the formation of complex order in materials and when preferred local order leads to system-wide disorder. Yoav holds a B.Sc. in physics from Rice University and a Ph.D. in physics from Cornell University. Before joining the Santa Fe Institute, Yoav was a postdoctoral fellow at the Princeton Center for Theoretical Science in Princeton University.
- Forms of Information
- Information and Probability
- Fundamental Formula of Information
- Computation and Logic: Information Processing
- Mutual Information
- Communication Capacity
- Shannon’s Coding Theorem
- The Manifold Things Information Measures