This collection of essays explores the authors’ work in, inquiry into, and critique of online learning, educational technology, and the trends, techniques, hopes, fears, and possibilities of digital pedagogy. For more information, visit urgencyofteachers.com.
Very slick! Greg McVerry, a professor, can post all of the readings, assignments, etc. for his EDU522 online course on his own website, and I can indicate that I’ve read the pieces, watched the videos, or post my responses to assignments and other classwork (as well as to fellow classmates’ work and questions) on my own website while sending notifications via Webmention of all of the above to the original posts on their sites.
When I’m done with the course I’ll have my own archive of everything I did for the entire course (as well as copies on the Internet Archive, since I ping it as I go). His class website and my responses there could be used for the purposes of grading.
I can subscribe to his feed of posts for the class (or an aggregated one he’s made–sometimes known as a planet) and use the feed reader of choice to consume the content (and that of my peers’) at my own pace to work my way through the course.
This is a lot closer to what I think online pedagogy or even the use of a Domain of One’s Own in an educational setting could and should be. I hope other educators might follow suit based on our examples. As an added bonus, if you’d like to try it out, Greg’s three week course is, in fact, an open course for using IndieWeb and DoOO technologies for teaching. It’s just started, so I hope more will join us.
He’s focusing primarily on using WordPress as the platform of choice in the course, but one could just as easily use other Webmention enabled CMSes like WithKnown, Grav, Perch, Drupal, et al. to participate.Syndicated copies to:
About the Course:
Probability and statistics have long helped scientists make sense of data about the natural world — to find meaningful signals in the noise. But classical statistics prove a little threadbare in today’s landscape of large datasets, which are driving new insights in disciplines ranging from biology to ecology to economics. It's as true in biology, with the advent of genome sequencing, as it is in astronomy, with telescope surveys charting the entire sky.
The data have changed. Maybe it's time our data analysis tools did, too.
During this three-month online course, starting June 11th, instructors Hector Zenil and Narsis Kiani will introduce students to concepts from the exciting new field of Algorithm Information Dynamics to search for solutions to fundamental questions about causality — that is, why a particular set of circumstances lead to a particular outcome.
Algorithmic Information Dynamics (or Algorithmic Dynamics in short) is a new type of discrete calculus based on computer programming to study causation by generating mechanistic models to help find first principles of physical phenomena building up the next generation of machine learning.
The course covers key aspects from graph theory and network science, information theory, dynamical systems and algorithmic complexity. It will venture into ongoing research in fundamental science and its applications to behavioral, evolutionary and molecular biology.
Students should have basic knowledge of college-level math or physics, though optional sessions will help students with more technical concepts. Basic computer programming skills are also desirable, though not required. The course does not require students to adopt any particular programming language for the Wolfram Language will be mostly used and the instructors will share a lot of code written in this language that student will be able to use, study and exploit for their own purposes.
- The course will begin with a conceptual overview of the field.
- Then it will review foundational theories like basic concepts of statistics and probability, notions of computability and algorithmic complexity, and brief introductions to graph theory and dynamical systems.
- Finally, the course explores new measures and tools related to reprogramming artificial and biological systems. It will showcase the tools and framework in applications to systems biology, genetic networks and cognition by way of behavioral sequences.
- Students will be able apply the tools to their own data and problems. The instructors will explain in detail how to do this, and will provide all the tools and code to do so.
The course runs 11 June through 03 September 2018.
Tuition is $50 required to get to the course material during the course and a certificate at the end but is is free to watch and if no fee is paid materials will not be available until the course closes. Donations are highly encouraged and appreciated in support for SFI's ComplexityExplorer to continue offering new courses.
In addition to all course materials tuition includes:
- Six-month access to the Wolfram|One platform (potentially renewable by other six) worth 150 to 300 USD.
- Free digital copy of the course textbook to be published by Cambridge University Press.
- Several gifts will be given away to the top students finishing the course, check the FAQ page for more details.
Best final projects will be invited to expand their results and submit them to the journal Complex Systems, the first journal in the field founded by Stephen Wolfram in 1987.
About the Instructor(s):
Hector Zenil has a PhD in Computer Science from the University of Lille 1 and a PhD in Philosophy and Epistemology from the Pantheon-Sorbonne University of Paris. He co-leads the Algorithmic Dynamics Lab at the Science for Life Laboratory (SciLifeLab), Unit of Computational Medicine, Center for Molecular Medicine at the Karolinska Institute in Stockholm, Sweden. He is also the head of the Algorithmic Nature Group at LABoRES, the Paris-based lab that started the Online Algorithmic Complexity Calculator and the Human Randomness Perception and Generation Project. Previously, he was a Research Associate at the Behavioural and Evolutionary Theory Lab at the Department of Computer Science at the University of Sheffield in the UK before joining the Department of Computer Science, University of Oxford as a faculty member and senior researcher.
Narsis Kiani has a PhD in Mathematics and has been a postdoctoral researcher at Dresden University of Technology and at the University of Heidelberg in Germany. She has been a VINNOVA Marie Curie Fellow and Assistant Professor in Sweden. She co-leads the Algorithmic Dynamics Lab at the Science for Life Laboratory (SciLifeLab), Unit of Computational Medicine, Center for Molecular Medicine at the Karolinska Institute in Stockholm, Sweden. Narsis is also a member of the Algorithmic Nature Group, LABoRES.
Hector and Narsis are the leaders of the Algorithmic Dynamics Lab at the Unit of Computational Medicine at Karolinska Institute.
Alyssa Adams has a PhD in Physics from Arizona State University and studies what makes living systems different from non-living ones. She currently works at Veda Data Solutions as a data scientist and researcher in social complex systems that are represented by large datasets. She completed an internship at Microsoft Research, Cambridge, UK studying machine learning agents in Minecraft, which is an excellent arena for simple and advanced tasks related to living and social activity. Alyssa is also a member of the Algorithmic Nature Group, LABoRES.
The development of the course and material offered has been supported by:
- The Foundational Questions Institute (FQXi)
- Wolfram Research
- John Templeton Foundation
- Santa Fe Institute
- Swedish Research Council (Vetenskapsrådet)
- Algorithmic Nature Group, LABoRES for the Natural and Digital Sciences
- Living Systems Lab, King Abdullah University of Science and Technology.
- Department of Computer Science, Oxford University
- Cambridge University Press
- London Mathematical Society
- Springer Verlag
- ItBit for the Natural and Computational Sciences and, of course,
- the Algorithmic Dynamics lab, Unit of Computational Medicine, SciLifeLab, Center for Molecular Medicine, The Karolinska Institute
Course dates: 11 Jun 2018 9pm PDT to 03 Sep 2018 10pm PDT
- A Computational Approach to Causality
- A Brief Introduction to Graph Theory and Biological Networks
- Elements of Information Theory and Computability
- Randomness and Algorithmic Complexity
- Dynamical Systems as Models of the World
- Practice, Technical Skills and Selected Topics
- Algorithmic Information Dynamics and Reprogrammability
- Applications to Behavioural, Evolutionary and Molecular Biology
Another interesting course from the SFI. Looks like an interesting way to spend the summer.Syndicated copies to:
About the Tutorial:
This tutorial will present you with the basics of how to use NetLogo to create an agent-based modeling. During the tutorial, we will briefly discuss what agent-based modeling is, and then dive in to hands-on work using the NetLogo programming language, which is developed and supported at Northwestern University by Uri Wilensky. No programming background or knowledge is required, and the methods examined will be useable in any number of different fields.About the Instructor(s):
Bill Rand is an assistant professor of Business Management at the Poole College of Management at North Carolina State University and a computer scientist by training. He has co-authored a textbook on agent-based modelingwith Uri Wilensky, the author of the NetLogo programming language. He is also the author of over 50 scholarly papers, many of which use agent-based modeling as their core methodology. He received his doctorate in computer science in 2005 from the University of Michigan, and was also awarded a postdoctoral fellowship to Northwestern University, where he worked directly with Uri Wilensky as part of the NetLogo development team.Syllabus
- Introduction to ABM
- Tabs, Turtles, Patches, and Links
- Code, Control, and Collections
- Putting It All Together
h/t to ComplexExplorer
Syndicated copies to:
WE’RE LAUNCHING A NEW TUTORIAL!
Fundamentals of NetLogo, a primer on the most used agent-based modeling software, will be available tomorrow.
Stay tuned for our launch announcement, and check out all our tutorials at https://t.co/APIkME07y5 pic.twitter.com/M8qIJp1R6x
— ComplexityExplorer (@ComplexExplorer) April 2, 2018
An introductory course in statistical mechanics.
Recommended textbook Thermal Physics by Charles Kittel and Herbert Kroemer
There’s also a corresponding video lecture series available on YouTube
Download 1200 free online courses from the world's top universities -- Stanford, Yale, MIT, & more. Over 30,000 hours of free audio & video lectures.
Learn about quantum computation and quantum information in this advanced graduate level course from MIT.
Syndicated copies to:
About this course
Already know something about quantum mechanics, quantum bits and quantum logic gates, but want to design new quantum algorithms, and explore multi-party quantum protocols? This is the course for you!
In this advanced graduate physics course on quantum computation and quantum information, we will cover:
- The formalism of quantum errors (density matrices, operator sum representations)
- Quantum error correction codes (stabilizers, graph states)
- Fault-tolerant quantum computation (normalizers, Clifford group operations, the Gottesman-Knill Theorem)
- Models of quantum computation (teleportation, cluster, measurement-based)
- Quantum Fourier transform-based algorithms (factoring, simulation)
- Quantum communication (noiseless and noisy coding)
- Quantum protocols (games, communication complexity)
Research problem ideas are presented along the journey.
What you’ll learn
- Formalisms for describing errors in quantum states and systems
- Quantum error correction theory
- Fault-tolerant quantum procedure constructions
- Models of quantum computation beyond gates
- Structures of exponentially-fast quantum algorithms
- Multi-party quantum communication protocols
Meet the instructor
Isaac Chuang Professor of Electrical Engineering and Computer Science, and Professor of Physics MIT
A very beautiful classical theory on field extensions of a certain type (Galois extensions) initiated by Galois in the 19th century. Explains, in particular, why it is not possible to solve an equation of degree 5 or more in the same way as we solve quadratic or cubic equations. You will learn to compute Galois groups and (before that) study the properties of various field extensions. We first shall survey the basic notions and properties of field extensions: algebraic, transcendental, finite field extensions, degree of an extension, algebraic closure, decomposition field of a polynomial. Then we shall do a bit of commutative algebra (finite algebras over a field, base change via tensor product) and apply this to study the notion of separability in some detail. After that we shall discuss Galois extensions and Galois correspondence and give many examples (cyclotomic extensions, finite fields, Kummer extensions, Artin-Schreier extensions, etc.). We shall address the question of solvability of equations by radicals (Abel theorem). We shall also try to explain the relation to representations and to topological coverings. Finally, we shall briefly discuss extensions of rings (integral elemets, norms, traces, etc.) and explain how to use the reduction modulo primes to compute Galois groups.
I’ve been watching MOOCs for several years and this is one of the few I’ve come across that covers some more advanced mathematical topics. I’m curious to see how it turns out and what type of interest/results it returns.
It’s being offered by National Research University – Higher School of Economics (HSE) in Russia.Syndicated copies to:
Introduction to Dynamical Systems and Chaos (Summer, 2016)
About the Course:
In this course you’ll gain an introduction to the modern study of dynamical systems, the interdisciplinary field of applied mathematics that studies systems that change over time.
Topics to be covered include: phase space, bifurcations, chaos, the butterfly effect, strange attractors, and pattern formation. The course will focus on some of the realizations from the study of dynamical systems that are of particular relevance to complex systems:
- Dynamical systems undergo bifurcations, where a small change in a system parameter such as the temperature or the harvest rate in a fishery leads to a large and qualitative change in the system’s
- Deterministic dynamical systems can behave randomly. This property, known as sensitive dependence or the butterfly effect, places strong limits on our ability to predict some phenomena.
- Disordered behavior can be stable. Non-periodic systems with the butterfly effect can have stable average properties. So the average or statistical properties of a system can be predictable, even if its details are not.
- Complex behavior can arise from simple rules. Simple dynamical systems do not necessarily lead to simple results. In particular, we will see that simple rules can produce patterns and structures of surprising complexity.
About the Instructor:
David Feldman is Professor of Physics and Mathematics at College of the Atlantic. From 2004-2009 he was a faculty member in the Santa Fe Institute’s Complex Systems Summer School in Beijing, China. He served as the school’s co-director from 2006-2009. Dave is the author of Chaos and Fractals: An Elementary Introduction (Oxford University Press, 2012), a textbook on chaos and fractals for students with a background in high school algebra. Dave was a U.S. Fulbright Lecturer in Rwanda in 2011-12.
5 Jul 2016 9am PDT to
20 Sep 2016 3pm PDT
A familiarity with basic high school algebra. There will be optional lessons for those with stronger math backgrounds.
- Introduction I: Iterated Functions
- Introduction II: Differential Equations
- Chaos and the Butterfly Effect
- Bifurcations: Part I (Differential Equations)
- Bifurcations: Part II (Logistic Map)
- Phase Space
- Strange Attractors
- Pattern Formation
- Summary and Conclusions
Source: Complexity Explorer
There aren’t a lot of available online lectures on the subject of information theory, but here are the ones I’m currently aware of:
- Brit Cruise (Khan Academy) Informtion Theory
- Seth Lloyd (Complexity Explorer/YouTube) Introduction to Information Theory
- Thomas Cover (Stanford | YouTube) Information Theory
- Raymond Yeung (Chinese University of Hong Kong | Coursera) Information Theory (May require account to see 3 or more archived versions)
- David MacKay (University of Cambridge) Information Theory, Inference, and Learning Algorithms
- Andrew Eckford (York University | YouTube) Coding and Information Theory
- S.N. Merchant (IIT Bombay | NPTEL :: Electronics & Communication Engineering) Introduction to Information Theory and Coding
Fortunately, most are pretty reasonable, though vary in their coverage of topics. The introductory lectures don’t require as much mathematics and can probably be understood by those at the high school level with just a small amount of basic probability theory and an understanding of the logarithm.
The top three in the advanced section (they generally presume a prior undergraduate level class in probability theory and some amount of mathematical sophistication) are from professors who’ve written some of the most commonly used college textbooks on the subject. If I recall a first edition of the Yeung text was available via download through his course interface. MacKay’s text is available for free download from his site as well.
Feel free to post other video lectures or resources you may be aware of in the comments below.
Editor’s Update: With sadness, I’ll note that David MacKay died just days after this was originally posted.Syndicated copies to:
Many readers often ask me for resources for delving into the basics of information theory. I hadn’t posted it before, but the Santa Fe Institute recently had an online course Introduction to Information Theory through their Complexity Explorer, which has some other excellent offerings. It included videos, fora, and other resources and was taught by the esteemed physicist and professor Seth Lloyd. There are a number of currently active students still learning and posting there.
Syndicated copies to:
Introduction to Information Theory
About the Tutorial:
This tutorial introduces fundamental concepts in information theory. Information theory has made considerable impact in complex systems, and has in part co-evolved with complexity science. Research areas ranging from ecology and biology to aerospace and information technology have all seen benefits from the growth of information theory.
In this tutorial, students will follow the development of information theory from bits to modern application in computing and communication. Along the way Seth Lloyd introduces valuable topics in information theory such as mutual information, boolean logic, channel capacity, and the natural relationship between information and entropy.
Lloyd coherently covers a substantial amount of material while limiting discussion of the mathematics involved. When formulas or derivations are considered, Lloyd describes the mathematics such that less advanced math students will find the tutorial accessible. Prerequisites for this tutorial are an understanding of logarithms, and at least a year of high-school algebra.
About the Instructor(s):
Professor Seth Lloyd is a principal investigator in the Research Laboratory of Electronics (RLE) at the Massachusetts Institute of Technology (MIT). He received his A.B. from Harvard College in 1982, the Certificate of Advanced Study in Mathematics (Part III) and an M. Phil. in Philosophy of Science from Cambridge University in 1983 and 1984 under a Marshall Fellowship, and a Ph.D. in Physics in 1988 from Rockefeller University under the supervision of Professor Heinz Pagels.
From 1988 to 1991, Professor Lloyd was a postdoctoral fellow in the High Energy Physics Department at the California Institute of Technology, where he worked with Professor Murray Gell-Mann on applications of information to quantum-mechanical systems. From 1991 to 1994, he was a postdoctoral fellow at Los Alamos National Laboratory, where he worked at the Center for Nonlinear Systems on quantum computation. In 1994, he joined the faculty of the Department of Mechanical Engineering at MIT. Since 1988, Professor Lloyd has also been an adjunct faculty member at the Sante Fe Institute.
Professor Lloyd has performed seminal work in the fields of quantum computation and quantum communications, including proposing the first technologically feasible design for a quantum computer, demonstrating the viability of quantum analog computation, proving quantum analogs of Shannon’s noisy channel theorem, and designing novel methods for quantum error correction and noise reduction.
Professor Lloyd is a member of the American Physical Society and the Amercian Society of Mechanical Engineers.
Yoav Kallus is an Omidyar Fellow at the Santa Fe Institute. His research at the boundary of statistical physics and geometry looks at how and when simple interactions lead to the formation of complex order in materials and when preferred local order leads to system-wide disorder. Yoav holds a B.Sc. in physics from Rice University and a Ph.D. in physics from Cornell University. Before joining the Santa Fe Institute, Yoav was a postdoctoral fellow at the Princeton Center for Theoretical Science in Princeton University.
- Forms of Information
- Information and Probability
- Fundamental Formula of Information
- Computation and Logic: Information Processing
- Mutual Information
- Communication Capacity
- Shannon’s Coding Theorem
- The Manifold Things Information Measures
Two of my favorite topics: Food and Science!
The University of East Anglia in the UK in association with the Institute of Food Research is offering a free four week course Identifying Food Fraud. It’s an introduction to modern analytical science techniques and how they can be used to uncover food fraud.
I know many people who could identify a fake Louis Vuitton (LVMH) purse, a knock off Christian Louboutin, or a sham Rolex, but who simultaneously are overly religious about their food brands and topics like organic food and couldn’t similarly identify the fakes they’re eating because of fraud in food labeling and misdirection and legerdemain within the food supply chain. Finally there’s a course to help everyone become smarter consumers.
The food industry is one of the most important commercial sectors in the world. Everyone uses it, but how many people abuse it? As we witness the increasing globalisation of the supply chain, a growing challenge is verifying the questionable identity of raw materials in the food we eat.
In this course we will look at topical issues concerning ‘food fraud’ and explore ways in which analytical chemistry can help in its identification and prevention. We’ll share fascinating examples, such as the history of white bread and a surprising ingredient once found in bitter beer.
The University of East Anglia has joined forces with the world-renowned Institute of Food Research (IFR) to bring you this unique course. You’ll be led by Kate Kemsley, a specialist in the use of advanced instrumentation for measuring the chemical composition of food materials. Course content is linked with UEA’s MChem postgraduate programme, which supports final-year students’ practical research projects in this area of science.
For those interested the course starts on October 26, 2015.Syndicated copies to:
Robert Greenberg recently wrote a Facebook post relating to a New York Times review article entitled “For This Class, Professors Pass Screen Test“. It’s substantively about The Teaching Company and their series The Great Courses (TGC); for convenience I’ll excerpt his comments in their entirety below:
My response to his post with some thoughts of my own follows:
This is an interesting, but very germane, review. As someone who’s both worked in the entertainment industry and followed the MOOC (massively open online courseware) revolution over the past decade, I very often consider the physical production value of TGCs offerings and have been generally pleased at their steady improvement over time. Not only do they offer some generally excellent content, but they’re entertaining and pleasing to watch. From a multimedia perspective, I’m always amazed at what they offer and that generally the difference between the video versus the audio only versions isn’t as drastic as one might otherwise expect. Though there are times that I think that TGC might include some additional graphics, maps, etc. either in the course itself or in the booklets, I’m impressed that they still function exceptionally well without them.
Within the MOOC revolution, Sue Alcott’s Coursera course Archaeology’s Dirty Little Secrets is still by far the best produced multi-media course I’ve come across. It’s going to take a lot of serious effort for other courses to come up to this level of production however. It’s one of the few courses which I think rivals that of The Teaching Company’s offerings thus far. Unfortunately, the increased competition in the MOOC space is going to eventually encroach on the business model of TGC, and I’m curious to see how that will evolve and how it will benefit students. Will TGC be forced to offer online fora for students to interact with each other the way most MOOCs do? Will MOOCs be forced to drastically increase their production quality to the level of TGC? Will certificates or diplomas be offered for courseware? Will the subsequent models be free (like most MOOCs now), paid like TGC, or some mixture of the two?
One area which neither platform seems to be doing very well at present is offering more advanced coursework. Naturally the primary difficulty is in having enough audience to justify the production effort. The audience for a graduate level topology class is simply far smaller than introductory courses in history or music appreciation, but those types of courses will eventually have to exist to make the enterprises sustainable – in addition to the fact that they add real value to society. Another difficulty is that advanced coursework usually requires some significant work outside of the lecture environment – readings, homework, etc. MOOCs seem to have a slight upper hand here while TGC has generally relied on all of the significant material being offered in a lecture with the suggestion of reading their accompanying booklets and possibly offering supplementary bibliographies. When are we going to start seeing course work at the upper-level undergraduate or graduate level?
The nice part is that with evolving technology and capabilities, there are potentially new pedagogic methods that will allow easier teaching of some material that may not have been possible previously. (For some brief examples, see this post I wrote last week on Latin and the digital humanities.) In particular, I’m sure many of us have been astounded and pleased at how Dr. Greenberg managed the supreme gymnastics of offering of “Understanding the Fundamentals of Music” without delving into traditional music theory and written notation, but will he be able to actually offer that in new and exciting ways to increase our levels of understanding of music and then spawn off another 618 lectures that take us all further and deeper into his exciting world? Perhaps it comes in the form of a multimedia mobile app? We’re all waiting with bated breath, because regardless of how he pulls it off, we know it’s going to be educational, entertaining and truly awe inspiring.
Following my commentary, Scott Ableman, the Chief Marketing Officer for TGC, responded with the following, which I find very interesting: