If you go the older route one of the best planet-like sites I’ve seen was http://connectedcourses.net/, which if I recall correctly was built by Alan Levine. If you poke around a bit or ask @cogdog on Twitter, I think there are some details or a recipe somewhere of how he put it together.
New York Times Columnist Nicholas Kristof thinks understanding the basics of the economy will help you get far in life.
College students receive any number of recommended introductory courses. But according to New York Times columnist Nicholas Kristof, one of the key classes you might need to take to succeed in life might be an introductory economics course.
As part of the Marketplace Morning Report’s “Econ Extra Credit” project, host David Brancaccio spoke with Kristof about how an Econ 101 class can provide a student with a robust toolbox that could be used later in life to both understand and address larger issues like rent control or how to fund a tax cut.
“We’ve repeatedly mangled macro economic policy in the U.S.,” Kristof said. “It’s pretty obvious that even lawmakers kind of have no clue about really basic issues, like you know, what a fiscal stimulus is.”
Click on the player above to hear their conversation on the merits of Econ 101, as well as Kristof’s thoughts on how introductory economics has adapted to better reflect real world economic issues.
This interview is part of our “Econ Extra Credit” project, where we read a new introductory economics textbook provided by the non-profit Core-Econ together with our listeners. If you’d like to join us in this project, email MorningReport@marketplace.org and let us know you’re reading along with Marketplace through the end of Spring.
Naturally I worry that the participation rates will start high and end low, but the fact that they’re encouraging their listeners to expand themselves and delve a bit deeper than just listening to their show is fantastic.
And honestly, who couldn’t use an ECON refresher from time to time–particularly one that takes a dramatically different approach to the subject than the one many of us took?
Just finished some major updates to the @OpenScienceMOOC, including how to set up your @ORCID_Org, @Impactstory, @OSFramework, and @Publons profiles. Whew! Available in PDF, HTML, Jupyter notebook, and markdown formats. Feedback appreciated! https://t.co/HPCkcbDLQO pic.twitter.com/3FdVeHXBLl— Jon Tennant (@Protohedgehog) November 17, 2018
This collection of essays explores the authors’ work in, inquiry into, and critique of online learning, educational technology, and the trends, techniques, hopes, fears, and possibilities of digital pedagogy. For more information, visit urgencyofteachers.com.
When I’m done with the course I’ll have my own archive of everything I did for the entire course (as well as copies on the Internet Archive, since I ping it as I go). His class website and my responses there could be used for the purposes of grading.
I can subscribe to his feed of posts for the class (or an aggregated one he’s made–sometimes known as a planet) and use the feed reader of choice to consume the content (and that of my peers’) at my own pace to work my way through the course.
This is a lot closer to what I think online pedagogy or even the use of a Domain of One’s Own in an educational setting could and should be. I hope other educators might follow suit based on our examples. As an added bonus, if you’d like to try it out, Greg’s three week course is, in fact, an open course for using IndieWeb and DoOO technologies for teaching. It’s just started, so I hope more will join us.
He’s focusing primarily on using WordPress as the platform of choice in the course, but one could just as easily use other Webmention enabled CMSes like WithKnown, Grav, Perch, Drupal, et al. to participate.
In its quest to find a sustainable business model, online course provider edX will test charging users for access to previously free content. Observers say the move was inevitable.
About the Course:
Probability and statistics have long helped scientists make sense of data about the natural world — to find meaningful signals in the noise. But classical statistics prove a little threadbare in today’s landscape of large datasets, which are driving new insights in disciplines ranging from biology to ecology to economics. It's as true in biology, with the advent of genome sequencing, as it is in astronomy, with telescope surveys charting the entire sky.
The data have changed. Maybe it's time our data analysis tools did, too.
During this three-month online course, starting June 11th, instructors Hector Zenil and Narsis Kiani will introduce students to concepts from the exciting new field of Algorithm Information Dynamics to search for solutions to fundamental questions about causality — that is, why a particular set of circumstances lead to a particular outcome.
Algorithmic Information Dynamics (or Algorithmic Dynamics in short) is a new type of discrete calculus based on computer programming to study causation by generating mechanistic models to help find first principles of physical phenomena building up the next generation of machine learning.
The course covers key aspects from graph theory and network science, information theory, dynamical systems and algorithmic complexity. It will venture into ongoing research in fundamental science and its applications to behavioral, evolutionary and molecular biology.
Students should have basic knowledge of college-level math or physics, though optional sessions will help students with more technical concepts. Basic computer programming skills are also desirable, though not required. The course does not require students to adopt any particular programming language for the Wolfram Language will be mostly used and the instructors will share a lot of code written in this language that student will be able to use, study and exploit for their own purposes.
- The course will begin with a conceptual overview of the field.
- Then it will review foundational theories like basic concepts of statistics and probability, notions of computability and algorithmic complexity, and brief introductions to graph theory and dynamical systems.
- Finally, the course explores new measures and tools related to reprogramming artificial and biological systems. It will showcase the tools and framework in applications to systems biology, genetic networks and cognition by way of behavioral sequences.
- Students will be able apply the tools to their own data and problems. The instructors will explain in detail how to do this, and will provide all the tools and code to do so.
The course runs 11 June through 03 September 2018.
Tuition is $50 required to get to the course material during the course and a certificate at the end but is is free to watch and if no fee is paid materials will not be available until the course closes. Donations are highly encouraged and appreciated in support for SFI's ComplexityExplorer to continue offering new courses.
In addition to all course materials tuition includes:
- Six-month access to the Wolfram|One platform (potentially renewable by other six) worth 150 to 300 USD.
- Free digital copy of the course textbook to be published by Cambridge University Press.
- Several gifts will be given away to the top students finishing the course, check the FAQ page for more details.
Best final projects will be invited to expand their results and submit them to the journal Complex Systems, the first journal in the field founded by Stephen Wolfram in 1987.
About the Instructor(s):
Hector Zenil has a PhD in Computer Science from the University of Lille 1 and a PhD in Philosophy and Epistemology from the Pantheon-Sorbonne University of Paris. He co-leads the Algorithmic Dynamics Lab at the Science for Life Laboratory (SciLifeLab), Unit of Computational Medicine, Center for Molecular Medicine at the Karolinska Institute in Stockholm, Sweden. He is also the head of the Algorithmic Nature Group at LABoRES, the Paris-based lab that started the Online Algorithmic Complexity Calculator and the Human Randomness Perception and Generation Project. Previously, he was a Research Associate at the Behavioural and Evolutionary Theory Lab at the Department of Computer Science at the University of Sheffield in the UK before joining the Department of Computer Science, University of Oxford as a faculty member and senior researcher.
Narsis Kiani has a PhD in Mathematics and has been a postdoctoral researcher at Dresden University of Technology and at the University of Heidelberg in Germany. She has been a VINNOVA Marie Curie Fellow and Assistant Professor in Sweden. She co-leads the Algorithmic Dynamics Lab at the Science for Life Laboratory (SciLifeLab), Unit of Computational Medicine, Center for Molecular Medicine at the Karolinska Institute in Stockholm, Sweden. Narsis is also a member of the Algorithmic Nature Group, LABoRES.
Hector and Narsis are the leaders of the Algorithmic Dynamics Lab at the Unit of Computational Medicine at Karolinska Institute.
Alyssa Adams has a PhD in Physics from Arizona State University and studies what makes living systems different from non-living ones. She currently works at Veda Data Solutions as a data scientist and researcher in social complex systems that are represented by large datasets. She completed an internship at Microsoft Research, Cambridge, UK studying machine learning agents in Minecraft, which is an excellent arena for simple and advanced tasks related to living and social activity. Alyssa is also a member of the Algorithmic Nature Group, LABoRES.
The development of the course and material offered has been supported by:
- The Foundational Questions Institute (FQXi)
- Wolfram Research
- John Templeton Foundation
- Santa Fe Institute
- Swedish Research Council (Vetenskapsrådet)
- Algorithmic Nature Group, LABoRES for the Natural and Digital Sciences
- Living Systems Lab, King Abdullah University of Science and Technology.
- Department of Computer Science, Oxford University
- Cambridge University Press
- London Mathematical Society
- Springer Verlag
- ItBit for the Natural and Computational Sciences and, of course,
- the Algorithmic Dynamics lab, Unit of Computational Medicine, SciLifeLab, Center for Molecular Medicine, The Karolinska Institute
Course dates: 11 Jun 2018 9pm PDT to 03 Sep 2018 10pm PDT
- A Computational Approach to Causality
- A Brief Introduction to Graph Theory and Biological Networks
- Elements of Information Theory and Computability
- Randomness and Algorithmic Complexity
- Dynamical Systems as Models of the World
- Practice, Technical Skills and Selected Topics
- Algorithmic Information Dynamics and Reprogrammability
- Applications to Behavioural, Evolutionary and Molecular Biology
About the Tutorial:
This tutorial will present you with the basics of how to use NetLogo to create an agent-based modeling. During the tutorial, we will briefly discuss what agent-based modeling is, and then dive in to hands-on work using the NetLogo programming language, which is developed and supported at Northwestern University by Uri Wilensky. No programming background or knowledge is required, and the methods examined will be useable in any number of different fields.About the Instructor(s):
Bill Rand is an assistant professor of Business Management at the Poole College of Management at North Carolina State University and a computer scientist by training. He has co-authored a textbook on agent-based modelingwith Uri Wilensky, the author of the NetLogo programming language. He is also the author of over 50 scholarly papers, many of which use agent-based modeling as their core methodology. He received his doctorate in computer science in 2005 from the University of Michigan, and was also awarded a postdoctoral fellowship to Northwestern University, where he worked directly with Uri Wilensky as part of the NetLogo development team.Syllabus
- Introduction to ABM
- Tabs, Turtles, Patches, and Links
- Code, Control, and Collections
- Putting It All Together
WE’RE LAUNCHING A NEW TUTORIAL!
Fundamentals of NetLogo, a primer on the most used agent-based modeling software, will be available tomorrow.
Stay tuned for our launch announcement, and check out all our tutorials at https://t.co/APIkME07y5 pic.twitter.com/M8qIJp1R6x
— ComplexityExplorer (@ComplexExplorer) April 2, 2018
One overhyped technology fades as another surges.
An introductory course in statistical mechanics.
There’s also a corresponding video lecture series available on YouTube
Download 1200 free online courses from the world's top universities -- Stanford, Yale, MIT, & more. Over 30,000 hours of free audio & video lectures.
Learn about quantum computation and quantum information in this advanced graduate level course from MIT.
About this course
Already know something about quantum mechanics, quantum bits and quantum logic gates, but want to design new quantum algorithms, and explore multi-party quantum protocols? This is the course for you!
In this advanced graduate physics course on quantum computation and quantum information, we will cover:
- The formalism of quantum errors (density matrices, operator sum representations)
- Quantum error correction codes (stabilizers, graph states)
- Fault-tolerant quantum computation (normalizers, Clifford group operations, the Gottesman-Knill Theorem)
- Models of quantum computation (teleportation, cluster, measurement-based)
- Quantum Fourier transform-based algorithms (factoring, simulation)
- Quantum communication (noiseless and noisy coding)
- Quantum protocols (games, communication complexity)
Research problem ideas are presented along the journey.
What you’ll learn
- Formalisms for describing errors in quantum states and systems
- Quantum error correction theory
- Fault-tolerant quantum procedure constructions
- Models of quantum computation beyond gates
- Structures of exponentially-fast quantum algorithms
- Multi-party quantum communication protocols
Meet the instructor
Isaac Chuang Professor of Electrical Engineering and Computer Science, and Professor of Physics MIT
A very beautiful classical theory on field extensions of a certain type (Galois extensions) initiated by Galois in the 19th century. Explains, in particular, why it is not possible to solve an equation of degree 5 or more in the same way as we solve quadratic or cubic equations. You will learn to compute Galois groups and (before that) study the properties of various field extensions. We first shall survey the basic notions and properties of field extensions: algebraic, transcendental, finite field extensions, degree of an extension, algebraic closure, decomposition field of a polynomial. Then we shall do a bit of commutative algebra (finite algebras over a field, base change via tensor product) and apply this to study the notion of separability in some detail. After that we shall discuss Galois extensions and Galois correspondence and give many examples (cyclotomic extensions, finite fields, Kummer extensions, Artin-Schreier extensions, etc.). We shall address the question of solvability of equations by radicals (Abel theorem). We shall also try to explain the relation to representations and to topological coverings. Finally, we shall briefly discuss extensions of rings (integral elemets, norms, traces, etc.) and explain how to use the reduction modulo primes to compute Galois groups.
It’s being offered by National Research University – Higher School of Economics (HSE) in Russia.
Introduction to Dynamical Systems and Chaos (Summer, 2016)
About the Course:
In this course you’ll gain an introduction to the modern study of dynamical systems, the interdisciplinary field of applied mathematics that studies systems that change over time.
Topics to be covered include: phase space, bifurcations, chaos, the butterfly effect, strange attractors, and pattern formation. The course will focus on some of the realizations from the study of dynamical systems that are of particular relevance to complex systems:
- Dynamical systems undergo bifurcations, where a small change in a system parameter such as the temperature or the harvest rate in a fishery leads to a large and qualitative change in the system’s
- Deterministic dynamical systems can behave randomly. This property, known as sensitive dependence or the butterfly effect, places strong limits on our ability to predict some phenomena.
- Disordered behavior can be stable. Non-periodic systems with the butterfly effect can have stable average properties. So the average or statistical properties of a system can be predictable, even if its details are not.
- Complex behavior can arise from simple rules. Simple dynamical systems do not necessarily lead to simple results. In particular, we will see that simple rules can produce patterns and structures of surprising complexity.
About the Instructor:
David Feldman is Professor of Physics and Mathematics at College of the Atlantic. From 2004-2009 he was a faculty member in the Santa Fe Institute’s Complex Systems Summer School in Beijing, China. He served as the school’s co-director from 2006-2009. Dave is the author of Chaos and Fractals: An Elementary Introduction (Oxford University Press, 2012), a textbook on chaos and fractals for students with a background in high school algebra. Dave was a U.S. Fulbright Lecturer in Rwanda in 2011-12.
5 Jul 2016 9am PDT to
20 Sep 2016 3pm PDT
A familiarity with basic high school algebra. There will be optional lessons for those with stronger math backgrounds.
- Introduction I: Iterated Functions
- Introduction II: Differential Equations
- Chaos and the Butterfly Effect
- Bifurcations: Part I (Differential Equations)
- Bifurcations: Part II (Logistic Map)
- Phase Space
- Strange Attractors
- Pattern Formation
- Summary and Conclusions
Source: Complexity Explorer
- Brit Cruise (Khan Academy) Informtion Theory
- Seth Lloyd (Complexity Explorer/YouTube) Introduction to Information Theory
- Thomas Cover (Stanford | YouTube) Information Theory
- Raymond Yeung (Chinese University of Hong Kong | Coursera) Information Theory (May require account to see 3 or more archived versions)
- David MacKay (University of Cambridge) Information Theory, Inference, and Learning Algorithms
- Andrew Eckford (York University | YouTube) Coding and Information Theory
- S.N. Merchant (IIT Bombay | NPTEL :: Electronics & Communication Engineering) Introduction to Information Theory and Coding
Fortunately, most are pretty reasonable, though vary in their coverage of topics. The introductory lectures don’t require as much mathematics and can probably be understood by those at the high school level with just a small amount of basic probability theory and an understanding of the logarithm.
The top three in the advanced section (they generally presume a prior undergraduate level class in probability theory and some amount of mathematical sophistication) are from professors who’ve written some of the most commonly used college textbooks on the subject. If I recall a first edition of the Yeung text was available via download through his course interface. MacKay’s text is available for free download from his site as well.
Feel free to post other video lectures or resources you may be aware of in the comments below.
Editor’s Update: With sadness, I’ll note that David MacKay died just days after this was originally posted.