This book provides an introduction to statistical learning methods. It is aimed for upper level undergraduate students, masters students and Ph.D. students in the non-mathematical sciences. The book also contains a number of R labs with detailed explanations on how to implement the various methods in real life settings, and should be a valuable resource for a practicing data scientist.

Slides and videos for Statistical Learning MOOC by Hastie and Tibshirani available separately here. Slides and video tutorials related to this book by Abass Al Sharif can be downloaded here.

I teach mathematics courses at University of Toronto Mississauga and Seneca College since 2016 when my family moved to Canada from Singapore. I taught full-time at Singapore Polytechnic as a math lecturer from 2012 to 2016. This is the space where I explore and share my journey of teaching mathematics, conducting education research projects, and learning about OER.

This is a collection of introductory, expository notes on applied category theory, inspired by the 2018 Applied Category Theory Workshop, and in these notes we take a leisurely stroll through two themes (functorial semantics and compositionality), two constructions (monoidal categories and decorated cospans) and two examples (chemical reaction networks and natural language processing) within the field. [PDF]

hat tip:

Friends! I am so happy to share that my little booklet “What is Applied Category Theory?” is now available on the arXiv. It’s a collection of introductory, expository notes inspired by the ACT workshop that took place earlier this year. Enjoy! https://t.co/EPYP19z14xpic.twitter.com/O4uVhj401s

A decades-old method called the “bootstrap” is enabling new discoveries about the geometry underlying all quantum theories.

In the 1960s, the charismatic physicist Geoffrey Chew espoused a radical vision of the universe, and with it, a new way of doing physics. Theorists of the era were struggling to find order in an unruly zoo of newfound particles. They wanted to know which ones were the fundamental building blocks of nature and which were composites. But Chew, a professor at the University of California, Berkeley, argued against such a distinction. “Nature is as it is because this is the only possible nature consistent with itself,” he wrote at the time. He believed he could deduce nature’s laws solely from the demand that they be self-consistent. Continue reading 👓 Physicists Uncover Geometric ‘Theory Space’ | Quanta Magazine

Remarkable progress of quantum information theory (QIT) allowed to formulate mathematical theorems for conditions that data-transmitting or data-processing occurs with a non-negative entropy gain. However, relation of these results formulated in terms of entropy gain in quantum channels to temporal evolution of real physical systems is not thoroughly understood. Here we build on the mathematical formalism provided by QIT to formulate the quantum H-theorem in terms of physical observables. We discuss the manifestation of the second law of thermodynamics in quantum physics and uncover special situations where the second law can be violated. We further demonstrate that the typical evolution of energy-isolated quantum systems occurs with non-diminishing entropy. [1]

Footnotes

[1]

G. B. Lesovik, A. V. Lebedev, I. A. Sadovskyy, M. V. Suslov, and V. M. Vinokur, “H-theorem in quantum physics,” Scientific Reports, vol. 6. Springer Nature, p. 32815, 12-Sep-2016 [Online]. Available: http://dx.doi.org/10.1038/srep32815

This tutorial will review the basics of theory in the field of evolutionary quantitative genetics and its connections to evolution observed at various time scales. Quantitative genetics deals with the inheritance of measurements of traits that are affected by many genes. Quantitative genetic theory for natural populations was developed considerably in the period from 1970 to 1990 and up to the present, and it has been applied to a wide range of phenomena including the evolution of differences between the sexes, sexual preferences, life history traits, plasticity of traits, as well as the evolution of body size and other morphological measurements. Textbooks have not kept pace with these developments, and currently few universities offer courses in this subject aimed at evolutionary biologists. There is a need for evolutionary biologists to understand this field because of the ability to collect large amounts of data by computer, the development of statistical methods for changes of traits on evolutionary trees and for changes in a single species through time, and the realization that quantitative characters will not soon be fully explained by genomics. This tutorial aims to fill this need by reviewing basic aspects of theory and illustrating how that theory can be tested with data, both from single species and with multiple-species phylogenies. Participants will learn to use R, an open-source statistical programming language, to build and test evolutionary models. The intended participants for this tutorial are graduate students, postdocs, and junior faculty members in evolutionary biology.

During decades the study of networks has been divided between the efforts of social scientists and natural scientists, two groups of scholars who often do not see eye to eye. In this review I present an effort to mutually translate the work conducted by scholars from both of these academic fronts hoping to continue to unify what has become a diverging body of literature. I argue that social and natural scientists fail to see eye to eye because they have diverging academic goals. Social scientists focus on explaining how context specific social and economic mechanisms drive the structure of networks and on how networks shape social and economic outcomes. By contrast, natural scientists focus primarily on modeling network characteristics that are independent of context, since their focus is to identify universal characteristics of systems instead of context specific mechanisms. In the following pages I discuss the differences between both of these literatures by summarizing the parallel theories advanced to explain link formation and the applications used by scholars in each field to justify their approach to network science. I conclude by providing an outlook on how these literatures can be further unified.

Dr. Michael Miller has announced his Autumn mathematics course, and it is…

Introduction to Complex Analysis

Course Description

Complex analysis is one of the most beautiful and useful disciplines of mathematics, with applications in engineering, physics, and astronomy, as well as other branches of mathematics. This introductory course reviews the basic algebra and geometry of complex numbers; develops the theory of complex differential and integral calculus; and concludes by discussing a number of elegant theorems, including many–the fundamental theorem of algebra is one example–that are consequences of Cauchy’s integral formula. Other topics include De Moivre’s theorem, Euler’s formula, Riemann surfaces, Cauchy-Riemann equations, harmonic functions, residues, and meromorphic functions. The course should appeal to those whose work involves the application of mathematics to engineering problems as well as individuals who are interested in how complex analysis helps explain the structure and behavior of the more familiar real number system and real-variable calculus.

Prerequisites

Basic calculus or familiarity with differentiation and integration of real-valued functions.

For many who will register, this certainly won’t be their first course with Dr. Miller — yes, he’s that good! But for the newcomers, I’ve written some thoughts and tips to help them more easily and quickly settle in and adjust: Dr. Michael Miller Math Class Hints and Tips | UCLA Extension

I often recommend people to join in Mike’s classes and more often hear the refrain: “I’ve been away from math too long”, or “I don’t have the prerequisites to even begin to think about taking that course.” For people in those categories, you’re in luck! If you’ve even had a soupcon of calculus, you’ll be able to keep up here. In fact, it was a similar class exactly a decade ago by Mike Miller that got me back into mathematics. (Happy 10th math anniversary to me!)

(Note that there’s another introductory complex analysis textbook from Silverman that’s offered through Dover, so be sure to choose the correct one.)

As always in Dr. Miller’s classes, the text is just recommended (read: not required) and in-class notes are more than adequate. To quote him directly, “We will be using as a basic guide, but, as always, supplemented by additional material and alternate ways of looking at things.”

The bonus surprise of his email: He’s doing two quarters of Complex Analysis! So we’ll be doing both the Fall and Winter Quarters to really get some depth in the subject!

Alternate textbooks

If you’re like me, you’ll probably take a look at some of the other common (and some more advanced) textbooks in the area. Since I’ve already compiled a list, I’ll share it:

R. A. Silverman, Complex Analysis with Applications, 1st ed. Dover Publications, Inc., 2010, pp. 304–304 [Online]. Available: http://amzn.to/2c7KaQy

[2]

J. Bak and D. J. Newman, Complex Analysis, 3rd ed. Springer, 2010, pp. 328–328 [Online]. Available: http://amzn.to/2bLPW89

[3]

T. Gamelin, Complex Analysis. Springer, 2003, pp. 478–478 [Online]. Available: http://amzn.to/2bGNQct

[4]

J. Brown and R. V. Churchill, Complex Variables and Applications, 8th ed. McGraw-Hill, 2008, pp. 468–468 [Online]. Available: http://amzn.to/2bLQWcu

[5]

E. B. Saff and A. D. Snider, Fundamentals of Complex Analysis with Applications to Engineering, Science, and Mathematics, 3rd ed. Pearson, 2003, pp. 563–563 [Online]. Available: http://amzn.to/2f3Nyj6

[6]

L. V. Ahlfors, Complex Analysis, 3rd ed. McGraw-Hill, 1979, pp. 336–336 [Online]. Available: http://amzn.to/2bMXrxm

[7]

S. Lang, Complex Analysis, 4th ed. Springer, 2003, pp. 489–489 [Online]. Available: http://amzn.to/2c7OaR0

[8]

J. B. Conway, Functions of One Complex Variable, 2nd ed. Springer, 1978, pp. 330–330 [Online]. Available: http://amzn.to/2cggbF1

[9]

El. M. Stein and R. Shakarchi, Complex Analysis. Princeton University Press, 2003, pp. 400–400 [Online]. Available: http://amzn.to/2bGOG9c

Tom M. Apostol, professor of mathematics, emeritus at California Institute of Technology passed away on May 8, 2016. He was 92.

My proverbial mathematical great-grandfather passed away yesterday.

As many know, for over a decade, I’ve been studying a variety of areas of advanced abstract mathematics with Michael Miller. Mike Miller received his Ph.D. in 1974 (UCLA) under the supervision of Basil Gordon who in turn received his Ph.D. in 1956 (CalTech) under the supervision of Tom M. Apostol.

Incidentally going back directly three generations is Markov and before that Chebyshev and two generations before that Lobachevsky.

Sadly, I never got to have Tom as a teacher directly myself, though I did get to meet him several times in (what mathematicians might call) social situations. I did have the advantage of delving into his two volumes of Calculus as well as referring to his book on Analytic Number Theory. If it’s been a while since you’ve looked at calculus, I highly recommend an evening or two by the fire with a glass of wine while you revel in Calculus, Vol 1 or Calculus, Vol 2.

It’s useful to take a moment to remember our intellectual antecedents, so in honor of Tom’s passing, I recommend the bookmarked very short obituary (I’m sure more will follow), this obituary of Basil, and this issue of the Notices of the AMS celebrating Basil as well. I also came across a copy of Fascinating Mathematical People which has a great section on Tom and incidentally includes some rare younger photos of Sol Golomb who suddenly passed away last Sunday. (It’s obviously been a tough week for me and math in Southern California this week.)

I was getting concerned that I hadn’t heard back from Sol for a while, particularly after emailing him late last week, and then I ran across this notice through ITSOC & the IEEE:

Solomon W. Golomb (May 30, 1932 – May 1, 2016)

Shannon Award winner and long-time ITSOC member Solomon W. Golomb passed away on May 1, 2016.
Solomon W. Golomb was the Andrew Viterbi Chair in Electrical Engineering at the University of Southern California (USC) and was at USC since 1963, rising to the rank of University and Distinguished Professor. He was a member of the National Academies of Engineering and Science, and was awarded the National Medal of Science, the Shannon Award, the Hamming Medal, and numerous other accolades. As USC Dean Yiannis C. Yortsos wrote, “With unparalleled scholarly contributions and distinction to the field of engineering and mathematics, Sol’s impact has been extraordinary, transformative and impossible to measure. His academic and scholarly work on the theory of communications built the pillars upon which our modern technological life rests.”

In addition to his many contributions to coding and information theory, Professor Golomb was one of the great innovators in recreational mathematics, contributing many articles to Scientific American and other publications. More recent Information Theory Society members may be most familiar with his mathematics puzzles that appeared in the Society Newsletter, which will publish a full remembrance later.

A quick search a moment later revealed this sad confirmation along with some great photos from an award Sol received just a week ago:

A sad day 4 @USC@USCViterbi@USCMingHsiehEE with the loss of beloved Sol Golomb. Was only last week we celebrated his Franklin medal.

As is common in academia, I’m sure it will take a few days for the news to drip out, but the world has certainly lost one of its greatest thinkers, and many of us have lost a dear friend, colleague, and mentor.

I’ll try touch base with his family and pass along what information sniff I can. I’ll post forthcoming obituaries as I see them, and will surely post some additional thoughts and reminiscences of my own in the coming days.

Syndicated copies:

The world has certainly lost one of its greatest thinkers, and many of us have lost a dear friend, colleague, and mentor.