“Every interesting mathematical concept, if it’s truly deep, has something to say in a variety of applied domains,” said Jordan Ellenberg, a mathematician from the University of Wisconsin-Madison who studies number theory and algebraic geometry. He has been writing about math concepts for general audiences for over 15 years and his most recent book, “Shape,” is a New York Times bestseller.
Ellenberg delivered the second annual College of Science Christmas lecture on Friday evening in Jordan Hall of Science. The Christmas lecture is modeled after the Royal Institution of Great Britain’s Christmas lectures, which were started almost 200 years ago by scientist Michael Faraday in 1825.
According to Allison Slabaugh, the academic advancement director for the College of Science, “the Christmas lecture was established with the goal of bringing science to the general public and inspiring the community to engage in science.” The Faraday-style lectures demonstrate the scientific concepts they present and are intended to be entertaining and deeply philosophical.
Ellenberg’s lecture focused on the theory of the random walk, also known as the Markov process. In mathematics, a random walk is a random process that describes a path consisting of a succession of random steps within a given space.
According to Ellenberg, the random walk describes a sequence of possible events where the probability of each event depends on the event that happened before the event that will occur next. This stochastic, or random process, occurs because each event is independent of the other and as the sample size increases, the mean gets closer to the average of the entire population.
The phenomenon of the random walk can be observed in many different areas of study, like flipping a coin or fluctuations in stock prices. “The concept of the random walk expanded out into an array of applications,” Ellenberg said, naming finance, physics, biology and mathematics.
Ellenberg invited members of the audience to participate in a game to demonstrate the random walk. Participants were instructed to look at a piece of text like a newspaper or a page from a book. Then, they were asked to locate a bigram or a pair of letters. After the first person called out a pair of letters, the next pair had to start with the second letter of the previous pair. This was repeated until someone ended the word with a space or a period.
This process produced words that resembled and sounded like English, even though they were not actual English words. Ellenberg explained that longer sequences of letters, for example, those that used five letters grouped together as opposed to two letters, would create words that capture more of the English language.
The random walk produces an imitation of the English text which has been applied to artificial language models. Although these computer programs are applied on a much larger scale, “it’s fundamentally doing the same kind of thing taking in a large body of existing English text and trying to figure out what’s likely to come out of it and then auto-generating just the way we all did together,” Ellenberg said.
“A language model [used by artificial intelligence] is not so different from a very simple Markov chain, it’s just much bigger because we have abilities we didn’t have years ago,” Ellenberg continued.
There are limitations to the Markov process and while there are certain things that a Markov process can do, there are also certain kinds of things that a Markov process can’t do. Ellenberg discussed how identifying this boundary is difficult because there is so much about math that is still unknown.
Contact Caroline Collins at firstname.lastname@example.org.