The world we live in consists of evolving structures and patterns, and with the advent of big data, our ability to identify and understand such patterns is shifting the intellectual landscape across disciplines. This initiative addresses the need for expertise and infrastructure that will aid scholars throughout the University as they tackle disparate problems that rely on quantitative methods and computational tools to study complex, evolving systems.
See Professor of Biology Junhyong Kim’s 60-Second Lecture on “The Genome: The Inner Universe and the Best Place to Put Your Money.”
Joshua Plotkin is expanding horizons in evolutionary biology and ecology through the power of mathematics, computation, and statistical methods. His research group examines a range of questions that go to the heart of adaptation in populations, including the evolution of robustness and adaptability, the evolutionary ecology of viral populations, the nature of genetic drift, the dynamics of protein translation, and the evolution of social norms.
A professor in the Department of Biology as well as in the Department of Computer and Information Sciences in the School of Engineering and Applied Science, Plotkin has consistently been recognized with some of the most prestigious honors for innovative early-career scientists.
How do you teach a machine a human language? This was the question DARPA (Defense Advanced Research Projects Agency) had in mind when it called for the creation of an organization to support human language technology research and development. The response was the Linguistic Data Consortium (LDC), a group of universities, libraries, corporations and government research laboratories with Penn Arts and Sciences as its hub.
Mark Liberman, the Christopher H. Browne Distinguished Professor of Linguistics, founded the Consortium in 1992 and continues to serve as its director. Speech and language data are contributed to the LDC repository by a large network of research groups, and this repository in turn fuels researchers around the world with access to billions of words of text and tens of thousands of hours of speech.
What would it take to stop a crime before it occurred? It’s a challenge Richard Berk, Professor of Criminology and Statistics, is tackling with software he’s designing to help increase public safety and squeeze more crime prevention from the constrained budgets of criminal justice agencies.
According to Berk, “A range of criminal justice officials routinely consider ‘future dangerousness’ when they make their decisions about charging, sentencing, prison security, release on parole or probation, and supervision during probation or parole…But thanks to very recent developments in statistics and computer science, our ability to accurately anticipate future crime has improved dramatically.”