Covid-19 RELATED EFFORTS: The Julia Lab at MIT's Computer Science and AI Laboratory (CSAIL) and the Julia Community at large are hard at work building the best tools for scientists worldwide from the low level compilers to parallel, GPU computation of the alphabet soup of models.

Julia is a programming language created by Jeff Bezanson, Alan Edelman, Stefan Karpinski, and Viral B. Shah in 2009, released publicly in 2012, Julia now has over ten million downloads.

News flash: Wilkinson Prize for Julia! (MIT News December 26, 2018 )

News flash from Google's Jeff Dean: Julia + TPUs = fast and easily expressible Machine Learning Computations. (tweet October 23, 2018)

The Julia lab embraces openness and the solving of human problems. Today, the Julia Lab collaborates with a variety of researchers on real-world problems and applications, while simultaneously working on the core language and its ecosystem.

Publications

Bezanson, Jeff, Jake Bolewski, and Jiahao Chen. 2018. “Fast Flexible Function Dispatch in Julia.” ArXiv:1808.03370 [Cs], August. http://arxiv.org/pdf/1808.03370.pdf.
Bezanson, Jeff, Jiahao Chen, Stefan Karpinski, Viral Shah, and Alan Edelman. 2014. “Array Operators Using Multiple Dispatch: A Design Methodology for Array Implementations in Dynamic Languages.” In Proceedings of ACM SIGPLAN International Workshop on Libraries, Languages, and Compilers for Array Programming, 56:56–56:61. ARRAY’14. New York, NY, USA: ACM. https://doi.org/10.1145/2627373.2627383.
Bezanson, Jeff, Alan Edelman, Stefan Karpinski, and Viral Shah. 2017. “Julia: A Fresh Approach to Numerical Computing.” SIAM Review 59 (1): 65–98. https://doi.org/10.1137/141000671.
Bezanson, Jeff, Stefan Karpinski, Viral B. Shah, and Alan Edelman. 2012. “Julia: A Fast Dynamic Language for Technical Computing.” ArXiv:1209.5145 [Cs], September. http://arxiv.org/pdf/1209.5145.pdf.
Bezanson, Jeffrey Werner. 2015. “Abstraction in Technical Computing.” Thesis, Massachusetts Institute of Technology. http://dspace.mit.edu/handle/1721.1/99811.
Chen, Jiahao. 2018. “Linguistic Relativity and Programming Languages.” ArXiv:1808.03916 [Cs, Stat], August. http://arxiv.org/pdf/1808.03916.pdf.
Chen, Jiahao, and Alan Edelman. 2014. “Parallel Prefix Polymorphism Permits Parallelization, Presentation & Proof.” In Proceedings of the 1st First Workshop for High Performance Technical Computing in Dynamic Languages, 47–56. HPTCDL ’14. Piscataway, NJ, USA: IEEE Press. https://doi.org/10.1109/HPTCDL.2014.9 http://jiahao.github.io/parallel-prefix/.
Chen, Jiahao, Andreas Noack, and Alan Edelman. 2018. “Fast Computation of the Principal Components of Genotype Matrices in Julia.” ArXiv:1808.03374 [Cs, Math, q-Bio, Stat], August. http://arxiv.org/pdf/1808.03374.pdf.
Chen, Jiahao, Jarrett Revels, and Alan Edelman. 2016. “Robust Benchmarking in Noisy Environments.” Proceedings of the 20th Annual IEEE High Performance Extreme Computing Conference, August. http://arxiv.org/pdf/1608.04295.pdf.
Chen, Jiahao, and Weijian Zhang. 2016. “The Right Way to Search Evolving Graphs.” Proceedings of GABB’2016 - Graph Algorithms Building Blocks Workshop, January. http://arxiv.org/pdf/1601.08189.pdf.
Innes, Mike, Stefan Karpinski, Viral Shah, David Barber, Pontus Stenetorp, Tim Besard, James Bradbury, et al. 2018. “On Machine Learning and Programming Languages.” SysML. https://www.sysml.cc/doc/37.pdf.
Leonard, Paul B., Edward B. Duffy, Robert F. Baldwin, Brad H. McRae, Viral B. Shah, and Tanmay K. Mohapatra. 2017. “GFlow: Software for Modelling Circuit Theory-Based Connectivity at Any Scale.” Methods in Ecology and Evolution 8 (4): 519–26. https://doi.org/10.1111/2041-210X.12689.
Palamadai Natarajan, Ekanathan. 2017. “Portable and Productive High-Performance Computing.” Thesis, Massachusetts Institute of Technology. http://dspace.mit.edu/handle/1721.1/108988.
Regier, Jeffrey, Kiran Pamnany, Keno Fischer, Andreas Noack, Maximilian Lam, Jarrett Revels, Steve Howard, et al. 2018. “Cataloging the Visible Universe Through Bayesian Inference at Petascale.” In 2018 IEEE International Parallel and Distributed Processing Symposium (IPDPS), 44–53. https://doi.org/10.1109/IPDPS.2018.00015.
Revels, Jarrett, Miles Lubin, and Theodore Papamarkou. 2016. “Forward-Mode Automatic Differentiation in Julia.” AD2016 - 7th International Conference on Algorithmic Differentiation, July. http://arxiv.org/pdf/1607.07892.pdf.
Shah, Viral, Alan Edelman, Stefan Karpinski, Jeff Bezanson, and Jeremy Kepner. 2013. “Novel Algebras for Advanced Analytics in Julia.” In 2013 IEEE High Performance Extreme Computing Conference (HPEC), 1–4. https://doi.org/10.1109/HPEC.2013.6670347.

Conferences

Past Research

The Julia Lab specializes in collaborating with other groups to solve messy real-world computational problems.

Statistical genomics

Existing bioinformatics tools aren't performant enough to handle the exabytes of data produced by modern genomics research each year, and general purpose linear algebra libraries are not optimized to take advantage of this data's inherent structure. To address this problem, the Julia Lab is developing specialized algorithms for principal component analysis and statistical fitting that will enable genomics researchers to analyze data at the same rapid pace that it is produced.

This project is an exciting interdisciplinary collaboration with Dr. Stavros Papadopoulos (Senior Research Scientist at Intel Labs) and Prof. Nikolaos Patsopoulos (Assistant Professor at Brigham and Women's Hospital, the Broad Institute and Harvard Medical School).

Financial Fraud Detection

A single stock exchange generates high-frequency trading (HFT) data at a rate of ~2.2 terabytes per month. Automatic identification of suspicious financial transactions in these high-throughput HFT data streams is an active area of research. The Julia Lab contributes to the battle against financial fraud by designing out-of-core analytics for anomaly detection.

Medical Data Analytics

Hospitals, like many large organizations, collect much more data than can be usefully processed and analyzed by human experts using today's available software. Oftentimes, these small-scale analyses can overlook statistical clues that might have rendered substantial improvements to patient care.

In collaboration with Harvard Medical School, the Julia Lab has worked on tools for rapidly identifying potential indicators of irregularities in medical data, equipping doctors and healthcare providers with the analytics they need to make informed medical decisions.

Numerical Linear Algebra and Parallel Computing

The Julia Lab leads the JuliaParallel organization, which maintains the following projects:

The Julia Lab also collaborates with Prof. Steven G. Johnson and Jared Crean in the development of PETSc.jl, a wrapper for the Portable, Extensible Toolkit for Scientific Computation.

People

Current Members

Alumni

Collaborators

The Julia group is grateful for numerous collaborations at MIT and around the world:

Sponsorship

We thank NSF, Amazon, DARPA XDATA, the Intel Science and Technology Center for Big Data, Saudi Aramco, the MIT Institute for Soldier Nanosystems, and NIH BD2K for their generous financial support.

The Julia Lab is a member of the bigdata@CSAIL MIT Big Data Initiative and gratefully acknowledges sponsorship from the MIT EECS SuperUROP Program and the MIT UROP Office for our talented undergraduate researchers.