2002 B.A. Computer Science & Applied Mathematics, Harvard University
2002 M.S. Computer Science & Applied Mathematics, Harvard University
2008 Ph.D. Applied Mathematics, Harvard University
Probabilistic Theories of Deep Learning from first principles; Neurally-inspired learning and computation; Medical Imaging Diagnosis; Reverse-engineering neocortex; Deep Learning for Particle Physics.
Ankit B. Patel is currently an Assistant Professor at the Baylor College of Medicine in the Dept. of Neuroscience, and at Rice University in the Dept. of Electrical and Computer Engineering. Ankit is broadly interested in the intersection between machine learning and computational neuroscience, two research areas that are essential for understanding and building truly intelligent systems, with a focus on learning abstractions.
Ankit returned to academia recently, after spending six years in industry, building real-time inference systems trained on large-scale data for (1) ballistic missile defense at MIT Lincoln Laboratory, and (2) algorithmic trading at the world’s top high-frequency market-making firm. Prior to that, he received his Ph.D., M.S. and B.S. from Harvard in Computer Science and Applied Mathematics, supported by an NSF graduate fellowship. Working with Radhika Nagpal, he built accurate statistical and computational models of dividing cell sheets, showing for the first time that cell division in developing organisms is not random, but instead strategic. Their work was published in Nature in 2005.
In his new role, he will continue to pursue the unification of traditional hierarchical machine learning with deep neural networks, with applications to a variety of fields, including neuroscience, robotics, and particle physics. In this vein, he is designing and building the first generative convolutional net, which promises to enable (1) the training of sophisticated deep vision models from large quantities of unlabeled data, and (2) the execution of top-down inference for tasks in which fine-scale information is important (e.g. segmentation, pose estimation). He is also working with visual neuroscientists to build a bridge between machine learning models and real neural networks, using the latter to make testable predictions about the former. And finally, he is working with physicists at the Large Hadron Collider to build efficient new algorithms to separate signal from noise, in search of New Physics beyond our best generative model of the Universe thus far -- the Standard Model of Particle Physics.