My research interest primarily lies in machine learning and optimization. Below are some specific themes I have worked on with my collaborators in industry and academia.
Industrial research
- ML solutions for Office Lens [Microsoft]. We have developed deep learning based solutions for the Office Lens mobile app on problems such as document-image classification, document-image scan cleanup, etc. An important requirement is to have implementation such that the algorithms should run in mobile devices. We have also filed a patent on mobile camera application.
- ML solutions for Kaizala [Microsoft]. Microsoft Kaizala is a mobile app and service designed for large group communications and work management. Kaizala makes it easy to connect and coordinate work with your entire value chain, including field employees, vendors, partners, and customers wherever they are. Focused on improving the user experience, we have developed several machine learning based algorithms for Kaizala. We have filed a couple of patents related to chat group recommendation and form conversion.
- Large-scale auto-tagging of documents [Microsoft]. Worked with the SharePoint team to develop ML algorithms for auto-tagging of SharePoint documents.
- Product recommendation [Microsoft]. We proposed and developed a product recommendation algorithm for a Microsoft customer. We have contributed our algorithm in the Microsoft Recommenders GitHub repository.
- Style recommendation for fashion [Amazon]. We developed algorithms for cold-start product recommendation, based on user attributes and product information. Our algorithms have been productionized for internal launch.
- Competitive pricing of products [Amazon]. Our goal is to estimate competitive reference pricing of millions of products sold in Amazon. This has been deployed internally and the price recommendations are consumed by multiple internal consumers.
- Identifying product safety concern [Amazon]. We developed algorithms that identify quality defects in products sold at Amazon’s platform. The quality defects of interest to me are those that may result in any safety concern. The model have been productionized and is being consumed by internal teams.
- Identify consumable products [Amazon]. The goal here is to identify products that are consumables.
Research themes
- Statistical optimal transport. We offer a fresh perspective on the statistical optimal transport problem by posing it as that of learning the transport plan’s kernel mean embeddings.
- Multilingual word embeddings. We focused on the problem of mapping monolingual embeddings of different languages into a common latent multilingual space and propose a geometry-aware framework. Our contributions have been published in TACL’19, ACL’20, EMNLP’20, and RepL4NLP’20.
- McTorch. Currently working on McTorch, a manifold optimization library for deep learning. It is a Python package that adds manifold optimization functionality to PyTorch.
- Low-rank tensor learning: We propose to learn a low-rank tensor in the regularized latent trace-norm framework. Existing works aim at learning a sparse combination of tensors. We empirically show that learning sparse combination of tensors may not always perform well on real-world problems. Subsequently, we propose a non-sparse latent trace-norm regularizer, that learns a non-sparse combination of tensors. We develop a dual framework for solving the problem. We show a novel characterization of the solution space and then propose two scalable optimization formulations. The problems are shown to lie on a Cartesian product of Riemannian spectrahedron manifolds. We exploit the versatile Riemannian optimization framework for proposing computationally efficient trust region algorithms. The experiments show the good performance of the proposed algorithms on several real-world data sets in different applications. This work has been accepted in NIPS 2018.
- Low-rank optimization with structure: We proposed the first generic framework for learning low-rank matrices with structural constraints for large scale applications. We exploit a variational characterization of the nuclear norm regularization. Our key technical contribution is a saddle point approach for solving such problems, where the low-rank and structural constraints are equivalently decoupled onto two separate factors. This makes our approach simpler and scalable. The proposed algorithms are developed in Riemannian optimization setting. Empirically, we showed results in four different applications: a) matrix completion, b) robust matrix completion, c) Hankel matrix learning, and d) multi-task feature learning. This work was accepted in ICML 2018.
- Sparse positive semi-definite matrices: Learning a sparse positive semi-definite matrix is an important problem in multi-task learning, among other settings. We propose to learn this sparse positive semi-definite matrix efficiently without enforcing the positive semi-definite constraint on the matrix. This makes our solution more amenable to big data settings. Our analysis ensures that we are guaranteed to learn a positive semi-definite output kernel. The algorithm was developed in the stochastic dual-coordinate ascent framework. This work was accepted in NIPS 2015.