About Me

I am a 3rd year PhD student in the Computer Science department at the University of California San Diego advised by Prof. Sanjoy Dasgupta. I’m also closely collaborating with Prof. Misha Belkin. Previously, I was a research fellow in the Machine Teaching Group at Max Planck Institute for Software Systems, where I had the great fortune to be advised by Dr. Adish Singla.

My journey into computer science started at Chennai Mathematical Institute (CMI, India), where I completed my BSc in Mathematics and Computer Science (2013-2016) and MSc in Computer Science (2016-2018) under the supervision of Prof. K Venkata Subrahmanyam.

I am broadly interested in the theoretical aspects of machine learning. More specifically, I’m interested in statistical machine learning, algorithms, interactive learning, optimization, and theory of deep learning. I’m enthusiastic to explore and apply ideas from probability theory, analysis, differential geometry, and statistics to understand the computational and statistical efficiency of learning methods, and the extent to which machines can learn from data.

Contact: (username id) akk002 at ucsd dot edu

I’m looking for a research/engineer position for summer internship 2024. Please request CV via email.

Recent News

  1. [June-Sept, 2023] I was a research scientist intern at Adobe Research (San Jose, CA).
  2. [Aug, 2022] Attended the Deep learning theory workshop at Simons Institute, UC Berkeley.
  3. [July, 2022] Attended a summer school on Discrete Mathematics at Charles University, Prague (CZK).

Publications and Preprints

  1. Mirror Descent on Reproducing Kernel Banach Space (RKBS)
    Akash Kumar, Parthe Pandit, Misha Belkin
    In preparation.

  2. Convergence of Nearest Neighbor Selective Classification
    Akash Kumar, Sanjoy Dasgupta
    In preparation.

  3. Robust Empirical Risk Minimization with Tolerance
    Robi Bhattacharjee, Kamalika Chaudhuri, Max Hopkins, Akash Kumar, Hantao Yu (alphabetical order)
    Accepted in The 34th International Conference on Algorithmic Learning Theory (ALT’23), 2023
    A preliminary version appeared in AdvML Frontiers @ ICML 2022
    [ArXiv 2023]

  4. Teaching via Best-Case Counterexamples in the Learning-with-Equivalence-Queries Paradigm
    Akash Kumar, Yuxin Chen, Adish Singla.
    Accepted in The 35th Conference on Neural Information Processing Systems (NeurIPS’21), 2021
    [Proc 2021], [Openreview]

  5. The Teaching Dimension of Kernel Perceptrons
    Akash Kumar, Hanqi Zhang, Adish Singla, Yuxin Chen.
    Accepted in The 24th International Conference on Artificial Intelligence and Statistics (AISTATS’21), 2021
    [ArXiv 2021], [Proc 2021]

  6. Average-case Complexity of Teaching Convex Polytopes via Halfspace Queries
    Akash Kumar, Adish Singla, Yisong Yue, Yuxin Chen.
    [ArXiv 2020]
    Rejected from ICML 2021 with 6 reviews
    Rejected from NeurlPS 2020 with 4 reviews

  7. Deletion to Induced Matching
    Akash Kumar, Mithilesh Kumar.
    [ArXiv 2020]

Talks

Feature Learning in Large Language Models (Adobe Research, San Jose)
Teaching via Best-case Counterexamples (UCSD AI Seminar)

Some notes

Improved Certified Adversarial Lower Bound Using Adaptive Relaxations
Ongoing project on adversarial deep learning.

Escaping Saddle Points and Tensor Decomposition
Master’s Thesis under the guidance of Dr. K V Subrahmanyam. [Slides]

Natural Proofs Vs Derandomization
Project report completed as part of the Advanced Complexity course at Chennai Mathematical Institute.