About Me
I am currently a Ph.D. candidate in the department of Electrical and Computer Engineering at the University of Wisconsin-Madison advised by Prof. Kangwook Lee and Prof. Robert D. Nowak. Before coming to Madison I received my B.S. in Electrical and Computer Engineering from Rutgers University and was advised by Prof. Waheed U. Bajwa.
For more about my experiences check out my CV.
My research interests are in both the theoretical and practical aspects of signal processing and machine learning methods. In particular, I focus on bridging the gap between theory and practice in deep learning by developing novel theories for modern paradigms such as multi-task learning and using such theoretical insights to guide the design of practical deep learning systems.
Publications
A New Neural Kernel Regime: The Inductive Bias of Multi-Task Learning
Julia Nakhleh, Joseph Shenouda, Robert D. Nowak.
Advances in Neural Information Processing Systems (NeurIPS) 2024
[paper] [arXiv]
Variation Spaces for Multi-Output Neural Networks: Insights on Multi-Task Learning and Network Compression
Joseph Shenouda, Rahul Parhi, Kangwook Lee, Robert D. Nowak.
Journal of Machine Learning Research (JMLR) 2024
[paper] [arXiv]
ReLUs Are Sufficient for Learning Implicit Neural Representations
Joseph Shenouda, Yamin Zhou, Robert D. Nowak.
International Conference on Machine Learning (ICML) 2024
[arXiv] [code]
A Continuous Transform for Localized Ridgelets
Joseph Shenouda, Rahul Parhi, Robert D. Nowak.
Sampling Theory and Applications Conference (SampTA) 2023 (contributed talk)
[paper]
A Guide to Computational Reproducibility in Signal Processing and Machine Learning
Joseph Shenouda and Waheed U. Bajwa.
IEEE Signal Processing Magazine 2023
[paper].
Workshop Papers
A Representer Theorem for Vector-Valued Neural Networks: Insights on Weight Decay Training and Widths of Deep Neural Networks
Joseph Shenouda, Rahul Parhi, Kangwook Lee, Robert D. Nowak.
International Conference on Machine Learning (ICML) Duality Principles for Modern ML Workshop (contributed talk)
[video]
A Better Way to Decay: Proximal Gradient Training Algorithms for Neural Nets
Liu Yang, Jifan Zhang, Joseph Shenouda, Dimitris Papailiopoulos, Kangwook Lee, Robert D. Nowak.
Neural Information Processing Systems (NeurIPS) OPT-ML Workshop 2022
[paper]
Preprints
PathProx: A Proximal Gradient Algorithm for Weight Decay Regularized Deep Neural Networks
Liu Yang, Jifan Zhang, Joseph Shenouda, Dimitris Papailiopoulos, Kangwook Lee, Robert D. Nowak.
In Review
[arXiv] [code]