About Me

I am currently a Ph.D. candidate in the department of Electrical and Computer Engineering at the University of Wisconsin-Madison advised by Prof. Kangwook Lee and Prof. Robert D. Nowak. Before coming to Madison I received my B.S. in Electrical and Computer Engineering from Rutgers University and was advised by Prof. Waheed U. Bajwa.

For more about my experiences check out my CV.

My research interests are in both the theoretical and practical aspects of signal processing and machine learning methods. Most recently I’ve been interested in understanding deep neural networks and the effect of explicit regularization, such as weight decay.

Preprints

Vector-Valued Variation Spaces and Width Bounds for DNNs: Insights on Weight Decay Regularization
Joseph Shenouda, Rahul Parhi, Kangwook Lee, Robert D. Nowak.
In Review
[arXiv]

PathProx: A Proximal Gradient Algorithm for Weight Decay Regularized Deep Neural Networks
Liu Yang, Jifan Zhang, Joseph Shenouda, Dimitris Papailiopoulos, Kangwook Lee, Robert D. Nowak.
In Review
[arXiv] [code]

Publications

A Representer Theorem for Vector-Valued Neural Networks: Insights on Weight Decay Training and Widths of Deep Neural Networks
Joseph Shenouda, Rahul Parhi, Kangwook Lee, Robert D. Nowak.
International Conference on Machine Learning (ICML) Duality Principles for Modern ML Workshop (contributed talk)
[video]

A Continuous Transform for Localized Ridgelets
Joseph Shenouda, Rahul Parhi, Robert D. Nowak.
Sampling Theory and Applications Conference (SampTA) 2023 (contributed talk)
[paper]

A Guide to Computational Reproducibility in Signal Processing and Machine Learning
Joseph Shenouda and Waheed U. Bajwa.
IEEE Signal Processing Magazine 2023
[paper].

A Better Way to Decay: Proximal Gradient Training Algorithms for Neural Nets
Liu Yang, Jifan Zhang, Joseph Shenouda, Dimitris Papailiopoulos, Kangwook Lee, Robert D. Nowak.
Neural Information Processing Systems (NeurIPS) OPT-ML Workshop 2022
[paper]