Follow
Jakob Heiss
Jakob Heiss
Other namesJakob M Heiss
PhD Student, ETH Zürich
Verified email at math.ethz.ch - Homepage
Title
Cited by
Cited by
Year
NOMU: Neural Optimization-based Model Uncertainty
J Heiss, J Weissteiner, H Wutte, S Seuken, J Teichmann
International Conference on Machine Learning (ICML'22), 8708-8758, 2022
192022
Bayesian Optimization-based Combinatorial Assignment
J Weissteiner, J Heiss, J Siems, S Seuken
AAAI Conference on Artificial Intelligence (AAAI'23), 2022
142022
How Implicit Regularization of ReLU Neural Networks Characterizes the Learned Function--Part I: the 1-D Case of Two Layers with Random First Layer
J Heiss, J Teichmann, H Wutte
arXiv preprint arXiv:1911.02903, 2019
112019
Monotone-Value Neural Networks: Exploiting Preference Monotonicity in Combinatorial Assignment
J Weissteiner, J Heiss, J Siems, S Seuken
International Joint Conference on Artificial Intelligence (IJCAI'22), 541-548, 2022
102022
How (Implicit) Regularization of ReLU Neural Networks Characterizes the Learned Function--Part II: the Multi-D Case of Two Layers with Random First Layer
J Heiss, J Teichmann, H Wutte
arXiv preprint arXiv:2303.11454, 2023
32023
How Infinitely Wide Neural Networks Benefit from Multi-task Learning-an Exact Macroscopic Characterization
J Heiss, J Teichmann, H Wutte
ETH Zurich, 2022
3*2022
Machine Learning-powered Combinatorial Clock Auction
EN Soumalias, J Weissteiner, J Heiss, S Seuken
Proceedings of the AAAI Conference on Artificial Intelligence 38 (9), 9891-9900, 2024
2024
Extending Path-Dependent NJ-ODEs to Noisy Observations and a Dependent Observation Framework
W Andersson, J Heiss, F Krach, J Teichmann
arXiv preprint arXiv:2307.13147, 2023
2023
Reducing the number of neurons of Deep ReLU Networks based on the current theory of Regularization
J Heiss, A Stockinger, J Teichmann
2020
The system can't perform the operation now. Try again later.
Articles 1–9