Folgen
Xinyu Tang
Xinyu Tang
Bestätigte E-Mail-Adresse bei princeton.edu - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Understanding human gaze communication by spatio-temporal graph reasoning
L Fan, W Wang, S Huang, X Tang, SC Zhu
Proceedings of the IEEE/CVF International Conference on Computer Vision …, 2019
1212019
Mitigating membership inference attacks by self-distillation through a novel ensemble architecture
X Tang, S Mahloujifar, L Song, V Shejwalkar, M Nasr, A Houmansadr, ...
31st {USENIX} Security Symposium ({USENIX} Security 22), 1433-1450, 2022
702022
Privacy-Preserving In-Context Learning with Differentially Private Few-Shot Generation
X Tang, R Shin, HA Inan, A Manoel, F Mireshghallah, Z Lin, S Gopi, ...
ICLR 2024, 2024
202024
Machine Learning with Differentially Private Labels: Mechanisms and Frameworks
X Tang, M Nasr, S Mahloujifar, V Shejwalkar, L Song, A Houmansadr, ...
Proceedings on Privacy Enhancing Technologies 4, 332-350, 2022
142022
A New Linear Scaling Rule for Private Adaptive Hyperparameter Optimization
A Panda, X Tang, S Mahloujifar, V Sehwag, P Mittal
ICML 2024, 2024
13*2024
Effectively Using Public Data in Privacy Preserving Machine Learning
M Nasr, S Mahloujifar, X Tang, P Mittal, A Houmansadr
International Conference on Machine Learning, 25718-25732, 2023
102023
Differentially Private Image Classification by Learning Priors from Random Processes
X Tang, A Panda, V Sehwag, P Mittal
NeurIPS 2023, 35855--35877, 2023
92023
Private Fine-tuning of Large Language Models with Zeroth-order Optimization
X Tang, A Panda, M Nasr, S Mahloujifar, P Mittal
arXiv preprint arXiv:2401.04343, 2024
52024
Privacy Auditing of Large Language Models
A Panda, X Tang, M Nasr, CA Choquette-Choo, P Mittal
ICML 2024 Workshop on Foundation Models in the Wild, 2024
2024
Differentially Private Generation of High Fidelity Samples From Diffusion Models
V Sehwag, A Panda, A Pokle, X Tang, S Mahloujifar, M Chiang, JZ Kolter, ...
ICML 2023 Workshop Challenges in Deployable Generative AI, 2023
2023
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–10