Thomas Brox
Thomas Brox
University of Freiburg and Amazon
Bestätigte E-Mail-Adresse bei cs.uni-freiburg.de - Startseite
Titel
Zitiert von
Zitiert von
Jahr
U-net: Convolutional networks for biomedical image segmentation
O Ronneberger, P Fischer, T Brox
International Conference on Medical image computing and computer-assisted …, 2015
330072015
Striving for simplicity: The all convolutional net
JT Springenberg, A Dosovitskiy, T Brox, M Riedmiller
arXiv preprint arXiv:1412.6806, 2014
34132014
High accuracy optical flow estimation based on a theory for warping
T Brox, A Bruhn, N Papenberg, J Weickert
European conference on computer vision, 25-36, 2004
32252004
3D U-Net: learning dense volumetric segmentation from sparse annotation
Ö Çiçek, A Abdulkadir, SS Lienkamp, T Brox, O Ronneberger
International conference on medical image computing and computer-assisted …, 2016
31022016
Flownet: Learning optical flow with convolutional networks
A Dosovitskiy, P Fischer, E Ilg, C Hazirbas, V Golkov, P van der Smagt, ...
2015 IEEE International Conference on Computer Vision (ICCV), 2758-2766, 2015
2865*2015
Flownet 2.0: Evolution of optical flow estimation with deep networks
E Ilg, N Mayer, T Saikia, M Keuper, A Dosovitskiy, T Brox
Proceedings of the IEEE conference on computer vision and pattern …, 2017
20072017
Large displacement optical flow: descriptor matching in variational motion estimation
T Brox, J Malik
IEEE transactions on pattern analysis and machine intelligence 33 (3), 500-513, 2010
15512010
A large dataset to train convolutional networks for disparity, optical flow, and scene flow estimation
N Mayer, E Ilg, P Hausser, P Fischer, D Cremers, A Dosovitskiy, T Brox
Proceedings of the IEEE conference on computer vision and pattern …, 2016
14782016
Object segmentation by long term analysis of point trajectories
T Brox, J Malik
European conference on computer vision, 282-295, 2010
9032010
Generating images with perceptual similarity metrics based on deep networks
A Dosovitskiy, T Brox
Advances in neural information processing systems 29, 658-666, 2016
8662016
Discriminative unsupervised feature learning with convolutional neural networks
A Dosovitskiy, JT Springenberg, M Riedmiller, T Brox
Advances in neural information processing systems 27, 766-774, 2014
7322014
Learning to generate chairs with convolutional neural networks
A Dosovitskiy, J Tobias Springenberg, T Brox
Proceedings of the IEEE conference on computer vision and pattern …, 2015
7092015
U-Net: deep learning for cell counting, detection, and morphometry
T Falk, D Mai, R Bensch, Ö Çiçek, A Abdulkadir, Y Marrakchi, A Böhm, ...
Nature methods 16 (1), 67-70, 2019
6492019
Highly accurate optic flow computation with theoretically justified warping
N Papenberg, A Bruhn, T Brox, S Didas, J Weickert
International Journal of Computer Vision 67 (2), 141-158, 2006
5882006
Inverting visual representations with convolutional networks
A Dosovitskiy, T Brox
Proceedings of the IEEE conference on computer vision and pattern …, 2016
583*2016
Octree generating networks: Efficient convolutional architectures for high-resolution 3d outputs
M Tatarchenko, A Dosovitskiy, T Brox
Proceedings of the IEEE International Conference on Computer Vision, 2088-2096, 2017
5102017
Dense point trajectories by gpu-accelerated large displacement optical flow
N Sundaram, T Brox, K Keutzer
European conference on computer vision, 438-451, 2010
5032010
Demon: Depth and motion network for learning monocular stereo
B Ummenhofer, H Zhou, J Uhrig, N Mayer, E Ilg, A Dosovitskiy, T Brox
Proceedings of the IEEE Conference on Computer Vision and Pattern …, 2017
5012017
Segmentation of moving objects by long term video analysis
P Ochs, J Malik, T Brox
IEEE Transactions on Pattern Analysis and Machine Intelligence 33 (3), 500-513, 2013
4942013
Synthesizing the preferred inputs for neurons in neural networks via deep generator networks
A Nguyen, A Dosovitskiy, J Yosinski, T Brox, J Clune
Advances in neural information processing systems 29, 3387-3395, 2016
4892016
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20