David Ifeoluwa Adelani
David Ifeoluwa Adelani
PhD Student Saarland University Saarland Informatics Campus
Bestätigte E-Mail-Adresse bei - Startseite
Zitiert von
Zitiert von
Demographic inference and representative population estimates from multilingual social media data
Z Wang, S Hale, DI Adelani, P Grabowicz, T Hartman, F Flöck, D Jurgens
The world wide web conference, 2056-2067, 2019
Generating sentiment-preserving fake online reviews using neural language models and their human-and machine-based detection
DI Adelani, H Mai, F Fang, HH Nguyen, J Yamagishi, I Echizen
International Conference on Advanced Information Networking and Applications …, 2020
Transfer learning and distant supervision for multilingual transformer models: A study on African languages
MA Hedderich, D Adelani, D Zhu, J Alabi, U Markus, D Klakow
Proceedings of the 2020 Conference on Empirical Methods in Natural Language …, 2020
Massive vs. curated embeddings for low-resourced languages: the case of Yorùbá and Twi
J Alabi, K Amponsah-Kaakyire, D Adelani, C Espana-Bonet
Proceedings of the 12th Language Resources and Evaluation Conference, 2754-2762, 2020
MasakhaNER: Named entity recognition for African languages
DI Adelani, J Abbott, G Neubig, D D’souza, J Kreutzer, C Lignos, ...
Transactions of the Association for Computational Linguistics 9, 1116-1131, 2021
Unsupervised pidgin text generation by pivoting english data and self-training
E Chang, DI Adelani, X Shen, V Demberg
arXiv preprint arXiv:2003.08272, 2020
A secure e-voting architecture
AS Sodiya, SA Onashoga, DI Adelani
2011 Eighth International Conference on Information Technology: New …, 2011
The Effect of Domain and Diacritics in Yorùbá-English Neural Machine Translation
DI Adelani, D Ruiter, JO Alabi, D Adebonojo, A Ayeni, M Adeyemi, ...
Proceedings of Machine Translation Summit XVIII: Research Track, 61–75, 2021
Privacy guarantees for de-identifying text transformations
DI Adelani, A Davody, T Kleinbauer, D Klakow
Proc. Interspeech 2020, 4666--4670, 2020
Investigating the impact of pre-trained word embeddings on memorization in neural networks
A Thomas, DI Adelani, A Davody, A Mogadala, D Klakow
International Conference on Text, Speech, and Dialogue, 273-281, 2020
Improving Yor\ub\'a Diacritic Restoration
I Orife, DI Adelani, T Fasubaa, V Williamson, WF Oyewusi, O Wahab, ...
arXiv preprint arXiv:2003.10564, 2020
Distant Supervision and Noisy Label Learning for Low Resource Named Entity Recognition: A Study on Hausa and Yor\ub\'a
DI Adelani, MA Hedderich, D Zhu, E Berg, D Klakow
arXiv preprint arXiv:2003.08370, 2020
Naijasenti: A nigerian twitter sentiment corpus for multilingual sentiment analysis
SH Muhammad, DI Adelani, IS Ahmad, I Abdulmumin, BS Bello, ...
arXiv preprint arXiv:2201.08277, 2022
On the effect of normalization layers on Differentially Private training of deep Neural networks
A Davody, DI Adelani, T Kleinbauer, D Klakow
arXiv preprint arXiv:2006.10919, 2020
Estimating community feedback effect on topic choice in social media with predictive modeling
DI Adelani, R Kobayashi, I Weber, PA Grabowicz
EPJ Data Science 9 (1), 25, 2020
Enhancing the reusability and interoperability of artificial neural networks with DEVS modeling and simulation
DI Adelani, MK Traore
International Journal of Modeling, Simulation, and Scientific Computing 7 (3), 2015
AI4D--African Language Program
K Siminyu, G Kalipe, D Orlic, J Abbott, V Marivate, S Freshia, P Sibal, ...
AfricaNLP 2021, 2021
MCSE: Multimodal Contrastive Learning of Sentence Embeddings
M Zhang, M Mosbach, DI Adelani, MA Hedderich, D Klakow
arXiv preprint arXiv:2204.10931, 2022
Preventing author profiling through zero-shot multilingual back-translation
DI Adelani, M Zhang, X Shen, A Davody, T Kleinbauer, D Klakow
Proceedings of the 2021 Conference on Empirical Methods in Natural Language …, 2021
BibleTTS: a large, high-fidelity, multilingual, and uniquely African speech corpus
J Meyer, DI Adelani, E Casanova, A Öktem, DWJ Weber, S Kabongo, ...
arXiv preprint arXiv:2207.03546, 2022
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20