Follow
Tyler A. Chang
Title
Cited by
Cited by
Year
Co-Scale Conv-Attentional Image Transformers
W Xu, Y Xu, TA Chang, Z Tu
International Conference on Computer Vision, 2021
4012021
Do Large Language Models Know What Humans Know?
S Trott, C Jones, T Chang, J Michaelov, B Bergen
Cognitive Science, 2023
952023
Language Model Behavior: A Comprehensive Survey
TA Chang, BK Bergen
Computational Linguistics, 2024
742024
Word Acquisition in Neural Language Models
TA Chang, BK Bergen
Transactions of the Association for Computational Linguistics 10, 1-16, 2022
472022
The Geometry of Multilingual Language Model Representations
TA Chang, Z Tu, BK Bergen
Conference on Empirical Methods in Natural Language Processing, 2022
442022
Distributional Semantics Still Can’t Account for Affordances
CR Jones, TA Chang, S Coulson, JA Michaelov, S Trott, BK Bergen
Annual Meeting of the Cognitive Science Society 44 (44), 2022
242022
Convolutions and Self-Attention: Re-interpreting Relative Positions in Pre-trained Language Models
TA Chang, Y Xu, W Xu, Z Tu
Annual Meeting of the Association for Computational Linguistics and the …, 2021
152021
When Is Multilinguality a Curse? Language Modeling for 250 High-and Low-Resource Languages
TA Chang, C Arnett, Z Tu, BK Bergen
Conference on Empirical Methods in Natural Language Processing, 2024
132024
Structural Priming Demonstrates Abstract Grammatical Representations in Multilingual Language Models
JA Michaelov, C Arnett, TA Chang, BK Bergen
Conference on Empirical Methods in Natural Language Processing, 2023
102023
Encodings of Source Syntax: Similarities in NMT Representations Across Target Languages
TA Chang, AN Rafferty
5th Workshop on Representation Learning for NLP at ACL, 2020
42020
Different Tokenization Schemes Lead to Comparable Performance in Spanish Number Agreement
C Arnett, PD Rivière, TA Chang, S Trott
SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and …, 2024
32024
Characterizing Learning Curves During Language Model Pre-Training: Learning, Forgetting, and Stability
TA Chang, Z Tu, BK Bergen
Transactions of the Association for Computational Linguistics, 2024
32024
Crosslingual Structural Priming and the Pre-Training Dynamics of Bilingual Language Models
C Arnett, TA Chang, JA Michaelov, BK Bergen
3rd Multilingual Representation Learning Workshop at EMNLP, 2023
22023
Characterizing and Measuring Linguistic Dataset Drift
TA Chang, K Halder, NA John, Y Vyas, Y Benajiba, M Ballesteros, D Roth
Annual Meeting of the Association for Computational Linguistics, 2023
22023
Does Contextual Diversity Hinder Early Word Acquisition?
TA Chang, BK Bergen
Annual Meeting of the Cognitive Science Society 44 (44), 2022
22022
Goldfish: Monolingual Language Models for 350 Languages
TA Chang, C Arnett, Z Tu, BK Bergen
arXiv preprint arXiv:2408.10441, 2024
12024
Detecting Hallucination and Coverage Errors in Retrieval Augmented Generation for Controversial Topics
TA Chang, K Tomanek, J Hoffmann, N Thain, E van Liemt, ...
Joint International Conference on Computational Linguistics, Language …, 2024
12024
A Bit of a Problem: Measurement Disparities in Dataset Sizes Across Languages
C Arnett, TA Chang, BK Bergen
Annual Meeting of the Special Interest Group on Under-Resourced Languages at …, 2024
12024
Scalable Influence and Fact Tracing for Large Language Model Pretraining
TA Chang, D Rajagopal, T Bolukbasi, L Dixon, I Tenney
arXiv preprint arXiv:2410.17413, 2024
2024
Correlations between Multilingual Language Model Geometry and Crosslingual Transfer Performance
C Shah, Y Chandak, AM Mane, B Bergen, TA Chang
Joint International Conference on Computational Linguistics, Language …, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–20