Weiting Tan
Center for Language and Speech Processing, Johns Hopkins University
Hi👋, I’m Weiting Tan! I’m a 2nd year PhD student in Computer Science at Johns Hopkins University, advised by Prof. Philipp Koehn. Previously, I completed my Undergraduate and Master’s degree in Computer Science at JHU, Go Hop!
My research interest lies at machine learning and natural language processing. In particular, I am interested in efficient and scalable representation learning methods for cross-modal applications.
If you have anything to share with me, please feel free to contact me through my email: wtan12 at jhu.edu
news
Sep 25, 2024 | DiffNorm accepted to NeurIPS 2024! Please come to our poster, see you in Vancouver! |
---|---|
Aug 02, 2024 | I will work on Multi-modal LLMs as a part-time student researcher at Meta in Fall 2024 & Spring 2025. |
Feb 20, 2024 | I will be interning at Meta AI (FAIR) in summer 2024, working on Speech Large Language Model. Looking forward to the new project! |
Apr 07, 2023 | I will be staying at Johns Hopkins University for my PhD, working with Prof. Philipp Koehn! |
selected publications
- preprint
- NeurIPSDiffNorm: Self-Supervised Normalization for Non-autoregressive Speech-to-speech Translation2024
- preprint
- ICMLContrastive Preference Optimization: Pushing the Boundaries of LLM Performance in Machine TranslationIn Proceedings of the 41st International Conference on Machine Learning, 21–27 jul 2024
- NAACLNarrowing the Gap between Zero- and Few-shot Machine Translation by Matching StylesIn Findings of the Association for Computational Linguistics: NAACL 2024, Jun 2024
- ACLThe Language Barrier: Dissecting Safety Challenges of LLMs in Multilingual ContextsIn Findings of the Association for Computational Linguistics ACL 2024, Aug 2024
- EACLMultilingual Representation Distillation with Contrastive LearningIn Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, May 2023
- EMNLPFlatness-Aware Prompt Selection Improves Accuracy and Sample EfficiencyIn Findings of the Association for Computational Linguistics: EMNLP 2023, Dec 2023
- EMNLPCondensing Multilingual Knowledge with Lightweight Language-Specific ModulesIn Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, Dec 2023