Difference between revisions of "Fall 2022 Schedule"
CompSemUser (talk | contribs) |
CompSemUser (talk | contribs) |
||
Line 27: | Line 27: | ||
| 10.6.21 || | | 10.6.21 || | ||
|- style="border-top: 2px solid DarkGray;" | |- style="border-top: 2px solid DarkGray;" | ||
− | | 10.13.21 || | + | | 10.13.21 || Invited guest: Arya McCarthy |
|- style="border-top: 2px solid DarkGray;" | |- style="border-top: 2px solid DarkGray;" | ||
| 10.20.21 || | | 10.20.21 || |
Revision as of 10:49, 13 September 2021
Date | Title |
---|---|
9.1.21 | Planning, introductions, welcome!
CompSem meetings will be hybrid this semester - in person at Fleming 279, and online here: https://cuboulder.zoom.us/j/97014876908 |
9.8.21 | 10am (NOTE: special start time)
Yoshinari Fujinuma thesis defense Analysis and Applications of Cross-Lingual Models in Natural Language Processing Human languages vary in terms of both typologically and data availability. A typical machine learning-based approach for natural language processing (NLP) requires training data from the language of interest. However, because machine learning-based approaches heavily rely on the amount of data available in each language, the quality of trained model languages without a large amount of data is poor. One way to overcome the lack of data in each language is to conduct cross-lingual transfer learning from resource-rich languages to resource-scarce languages. Cross-lingual word embeddings and multilingual contextualized embeddings are commonly used to conduct cross-lingual transfer learning. However, the lack of resources still makes it challenging to either evaluate or improve such models. This dissertation first proposes a graph-based method to overcome the lack of evaluation data in low-resource languages by focusing on the structure of cross-lingual word embeddings, further discussing approaches to improve cross-lingual transfer learning by using retrofitting methods and by focusing on a specific task. Finally, it provides an analysis of the effect of adding different languages when pretraining multilingual models. |
9.15.21 | ACL best paper recaps |
9.22.21 | Introduction to AI Institute (short talks) |
9.29.21 | |
10.6.21 | |
10.13.21 | Invited guest: Arya McCarthy |
10.20.21 | |
10.27.21 | Invited talk: Lisa Miracchi |
11.3.21 | EMNLP practice talks |
11.10.21 | EMNLP - no meeting |
11.17.21 | Elizabeth Spaulding prelim |
11.24.21 | Fall break - no meeting |
12.1.21 | Invited talk: Abe Handler |
12.8.21 | Abhidip Bhattacharyya proposal defense |
Past Schedules
- Spring 2021 Schedule
- Fall 2020 Schedule
- Spring 2020 Schedule
- Fall 2019 Schedule
- Spring 2019 Schedule
- Fall 2018 Schedule
- Summer 2018 Schedule
- Spring 2018 Schedule
- Fall 2017 Schedule
- Summer 2017 Schedule
- Spring 2017 Schedule
- Fall 2016 Schedule
- Spring 2016 Schedule
- Fall 2015 Schedule
- Spring 2015 Schedule
- Fall 2014 Schedule
- Spring 2014 Schedule
- Fall 2013 Schedule
- Summer 2013 Schedule
- Spring 2013 Schedule
- Fall 2012 Schedule
- Spring 2012 Schedule
- Fall 2011 Schedule
- Summer 2011 Schedule
- Spring 2011 Schedule
- Fall 2010 Schedule
- Summer 2010 Schedule
- Spring 2010 Schedule
- Fall 2009 Schedule
- Summer 2009 Schedule
- Spring 2009 Schedule
- Fall 2008 Schedule
- Summer 2008 Schedule