Yu-Hsiang Lin

Email: hitr2997925@gmail.com
LinkedIn: https://www.linkedin.com/in/yhl2997925/
GitHub: https://github.com/yuhsianglin

I am an applied scientist at Amazon. I enjoy working on both scientific projects in machine learning as well as engineering projects in the search and online experiment systems. I built neural language generative models to help customers find relevant new products, introduced graph neural networks and new model training methods to address the low-resource learning problem in product search, and developed the online ranking features to improve the search performance in long-tail queries. I also built the data pipeline that generates online A/B testing metrics used by all teams across Amazon, and a feature build that publishes the new product features to the search index.

I was at A9 as a software development intern, developing the two-phase ranking model for Amazon Business.

I had my master in the Language Technologies Institute at Carnegie Mellon University and Ph.D. in physics in National Taiwan University. My master training focused on machine learning, deep learning, natural language processing, and distributed systems. My Ph.D. research was on high-energy physics and inflationary universe.

At CMU I worked on topics in natural language processing including cross-lingual transfer for low-resource languages, advised by Graham Neubig (ACL 2019). I also worked on reinforcement learning, generative adversarial networks, and structured prediction.

In NTU I had the opportunity to work with Chih-Jen Lin on the second-order optimization methods for deep neural networks (Neural Computation 2018).