Clemson University

"Learning Hyperparameters for Neural Machine Translation" - Dr. Kenton Murray (Johns Hopkins University)



Machine Translation, the subfield of Computer Science that focuses on translating be-tween two human languages, has greatly benefited from neural networks. However, these neural machine translation systems have complicated architectures with many hyperparameters that need to be manually chosen. Frequently, these are selected either through a grid search over values, or by using values commonplace in the literature. However, these are not theoretically justified and the same values are not optimal for all language pairs and datasets. Fortunately, the innate structure of the problem allows for optimization of these hyper-parameters during training. Traditionally, the hyperparameters of a system are chosen and then a learning algorithm optimizes all of the parameters within the model. In this work, I propose three methods to learn the optimal hyperparameters during the training of the model, allowing for one step instead of two. First, I propose using group regularizers to learn the number, and size of, the hidden neural network layers. Second, I demonstrate how to use a perceptron-like tuning method to solve known problems of undertranslation and label bias. Finally, I propose an Expectation-Maximization based method to learn the optimal vocabulary size and granularity. Using various techniques from machine learning and numerical optimization, this dissertation covers how to learn hyperparameters of a Neural Machine Translation system while training the model itself.

Friday, January 29, 2021 at 2:30pm to 3:30pm

Virtual Event

Notice of Non-Discrimination

Target Audience

All Audiences


College of Engineering, Computing and Applied Sciences, School of Computing, Research Seminars


Contact Name:

Paige Rodeghero

Contact Phone:


Contact Email:


Recent Activity