In this page, you can define and start the optimization experiments. An optimization experiment is an experiment that involves a chosen dataset and topic modeling algorithm whose hyperparameters (or a subset of hyperparameters) are optimized with respect of a chosen evaluation metric.


Quick start:

  • Select the folder in which you will save your experiments. For each experiment you need to specify the name of the experiment and the name for the set of experiments (batch name). The "batch" allows you to organize experiments in the same cluster. This will be useful later for visualization purposes.
  • Select a Dataset among our preprocessed datasets. You can just train a model, or you can train and test the model (we have already splitted the datasets).
  • Select a Model among our predefined topic models. You can choose among a selection of classical topic models and neural topic models.
  • Select the hyperparameters of the model that you want to optimize and/or fix the value of the hyperpararameters that you like. If you optimize a hyperparameter, you need to select the value range for the selected hyperparameter.
  • Select the Metric that you want to optimize (just one for now, but we plan to do multi-objective hyperparameter optimization soon) and its corresponding parameters.
  • Select the configuration of the parameters of Bayesian Optimization.

Hint:

Click on names of the hyperparameters, the models or the metrics for obtaining additional information.
Select Path

No path chosen


Choose a Dataset

Only training

Training and Testing


Choose a Model

Choose the Evaluation Metric to optimize

Choose one or more metrics to evaluate

Optimization parameters

Number of iterations

Model runs

Initial random points