Current post : 《A Glance of AutoML part1: The experience with NNI》

7/19/2019 —— 

0. Preface

This is my first time to learn Automated machine learning(AutoML), I used to think the selection of model and the configuration of hyper-parameters must be settled manually, until I took part in the NNI student project, It’s always fun to learn something new.

1. What is NNI

(From github) NNI (Neural Network Intelligence) is a toolkit to help users run automated machine learning (AutoML) experiments. The tool dispatches and runs trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in different environments like local machine, remote servers and cloud.

2. What is AutoML

(From Wikipedia) Automated machine learning (AutoML) is the process of automating the end-to-end process of applying machine learning to real-world problems. In a typical machine learning application, practitioners must apply the appropriate data pre-processing, feature engineering, feature extraction, and feature selection methods that make the dataset amenable for machine learning. Following those preprocessing steps, practitioners must then perform algorithm selection and hyperparameter optimization to maximize the predictive performance of their final machine learning model. As many of these steps are often beyond the abilities of non-experts, AutoML was proposed as an artificial intelligence-based solution to the ever-growing challenge of applying machine learning.[1][2] Automating the end-to-end process of applying machine learning offers the advantages of producing simpler solutions, faster creation of those solutions, and models that often outperform models that were designed by hand.

This is a general defination of AutoML:


3. Installation & usage

The installation of NNI is succinct, you just need to follow the doc in github. BTW, I’ve configured an anaconda enviroment ( I think anaconda is recommended for a machine learning worker), so I created a new python environment for NNI.

conda create -n nni python=3.6 //python >= 3.5 
source activate nni

Then use pip to install NNI directly

pip install nni

To run the test sample, tensorflow is in need, use pip to install it.

pip install tensorflow

Clone the NNI respository and run the sample to verify installation

git clone -b v0.5.1
nnictl create --config nni/examples/trials/mnist/config.yml

The ouput information:

INFO: Starting restful server...
INFO: Successfully started Restful server!
INFO: Setting local config...
INFO: Successfully set local config!
INFO: Starting experiment...
INFO: Successfully started experiment!
The experiment id is rD1uGvg9
The Web UI urls are:

You can use these commands to get more information about the experiment
         commands                       description
1. nnictl experiment show        show the information of experiments
2. nnictl trial ls               list all of trial jobs
3. nnictl top                    monitor the status of running experiments
4. nnictl log stderr             show stderr log content
5. nnictl log stdout             show stdout log content
6. nnictl stop                   stop an experiment
7. nnictl trial kill             kill a trial job by id
8. nnictl --help                 get help information about nnictl

A nice feature of NNI is the Web UI, access the url to see the detail information of the experiment.



When I run my own experiment, I got the output “Restful server start failed”, like issue #432

, but the port is free and the node is installed correctly, I just restarted my PC and solved this issue.

4. Summary

There are some features impressed me a lot:

  • Succinct and intergrated installation: just use pip to install nni
  • Complete instruction and log system
  • WebUI make the information view more intuitive


  • The “Restful server start failed” issue, hope for solution at the code level.
  • For some framework like Keras, the intermediate metrics is not easy to get, so nni.report_intermediate_result can’t be applied in real time. The solution I found now is using the history of keras model, but in this way, the information cannot be gotten in every specific epochs.