![TensorFlow 1.x Deep Learning Cookbook](https://wfqqreader-1252317822.image.myqcloud.com/cover/419/36700419/b_36700419.jpg)
上QQ阅读APP看书,第一时间看更新
There's more...
There is another approach called Bayesian optimization, which can also be used to tune hyperparameters. In it, we define an acquisition function along with a Gaussian process. The Gaussian process uses a set of previously evaluated parameters and resulting accuracy to assume about unobserved parameters. The acquisition function using this information suggests the next set of parameters. There is a wrapper available for even gradient-based hyperparameter optimization https://github.com/lucfra/RFHO.