Advance Keras Hyperparameter Tuning with aisaratuners Library

AiSara
4 min readDec 2, 2020

Finally, AI assisting AI!

Photo by Roberta Sorge on Unsplash

Hyperparameter tuning is known to be a time-consuming and computationally expensive process. Even after you have found your optimized parameters, doubt sinks in, shall I try a bit more? The main reason being you are kept in the dark during the process. There was no way to see the big picture, until now. Here, we at AiSara are happy to introduce the state of art “aisaratuners library”, that is up to 10x faster.

AiSara proprietary state of the art algorithm is the secret sauce for the aisaratuners library. aisaratuners uses Latin Hypercube to provide initial sampling and with AiSara Hyperparameter Tuning API utilizing SOTA pattern recognition to reduce the solution space boundaries as it goes along. The reduced search by stages focuses the search faster, and with more accuracy. The analogy is like a satellite system triangulating to find the exact GPS location. The following graphs (patent pending but feel free to republished with citation) illustrate the innovative workflow behind aisaratuners with 3 rounds consisting of 5 trials each.

aisaratuners is very convenient, fast in convergence, and can be used by everyone. For now, it only works with keras, but in our plan to roll it out for other libraries.

aisaratuners library can be used to tune numerical hyperparameters which might include:

  1. learning rate
  2. epoch
  3. batch size
  4. Dropout rate
  5. number of layers
  6. number of units and filters

To use aisaratuners the user needs to follow the steps below:

1- Install aisaratuners and and import the needed classes

2- Define hyperparameters:
The user needs to define the hyperparameters and their min and max values. In case learning rate is considered, we recommend the user to change the hyperparameter type from the default to “log”.

3- Create keras model:
The user needs to create the model as in the usual way of defining any keras model, just it should be wrapped inside a function which returns the model along with the history as shown below.

keras model for binary classification wrapped in a function where the above list of defined hyperparameters will be tuned

4- Instantiate HpOptimization class and run the optimizer:
The user needs to specify the optimization parameters, number of rounds (solution space reduction) and the number of trials for each round. As a rule of thumb, 3 rounds and 5 trails each is usually sufficient. Yes, that’s right, only 15 trials in total!
The user can then run the optimization using run_opti().

Using plot_opti_results() function, the user can visualize results summary. In this case Y-axis in the plot below shows the accuracy for each round (x-axis). You can see below the accuracy converges quickly with little to no wastage of trials. If one is doing random search, significantly more trials are required to reach a high confidence level of the accuracy.

Maximum Accuracy and Accuracy Distribution at Each Round
Parallel Coordinates Plot Visualizing the Performance of Multiple Runs Over a Set Number of Hyperparameters

Now this is the kicker, for the first time ever, the user can see the whole hyperparameter solution space using plot_search_space() function.

Hyperparameters Solution Space Boundaries at Each Round
The 3D Hyperparameters Solution Space

We invite you to try aisaratuners library for fast and convenient hyperparameter tuning. Also we would like to hear about your experience using aisaratuners, please comment below or email us at support@aisara.ai

Note: aisaratuners is free for private use (also it includes hackathon, Kaggle competitions, teaching materials in paid or free education platforms). For commercial use, please register via RapidApi.

--

--

AiSara

We are innovation driven AI company and with our proprietary algorithm for pattern recognition, we provide state of art AI solutions to our clients.