Finally, AI assisting AI!
Hyperparameter tuning is known to be a time-consuming and computationally expensive process. Even after you have found your optimized parameters, doubt sinks in, shall I try a bit more? The main reason being you are kept in the dark during the process. There was no way to see the big picture, until now. Here, we at AiSara are happy to introduce the state of art “aisaratuners library”, that is up to 10x faster.
AiSara proprietary state of the art algorithm is the secret sauce for the aisaratuners library. aisaratuners uses Latin Hypercube to provide initial sampling and with AiSara Hyperparameter Tuning API utilizing SOTA pattern recognition to reduce the solution space boundaries as it goes along. The reduced search by stages focuses the search faster, and with more accuracy. The analogy is like a satellite system triangulating to find the exact GPS location. The following graphs (patent pending but feel free to republished with citation) illustrate the innovative workflow behind aisaratuners with 3 rounds consisting of 5 trials each.
aisaratuners is very convenient, fast in convergence, and can be used by everyone. For now, it only works with keras, but in our plan to roll it out for other libraries.
aisaratuners library can be used to tune numerical hyperparameters which might include:
- learning rate
- batch size
- Dropout rate
- number of layers
- number of units and filters
To use aisaratuners the user needs to follow the steps below:
1- Install aisaratuners and and import the needed classes
2- Define hyperparameters:
The user needs to define the hyperparameters and their min and max values. In case learning rate is considered, we recommend the user to change the hyperparameter type from the default to “log”.
3- Create keras model:
The user needs to create the model as in the usual way of defining any keras model, just it should be wrapped inside a function which returns the model along with the history as shown below.
4- Instantiate HpOptimization class and run the optimizer:
The user needs to specify the optimization parameters, number of rounds (solution space reduction) and the number of trials for each round. As a rule of thumb, 3 rounds and 5 trails each is usually sufficient. Yes, that’s right, only 15 trials in total!
The user can then run the optimization using run_opti().
Using plot_opti_results() function, the user can visualize results summary. In this case Y-axis in the plot below shows the accuracy for each round (x-axis). You can see below the accuracy converges quickly with little to no wastage of trials. If one is doing random search, significantly more trials are required to reach a high confidence level of the accuracy.
Now this is the kicker, for the first time ever, the user can see the whole hyperparameter solution space using plot_search_space() function.
We invite you to try aisaratuners library for fast and convenient hyperparameter tuning. Also we would like to hear about your experience using aisaratuners, please comment below or email us at firstname.lastname@example.org
Note: aisaratuners is free for private use (also it includes hackathon, Kaggle competitions, teaching materials in paid or free education platforms). For commercial use, please register via RapidApi.
aisaratuners on pypi: https://pypi.org/project/aisaratuners/
AiSara Hyperparameter Tuning API (for commercial use): https://rapidapi.com/aisara-technology-aisara-technology-default/api/aisara-hyperparameter-tuning
AiSara Solutions: https://www.aisara.ai/