Home Artificial Intelligence Hyperparameter Optimization With Hyperopt — Intro & Implementation 1. Basics 2. Hyperopt Implementation Conclusion Thanks for Reading!

Hyperparameter Optimization With Hyperopt — Intro & Implementation 1. Basics 2. Hyperopt Implementation Conclusion Thanks for Reading!

1
Hyperparameter Optimization With Hyperopt — Intro & Implementation
1. Basics
2. Hyperopt Implementation
Conclusion
Thanks for Reading!

2.1. Support Vector Machines and Iris Data Set

In a previous post I used Grid Search, Random Search and Bayesian Optimization for hyperparameter optimization using the Iris data set provided by scikit-learn. Iris data set includes 3 different irises petal and sepal lengths and is a commonly-used data set for classification exercises. On this post, we’ll use the identical data set but we’ll use a Support Vector Machine (SVM) as a model with two parameters that we are able to optimize as follows:

  • C: Regularization parameter, which trades off misclassification of coaching examples against simplicity of the choice surface.
  • gamma: Kernel coefficient, which defines how much influence a single training example has. The larger gamma is, the closer other examples have to be to be affected.

Because the goal of this exercise is to undergo the hyperparameter optimization, I won’t go deeper into what SVMs do but should you have an interest, I find this scikit-learn post helpful.

We are going to generally follow the identical steps that we utilized in the straightforward example earlier but will even visualize the method at the top:

1. Import mandatory libraries and packages
2. Define the target function and the search space
3. Run the optimization process
4. Visualize the optimization

2.1.1. Step 1 — Import Libraries and Packages

Let’s import the libraries and packages after which load the info set.

# Import libraries and packages
from sklearn import datasets
from sklearn.svm import SVC
from sklearn.model_selection import cross_val_score

# Load Iris dataset
iris = datasets.load_iris()
X = iris.data
y = iris.goal

2.1.2. Step 2 — Define Objective Function and Search Space

Let’s first start with defining the target function, which is able to train an SVM and returns the negative of the cross-validation rating — that’s what we wish to attenuate. Note that we’re minimizing the negative of cross-validation rating to be consistent with the final goal of “minimizing” the target function (as an alternative of “maximizing” the cross-validation rating).

def objective_function(parameters):
clf = SVC(**parameters)
rating = cross_val_score(clf, X, y, cv=5).mean()
return -score

Next we’ll define the search space, which consists of the values that our parameters of C and gamma can take. Note that we’ll use Hyperopt’s hp.uniform(label, low, high), which returns a worth uniformly between “low” and “high” (source).

# Search Space
search_space = {
'C': hp.uniform('C', 0.1, 10),
'gamma': hp.uniform('gamma', 0.01, 1)
}

2.1.3. Run Optimization

Same as the straightforward example earlier, we’ll use a TPE algorithm and store the ends in a Trials object.

# Trials object to store the outcomes
trials = Trials()

# Run optimization
best = fmin(fn=objective_function, space=search_space, algo=tpe.suggest, trials=trials, max_evals=100)

Results:

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here