New! Use Simple ML for Sheets to apply machine learning to the data in your Google Sheets
Read More
tfdf.tuner.RandomSearch
Stay organized with collections
Save and categorize content based on your preferences.
Tuner using random hyperparameter values.
Inherits From: Tuner
tfdf.tuner.RandomSearch(
num_trials: int = 100,
use_predefined_hps: bool = False,
trial_num_threads: int = 1,
trial_maximum_training_duration_seconds: Optional[float] = None
)
Used in the notebooks
The candidate hyper-parameter can be evaluated independently and in parallel.
Attributes |
num_trials
|
Number of random hyperparameter values to evaluate.
|
use_predefined_hps
|
If true, automatically configure the the space of
hyper-parameters explored by the tuner. In this case, configuring the
hyper-parameters manually (e.g. calling "choice(...)" on the tuner) is not
necessary.
|
trial_num_threads
|
Number of threads used to train the models in each trial.
This parameter is different from the num_threads parameter of the model
constructor that indicates how many threads to use for the overal
training+possibly tuning. For example trial_num_threads=2 and
num_threads=5, 5 models will be training in parallel during tuning, and
each of those models will be trained with 2 threads. In reverse, if you
want to run at most 100 threads globally, make sure that
trial_num_threads*num_threads = 100.
|
trial_maximum_training_duration_seconds
|
Maximum training duration of an
individual trial expressed in seconds. This parameter is different from
the maximum_training_duration_seconds parameter of the model constructor
that define the maximum training+tuning duration.
|
Methods
choice
View source
choice(
key: str,
values: Union[List[int], List[float], List[str], List[bool]],
merge: bool = False
) -> SearchSpace
Adds a hyperparameter with a list of possible values.
Args |
key
|
Name of the hyper-parameter.
|
values
|
List of possible value for the hyperparameter.
|
merge
|
If false (default), raises an error if the hyper-parameter already
exist. If true, adds values to the parameter if it already exist.
|
Returns |
The conditional SearchSpace corresponding to the values in "values".
|
set_base_learner
View source
set_base_learner(
learner: str
) -> None
Sets the base learner key.
train_config
View source
train_config() -> TrainConfig
YDF training configuration for the Hyperparameter optimizer.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2024-04-26 UTC.
[null,null,["Last updated 2024-04-26 UTC."],[],[],null,["# tfdf.tuner.RandomSearch\n\n\u003cbr /\u003e\n\n|--------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/decision-forests/blob/main/tensorflow_decision_forests/component/tuner/tuner.py#L218-L257) |\n\nTuner using random hyperparameter values.\n\nInherits From: [`Tuner`](../../tfdf/tuner/Tuner) \n\n tfdf.tuner.RandomSearch(\n num_trials: int = 100,\n use_predefined_hps: bool = False,\n trial_num_threads: int = 1,\n trial_maximum_training_duration_seconds: Optional[float] = None\n )\n\n### Used in the notebooks\n\n| Used in the tutorials |\n|--------------------------------------------------------------------------------------------------------------------|\n| - [Automated hyper-parameter tuning](https://www.tensorflow.org/decision_forests/tutorials/automatic_tuning_colab) |\n\nThe candidate hyper-parameter can be evaluated independently and in parallel.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Attributes ---------- ||\n|-------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `num_trials` | Number of random hyperparameter values to evaluate. |\n| `use_predefined_hps` | If true, automatically configure the the space of hyper-parameters explored by the tuner. In this case, configuring the hyper-parameters manually (e.g. calling \"choice(...)\" on the tuner) is not necessary. |\n| `trial_num_threads` | Number of threads used to train the models in each trial. This parameter is different from the `num_threads` parameter of the model constructor that indicates how many threads to use for the overal training+possibly tuning. For example trial_num_threads=2 and num_threads=5, 5 models will be training in parallel during tuning, and each of those models will be trained with 2 threads. In reverse, if you want to run at most 100 threads globally, make sure that trial_num_threads\\*num_threads = 100. |\n| `trial_maximum_training_duration_seconds` | Maximum training duration of an individual trial expressed in seconds. This parameter is different from the maximum_training_duration_seconds parameter of the model constructor that define the maximum training+tuning duration. |\n\n\u003cbr /\u003e\n\nMethods\n-------\n\n### `choice`\n\n[View source](https://github.com/tensorflow/decision-forests/blob/main/tensorflow_decision_forests/component/tuner/tuner.py#L194-L211) \n\n choice(\n key: str,\n values: Union[List[int], List[float], List[str], List[bool]],\n merge: bool = False\n ) -\u003e SearchSpace\n\nAdds a hyperparameter with a list of possible values.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ||\n|----------|--------------------------------------------------------------------------------------------------------------------------------------|\n| `key` | Name of the hyper-parameter. |\n| `values` | List of possible value for the hyperparameter. |\n| `merge` | If false (default), raises an error if the hyper-parameter already exist. If true, adds values to the parameter if it already exist. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ||\n|---|---|\n| The conditional SearchSpace corresponding to the values in \"values\". ||\n\n\u003cbr /\u003e\n\n### `set_base_learner`\n\n[View source](https://github.com/tensorflow/decision-forests/blob/main/tensorflow_decision_forests/component/tuner/tuner.py#L189-L192) \n\n set_base_learner(\n learner: str\n ) -\u003e None\n\nSets the base learner key.\n\n### `train_config`\n\n[View source](https://github.com/tensorflow/decision-forests/blob/main/tensorflow_decision_forests/component/tuner/tuner.py#L184-L187) \n\n train_config() -\u003e TrainConfig\n\nYDF training configuration for the Hyperparameter optimizer."]]