Module: tf.keras.applications.nasnet
Stay organized with collections
Save and categorize content based on your preferences.
NASNet-A models for Keras.
NASNet refers to Neural Architecture Search Network, a family of models
that were designed automatically by learning the model architectures
directly on the dataset of interest.
Here we consider NASNet-A, the highest performance model that was found
for the CIFAR-10 dataset, and then extended to ImageNet 2012 dataset,
obtaining state of the art performance on CIFAR-10 and ImageNet 2012.
Only the NASNet-A models, and their respective weights, which are suited
for ImageNet 2012 are provided.
Architecture | Top-1 Acc | Top-5 Acc | Multiply-Adds | Params (M)
| NASNet-A (4 @ 1056) | 74.0 % | 91.6 % | 564 M | 5.3 |
| NASNet-A (6 @ 4032) | 82.7 % | 96.2 % | 23.8 B | 88.9 |
References:
Functions
NASNetLarge(...)
: Instantiates a NASNet model in ImageNet mode.
NASNetMobile(...)
: Instantiates a Mobile NASNet model in ImageNet mode.
decode_predictions(...)
: Decodes the prediction of an ImageNet model.
preprocess_input(...)
: Preprocesses a tensor or Numpy array encoding a batch of images.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[null,null,["Last updated 2020-10-01 UTC."],[],[],null,["# Module: tf.keras.applications.nasnet\n\n\u003cbr /\u003e\n\n|--------------------------------------------------------------------------------------|\n| [TensorFlow 1 version](/versions/r1.15/api_docs/python/tf/keras/applications/nasnet) |\n\nNASNet-A models for Keras.\n\nNASNet refers to Neural Architecture Search Network, a family of models\nthat were designed automatically by learning the model architectures\ndirectly on the dataset of interest.\n\nHere we consider NASNet-A, the highest performance model that was found\nfor the CIFAR-10 dataset, and then extended to ImageNet 2012 dataset,\nobtaining state of the art performance on CIFAR-10 and ImageNet 2012.\nOnly the NASNet-A models, and their respective weights, which are suited\nfor ImageNet 2012 are provided.\n\nThe below table describes the performance on ImageNet 2012:\n-----------------------------------------------------------\n\n Architecture | Top-1 Acc | Top-5 Acc | Multiply-Adds | Params (M)\n\n*** ** * ** ***\n\n\\| NASNet-A (4 @ 1056) \\| 74.0 % \\| 91.6 % \\| 564 M \\| 5.3 \\|\n\n\\| NASNet-A (6 @ 4032) \\| 82.7 % \\| 96.2 % \\| 23.8 B \\| 88.9 \\|\n---------------------------------------------------------------\n\n#### References:\n\n- [Learning Transferable Architectures for Scalable Image Recognition](https://arxiv.org/abs/1707.07012) (CVPR 2018)\n\nFunctions\n---------\n\n[`NASNetLarge(...)`](../../../tf/keras/applications/NASNetLarge): Instantiates a NASNet model in ImageNet mode.\n\n[`NASNetMobile(...)`](../../../tf/keras/applications/NASNetMobile): Instantiates a Mobile NASNet model in ImageNet mode.\n\n[`decode_predictions(...)`](../../../tf/keras/applications/nasnet/decode_predictions): Decodes the prediction of an ImageNet model.\n\n[`preprocess_input(...)`](../../../tf/keras/applications/nasnet/preprocess_input): Preprocesses a tensor or Numpy array encoding a batch of images."]]