Creates MobileBert model spec for the question answer task. See also: tflite_model_maker.question_answer.BertQaSpec.

uri TF-Hub path/url to Bert module.
model_dir The location of the model checkpoint files.
seq_len Length of the sequence to feed into the model.
query_len Length of the query to feed into the model.
doc_stride The stride when we do a sliding window approach to take chunks of the documents.
dropout_rate The rate for dropout.
initializer_range The stdev of the truncated_normal_initializer for initializing all weight matrices.
learning_rate The initial learning rate for Adam.
distribution_strategy A string specifying which distribution strategy to use. Accepted values are 'off', 'one_device', 'mirrored', 'parameter_server', 'multi_worker_mirrored', and 'tpu' -- case insensitive. 'off' means not to use Distribution Strategy; 'tpu' means to use TPUStrategy using tpu_address.
num_gpus How many GPUs to use at each worker with the DistributionStrategies API. The default is -1, which means utilize all available GPUs.
tpu TPU address to connect to.
trainable boolean, whether pretrain layer is trainable.
predict_batch_size Batch size for prediction.
do_lower_case boolean, whether to lower case the input text. Should be True for uncased models and False for cased models.
is_tf2 boolean, whether the hub module is in TensorFlow 2.x format.
tflite_input_name Dict, input names for the TFLite model.
tflite_output_name Dict, output names for the TFLite model.
init_from_squad_model boolean, whether to initialize from the model that is already retrained on Squad 1.1.
default_batch_size Default batch size for training.
name Name of the object.