smartwatch_gestures
Stay organized with collections
Save and categorize content based on your preferences.
The SmartWatch Gestures Dataset has been collected to evaluate several
gesture recognition algorithms for interacting with mobile applications using
arm gestures.
Eight different users performed twenty repetitions of twenty different gestures,
for a total of 3200 sequences. Each sequence contains acceleration data from the
3-axis accelerometer of a first generation Sony SmartWatch™, as well as
timestamps from the different clock sources available on an Android device. The
smartwatch was worn on the user's right wrist. The gestures have been manually
segmented by the users performing them by tapping the smartwatch screen at the
beginning and at the end of every repetition.
Split |
Examples |
'train' |
3,251 |
FeaturesDict({
'attempt': uint8,
'features': Sequence({
'accel_x': float64,
'accel_y': float64,
'accel_z': float64,
'time_event': uint64,
'time_millis': uint64,
'time_nanos': uint64,
}),
'gesture': ClassLabel(shape=(), dtype=int64, num_classes=20),
'participant': uint8,
})
Feature |
Class |
Shape |
Dtype |
Description |
|
FeaturesDict |
|
|
|
attempt |
Tensor |
|
uint8 |
|
features |
Sequence |
|
|
|
features/accel_x |
Tensor |
|
float64 |
|
features/accel_y |
Tensor |
|
float64 |
|
features/accel_z |
Tensor |
|
float64 |
|
features/time_event |
Tensor |
|
uint64 |
|
features/time_millis |
Tensor |
|
uint64 |
|
features/time_nanos |
Tensor |
|
uint64 |
|
gesture |
ClassLabel |
|
int64 |
|
participant |
Tensor |
|
uint8 |
|
@INPROCEEDINGS{
6952946,
author={Costante, Gabriele and Porzi, Lorenzo and Lanz, Oswald and Valigi, Paolo and Ricci, Elisa},
booktitle={2014 22nd European Signal Processing Conference (EUSIPCO)},
title={Personalizing a smartwatch-based gesture interface with transfer learning},
year={2014},
volume={},
number={},
pages={2530-2534},
doi={} }
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2023-01-13 UTC.
[null,null,["Last updated 2023-01-13 UTC."],[],[],null,["# smartwatch_gestures\n\n\u003cbr /\u003e\n\n- **Description**:\n\nThe **SmartWatch Gestures Dataset** has been collected to evaluate several\ngesture recognition algorithms for interacting with mobile applications using\narm gestures.\n\nEight different users performed twenty repetitions of twenty different gestures,\nfor a total of 3200 sequences. Each sequence contains acceleration data from the\n3-axis accelerometer of a first generation Sony SmartWatch™, as well as\ntimestamps from the different clock sources available on an Android device. The\nsmartwatch was worn on the user's right wrist. The gestures have been manually\nsegmented by the users performing them by tapping the smartwatch screen at the\nbeginning and at the end of every repetition.\n\n- **Homepage** :\n \u003chttps://tev.fbk.eu/resources/smartwatch\u003e\n\n- **Source code** :\n [`tfds.datasets.smartwatch_gestures.Builder`](https://github.com/tensorflow/datasets/tree/master/tensorflow_datasets/datasets/smartwatch_gestures/smartwatch_gestures_dataset_builder.py)\n\n- **Versions**:\n\n - **`1.0.0`** (default): Initial release.\n- **Download size** : `2.06 MiB`\n\n- **Dataset size** : `2.64 MiB`\n\n- **Auto-cached**\n ([documentation](https://www.tensorflow.org/datasets/performances#auto-caching)):\n Yes\n\n- **Splits**:\n\n| Split | Examples |\n|-----------|----------|\n| `'train'` | 3,251 |\n\n- **Feature structure**:\n\n FeaturesDict({\n 'attempt': uint8,\n 'features': Sequence({\n 'accel_x': float64,\n 'accel_y': float64,\n 'accel_z': float64,\n 'time_event': uint64,\n 'time_millis': uint64,\n 'time_nanos': uint64,\n }),\n 'gesture': ClassLabel(shape=(), dtype=int64, num_classes=20),\n 'participant': uint8,\n })\n\n- **Feature documentation**:\n\n| Feature | Class | Shape | Dtype | Description |\n|----------------------|--------------|-------|---------|-------------|\n| | FeaturesDict | | | |\n| attempt | Tensor | | uint8 | |\n| features | Sequence | | | |\n| features/accel_x | Tensor | | float64 | |\n| features/accel_y | Tensor | | float64 | |\n| features/accel_z | Tensor | | float64 | |\n| features/time_event | Tensor | | uint64 | |\n| features/time_millis | Tensor | | uint64 | |\n| features/time_nanos | Tensor | | uint64 | |\n| gesture | ClassLabel | | int64 | |\n| participant | Tensor | | uint8 | |\n\n- **Supervised keys** (See\n [`as_supervised` doc](https://www.tensorflow.org/datasets/api_docs/python/tfds/load#args)):\n `('features', 'gesture')`\n\n- **Figure**\n ([tfds.show_examples](https://www.tensorflow.org/datasets/api_docs/python/tfds/visualization/show_examples)):\n Not supported.\n\n- **Examples**\n ([tfds.as_dataframe](https://www.tensorflow.org/datasets/api_docs/python/tfds/as_dataframe)):\n\nDisplay examples... \n\n- **Citation**:\n\n @INPROCEEDINGS{\n 6952946,\n author={Costante, Gabriele and Porzi, Lorenzo and Lanz, Oswald and Valigi, Paolo and Ricci, Elisa},\n booktitle={2014 22nd European Signal Processing Conference (EUSIPCO)},\n title={Personalizing a smartwatch-based gesture interface with transfer learning},\n year={2014},\n volume={},\n number={},\n pages={2530-2534},\n doi={} }"]]