smartwatch_gestures

  • Description:

The SmartWatch Gestures Dataset has been collected to evaluate several gesture recognition algorithms for interacting with mobile applications using arm gestures.

Eight different users performed twenty repetitions of twenty different gestures, for a total of 3200 sequences. Each sequence contains acceleration data from the 3-axis accelerometer of a first generation Sony SmartWatch™, as well as timestamps from the different clock sources available on an Android device. The smartwatch was worn on the user's right wrist. The gestures have been manually segmented by the users performing them by tapping the smartwatch screen at the beginning and at the end of every repetition.

Split Examples
'train' 3,251
  • Feature structure:
FeaturesDict({
    'attempt': uint8,
    'features': Sequence({
        'accel_x': float64,
        'accel_y': float64,
        'accel_z': float64,
        'time_event': uint64,
        'time_millis': uint64,
        'time_nanos': uint64,
    }),
    'gesture': ClassLabel(shape=(), dtype=int64, num_classes=20),
    'participant': uint8,
})
  • Feature documentation:
Feature Class Shape Dtype Description
FeaturesDict
attempt Tensor uint8
features Sequence
features/accel_x Tensor float64
features/accel_y Tensor float64
features/accel_z Tensor float64
features/time_event Tensor uint64
features/time_millis Tensor uint64
features/time_nanos Tensor uint64
gesture ClassLabel int64
participant Tensor uint8
  • Citation:
@INPROCEEDINGS{
  6952946,
  author={Costante, Gabriele and Porzi, Lorenzo and Lanz, Oswald and Valigi, Paolo and Ricci, Elisa},
  booktitle={2014 22nd European Signal Processing Conference (EUSIPCO)},
  title={Personalizing a smartwatch-based gesture interface with transfer learning},
  year={2014},
  volume={},
  number={},
  pages={2530-2534},
  doi={} }