smartwatch_gestures

  • Description:

The SmartWatch Gestures Dataset has been collected to evaluate several gesture recognition algorithms for interacting with mobile applications using arm gestures.

Eight different users performed twenty repetitions of twenty different gestures, for a total of 3200 sequences. Each sequence contains acceleration data from the 3-axis accelerometer of a first generation Sony SmartWatch™, as well as timestamps from the different clock sources available on an Android device. The smartwatch was worn on the user's right wrist. The gestures have been manually segmented by the users performing them by tapping the smartwatch screen at the beginning and at the end of every repetition.

Split Examples
'train' 3,251
  • Feature structure:
FeaturesDict({
    'attempt': tf.uint8,
    'features': Sequence({
        'accel_x': tf.float64,
        'accel_y': tf.float64,
        'accel_z': tf.float64,
        'time_event': tf.uint64,
        'time_millis': tf.uint64,
        'time_nanos': tf.uint64,
    }),
    'gesture': ClassLabel(shape=(), dtype=tf.int64, num_classes=20),
    'participant': tf.uint8,
})
  • Feature documentation:
Feature Class Shape Dtype Description
FeaturesDict
attempt Tensor tf.uint8
features Sequence
features/accel_x Tensor tf.float64
features/accel_y Tensor tf.float64
features/accel_z Tensor tf.float64
features/time_event Tensor tf.uint64
features/time_millis Tensor tf.uint64
features/time_nanos Tensor tf.uint64
gesture ClassLabel tf.int64
participant Tensor tf.uint8
  • Citation:
@INPROCEEDINGS{
  6952946,
  author={Costante, Gabriele and Porzi, Lorenzo and Lanz, Oswald and Valigi, Paolo and Ricci, Elisa},
  booktitle={2014 22nd European Signal Processing Conference (EUSIPCO)},
  title={Personalizing a smartwatch-based gesture interface with transfer learning},
  year={2014},
  volume={},
  number={},
  pages={2530-2534},
  doi={}}