jaco_play
Stay organized with collections
Save and categorize content based on your preferences.
Jaco 2 pick place on table top
Split |
Examples |
'test' |
109 |
'train' |
976 |
FeaturesDict({
'steps': Dataset({
'action': FeaturesDict({
'gripper_closedness_action': Tensor(shape=(1,), dtype=float32),
'terminate_episode': Tensor(shape=(3,), dtype=int32),
'world_vector': Tensor(shape=(3,), dtype=float32),
}),
'is_first': bool,
'is_last': bool,
'is_terminal': bool,
'observation': FeaturesDict({
'end_effector_cartesian_pos': Tensor(shape=(7,), dtype=float32),
'end_effector_cartesian_velocity': Tensor(shape=(6,), dtype=float32),
'image': Image(shape=(224, 224, 3), dtype=uint8),
'image_wrist': Image(shape=(224, 224, 3), dtype=uint8),
'joint_pos': Tensor(shape=(8,), dtype=float32),
'natural_language_embedding': Tensor(shape=(512,), dtype=float32),
'natural_language_instruction': string,
}),
'reward': Scalar(shape=(), dtype=float32),
}),
})
Feature |
Class |
Shape |
Dtype |
Description |
|
FeaturesDict |
|
|
|
steps |
Dataset |
|
|
|
steps/action |
FeaturesDict |
|
|
|
steps/action/gripper_closedness_action |
Tensor |
(1,) |
float32 |
|
steps/action/terminate_episode |
Tensor |
(3,) |
int32 |
|
steps/action/world_vector |
Tensor |
(3,) |
float32 |
|
steps/is_first |
Tensor |
|
bool |
|
steps/is_last |
Tensor |
|
bool |
|
steps/is_terminal |
Tensor |
|
bool |
|
steps/observation |
FeaturesDict |
|
|
|
steps/observation/end_effector_cartesian_pos |
Tensor |
(7,) |
float32 |
|
steps/observation/end_effector_cartesian_velocity |
Tensor |
(6,) |
float32 |
|
steps/observation/image |
Image |
(224, 224, 3) |
uint8 |
|
steps/observation/image_wrist |
Image |
(224, 224, 3) |
uint8 |
|
steps/observation/joint_pos |
Tensor |
(8,) |
float32 |
|
steps/observation/natural_language_embedding |
Tensor |
(512,) |
float32 |
|
steps/observation/natural_language_instruction |
Tensor |
|
string |
|
steps/reward |
Scalar |
|
float32 |
|
@software{dass2023jacoplay,
author = {Dass, Shivin and Yapeter, Jullian and Zhang, Jesse and Zhang, Jiahui
and Pertsch, Karl and Nikolaidis, Stefanos and Lim, Joseph J.},
title = {CLVR Jaco Play Dataset},
url = {https://github.com/clvrai/clvr_jaco_play_dataset},
version = {1.0.0},
year = {2023}
}
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2023-12-19 UTC.
[null,null,["Last updated 2023-12-19 UTC."],[],[],null,["# jaco_play\n\n\u003cbr /\u003e\n\n- **Description**:\n\nJaco 2 pick place on table top\n\n- **Homepage** :\n \u003chttps://github.com/clvrai/clvr_jaco_play_dataset\u003e\n\n- **Source code** :\n [`tfds.robotics.rtx.JacoPlay`](https://github.com/tensorflow/datasets/tree/master/tensorflow_datasets/robotics/rtx/rtx.py)\n\n- **Versions**:\n\n - **`0.1.0`** (default): Initial release.\n- **Download size** : `Unknown size`\n\n- **Dataset size** : `9.24 GiB`\n\n- **Auto-cached**\n ([documentation](https://www.tensorflow.org/datasets/performances#auto-caching)):\n No\n\n- **Splits**:\n\n| Split | Examples |\n|-----------|----------|\n| `'test'` | 109 |\n| `'train'` | 976 |\n\n- **Feature structure**:\n\n FeaturesDict({\n 'steps': Dataset({\n 'action': FeaturesDict({\n 'gripper_closedness_action': Tensor(shape=(1,), dtype=float32),\n 'terminate_episode': Tensor(shape=(3,), dtype=int32),\n 'world_vector': Tensor(shape=(3,), dtype=float32),\n }),\n 'is_first': bool,\n 'is_last': bool,\n 'is_terminal': bool,\n 'observation': FeaturesDict({\n 'end_effector_cartesian_pos': Tensor(shape=(7,), dtype=float32),\n 'end_effector_cartesian_velocity': Tensor(shape=(6,), dtype=float32),\n 'image': Image(shape=(224, 224, 3), dtype=uint8),\n 'image_wrist': Image(shape=(224, 224, 3), dtype=uint8),\n 'joint_pos': Tensor(shape=(8,), dtype=float32),\n 'natural_language_embedding': Tensor(shape=(512,), dtype=float32),\n 'natural_language_instruction': string,\n }),\n 'reward': Scalar(shape=(), dtype=float32),\n }),\n })\n\n- **Feature documentation**:\n\n| Feature | Class | Shape | Dtype | Description |\n|---------------------------------------------------|--------------|---------------|---------|-------------|\n| | FeaturesDict | | | |\n| steps | Dataset | | | |\n| steps/action | FeaturesDict | | | |\n| steps/action/gripper_closedness_action | Tensor | (1,) | float32 | |\n| steps/action/terminate_episode | Tensor | (3,) | int32 | |\n| steps/action/world_vector | Tensor | (3,) | float32 | |\n| steps/is_first | Tensor | | bool | |\n| steps/is_last | Tensor | | bool | |\n| steps/is_terminal | Tensor | | bool | |\n| steps/observation | FeaturesDict | | | |\n| steps/observation/end_effector_cartesian_pos | Tensor | (7,) | float32 | |\n| steps/observation/end_effector_cartesian_velocity | Tensor | (6,) | float32 | |\n| steps/observation/image | Image | (224, 224, 3) | uint8 | |\n| steps/observation/image_wrist | Image | (224, 224, 3) | uint8 | |\n| steps/observation/joint_pos | Tensor | (8,) | float32 | |\n| steps/observation/natural_language_embedding | Tensor | (512,) | float32 | |\n| steps/observation/natural_language_instruction | Tensor | | string | |\n| steps/reward | Scalar | | float32 | |\n\n- **Supervised keys** (See\n [`as_supervised` doc](https://www.tensorflow.org/datasets/api_docs/python/tfds/load#args)):\n `None`\n\n- **Figure**\n ([tfds.show_examples](https://www.tensorflow.org/datasets/api_docs/python/tfds/visualization/show_examples)):\n Not supported.\n\n- **Examples**\n ([tfds.as_dataframe](https://www.tensorflow.org/datasets/api_docs/python/tfds/as_dataframe)):\n\nDisplay examples... \n\n- **Citation**:\n\n @software{dass2023jacoplay,\n author = {Dass, Shivin and Yapeter, Jullian and Zhang, Jesse and Zhang, Jiahui\n and Pertsch, Karl and Nikolaidis, Stefanos and Lim, Joseph J.},\n title = {CLVR Jaco Play Dataset},\n url = {https://github.com/clvrai/clvr_jaco_play_dataset},\n version = {1.0.0},\n year = {2023}\n }"]]