berkeley_autolab_ur5
Stay organized with collections
Save and categorize content based on your preferences.
UR5 performing cloth manipulation, pick place etc tasks
Split |
Examples |
'test' |
104 |
'train' |
896 |
FeaturesDict({
'steps': Dataset({
'action': FeaturesDict({
'gripper_closedness_action': float32,
'rotation_delta': Tensor(shape=(3,), dtype=float32, description=Delta change in roll, pitch, yaw.),
'terminate_episode': float32,
'world_vector': Tensor(shape=(3,), dtype=float32, description=Delta change in XYZ.),
}),
'is_first': bool,
'is_last': bool,
'is_terminal': bool,
'observation': FeaturesDict({
'hand_image': Image(shape=(480, 640, 3), dtype=uint8),
'image': Image(shape=(480, 640, 3), dtype=uint8),
'image_with_depth': Image(shape=(480, 640, 1), dtype=float32),
'natural_language_embedding': Tensor(shape=(512,), dtype=float32),
'natural_language_instruction': string,
'robot_state': Tensor(shape=(15,), dtype=float32, description=Explanation of the robot state can be found at https://sites.google.com/corp/view/berkeley-ur5),
}),
'reward': Scalar(shape=(), dtype=float32),
}),
})
Feature |
Class |
Shape |
Dtype |
Description |
|
FeaturesDict |
|
|
|
steps |
Dataset |
|
|
|
steps/action |
FeaturesDict |
|
|
|
steps/action/gripper_closedness_action |
Tensor |
|
float32 |
1 if close gripper, -1 if open gripper, 0 if no change. |
steps/action/rotation_delta |
Tensor |
(3,) |
float32 |
Delta change in roll, pitch, yaw. |
steps/action/terminate_episode |
Tensor |
|
float32 |
|
steps/action/world_vector |
Tensor |
(3,) |
float32 |
Delta change in XYZ. |
steps/is_first |
Tensor |
|
bool |
|
steps/is_last |
Tensor |
|
bool |
|
steps/is_terminal |
Tensor |
|
bool |
|
steps/observation |
FeaturesDict |
|
|
|
steps/observation/hand_image |
Image |
(480, 640, 3) |
uint8 |
|
steps/observation/image |
Image |
(480, 640, 3) |
uint8 |
|
steps/observation/image_with_depth |
Image |
(480, 640, 1) |
float32 |
|
steps/observation/natural_language_embedding |
Tensor |
(512,) |
float32 |
|
steps/observation/natural_language_instruction |
Tensor |
|
string |
|
steps/observation/robot_state |
Tensor |
(15,) |
float32 |
Explanation of the robot state can be found at https://sites.google.com/corp/view/berkeley-ur5 |
steps/reward |
Scalar |
|
float32 |
|
@misc{BerkeleyUR5Website,
title = {Berkeley {UR5} Demonstration Dataset},
author = {Lawrence Yunliang Chen and Simeon Adebola and Ken Goldberg},
howpublished = {https://sites.google.com/view/berkeley-ur5/home},
}
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2024-09-03 UTC.
[null,null,["Last updated 2024-09-03 UTC."],[],[],null,["# berkeley_autolab_ur5\n\n\u003cbr /\u003e\n\n- **Description**:\n\nUR5 performing cloth manipulation, pick place etc tasks\n\n- **Homepage** :\n \u003chttps://sites.google.com/view/berkeley-ur5/home\u003e\n\n- **Source code** :\n [`tfds.robotics.rtx.BerkeleyAutolabUr5`](https://github.com/tensorflow/datasets/tree/master/tensorflow_datasets/robotics/rtx/rtx.py)\n\n- **Versions**:\n\n - **`0.1.0`** (default): Initial release.\n- **Download size** : `Unknown size`\n\n- **Dataset size** : `76.39 GiB`\n\n- **Auto-cached**\n ([documentation](https://www.tensorflow.org/datasets/performances#auto-caching)):\n No\n\n- **Splits**:\n\n| Split | Examples |\n|-----------|----------|\n| `'test'` | 104 |\n| `'train'` | 896 |\n\n- **Feature structure**:\n\n FeaturesDict({\n 'steps': Dataset({\n 'action': FeaturesDict({\n 'gripper_closedness_action': float32,\n 'rotation_delta': Tensor(shape=(3,), dtype=float32, description=Delta change in roll, pitch, yaw.),\n 'terminate_episode': float32,\n 'world_vector': Tensor(shape=(3,), dtype=float32, description=Delta change in XYZ.),\n }),\n 'is_first': bool,\n 'is_last': bool,\n 'is_terminal': bool,\n 'observation': FeaturesDict({\n 'hand_image': Image(shape=(480, 640, 3), dtype=uint8),\n 'image': Image(shape=(480, 640, 3), dtype=uint8),\n 'image_with_depth': Image(shape=(480, 640, 1), dtype=float32),\n 'natural_language_embedding': Tensor(shape=(512,), dtype=float32),\n 'natural_language_instruction': string,\n 'robot_state': Tensor(shape=(15,), dtype=float32, description=Explanation of the robot state can be found at https://sites.google.com/corp/view/berkeley-ur5),\n }),\n 'reward': Scalar(shape=(), dtype=float32),\n }),\n })\n\n- **Feature documentation**:\n\n| Feature | Class | Shape | Dtype | Description |\n|------------------------------------------------|--------------|---------------|---------|--------------------------------------------------------------------------------------------------|\n| | FeaturesDict | | | |\n| steps | Dataset | | | |\n| steps/action | FeaturesDict | | | |\n| steps/action/gripper_closedness_action | Tensor | | float32 | 1 if close gripper, -1 if open gripper, 0 if no change. |\n| steps/action/rotation_delta | Tensor | (3,) | float32 | Delta change in roll, pitch, yaw. |\n| steps/action/terminate_episode | Tensor | | float32 | |\n| steps/action/world_vector | Tensor | (3,) | float32 | Delta change in XYZ. |\n| steps/is_first | Tensor | | bool | |\n| steps/is_last | Tensor | | bool | |\n| steps/is_terminal | Tensor | | bool | |\n| steps/observation | FeaturesDict | | | |\n| steps/observation/hand_image | Image | (480, 640, 3) | uint8 | |\n| steps/observation/image | Image | (480, 640, 3) | uint8 | |\n| steps/observation/image_with_depth | Image | (480, 640, 1) | float32 | |\n| steps/observation/natural_language_embedding | Tensor | (512,) | float32 | |\n| steps/observation/natural_language_instruction | Tensor | | string | |\n| steps/observation/robot_state | Tensor | (15,) | float32 | Explanation of the robot state can be found at \u003chttps://sites.google.com/corp/view/berkeley-ur5\u003e |\n| steps/reward | Scalar | | float32 | |\n\n- **Supervised keys** (See\n [`as_supervised` doc](https://www.tensorflow.org/datasets/api_docs/python/tfds/load#args)):\n `None`\n\n- **Figure**\n ([tfds.show_examples](https://www.tensorflow.org/datasets/api_docs/python/tfds/visualization/show_examples)):\n Not supported.\n\n- **Examples**\n ([tfds.as_dataframe](https://www.tensorflow.org/datasets/api_docs/python/tfds/as_dataframe)):\n\nDisplay examples... \n\n- **Citation**:\n\n @misc{BerkeleyUR5Website,\n title = {Berkeley {UR5} Demonstration Dataset},\n author = {Lawrence Yunliang Chen and Simeon Adebola and Ken Goldberg},\n howpublished = {https://sites.google.com/view/berkeley-ur5/home},\n }"]]