Foldsformer
Code for the paper "Foldsformer: Learning Sequential Multi-Step Cloth Manipulation With Space-Time Attention" (RA-L)
Install / Use
/learn @Murkey8895/FoldsformerREADME
Foldsformer: Learning Sequential Multi-Step Cloth Manipulation with Space-Time Attention
Kai Mo, Chongkun Xia, Xueqian Wang, Yuhong Deng, Xuehai Gao, Bin Liang
Tsinghua University
This repository is a PyTorch implementation of the paper "Foldsformer: Learning Sequential Multi-Step Cloth Manipulation with Space-Time Attention", published in IEEE RA-L.
Website | IEEE Manuscript | ArXiv
If you find this code useful in your research, please consider citing:
@ARTICLE{mo2022foldsformer,
author={Mo, Kai and Xia, Chongkun and Wang, Xueqian and Deng, Yuhong and Gao, Xuehai and Liang, Bin},
journal={IEEE Robotics and Automation Letters},
title={Foldsformer: Learning Sequential Multi-Step Cloth Manipulation With Space-Time Attention},
year={2023},
volume={8},
number={2},
pages={760-767},
doi={10.1109/LRA.2022.3229573}
}
Table of Contents
Installation
This simulation environment is based on SoftGym. You can follow the instructions in SoftGym to setup the simulator.
-
Clone this repository.
-
Follow the SoftGym to create a conda environment and install PyFlex. A nice blog written by Daniel Seita may help you get started on SoftGym.
-
Install the following packages in the created conda environment:
- pytorch and torchvision:
pip install torchvisionorconda install torchvision -c pytorch - einops:
pip install einops - tqdm:
pip install tqdm - yaml:
pip install PyYaml
- pytorch and torchvision:
-
Before you use the code, you should make sure the conda environment activated(
conda activate softgym) and set up the paths appropriately:export PYFLEXROOT=${PWD}/PyFlex export PYTHONPATH=${PYFLEXROOT}/bindings/build:$PYTHONPATH export LD_LIBRARY_PATH=${PYFLEXROOT}/external/SDL2-2.0.4/lib/x64:$LD_LIBRARY_PATHThe provided script
prepare_1.0.shincludes these commands above.
Generate Data
-
Generate initial configurations:
python generate_configs.py --num_cached 1000 --cloth_type random python generate_configs.py --num_cached 100 --cloth_type square python generate_configs.py --num_cached 100 --cloth_type rectanglewhere
--num_cachedspecifies the number of configurations to be generated, and--cloth_typespecifies the cloth type (square | rectangle | random). These generated initial configurations will be saved incached configs/ -
Generate trajectories by random actions:
python generate_random.py --gui --corner_bias --img_size 224 --cached random1000 --horizon 8where
--img_sizespecifies the image size captured by the camera in the simulator,--cachedspecifies the filename of the cached configurations, and--horizonspecifies the number of actions in a trajectory. You can remove--guito run headless and remove--corner_biasto pick the cloth uniformly instead of picking the corners. These generated trajectories will be saved indata/random/corner biasanddata/random/random. -
Generate expert demonstrations:
python generate_demonstrations.py --gui --task DoubleTriangle --img_size 224 --cached square100 python generate_demonstrations.py --gui --task DoubleStraight --img_size 224 --cached rectangle100 python generate_demonstrations.py --gui --task AllCornersInward --img_size 224 --cached square100 python generate_demonstrations.py --gui --task CornersEdgesInward --img_size 224 --cached square100where
--taskspecifies the task name,--img_sizespecifies the image size captured by the camera in the simulator, and--cachedspecifies the filename of the cached configurations, and . You can remove--guito run headless. These generated demonstrations will be saved indata/demonstrations.Demonstrator/demonstrator.pyincludes the scripted demonstrator by accessing the ground truth position of each particle.
Train Foldsformer
-
Preprocess the data (split each long trajectory into sub-trajectories):
python utils/prepare_data_list.py -
Set up the model, optimizer and other details in
train/train configs/train.yaml. -
Train Foldsformer:
python train.py --config_path trainwhere
--config_pathspecifies theyamlconfiguration filename intrain/train configs/.
Evaluate Foldsformer
-
Download the evaluation set and model weights:
- Download the evaluation initial configurations, and then put them in
cached configs/. - Download the Foldsformer weights, and then put it in
train/trained model/Foldsformer/ - Download the demonstration sub-goals, and then extract them in
data/demo/
- Download the evaluation initial configurations, and then put them in
-
Evaluate Foldsformer by running:
python eval.py --gui --task DoubleTriangle --model_config train --model_file foldsformer_eval --cached square python eval.py --gui --task DoubleStraight --model_config train --model_file foldsformer_eval --cached rectangle python eval.py --gui --task AllCornersInward --model_config train --model_file foldsformer_eval --cached square python eval.py --gui --task CornersEdgesInward --model_confilg train --model_file foldsformer_eval --cached squareThe evaluation results are saved in
eval result/.
If you have any questions, please feel free to contact me via mok21@tsinghua.org.cn. (~~mok21@mails.tsinghua.edu.cn~~)
Related Skills
proje
Interactive vocabulary learning platform with smart flashcards and spaced repetition for effective language acquisition.
YC-Killer
2.7kA library of enterprise-grade AI agents designed to democratize artificial intelligence and provide free, open-source alternatives to overvalued Y Combinator startups. If you are excited about democratizing AI access & AI agents, please star ⭐️ this repository and use the link in the readme to join our open source AI research team.
best-practices-researcher
The most comprehensive Claude Code skills registry | Web Search: https://skills-registry-web.vercel.app
flutter-tutor
Flutter Learning Tutor Guide You are a friendly computer science tutor specializing in Flutter development. Your role is to guide the student through learning Flutter step by step, not to provide d
