DRL
[PR2026] Drone Referring Localization: An Efficient Heterogeneous Spatial Feature Interaction Method For UAV Self-Localization
Install / Use
/learn @Dmmm1997/DRLREADME
This repository contains code and dataset for the paper titled Drone Referring Localization: An Efficient Heterogeneous Spatial Feature Interaction Method For UAV Self-Localization.




News
2024/8/28: Our dataset and code are released.
Table of contents
- News
- Table of contents
- About Dataset
- Prerequisites
- Installation
- Dataset & Preparation
- Train & Evaluation
- Supported Methods
- License
- Citation
- Related Work
About Dataset
The dataset split is as follows: | Subset | UAV-view | Satellite-view | universities | | -------- | ----- | ---- | ---- | | Train | 6,768 | 6,768 | 10 | | test | 2,331 | 27,972 | 4 |
More detailed file structure:
├── UL14/
│ ├── train/
│ ├── PlaceName_Height_Index/
│ ├── UAV
│ ├── 0.JPG
│ ├── Satellite/
│ ├── 0.tif
| ...
│ ├── val/
├── PlaceName_Height_Index/
├── UAV
│ ├── 0.JPG
│ ├── Satellite/
│ ├── 0.jpg
| ├── 1.jpg
| ├── 2.jpg
| ...
| ├── 11.jpg
│ GPS_info.json /* UAV position in satellite images
| label.json /* Supplementary information such as latitude and longitude, mapsize
│ ├── test/ /* Structure is same as val
Prerequisites
- Python 3.7+
- GPU Memory >= 8G
- Numpy 1.26.0
- Pytorch 2.0.0+cu118
- Torchvision 0.15.0+cu118
Installation
It is best to use cuda version 11.8 and pytorch version 2.0.0. You can download the corresponding version from this website and install it through pip install. Then you can execute the following command to install all dependencies.
pip install -r requirments.txt
Create the directory for saving the training log and ckpts.
mkdir checkpoints
Dataset & Preparation
UL14 has been released in link.
Additionally, download the pretrain weight of cvt13 from this link.
Important: change the pretrain_path, train_dir, val_dir, test_dir in the config file.
Train & Evaluation
Training and Testing
Execute the following command to implement the entire process of training and testing.
bash train_test_local.sh
The setting of parameters in train_test_local.sh can refer to Get Started.
Evaluation
The following commands are required to evaluate MA@K and RDS.
cd checkpoints/<name>
python test_meter.py --config <name>
the <name> is the dir name in your training setting, you can find in the checkpoints/.
We also provide the baseline checkpoints, link.
unzip <file.zip> -d checkpoints
cd checkpoints/baseline
python test.py --test_dir <dataset_root>/test
python test_meter.py --config <name>
License
This project is licensed under the Apache 2.0 license.
Citation
The following paper uses and reports the result of the baseline model. You may cite it in your paper.
@misc{drl,
title={Drone Referring Localization: An Efficient Heterogeneous Spatial Feature Interaction Method For UAV Self-Localization},
author={Ming Dai and Enhui Zheng and Zhenhua Feng and Jiahao Chen and Wankou Yang},
year={2024},
eprint={2208.06561},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2208.06561},
}
Related Work
Related Skills
node-connect
349.0kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
109.4kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
349.0kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
349.0kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
