Torrentp
Python torrent downloader - Download from torrent with .torrent file or magnet link, with just 3 lines of python code.
Install / Use
/learn @iw4p/TorrentpREADME
TorrentP
Wrapped python library for downloading from torrent
Download from torrent with .torrent file or magnet link. With just 3 lines of python code.
Installation
$ pip install torrentp
Also can be found on pypi
How can I use it?
- Install the package by pip package manager.
- After installing, you can use it and call the library.
- You have to pass magnet link or torrent file, and a path for saving the file. use . (dot) for saving in current directory.
Download with magnet link:
import asyncio
from torrentp import TorrentDownloader
torrent_file = TorrentDownloader("magnet:...", '.')
# Start the download process
asyncio.run(torrent_file.start_download()) # start_download() is a asynchronous method
# Pausing the download
torrent_file.pause_download()
# Resuming the download
torrent_file.resume_download()
# Stopping the download
torrent_file.stop_download()
Or download with .torrent file:
import asyncio
from torrentp import TorrentDownloader
torrent_file = TorrentDownloader("test.torrent", '.')
# Start the download process
asyncio.run(torrent_file.start_download()) # start_download() is a asynchronous method
# Pausing the download
torrent_file.pause_download()
# Resuming the download
torrent_file.resume_download()
# Stopping the download
torrent_file.stop_download()
How can I use a custom port?
torrent_file = TorrentDownloader("magnet/torrent.file", '.', port=0000)
How can I limit the upload or download speed?
Download Using 0 (default number) means unlimited speed:
await torrent_file.start_download(download_speed=0, upload_speed=0)
Or download with specifc number (kB/s):
await torrent_file.start_download(download_speed=2, upload_speed=1)
Using Command Line Interface (CLI)
Download with a magnet link:
$ torrentp --link 'magnet:...'
or download with .torrent file:
$ torrentp --link 'test.torrent'
You can also use --help parameter to display all the parameters that you can use
| args | help | type |
| ------ | ------ | ------ |
| --link | Torrent link. Example: [--link 'file.torrent'] or [--link 'magnet:...'] [required] | str |
| --download_speed | Download speed with a specific number (kB/s). Default: 0, means unlimited speed | int |
| --upload_speed | Upload speed with a specific number (kB/s). Default: 0, means unlimited speed | int |
| --save_path | Path to save the file, default: '.' | str |
| --stop_after_download | Stop the download immediately after completion without seeding | flag |
| --help |Show this message and exit | |
Example with all commands:
$ torrentp --link 'magnet:...' --download_speed 100 --upload_speed 50 --save_path '.' --stop_after_download
To do list
- [x] Limit upload and download speed
- [x] User can change the port
- [x] CLI
- [x] Pause / Resume / Stop
Star History
Issues
Feel free to submit issues and enhancement requests or contact me via vida.page/nima.
Contributing
Please refer to each project's style and contribution guidelines for submitting patches and additions. In general, we follow the "fork-and-pull" Git workflow.
- Fork the repo on GitHub
- Clone the project to your own machine
- Update the Version inside init.py
- Commit changes to your own branch
- Push your work back up to your fork
- Submit a Pull request so that we can review your changes
Related Skills
node-connect
350.1kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
claude-opus-4-5-migration
109.9kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
frontend-design
109.9kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
model-usage
350.1kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.

