-
-
Notifications
You must be signed in to change notification settings - Fork 420
Auto Scraping
Trim21 edited this page Jul 7, 2023
·
8 revisions
Contents
Retrieved information by scraping is only cosmetic using rTorrent
(since it uses a different philosophy than most other torrent client, it won't affect the operation of it) but good to see these values changing if you use any UI (e.g. webUI) with it or rTorrent-PS
.
By default, rTorrent
only sends scrape requests to trackers when a torrent is added for the first time or the client was restarted.
- the builtin
t.scrape_time_last
property is lost upon restart (not saved in session) - multi-scraping isn't implemented
Regularly update scrape information for all torrents, even stopped ones.
Let's try to balance the requests to not fire up them at the same time:
- use a custom field
tm_last_scrape
to store the last scrape time per torrent to be able to save into session - send scrape requests regularly, check for update in every
5 minutes
and distinguish between 2 groups:- transferring (uploading/downloading) torrents : update it in every
10 minutes
- non-transferring torrents: update it in every
12 hours
- stopped torrents: update it in every
24 hours
- transferring (uploading/downloading) torrents : update it in every
All you have to do is to copy-paste the following config into your rTorrent
config. That's it.
### Tracker Announce ###
# Set scrape intervals for active (~10 mins), idle items (~12 hours) and stopped items (~24 hours)
method.insert = cfg.scrape_interval.active, value, 580 # 600-20
method.insert = cfg.scrape_interval.idle, value, 43100 # 43200 for 12 hours
method.insert = cfg.scrape_interval.stopped, value, 86300 # 86400 for 24 hours
# Regularly update scrape information for all torrents (even stopped ones), it won't affect the operation of rtorrent, but nice to have these values updated.
# This info is only updated when rtorrent starts or a torrent is added by default.
# Try to balance calls to not fire them up at the same time (since multiscraping isn't implemented in libtorrent). Check for update every 5 minutes and distinguish between 2 groups:
# - transferring (uploading and/or downloading) torrents: update in every 10 minutes
# - non-transferring torrents: update in every 12 hours
# - stopped torrents: update it in every `24 hours`
# helper method: sets current time in a custom field (tm_last_scrape) and saves session
method.insert = d.last_scrape.set, simple|private, "d.custom.set=tm_last_scrape,$cat=$system.time=; d.save_resume="
# helper method: sends the scrape request and sets the tm_last_scrape timestamp and saves session
method.insert = d.last_scrape.send_set, simple, "d.tracker.send_scrape=0;d.last_scrape.set="
# helper method: decides whether the required time interval (with the help of an argument) has passed and if so calls the above method
method.insert = d.last_scrape.check_elapsed, simple|private, "branch={(elapsed.greater,$d.custom=tm_last_scrape,$argument.0=),d.last_scrape.send_set=}"
# helper method: checks for non-existing/empty custom field to be able to test its validity later
method.insert = d.last_scrape.check, simple|private, "branch={d.custom=tm_last_scrape,d.last_scrape.check_elapsed=$argument.0=,d.last_scrape.send_set=}"
# sets custom field (tm_last_scrape) to current time only for torrents just has been added (skips setting time on purpose when rtorrent started)
method.set_key = event.download.inserted_new, ~last_scrape_i, "d.last_scrape.set="
# helper method: set next scrape time based on uploading/downloading
method.insert = d.last_scrape.update_active, simple|private, "branch={\"or={d.up.rate=,d.down.rate=}\",d.last_scrape.check=$cfg.scrape_interval.active=,d.last_scrape.check=$cfg.scrape_interval.idle=}"
method.insert = d.last_scrape.update, simple|private, "branch={d.is_open=,d.last_scrape.update_active=,d.last_scrape.check=$cfg.scrape_interval.stopped=}"
# check for update every 5 minutes (300 sec) and
schedule2 = last_scrape_t, 300, 300, "d.multicall2=default,\"d.last_scrape.update=\""
It's strongly advised to apply the following to your setup: