You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 8, 2021. It is now read-only.
I think a lot of us have been thinking about how to do bots on Rotonde, so I figured I would share what I've come up with thus far. Most things here feel obvious, but even the obvious benefits from documentation.
Basic Bots
You know, the kind of bots you'll see posting every few hours on Twitter.
These are fairly straightforward to implement. You can post automatically whenever you are on your computer with Beaker (I think this is how poemexe got its first start).
The caveat is that it requires a server of sorts, but I think that the majority of botmakers can accept this for the moment being. (Otherwise I have an experimental project that could be used, but I digress.)
Interactive Bots & Services
What's really interesting is building interactive bots and services ontop of Rotonde. The way to currently do that, as I see it, is to scrape the network. I'll outline out my thoughts on these matters so far.
I made a bot of this type called greeter, which says hi to you when you greet it! See the code, follow at dat://1bd0f36ba074263a0dc3f0bcb504c59742b30f8c2d3e15b87cbe9cb04afdf575
Components
Crawler
The crawler grabs all of the known network's dat urls and stuffs them into a list. The list is then shared over as a dat, for everybody to use.
Currently The Rotonde List can be used for this, but the best solution would be to combine the Rotonde List (for finding people that are currently completely unconnected, new converts from Twitter etc) and an automated scraped list (for people that have joined and are connected, but aren't in the list).
Scraper
The scraper gets all messages from the network using the crawler generated list and puts it all into a file. The file has one network message per line, for easy parsing. E.g.
for line in scraped:
message = JSON.parse(line)
if message.target == bot.dat:
// do things
The Service
I think the two preceeding components will be pretty much present in any kind of p2p service or interactive bot, but now it becomes more speculative.
The service's portal.json can be shared in the manner outlined in my guide.
My current thoughts for a basic structure are the following:
Scrape network with scraper
Find messages with target=<bot.dat>
Check if message sender has bot.dat in portal.port
Execute message command
Reply with a whisper
Even better is to put this functionality in a client-side module. Using the scraped files this should now be possible.
The first service I wrote was a mention service, as discussed in #46. Its current incarnation is as a bot, but I think this would be a good fit for the client-side idea. See the code, follow at dat://acf76768f02665825ccf7e205268833e096fc232a8cb933c0c12ee03bdd25ccd/
Scraped files
I maintain a list of the scraped network at dat://cb0fb0216c962b434e87fce98a5293e53595c136eac3f21b0c926a3a0d8a529c
Its content is as follows:
Messages dat://cb0fb0216c962b434e87fce98a5293e53595c136eac3f21b0c926a3a0d8a529c/scraped.txt
each line has a json object containing a rotonde message as per the Specs, but with the addition of a key called source, attributing the message to the instance that created it format<rotonde message object><newline>
Network dat://cb0fb0216c962b434e87fce98a5293e53595c136eac3f21b0c926a3a0d8a529c/network.txt
each line has a dat portal as scraped from the network using my scraper. note: some of these don't work format<portal.name><space><dat://..><newline>
Metadata dat://cb0fb0216c962b434e87fce98a5293e53595c136eac3f21b0c926a3a0d8a529c/metadata.txt
each line contains a dat's portal.json, albeit with the feed set to an empty array format<dat://...><space><stringified portal.json><newline>
The scraper itself can be found here and bugfixes, patches and similar PRs are very welcome! I would also love to see other people with bigger and better machines start scraping Rotonde!
Discussion
There are many concerns and ideas around this area, such as how to implement services in the future using Beaker's intents, declining being scraped by using a kind of robots.txt, and many more.
Known issues
dat-node occasionally breaks when closing utp connections using archive.close(). dat-node remember to call both archive.close() and archive.leave(), otherwise you will leak memory. (At least I did, I have an open issue where I'll look into it more.
The text was updated successfully, but these errors were encountered:
I think a lot of us have been thinking about how to do bots on Rotonde, so I figured I would share what I've come up with thus far. Most things here feel obvious, but even the obvious benefits from documentation.
Basic Bots
You know, the kind of bots you'll see posting every few hours on Twitter.
These are fairly straightforward to implement. You can post automatically whenever you are on your computer with Beaker (I think this is how poemexe got its first start).
You can also make use of dat's command-line interface to share over dat. I posted a guide on how to use this method for bots.
The caveat is that it requires a server of sorts, but I think that the majority of botmakers can accept this for the moment being. (Otherwise I have an experimental project that could be used, but I digress.)
Interactive Bots & Services
What's really interesting is building interactive bots and services ontop of Rotonde. The way to currently do that, as I see it, is to scrape the network. I'll outline out my thoughts on these matters so far.
I made a bot of this type called greeter, which says hi to you when you greet it!
See the code, follow at dat://1bd0f36ba074263a0dc3f0bcb504c59742b30f8c2d3e15b87cbe9cb04afdf575
Components
Crawler
The crawler grabs all of the known network's dat urls and stuffs them into a list. The list is then shared over as a dat, for everybody to use.
Currently The Rotonde List can be used for this, but the best solution would be to combine the Rotonde List (for finding people that are currently completely unconnected, new converts from Twitter etc) and an automated scraped list (for people that have joined and are connected, but aren't in the list).
(Rotolist by @aeonofdiscord is such a crawler. I based my own crawler off of its code, many thanks to aeon for a good start!)
Scraper
The scraper gets all messages from the network using the crawler generated list and puts it all into a file. The file has one network message per line, for easy parsing. E.g.
The Service
I think the two preceeding components will be pretty much present in any kind of p2p service or interactive bot, but now it becomes more speculative.
The service's portal.json can be shared in the manner outlined in my guide.
My current thoughts for a basic structure are the following:
target=<bot.dat>
bot.dat
inportal.port
Even better is to put this functionality in a client-side module. Using the scraped files this should now be possible.
The first service I wrote was a mention service, as discussed in #46. Its current incarnation is as a bot, but I think this would be a good fit for the client-side idea.
See the code, follow at dat://acf76768f02665825ccf7e205268833e096fc232a8cb933c0c12ee03bdd25ccd/
Scraped files
I maintain a list of the scraped network at dat://cb0fb0216c962b434e87fce98a5293e53595c136eac3f21b0c926a3a0d8a529c
Its content is as follows:
Messages
dat://cb0fb0216c962b434e87fce98a5293e53595c136eac3f21b0c926a3a0d8a529c/scraped.txt
each line has a json object containing a rotonde message as per the Specs, but with the addition of a key called
source
, attributing the message to the instance that created itformat
<rotonde message object><newline>
Network
dat://cb0fb0216c962b434e87fce98a5293e53595c136eac3f21b0c926a3a0d8a529c/network.txt
each line has a dat portal as scraped from the network using my scraper. note: some of these don't work
format
<portal.name><space><dat://..><newline>
Metadata
dat://cb0fb0216c962b434e87fce98a5293e53595c136eac3f21b0c926a3a0d8a529c/metadata.txt
each line contains a dat's portal.json, albeit with the feed set to an empty array
format
<dat://...><space><stringified portal.json><newline>
The scraper itself can be found here and bugfixes, patches and similar PRs are very welcome! I would also love to see other people with bigger and better machines start scraping Rotonde!
Discussion
There are many concerns and ideas around this area, such as how to implement services in the future using Beaker's intents, declining being scraped by using a kind of robots.txt, and many more.
Known issues
dat-node occasionally breaks when closing utp connections using
archive.close()
.dat-node remember to call both
archive.close()
andarchive.leave()
, otherwise you will leak memory. (At least I did, I have an open issue where I'll look into it more.The text was updated successfully, but these errors were encountered: