This repository records up to 4 HLS streams in parallel using GitHub Actions and uploads the finished chunks to Google Drive with rclone.
Use this only for streams you own or are allowed to archive. Be aware that this may go up to 100+ GB.
.github/workflows/record.yml— the GitHub Actions workflowscripts/record_one.sh— records one HLS stream and uploads chunks to Drive.gitignore
Create a new public repository, then upload these files.
Go to Settings → Secrets and variables → Actions and add:
STREAM1_URLSTREAM2_URLSTREAM3_URLSTREAM4_URLRCLONE_CONF_B64
Leave any unused STREAMx_URL secret empty if you are only using 1–3 streams.
Also under Settings → Secrets and variables → Actions → Variables, you can add:
DRIVE_BASE→ default:gdrive:GitHub-HLS-RecordingsSEGMENT_SECONDS→ default:900RECORD_SECONDS→ default:21300
If you do nothing, the defaults above are used.
On your own computer, install rclone and run:
rclone configCreate a remote named exactly:
gdrive
Test it:
rclone lsd gdrive:Now base64-encode your rclone.conf and put the result into GitHub secret RCLONE_CONF_B64.
base64 -i ~/.config/rclone/rclone.conf | tr -d '\n'If your Linux base64 does not support -i, use:
base64 -w 0 ~/.config/rclone/rclone.conf[Convert]::ToBase64String([IO.File]::ReadAllBytes("$env:APPDATA\rclone\rclone.conf"))Add each current HLS URL into:
STREAM1_URLSTREAM2_URLSTREAM3_URLSTREAM4_URL
If your provider uses temporary signed URLs, refresh them before every run.
Go to Actions → Record HLS streams → Run workflow.
Each stream runs in its own job.
Chunks are uploaded to:
gdrive:GitHub-HLS-Recordings/<stream-name>/<github-run-id>/
SEGMENT_SECONDS=900→ 15-minute chunksRECORD_SECONDS=21300→ 5 hours 55 minutes
- This records one stream per GitHub runner.
- Completed chunks are moved to Drive during the run so runner storage stays lower.
- If an HLS URL expires mid-run, that job will stop when the source becomes invalid.