A la RSS, is it possible to download videos from a given YouTube channel once they're released? Is there perhaps a script I can run to automatically download newly released videos, or perhaps a piece of software I can use to monitor a YouTube channel's given library?
Google youtube-dll.
Using feedstail and youtube-dl:
By user name:youtube-dl $(feedstail -1 -n 1 -f {link} -u "https://www.youtube.com/feeds/videos.xml?user=[Username]")
By channel ID:youtube-dl $(feedstail -1 -n 1 -f {link} -u "https://www.youtube.com/feeds/videos.xml?channel_id=[Channel ID]")
You could make a script like this for example:
#!/bin/bash
function get-url(){
echo $(feedstail -1 -n 1 -f {link} -u 'https://www.youtube.com/feeds/videos.xml?user='"$1")
}
youtube-dl $(get-url $1)
And when you run it...
~$ ./dl-latest.sh razethew0rld
[youtube] bekGVGHQSNo: Downloading webpage
[youtube] bekGVGHQSNo: Downloading video info webpage
[youtube] bekGVGHQSNo: Extracting video information
[youtube] bekGVGHQSNo: Downloading DASH manifest
[youtube] bekGVGHQSNo: Downloading DASH manifest
WARNING: Requested formats are incompatible for merge and will be merged into mkv.
[download] Destination: The Tek 0205 - Comcast Hates You And Thinks You are Ugly-bekGVGHQSNo.f137.mp4
[download] 3.7% of 1.19GiB at 2.02MiB/s ETA 09:40
If you want a GUI way Miro can do this for you along with pulling podcasts and many other media off the net.
I have thought that Miro and the associated Amara project could be something Tek Syndicate would take advantage of.
So I almost completely understand what's going on here, the only problem at this point that I'm having is actually executing the command.
Unfortunately, I'm having issues actually finding where feedstail installed itself through pip. I've tried to find some sort of indicator from the internet but to no avail. I've looked in /usr/lib/pythonx.y/xxx as well as /usr/local/lib/pythonx.y/xxx and I haven't had any luck with that either.
I really want this to work so I don't have to manually download videos that I want to watch on my tablet or laptop, every time something new comes out and I'm gonna be on the road for a while.
Mainly, I wanted to be able to run the script, and then go about other things, getting ready, etc.
Miro is DEFINITELY a simple solution to what I'm looking for but I'm also looking for something that I can simply run every once in a while rather than on a set schedule. I realize there are ways to do this in Miro but I'm also not looking for an actual player. Rather just a facilitator that will download videos at my whim.
That being said, I think I might use that for other cases where I'm just looking to sit back and catch up on some longform content from other places, like when I want an iTunes podcast, etc.
It would be nice if the videos were available as torrent's. I also watch a lot of stuff from Jupiter Broadcasting. They make their content available in many formats.
One of the presenters (Allan Jude) owns a Video Streaming service scale engine and I think JB gets a lot of support from that company.
I see a lot of flaws being present when having to deal with changing feeds and torrents. There's an issue that has to do with trackers that would be present should any given torrent be adding or removing content on the fly rather than staying with what's initially present.
I've been doing something like this for a while with a little youtube-dl script, it's a bit clunky maybe but works well for me.
The video list page for a youtube channel, eg https://www.youtube.com/user/razethew0rld/videos is a playlist of the whole channel's videos numbered from the most recent video.
The the -i option is so that it ignores errors (eg if particular format specified isn't available etc.) , the --dateafter now-5days only downloads video from past 5 days, and --playlist-items 1-4 downloads the first four videos in the playlist, i.e. the 4 most recent videos. And then -a option reads from a text file which is a list of channels (urls like earlier) that I wan't it to dl, so you can just add as many channels as you want to that and it'll do them all. Of course it will skip already downloaded videos, provided they haven't been deleted. But if running it frequently you can have the script only consider videos 2 days old or something and then also delete videos more than two days old.
The -f 18 part means it downloads only 360p videos (my internet is slow), but you could have 22 if you wanted only 720p, or just omit that -f entirely and I think it automatically downloads the best quality available. I also limit the download rate with --rate-limit but you probably don't need that either.
I have it so it saves the videos in a folder named for each channel and video titles prefixed with video date so they're listed nicely in folder.
This is the part of the script that does the downloading:
#!/bin/bash
youtube-dl -f 18 -i --playlist-items 1-4 --dateafter now-5days \
--rate-limit 100K -a /PATH/TO/CHANNEL/LIST \
-o '/PATH/TO/VIDEO/FOLDER/youtube/%(uploader)s/%(upload_date)s %(uploader)s %(title)s'
I also have it producing a log of recently downloaded videos but it's a mess so won't post that. I either run this by itsef to update the downloaded videos, or call it from a while loop and have it run every 90 minutes or so.
For the list of channels, am I putting the link of the /videos page or am I simply specifying the channel on each new line?
I have a list of the /videos pages yes, one line for each channel. eg
https://www.youtube.com/user/razethew0rld/videos
https://www.youtube.com/user/teksyndicate/videos
One way to get around the "deleting videos without them being re-downloaded" issue can be this:
--download-archive FILE ...
I'll come back with results later on.
Yes, the --download-archive FILE command works exactly as expected. Simply specify a txt file wherever (gonna see if I can make it create a file), and the video IDs will be written into the file. After it's written, it's safe to delete so-and-so video right after watching it.
It should be in your path, I installed mine via the package manager and it got into /usr/bin/feedstail. Here is the code for the script.
ebbs solution is probably better anyways.
Yeah, I finally decided I was going to use ebb's solution. Here's the final code:
#!/bin/bash
youtube-dl -f "bestvideo[height<=?1080]+bestaudio/best" -i \
--playlist-items 1-4 --download-archive \
/mnt/mnemosyne/Scripts/YT-Subs/archive.txt \
--dateafter now-2days --rate-limit 1M -a \
/home/brandon/Documents/channellist.txt -o \
'/home/brandon/Videos/YT-Subs/%(uploader)s/%(upload_date)s %(uploader)s %(title)s'
Using cron
I set the script up to run every hour on the hour.
Now I need to find a way to delete the files that I've watched.
Here is the absolute final code regarding the original topic and my added goal concerning deleting files that have been watched. :
#!/bin/bash
float=$(cat /mnt/mnemosyne/Scripts/YT-Subs/te.txt) && float=${float%.*} && var=$(expr 59 - $float / 60);
find ~/Videos/YT-Subs -amin -$var -exec rm {} \;;
/usr/bin/time -f "%e" youtube-dl -f "bestvideo[height<=?1080]+bestaudio/best" -i \
--download-archive /mnt/mnemosyne/Scripts/YT-Subs/archive.txt \
--dateafter now-2days --rate-limit 1M -a /home/brandon/Documents/channellist.txt -o \
'/home/brandon/Videos/YT-Subs/%(uploader)s/%(upload_date)s %(uploader)s %(title)s' 2> \
/mnt/mnemosyne/Scripts/YT-Subs/te.txt