mirror of
https://github.com/iv-org/invidious.git
synced 2026-02-18 14:35:44 +00:00
Compare commits
38 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
57d88ffcc8 | ||
|
|
e46e6183ae | ||
|
|
b49623f90f | ||
|
|
95c6747a3e | ||
|
|
245d0b571f | ||
|
|
6e0df50a03 | ||
|
|
f88697541c | ||
|
|
5eefab62fd | ||
|
|
13b0526c7a | ||
|
|
1568a35cfb | ||
|
|
93082c0a45 | ||
|
|
1a39faee75 | ||
|
|
81b447782a | ||
|
|
c87aa8671c | ||
|
|
921c34aa65 | ||
|
|
ccc423f682 | ||
|
|
02335f3390 | ||
|
|
bcc8ba73bf | ||
|
|
35e63fa3f5 | ||
|
|
3fe4547f8e | ||
|
|
2dbe151ceb | ||
|
|
e2c15468e0 | ||
|
|
022427e20e | ||
|
|
88430a6fc0 | ||
|
|
c72b9bea64 | ||
|
|
80bc29f3cd | ||
|
|
f7125c1204 | ||
|
|
6f9056fd84 | ||
|
|
3733fe8272 | ||
|
|
98bb20abcd | ||
|
|
a4d44d3286 | ||
|
|
dc358fc7e5 | ||
|
|
e14f2f2750 | ||
|
|
650b44ade2 | ||
|
|
3830604e42 | ||
|
|
f83e9e6eb9 | ||
|
|
236358d3ad | ||
|
|
43d6b65b4f |
38
CHANGELOG.md
38
CHANGELOG.md
@@ -1,10 +1,40 @@
|
|||||||
|
# 0.9.0 (2018-10-08)
|
||||||
|
|
||||||
|
## Week 9: Playlists
|
||||||
|
|
||||||
|
Not as much to announce this week, but I'm still quite happy to announce a couple things, namely:
|
||||||
|
|
||||||
|
Playback support for playlists has finally been added with [`88430a6`](https://github.com/omarroth/invidious/88430a6). You can now view playlists with the `&list=` query param, as you would on YouTube. You can also view mixes with the mentioned `&list=`, although they require some extra handling that I would like to add in the coming week, as well as adding playlist looping and shuffle. I think playback support has been a roadblock for more exciting features such as [#114](https://github.com/omarroth/invidious/issues/114), and I look forward to improving the experience.
|
||||||
|
|
||||||
|
Comments have had a bit of a cosmetic upgrade with [#132](https://github.com/omarroth/invidious/issues/132), which I think helps better distinguish between Reddit and YouTube comments, as it makes them appear similarly to their respective sites. You can also now switch between YouTube and Reddit comments with a push of a button, which I think is quite an improvement, especially for newer or less popular videos with fewer comments.
|
||||||
|
|
||||||
|
I've had a small breakthrough in speeding up users' subscription feeds with PostgreSQL's [materialized views](https://www.postgresql.org/docs/current/static/rules-materializedviews.html). Without going into too much detail, materialized views essentially cache the result of a query, making it possible to run resource-intensive queries once, rather than every time a user visits their feed. In the coming week I hope to push this out to users, and hopefully close [#173](https://github.com/omarroth/invidious/issues/173).
|
||||||
|
|
||||||
|
I haven't had as much time to work on the project this week, but I'm quite happy to have added some new features. Have a great week everyone.
|
||||||
|
|
||||||
|
# 0.8.0 (2018-10-02)
|
||||||
|
|
||||||
|
## Week 8: Mixes
|
||||||
|
|
||||||
|
Hello again!
|
||||||
|
|
||||||
|
Mixes have been added with [`20130db`](https://github.com/omarroth/invidious/20130db), which makes it easy to create a playlist of related content. See [#188](https://github.com/omarroth/invidious/issues/188) for more info on how they work. Currently, they return the first 50 videos rather than a continuous feed to avoid tracking by Google/YouTube, which I think is a good trade-off between usability and privacy, and I hope other folks agree. You can create mixes by adding `RD` to the beginning of a video ID, an example is provided [here](https://www.invidio.us/mix?list=RDYE7VzlLtp-4) based on Big Buck Bunny. I've been quite happy with the results returned for the mixes I've tried, and it is not limited to music, which I think is a big plus. To emulate a continuous feed provided many are used to, using the last video of each mix as a new 'seed' has worked well for me. In the coming week I'd like to to add playback support in the player to listen to these easily.
|
||||||
|
|
||||||
|
A very big thanks to [**@flourgaz**](https://github.com/flourgaz) for Docker support with [#186](https://github.com/omarroth/invidious/pull/186). This is an enormous improvement in portability for the project, and opens the door for Heroku support (see [#162](https://github.com/omarroth/invidious/issues/162)), and seamless support on Windows. For most users, it should be as easy as running `docker-compose up`.
|
||||||
|
|
||||||
|
I've spent quite a bit of time this past week improving support for geo-bypass (see [#92](https://github.com/omarroth/invidious/issues/92)), and am happy to note that Invidious has been able to proxy ~50% of the geo-restricted videos I've tried. In addition, you can now watch geo-restricted videos if you have `dash` enabled as your `preferred quality`, for more details see [#34](https://github.com/omarroth/invidious/issues/34) and [#185](https://github.com/omarroth/invidious/issues/185), or last week's update. For folks interested in replicating these results for themselves, I'd take a look [here](https://gist.github.com/omarroth/3ce0f276c43e0c4b13e7d9cd35524688) for the script used, and [here](https://gist.github.com/omarroth/beffc4a76a7b82a422e1b36a571878ef) for a list of videos restricted in the US.
|
||||||
|
|
||||||
|
1080p has seen a fairly smooth roll-out, although there have been a couple issues reported, mainly [#193](https://github.com/omarroth/invidious/issues/193), which is likely an issue in the player. I've also encountered a couple other issues myself that I would like to investigate. Although none are major, I'd like to keep 1080p opt-in for registered users another week to better address these issues.
|
||||||
|
|
||||||
|
Have an excellent week everyone.
|
||||||
|
|
||||||
# 0.7.0 (2018-09-25)
|
# 0.7.0 (2018-09-25)
|
||||||
|
|
||||||
## Week 7: 1080p and Search Types
|
## Week 7: 1080p and Search Types
|
||||||
|
|
||||||
Hello again everyone! I've got quite a couple announcements this week:
|
Hello again everyone! I've got quite a couple announcements this week:
|
||||||
|
|
||||||
Experimental 1080p support has been added with [`b3ca392`](https://github.com/omarroth/invidious/b3ca392)2a9073b4abb0d7fde58a3e6098668f53e, and can be enabled by going to preferences and changing `preferred video quality` to `dash`. You can find more details [here](https://github.com/omarroth/invidious/issues/34#issuecomment-424171888). Currently quality and speed controls have not yet been integrated into the player, but I'd still appreciate feedback, mainly on any issues with buffering or DASH playback. I hope to integrate 1080p support into the player and push support site-wide in the coming weeks.
|
Experimental 1080p support has been added with [`b3ca392`](https://github.com/omarroth/invidious/b3ca392), and can be enabled by going to preferences and changing `preferred video quality` to `dash`. You can find more details [here](https://github.com/omarroth/invidious/issues/34#issuecomment-424171888). Currently quality and speed controls have not yet been integrated into the player, but I'd still appreciate feedback, mainly on any issues with buffering or DASH playback. I hope to integrate 1080p support into the player and push support site-wide in the coming weeks.
|
||||||
|
|
||||||
You can now filter content types in search with the `type:TYPE` filter. Supported content types are `playlist`, `channel`, and `video`. More info is available [here](https://github.com/omarroth/invidious/issues/126#issuecomment-423823148). I think this is quite an improvement in usability and I hope others find the same.
|
You can now filter content types in search with the `type:TYPE` filter. Supported content types are `playlist`, `channel`, and `video`. More info is available [here](https://github.com/omarroth/invidious/issues/126#issuecomment-423823148). I think this is quite an improvement in usability and I hope others find the same.
|
||||||
|
|
||||||
@@ -13,9 +43,9 @@ A [CHANGELOG](https://github.com/omarroth/invidious/blob/master/CHANGELOG.md) ha
|
|||||||
Recently, users have been reporting 504s when attempting to access their subscriptions, which is tracked in [#173](https://github.com/omarroth/invidious/issues/173). This is most likely caused by an uptick in usage, which I am absolutely grateful for, but unfortunately has resulted in an increase in costs for hosting the site, which is why I will be bumping my goal on Patreon from $60 to $80. I would appreciate any feedback on how subscriptions could be improved.
|
Recently, users have been reporting 504s when attempting to access their subscriptions, which is tracked in [#173](https://github.com/omarroth/invidious/issues/173). This is most likely caused by an uptick in usage, which I am absolutely grateful for, but unfortunately has resulted in an increase in costs for hosting the site, which is why I will be bumping my goal on Patreon from $60 to $80. I would appreciate any feedback on how subscriptions could be improved.
|
||||||
|
|
||||||
Other minor improvements include:
|
Other minor improvements include:
|
||||||
- Additional regions added to bypass geo-block with [`9a78523`](https://github.com/omarroth/invidious/9a78523)41d9d67b6bddd8a9836c1b71c124c3614
|
- Additional regions added to bypass geo-block with [`9a78523`](https://github.com/omarroth/invidious/9a78523)
|
||||||
- Fix for playlists containing less than 100 videos (previously shown as empty) with [`35ac887`](https://github.com/omarroth/invidious/35ac887)13320a970e3a87a26249c2a18a709f020
|
- Fix for playlists containing less than 100 videos (previously shown as empty) with [`35ac887`](https://github.com/omarroth/invidious/35ac887)
|
||||||
- Fix for `published` date for Reddit comments (previously showing negative seconds) with [`6e09202`](https://github.com/omarroth/invidious/6e09202)6d29eccc3e3adf02be138fddec2354027
|
- Fix for `published` date for Reddit comments (previously showing negative seconds) with [`6e09202`](https://github.com/omarroth/invidious/6e09202)
|
||||||
|
|
||||||
Thank you everyone for your support!
|
Thank you everyone for your support!
|
||||||
|
|
||||||
|
|||||||
@@ -17,6 +17,15 @@ div {
|
|||||||
animation: spin 2s linear infinite;
|
animation: spin 2s linear infinite;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.playlist-restricted {
|
||||||
|
height: 20em;
|
||||||
|
padding-right: 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.pure-button-primary {
|
||||||
|
background: rgba(0, 182, 240, 1);
|
||||||
|
}
|
||||||
|
|
||||||
/*
|
/*
|
||||||
* Navbar
|
* Navbar
|
||||||
*/
|
*/
|
||||||
|
|||||||
59
assets/js/watch.js
Normal file
59
assets/js/watch.js
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
function toggle_parent(target) {
|
||||||
|
body = target.parentNode.parentNode.children[1];
|
||||||
|
if (body.style.display === null || body.style.display === "") {
|
||||||
|
target.innerHTML = "[ + ]";
|
||||||
|
body.style.display = "none";
|
||||||
|
} else {
|
||||||
|
target.innerHTML = "[ - ]";
|
||||||
|
body.style.display = "";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function toggle_comments(target) {
|
||||||
|
body = target.parentNode.parentNode.parentNode.children[1];
|
||||||
|
if (body.style.display === null || body.style.display === "") {
|
||||||
|
target.innerHTML = "[ + ]";
|
||||||
|
body.style.display = "none";
|
||||||
|
} else {
|
||||||
|
target.innerHTML = "[ - ]";
|
||||||
|
body.style.display = "";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function swap_comments(source) {
|
||||||
|
if (source == "youtube") {
|
||||||
|
get_youtube_comments();
|
||||||
|
} else if (source == "reddit") {
|
||||||
|
get_reddit_comments();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function commaSeparateNumber(val) {
|
||||||
|
while (/(\d+)(\d{3})/.test(val.toString())) {
|
||||||
|
val = val.toString().replace(/(\d+)(\d{3})/, "$1" + "," + "$2");
|
||||||
|
}
|
||||||
|
return val;
|
||||||
|
}
|
||||||
|
|
||||||
|
String.prototype.supplant = function(o) {
|
||||||
|
return this.replace(/{([^{}]*)}/g, function(a, b) {
|
||||||
|
var r = o[b];
|
||||||
|
return typeof r === "string" || typeof r === "number" ? r : a;
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
function show_youtube_replies(target) {
|
||||||
|
body = target.parentNode.parentNode.children[1];
|
||||||
|
body.style.display = "";
|
||||||
|
|
||||||
|
target.innerHTML = "Hide replies";
|
||||||
|
target.setAttribute("onclick", "hide_youtube_replies(this)");
|
||||||
|
}
|
||||||
|
|
||||||
|
function hide_youtube_replies(target) {
|
||||||
|
body = target.parentNode.parentNode.children[1];
|
||||||
|
body.style.display = "none";
|
||||||
|
|
||||||
|
target.innerHTML = "Show replies";
|
||||||
|
target.setAttribute("onclick", "show_youtube_replies(this)");
|
||||||
|
}
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
crawl_threads: 1
|
crawl_threads: 1
|
||||||
channel_threads: 1
|
channel_threads: 1
|
||||||
|
feed_threads: 1
|
||||||
video_threads: 1
|
video_threads: 1
|
||||||
db:
|
db:
|
||||||
user: kemal
|
user: kemal
|
||||||
@@ -9,3 +10,4 @@ db:
|
|||||||
dbname: invidious
|
dbname: invidious
|
||||||
full_refresh: false
|
full_refresh: false
|
||||||
https_only: false
|
https_only: false
|
||||||
|
geo_bypass: true
|
||||||
|
|||||||
@@ -22,6 +22,8 @@ CREATE TABLE public.videos
|
|||||||
genre text COLLATE pg_catalog."default",
|
genre text COLLATE pg_catalog."default",
|
||||||
genre_url text COLLATE pg_catalog."default",
|
genre_url text COLLATE pg_catalog."default",
|
||||||
license text COLLATE pg_catalog."default",
|
license text COLLATE pg_catalog."default",
|
||||||
|
sub_count_text text COLLATE pg_catalog."default",
|
||||||
|
author_thumbnail text COLLATE pg_catalog."default",
|
||||||
CONSTRAINT videos_pkey PRIMARY KEY (id)
|
CONSTRAINT videos_pkey PRIMARY KEY (id)
|
||||||
)
|
)
|
||||||
WITH (
|
WITH (
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
name: invidious
|
name: invidious
|
||||||
version: 0.7.0
|
version: 0.9.0
|
||||||
|
|
||||||
authors:
|
authors:
|
||||||
- Omar Roth <omarroth@hotmail.com>
|
- Omar Roth <omarroth@hotmail.com>
|
||||||
|
|||||||
330
src/invidious.cr
330
src/invidious.cr
@@ -31,6 +31,7 @@ HMAC_KEY = CONFIG.hmac_key || Random::Secure.random_bytes(32)
|
|||||||
|
|
||||||
crawl_threads = CONFIG.crawl_threads
|
crawl_threads = CONFIG.crawl_threads
|
||||||
channel_threads = CONFIG.channel_threads
|
channel_threads = CONFIG.channel_threads
|
||||||
|
feed_threads = CONFIG.feed_threads
|
||||||
video_threads = CONFIG.video_threads
|
video_threads = CONFIG.video_threads
|
||||||
|
|
||||||
Kemal.config.extra_options do |parser|
|
Kemal.config.extra_options do |parser|
|
||||||
@@ -51,6 +52,14 @@ Kemal.config.extra_options do |parser|
|
|||||||
exit
|
exit
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
parser.on("-f THREADS", "--feed-threads=THREADS", "Number of threads for refreshing feeds (default: #{feed_threads})") do |number|
|
||||||
|
begin
|
||||||
|
feed_threads = number.to_i
|
||||||
|
rescue ex
|
||||||
|
puts "THREADS must be integer"
|
||||||
|
exit
|
||||||
|
end
|
||||||
|
end
|
||||||
parser.on("-v THREADS", "--video-threads=THREADS", "Number of threads for refreshing videos (default: #{video_threads})") do |number|
|
parser.on("-v THREADS", "--video-threads=THREADS", "Number of threads for refreshing videos (default: #{video_threads})") do |number|
|
||||||
begin
|
begin
|
||||||
video_threads = number.to_i
|
video_threads = number.to_i
|
||||||
@@ -85,6 +94,8 @@ end
|
|||||||
|
|
||||||
refresh_channels(PG_DB, channel_threads, CONFIG.full_refresh)
|
refresh_channels(PG_DB, channel_threads, CONFIG.full_refresh)
|
||||||
|
|
||||||
|
refresh_feeds(PG_DB, feed_threads)
|
||||||
|
|
||||||
video_threads.times do |i|
|
video_threads.times do |i|
|
||||||
spawn do
|
spawn do
|
||||||
refresh_videos(PG_DB)
|
refresh_videos(PG_DB)
|
||||||
@@ -106,10 +117,12 @@ spawn do
|
|||||||
end
|
end
|
||||||
|
|
||||||
proxies = {} of String => Array({ip: String, port: Int32})
|
proxies = {} of String => Array({ip: String, port: Int32})
|
||||||
spawn do
|
if CONFIG.geo_bypass
|
||||||
find_working_proxies(BYPASS_REGIONS) do |region, list|
|
spawn do
|
||||||
if !list.empty?
|
find_working_proxies(BYPASS_REGIONS) do |region, list|
|
||||||
proxies[region] = list
|
if !list.empty?
|
||||||
|
proxies[region] = list
|
||||||
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
@@ -215,6 +228,8 @@ get "/watch" do |env|
|
|||||||
next env.redirect "/"
|
next env.redirect "/"
|
||||||
end
|
end
|
||||||
|
|
||||||
|
plid = env.params.query["list"]?
|
||||||
|
|
||||||
user = env.get? "user"
|
user = env.get? "user"
|
||||||
if user
|
if user
|
||||||
user = user.as(User)
|
user = user.as(User)
|
||||||
@@ -235,6 +250,8 @@ get "/watch" do |env|
|
|||||||
|
|
||||||
begin
|
begin
|
||||||
video = get_video(id, PG_DB, proxies)
|
video = get_video(id, PG_DB, proxies)
|
||||||
|
rescue ex : VideoRedirect
|
||||||
|
next env.redirect "/watch?v=#{ex.message}"
|
||||||
rescue ex
|
rescue ex
|
||||||
error_message = ex.message
|
error_message = ex.message
|
||||||
STDOUT << id << " : " << ex.message << "\n"
|
STDOUT << id << " : " << ex.message << "\n"
|
||||||
@@ -335,6 +352,8 @@ get "/embed/:id" do |env|
|
|||||||
|
|
||||||
begin
|
begin
|
||||||
video = get_video(id, PG_DB, proxies)
|
video = get_video(id, PG_DB, proxies)
|
||||||
|
rescue ex : VideoRedirect
|
||||||
|
next env.redirect "/embed/#{ex.message}"
|
||||||
rescue ex
|
rescue ex
|
||||||
error_message = ex.message
|
error_message = ex.message
|
||||||
next templated "error"
|
next templated "error"
|
||||||
@@ -400,6 +419,10 @@ get "/playlist" do |env|
|
|||||||
page = env.params.query["page"]?.try &.to_i?
|
page = env.params.query["page"]?.try &.to_i?
|
||||||
page ||= 1
|
page ||= 1
|
||||||
|
|
||||||
|
if plid.starts_with? "RD"
|
||||||
|
next env.redirect "/mix?list=#{plid}"
|
||||||
|
end
|
||||||
|
|
||||||
begin
|
begin
|
||||||
playlist = fetch_playlist(plid)
|
playlist = fetch_playlist(plid)
|
||||||
rescue ex
|
rescue ex
|
||||||
@@ -463,9 +486,8 @@ get "/search" do |env|
|
|||||||
user = env.get? "user"
|
user = env.get? "user"
|
||||||
if user
|
if user
|
||||||
user = user.as(User)
|
user = user.as(User)
|
||||||
ucids = user.subscriptions
|
view_name = "subscriptions_#{sha256(user.email)[0..7]}"
|
||||||
end
|
end
|
||||||
ucids ||= [] of String
|
|
||||||
|
|
||||||
channel = nil
|
channel = nil
|
||||||
content_type = "all"
|
content_type = "all"
|
||||||
@@ -502,14 +524,19 @@ get "/search" do |env|
|
|||||||
if channel
|
if channel
|
||||||
count, videos = channel_search(search_query, page, channel)
|
count, videos = channel_search(search_query, page, channel)
|
||||||
elsif subscriptions
|
elsif subscriptions
|
||||||
videos = PG_DB.query_all("SELECT id,title,published,updated,ucid,author FROM (
|
if view_name
|
||||||
|
videos = PG_DB.query_all("SELECT id,title,published,updated,ucid,author FROM (
|
||||||
SELECT *,
|
SELECT *,
|
||||||
to_tsvector(channel_videos.title) ||
|
to_tsvector(#{view_name}.title) ||
|
||||||
to_tsvector(channel_videos.author)
|
to_tsvector(#{view_name}.author)
|
||||||
as document
|
as document
|
||||||
FROM channel_videos WHERE ucid IN (#{arg_array(ucids, 3)})
|
FROM #{view_name}
|
||||||
) v_search WHERE v_search.document @@ plainto_tsquery($1) LIMIT 20 OFFSET $2;", [search_query, (page - 1) * 20] + ucids, as: ChannelVideo)
|
) v_search WHERE v_search.document @@ plainto_tsquery($1) LIMIT 20 OFFSET $2;", search_query, (page - 1) * 20, as: ChannelVideo)
|
||||||
count = videos.size
|
count = videos.size
|
||||||
|
else
|
||||||
|
videos = [] of ChannelVideo
|
||||||
|
count = 0
|
||||||
|
end
|
||||||
else
|
else
|
||||||
begin
|
begin
|
||||||
search_params = produce_search_params(sort: sort, date: date, content_type: content_type,
|
search_params = produce_search_params(sort: sort, date: date, content_type: content_type,
|
||||||
@@ -743,7 +770,7 @@ post "/login" do |env|
|
|||||||
end
|
end
|
||||||
|
|
||||||
if action == "signin"
|
if action == "signin"
|
||||||
user = PG_DB.query_one?("SELECT * FROM users WHERE email = $1 AND password IS NOT NULL", email, as: User)
|
user = PG_DB.query_one?("SELECT * FROM users WHERE LOWER(email) = LOWER($1) AND password IS NOT NULL", email, as: User)
|
||||||
|
|
||||||
if !user
|
if !user
|
||||||
error_message = "Invalid username or password"
|
error_message = "Invalid username or password"
|
||||||
@@ -757,7 +784,7 @@ post "/login" do |env|
|
|||||||
|
|
||||||
if Crypto::Bcrypt::Password.new(user.password.not_nil!) == password
|
if Crypto::Bcrypt::Password.new(user.password.not_nil!) == password
|
||||||
sid = Base64.urlsafe_encode(Random::Secure.random_bytes(32))
|
sid = Base64.urlsafe_encode(Random::Secure.random_bytes(32))
|
||||||
PG_DB.exec("UPDATE users SET id = id || $1 WHERE email = $2", [sid], email)
|
PG_DB.exec("UPDATE users SET id = id || $1 WHERE LOWER(email) = LOWER($2)", [sid], email)
|
||||||
|
|
||||||
if Kemal.config.ssl || CONFIG.https_only
|
if Kemal.config.ssl || CONFIG.https_only
|
||||||
secure = true
|
secure = true
|
||||||
@@ -772,7 +799,7 @@ post "/login" do |env|
|
|||||||
next templated "error"
|
next templated "error"
|
||||||
end
|
end
|
||||||
elsif action == "register"
|
elsif action == "register"
|
||||||
user = PG_DB.query_one?("SELECT * FROM users WHERE email = $1 AND password IS NOT NULL", email, as: User)
|
user = PG_DB.query_one?("SELECT * FROM users WHERE LOWER(email) = LOWER($1) AND password IS NOT NULL", email, as: User)
|
||||||
if user
|
if user
|
||||||
error_message = "Please sign in"
|
error_message = "Please sign in"
|
||||||
next templated "error"
|
next templated "error"
|
||||||
@@ -787,6 +814,12 @@ post "/login" do |env|
|
|||||||
|
|
||||||
PG_DB.exec("INSERT INTO users VALUES (#{args})", user_array)
|
PG_DB.exec("INSERT INTO users VALUES (#{args})", user_array)
|
||||||
|
|
||||||
|
view_name = "subscriptions_#{sha256(user.email)[0..7]}"
|
||||||
|
PG_DB.exec("CREATE MATERIALIZED VIEW #{view_name} AS \
|
||||||
|
SELECT * FROM channel_videos WHERE \
|
||||||
|
ucid = ANY ((SELECT subscriptions FROM users WHERE email = '#{user.email}')::text[]) \
|
||||||
|
ORDER BY published DESC;")
|
||||||
|
|
||||||
if Kemal.config.ssl || CONFIG.https_only
|
if Kemal.config.ssl || CONFIG.https_only
|
||||||
secure = true
|
secure = true
|
||||||
else
|
else
|
||||||
@@ -1113,12 +1146,14 @@ post "/data_control" do |env|
|
|||||||
body = JSON.parse(body)
|
body = JSON.parse(body)
|
||||||
body["subscriptions"].as_a.each do |ucid|
|
body["subscriptions"].as_a.each do |ucid|
|
||||||
ucid = ucid.as_s
|
ucid = ucid.as_s
|
||||||
if !user.subscriptions.includes? ucid
|
|
||||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE id = $2", ucid, user.id)
|
|
||||||
|
|
||||||
|
if !user.subscriptions.includes? ucid
|
||||||
begin
|
begin
|
||||||
client = make_client(YT_URL)
|
client = make_client(YT_URL)
|
||||||
get_channel(ucid, client, PG_DB, false, false)
|
get_channel(ucid, client, PG_DB, false, false)
|
||||||
|
|
||||||
|
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||||
|
user.subscriptions << ucid
|
||||||
rescue ex
|
rescue ex
|
||||||
next
|
next
|
||||||
end
|
end
|
||||||
@@ -1127,8 +1162,10 @@ post "/data_control" do |env|
|
|||||||
|
|
||||||
body["watch_history"].as_a.each do |id|
|
body["watch_history"].as_a.each do |id|
|
||||||
id = id.as_s
|
id = id.as_s
|
||||||
|
|
||||||
if !user.watched.includes? id
|
if !user.watched.includes? id
|
||||||
PG_DB.exec("UPDATE users SET watched = array_append(watched,$1) WHERE email = $2", id, user.email)
|
PG_DB.exec("UPDATE users SET watched = array_append(watched,$1) WHERE email = $2", id, user.email)
|
||||||
|
user.watched << id
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
@@ -1139,11 +1176,12 @@ post "/data_control" do |env|
|
|||||||
ucid = channel["xmlUrl"].match(/UC[a-zA-Z0-9_-]{22}/).not_nil![0]
|
ucid = channel["xmlUrl"].match(/UC[a-zA-Z0-9_-]{22}/).not_nil![0]
|
||||||
|
|
||||||
if !user.subscriptions.includes? ucid
|
if !user.subscriptions.includes? ucid
|
||||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
|
||||||
|
|
||||||
begin
|
begin
|
||||||
client = make_client(YT_URL)
|
client = make_client(YT_URL)
|
||||||
get_channel(ucid, client, PG_DB, false, false)
|
get_channel(ucid, client, PG_DB, false, false)
|
||||||
|
|
||||||
|
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||||
|
user.subscriptions << ucid
|
||||||
rescue ex
|
rescue ex
|
||||||
next
|
next
|
||||||
end
|
end
|
||||||
@@ -1154,11 +1192,12 @@ post "/data_control" do |env|
|
|||||||
ucid = md["channel_id"]
|
ucid = md["channel_id"]
|
||||||
|
|
||||||
if !user.subscriptions.includes? ucid
|
if !user.subscriptions.includes? ucid
|
||||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
|
||||||
|
|
||||||
begin
|
begin
|
||||||
client = make_client(YT_URL)
|
client = make_client(YT_URL)
|
||||||
get_channel(ucid, client, PG_DB, false, false)
|
get_channel(ucid, client, PG_DB, false, false)
|
||||||
|
|
||||||
|
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||||
|
user.subscriptions << ucid
|
||||||
rescue ex
|
rescue ex
|
||||||
next
|
next
|
||||||
end
|
end
|
||||||
@@ -1170,11 +1209,12 @@ post "/data_control" do |env|
|
|||||||
ucid = channel["url"].as_s.match(/UC[a-zA-Z0-9_-]{22}/).not_nil![0]
|
ucid = channel["url"].as_s.match(/UC[a-zA-Z0-9_-]{22}/).not_nil![0]
|
||||||
|
|
||||||
if !user.subscriptions.includes? ucid
|
if !user.subscriptions.includes? ucid
|
||||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
|
||||||
|
|
||||||
begin
|
begin
|
||||||
client = make_client(YT_URL)
|
client = make_client(YT_URL)
|
||||||
get_channel(ucid, client, PG_DB, false, false)
|
get_channel(ucid, client, PG_DB, false, false)
|
||||||
|
|
||||||
|
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||||
|
user.subscriptions << ucid
|
||||||
rescue ex
|
rescue ex
|
||||||
next
|
next
|
||||||
end
|
end
|
||||||
@@ -1190,19 +1230,24 @@ post "/data_control" do |env|
|
|||||||
|
|
||||||
db = entry.io.gets_to_end
|
db = entry.io.gets_to_end
|
||||||
db.scan(/youtube\.com\/watch\?v\=(?<id>[a-zA-Z0-9_-]{11})/) do |md|
|
db.scan(/youtube\.com\/watch\?v\=(?<id>[a-zA-Z0-9_-]{11})/) do |md|
|
||||||
if !user.watched.includes? md["id"]
|
id = md["id"]
|
||||||
PG_DB.exec("UPDATE users SET watched = array_append(watched,$1) WHERE email = $2", md["id"], user.email)
|
|
||||||
|
if !user.watched.includes? id
|
||||||
|
PG_DB.exec("UPDATE users SET watched = array_append(watched,$1) WHERE email = $2", id, user.email)
|
||||||
|
user.watched << id
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
db.scan(/youtube\.com\/channel\/(?<ucid>[a-zA-Z0-9_-]{22})/) do |md|
|
db.scan(/youtube\.com\/channel\/(?<ucid>[a-zA-Z0-9_-]{22})/) do |md|
|
||||||
ucid = md["ucid"]
|
ucid = md["ucid"]
|
||||||
if !user.subscriptions.includes? ucid
|
|
||||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
|
||||||
|
|
||||||
|
if !user.subscriptions.includes? ucid
|
||||||
begin
|
begin
|
||||||
client = make_client(YT_URL)
|
client = make_client(YT_URL)
|
||||||
get_channel(ucid, client, PG_DB, false, false)
|
get_channel(ucid, client, PG_DB, false, false)
|
||||||
|
|
||||||
|
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||||
|
user.subscriptions << ucid
|
||||||
rescue ex
|
rescue ex
|
||||||
next
|
next
|
||||||
end
|
end
|
||||||
@@ -1340,6 +1385,8 @@ get "/feed/subscriptions" do |env|
|
|||||||
|
|
||||||
notifications = PG_DB.query_one("SELECT notifications FROM users WHERE email = $1", user.email,
|
notifications = PG_DB.query_one("SELECT notifications FROM users WHERE email = $1", user.email,
|
||||||
as: Array(String))
|
as: Array(String))
|
||||||
|
view_name = "subscriptions_#{sha256(user.email)[0..7]}"
|
||||||
|
|
||||||
if preferences.notifications_only && !notifications.empty?
|
if preferences.notifications_only && !notifications.empty?
|
||||||
args = arg_array(notifications)
|
args = arg_array(notifications)
|
||||||
|
|
||||||
@@ -1362,39 +1409,35 @@ get "/feed/subscriptions" do |env|
|
|||||||
else
|
else
|
||||||
if preferences.latest_only
|
if preferences.latest_only
|
||||||
if preferences.unseen_only
|
if preferences.unseen_only
|
||||||
ucids = arg_array(user.subscriptions)
|
|
||||||
if user.watched.empty?
|
if user.watched.empty?
|
||||||
watched = "'{}'"
|
watched = "'{}'"
|
||||||
else
|
else
|
||||||
watched = arg_array(user.watched, user.subscriptions.size + 1)
|
watched = arg_array(user.watched)
|
||||||
end
|
end
|
||||||
|
|
||||||
videos = PG_DB.query_all("SELECT DISTINCT ON (ucid) * FROM channel_videos WHERE \
|
videos = PG_DB.query_all("SELECT DISTINCT ON (ucid) * FROM #{view_name} WHERE \
|
||||||
ucid IN (#{ucids}) AND id NOT IN (#{watched}) ORDER BY ucid, published DESC",
|
id NOT IN (#{watched}) ORDER BY ucid, published DESC",
|
||||||
user.subscriptions + user.watched, as: ChannelVideo)
|
user.watched, as: ChannelVideo)
|
||||||
else
|
else
|
||||||
args = arg_array(user.subscriptions)
|
videos = PG_DB.query_all("SELECT DISTINCT ON (ucid) * FROM #{view_name} \
|
||||||
videos = PG_DB.query_all("SELECT DISTINCT ON (ucid) * FROM channel_videos WHERE \
|
ORDER BY ucid, published DESC", as: ChannelVideo)
|
||||||
ucid IN (#{args}) ORDER BY ucid, published DESC", user.subscriptions, as: ChannelVideo)
|
|
||||||
end
|
end
|
||||||
|
|
||||||
videos.sort_by! { |video| video.published }.reverse!
|
videos.sort_by! { |video| video.published }.reverse!
|
||||||
else
|
else
|
||||||
if preferences.unseen_only
|
if preferences.unseen_only
|
||||||
ucids = arg_array(user.subscriptions, 3)
|
|
||||||
if user.watched.empty?
|
if user.watched.empty?
|
||||||
watched = "'{}'"
|
watched = "'{}'"
|
||||||
else
|
else
|
||||||
watched = arg_array(user.watched, user.subscriptions.size + 3)
|
watched = arg_array(user.watched, 3)
|
||||||
end
|
end
|
||||||
|
|
||||||
videos = PG_DB.query_all("SELECT * FROM channel_videos WHERE ucid IN (#{ucids}) \
|
videos = PG_DB.query_all("SELECT * FROM #{view_name} WHERE \
|
||||||
AND id NOT IN (#{watched}) ORDER BY published DESC LIMIT $1 OFFSET $2",
|
id NOT IN (#{watched}) LIMIT $1 OFFSET $2",
|
||||||
[limit, offset] + user.subscriptions + user.watched, as: ChannelVideo)
|
[limit, offset] + user.watched, as: ChannelVideo)
|
||||||
else
|
else
|
||||||
args = arg_array(user.subscriptions, 3)
|
videos = PG_DB.query_all("SELECT * FROM #{view_name} \
|
||||||
videos = PG_DB.query_all("SELECT * FROM channel_videos WHERE ucid IN (#{args}) \
|
ORDER BY published DESC LIMIT $1 OFFSET $2", limit, offset, as: ChannelVideo)
|
||||||
ORDER BY published DESC LIMIT $1 OFFSET $2", [limit, offset] + user.subscriptions, as: ChannelVideo)
|
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
@@ -1443,29 +1486,8 @@ get "/feed/channel/:ucid" do |env|
|
|||||||
halt env, status_code: 404, response: error_message
|
halt env, status_code: 404, response: error_message
|
||||||
end
|
end
|
||||||
|
|
||||||
client = make_client(YT_URL)
|
|
||||||
|
|
||||||
page = 1
|
page = 1
|
||||||
|
videos, count = get_60_videos(ucid, page, auto_generated)
|
||||||
videos = [] of SearchVideo
|
|
||||||
2.times do |i|
|
|
||||||
url = produce_channel_videos_url(ucid, page * 2 + (i - 1), auto_generated: auto_generated)
|
|
||||||
response = client.get(url)
|
|
||||||
json = JSON.parse(response.body)
|
|
||||||
|
|
||||||
if json["content_html"]? && !json["content_html"].as_s.empty?
|
|
||||||
document = XML.parse_html(json["content_html"].as_s)
|
|
||||||
nodeset = document.xpath_nodes(%q(//li[contains(@class, "feed-item-container")]))
|
|
||||||
|
|
||||||
if auto_generated
|
|
||||||
videos += extract_videos(nodeset)
|
|
||||||
else
|
|
||||||
videos += extract_videos(nodeset, ucid)
|
|
||||||
end
|
|
||||||
else
|
|
||||||
break
|
|
||||||
end
|
|
||||||
end
|
|
||||||
|
|
||||||
host_url = make_host_url(Kemal.config.ssl || CONFIG.https_only, env.request.headers["Host"]?)
|
host_url = make_host_url(Kemal.config.ssl || CONFIG.https_only, env.request.headers["Host"]?)
|
||||||
path = env.request.path
|
path = env.request.path
|
||||||
@@ -1552,15 +1574,14 @@ get "/feed/private" do |env|
|
|||||||
latest_only ||= 0
|
latest_only ||= 0
|
||||||
latest_only = latest_only == 1
|
latest_only = latest_only == 1
|
||||||
|
|
||||||
|
view_name = "subscriptions_#{sha256(user.email)[0..7]}"
|
||||||
|
|
||||||
if latest_only
|
if latest_only
|
||||||
args = arg_array(user.subscriptions)
|
videos = PG_DB.query_all("SELECT DISTINCT ON (ucid) * FROM #{view_name} ORDER BY ucid, published DESC", as: ChannelVideo)
|
||||||
videos = PG_DB.query_all("SELECT DISTINCT ON (ucid) * FROM channel_videos WHERE \
|
|
||||||
ucid IN (#{args}) ORDER BY ucid, published DESC", user.subscriptions, as: ChannelVideo)
|
|
||||||
videos.sort_by! { |video| video.published }.reverse!
|
videos.sort_by! { |video| video.published }.reverse!
|
||||||
else
|
else
|
||||||
args = arg_array(user.subscriptions, 3)
|
videos = PG_DB.query_all("SELECT * FROM #{view_name} \
|
||||||
videos = PG_DB.query_all("SELECT * FROM channel_videos WHERE ucid IN (#{args}) \
|
ORDER BY published DESC LIMIT $1 OFFSET $2", limit, offset, as: ChannelVideo)
|
||||||
ORDER BY published DESC LIMIT $1 OFFSET $2", [limit, offset] + user.subscriptions, as: ChannelVideo)
|
|
||||||
end
|
end
|
||||||
|
|
||||||
sort = env.params.query["sort"]?
|
sort = env.params.query["sort"]?
|
||||||
@@ -1697,7 +1718,7 @@ get "/channel/:ucid" do |env|
|
|||||||
page ||= 1
|
page ||= 1
|
||||||
|
|
||||||
begin
|
begin
|
||||||
author, ucid, auto_generated = get_about_info(ucid)
|
author, ucid, auto_generated, sub_count = get_about_info(ucid)
|
||||||
rescue ex
|
rescue ex
|
||||||
error_message = "User does not exist"
|
error_message = "User does not exist"
|
||||||
next templated "error"
|
next templated "error"
|
||||||
@@ -1711,27 +1732,7 @@ get "/channel/:ucid" do |env|
|
|||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
client = make_client(YT_URL)
|
videos, count = get_60_videos(ucid, page, auto_generated)
|
||||||
|
|
||||||
videos = [] of SearchVideo
|
|
||||||
2.times do |i|
|
|
||||||
url = produce_channel_videos_url(ucid, page * 2 + (i - 1), auto_generated: auto_generated)
|
|
||||||
response = client.get(url)
|
|
||||||
json = JSON.parse(response.body)
|
|
||||||
|
|
||||||
if json["content_html"]? && !json["content_html"].as_s.empty?
|
|
||||||
document = XML.parse_html(json["content_html"].as_s)
|
|
||||||
nodeset = document.xpath_nodes(%q(//li[contains(@class, "feed-item-container")]))
|
|
||||||
|
|
||||||
if auto_generated
|
|
||||||
videos += extract_videos(nodeset)
|
|
||||||
else
|
|
||||||
videos += extract_videos(nodeset, ucid)
|
|
||||||
end
|
|
||||||
else
|
|
||||||
break
|
|
||||||
end
|
|
||||||
end
|
|
||||||
|
|
||||||
templated "channel"
|
templated "channel"
|
||||||
end
|
end
|
||||||
@@ -1759,6 +1760,8 @@ get "/api/v1/captions/:id" do |env|
|
|||||||
client = make_client(YT_URL)
|
client = make_client(YT_URL)
|
||||||
begin
|
begin
|
||||||
video = get_video(id, PG_DB, proxies)
|
video = get_video(id, PG_DB, proxies)
|
||||||
|
rescue ex : VideoRedirect
|
||||||
|
next env.redirect "/api/v1/captions/#{ex.message}"
|
||||||
rescue ex
|
rescue ex
|
||||||
halt env, status_code: 403
|
halt env, status_code: 403
|
||||||
end
|
end
|
||||||
@@ -1874,31 +1877,34 @@ get "/api/v1/comments/:id" do |env|
|
|||||||
|
|
||||||
proxies.each do |region, list|
|
proxies.each do |region, list|
|
||||||
spawn do
|
spawn do
|
||||||
|
proxy_html = %(<meta itemprop="regionsAllowed" content="">)
|
||||||
|
|
||||||
list.each do |proxy|
|
list.each do |proxy|
|
||||||
begin
|
begin
|
||||||
proxy_client = HTTPClient.new(YT_URL)
|
proxy_client = HTTPClient.new(YT_URL)
|
||||||
proxy_client.read_timeout = 10.seconds
|
proxy_client.read_timeout = 10.seconds
|
||||||
proxy_client.connect_timeout = 10.seconds
|
proxy_client.connect_timeout = 10.seconds
|
||||||
|
|
||||||
proxy = list.sample(1)[0]
|
|
||||||
proxy = HTTPProxy.new(proxy_host: proxy[:ip], proxy_port: proxy[:port])
|
proxy = HTTPProxy.new(proxy_host: proxy[:ip], proxy_port: proxy[:port])
|
||||||
proxy_client.set_proxy(proxy)
|
proxy_client.set_proxy(proxy)
|
||||||
|
|
||||||
proxy_html = proxy_client.get("/watch?v=#{id}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
response = proxy_client.get("/watch?v=#{id}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
||||||
proxy_headers = HTTP::Headers.new
|
proxy_headers = HTTP::Headers.new
|
||||||
proxy_headers["cookie"] = proxy_html.cookies.add_request_headers(headers)["cookie"]
|
proxy_headers["cookie"] = response.cookies.add_request_headers(headers)["cookie"]
|
||||||
proxy_html = proxy_html.body
|
proxy_html = response.body
|
||||||
|
|
||||||
if proxy_html.match(/<meta itemprop="regionsAllowed" content="">/)
|
if !proxy_html.match(/<meta itemprop="regionsAllowed" content="">/)
|
||||||
bypass_channel.send(nil)
|
|
||||||
else
|
|
||||||
bypass_channel.send({proxy_html, proxy_client, proxy_headers})
|
bypass_channel.send({proxy_html, proxy_client, proxy_headers})
|
||||||
|
break
|
||||||
end
|
end
|
||||||
|
|
||||||
break
|
|
||||||
rescue ex
|
rescue ex
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
# If none of the proxies we tried returned a valid response
|
||||||
|
if proxy_html.match(/<meta itemprop="regionsAllowed" content="">/)
|
||||||
|
bypass_channel.send(nil)
|
||||||
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
@@ -2203,6 +2209,8 @@ get "/api/v1/videos/:id" do |env|
|
|||||||
|
|
||||||
begin
|
begin
|
||||||
video = get_video(id, PG_DB, proxies)
|
video = get_video(id, PG_DB, proxies)
|
||||||
|
rescue ex : VideoRedirect
|
||||||
|
next env.redirect "/api/v1/videos/#{ex.message}"
|
||||||
rescue ex
|
rescue ex
|
||||||
error_message = {"error" => ex.message}.to_json
|
error_message = {"error" => ex.message}.to_json
|
||||||
halt env, status_code: 500, response: error_message
|
halt env, status_code: 500, response: error_message
|
||||||
@@ -2246,6 +2254,22 @@ get "/api/v1/videos/:id" do |env|
|
|||||||
json.field "authorId", video.ucid
|
json.field "authorId", video.ucid
|
||||||
json.field "authorUrl", "/channel/#{video.ucid}"
|
json.field "authorUrl", "/channel/#{video.ucid}"
|
||||||
|
|
||||||
|
json.field "authorThumbnails" do
|
||||||
|
json.array do
|
||||||
|
qualities = [32, 48, 76, 100, 176, 512]
|
||||||
|
|
||||||
|
qualities.each do |quality|
|
||||||
|
json.object do
|
||||||
|
json.field "url", video.author_thumbnail.gsub("=s48-", "=s#{quality}-")
|
||||||
|
json.field "width", quality
|
||||||
|
json.field "height", quality
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
json.field "subCountText", video.sub_count_text
|
||||||
|
|
||||||
json.field "lengthSeconds", video.info["length_seconds"].to_i
|
json.field "lengthSeconds", video.info["length_seconds"].to_i
|
||||||
if video.info["allow_ratings"]?
|
if video.info["allow_ratings"]?
|
||||||
json.field "allowRatings", video.info["allow_ratings"] == "1"
|
json.field "allowRatings", video.info["allow_ratings"] == "1"
|
||||||
@@ -2464,30 +2488,10 @@ get "/api/v1/channels/:ucid" do |env|
|
|||||||
halt env, status_code: 404, response: error_message
|
halt env, status_code: 404, response: error_message
|
||||||
end
|
end
|
||||||
|
|
||||||
client = make_client(YT_URL)
|
|
||||||
|
|
||||||
page = 1
|
page = 1
|
||||||
|
videos, count = get_60_videos(ucid, page, auto_generated)
|
||||||
|
|
||||||
videos = [] of SearchVideo
|
client = make_client(YT_URL)
|
||||||
2.times do |i|
|
|
||||||
url = produce_channel_videos_url(ucid, page * 2 + (i - 1), auto_generated: auto_generated)
|
|
||||||
response = client.get(url)
|
|
||||||
json = JSON.parse(response.body)
|
|
||||||
|
|
||||||
if json["content_html"]? && !json["content_html"].as_s.empty?
|
|
||||||
document = XML.parse_html(json["content_html"].as_s)
|
|
||||||
nodeset = document.xpath_nodes(%q(//li[contains(@class, "feed-item-container")]))
|
|
||||||
|
|
||||||
if auto_generated
|
|
||||||
videos += extract_videos(nodeset)
|
|
||||||
else
|
|
||||||
videos += extract_videos(nodeset, ucid)
|
|
||||||
end
|
|
||||||
else
|
|
||||||
break
|
|
||||||
end
|
|
||||||
end
|
|
||||||
|
|
||||||
channel_html = client.get("/channel/#{ucid}/about?disable_polymer=1").body
|
channel_html = client.get("/channel/#{ucid}/about?disable_polymer=1").body
|
||||||
channel_html = XML.parse_html(channel_html)
|
channel_html = XML.parse_html(channel_html)
|
||||||
banner = channel_html.xpath_node(%q(//div[@id="gh-banner"]/style)).not_nil!.content
|
banner = channel_html.xpath_node(%q(//div[@id="gh-banner"]/style)).not_nil!.content
|
||||||
@@ -2623,27 +2627,7 @@ end
|
|||||||
halt env, status_code: 404, response: error_message
|
halt env, status_code: 404, response: error_message
|
||||||
end
|
end
|
||||||
|
|
||||||
client = make_client(YT_URL)
|
videos, count = get_60_videos(ucid, page, auto_generated)
|
||||||
|
|
||||||
videos = [] of SearchVideo
|
|
||||||
2.times do |i|
|
|
||||||
url = produce_channel_videos_url(ucid, page * 2 + (i - 1), auto_generated: auto_generated)
|
|
||||||
response = client.get(url)
|
|
||||||
json = JSON.parse(response.body)
|
|
||||||
|
|
||||||
if json["content_html"]? && !json["content_html"].as_s.empty?
|
|
||||||
document = XML.parse_html(json["content_html"].as_s)
|
|
||||||
nodeset = document.xpath_nodes(%q(//li[contains(@class, "feed-item-container")]))
|
|
||||||
|
|
||||||
if auto_generated
|
|
||||||
videos += extract_videos(nodeset)
|
|
||||||
else
|
|
||||||
videos += extract_videos(nodeset, ucid)
|
|
||||||
end
|
|
||||||
else
|
|
||||||
break
|
|
||||||
end
|
|
||||||
end
|
|
||||||
|
|
||||||
result = JSON.build do |json|
|
result = JSON.build do |json|
|
||||||
json.array do
|
json.array do
|
||||||
@@ -2906,6 +2890,15 @@ get "/api/v1/playlists/:plid" do |env|
|
|||||||
page = env.params.query["page"]?.try &.to_i?
|
page = env.params.query["page"]?.try &.to_i?
|
||||||
page ||= 1
|
page ||= 1
|
||||||
|
|
||||||
|
format = env.params.query["format"]?
|
||||||
|
format ||= "json"
|
||||||
|
|
||||||
|
continuation = env.params.query["continuation"]?
|
||||||
|
|
||||||
|
if plid.starts_with? "RD"
|
||||||
|
next env.redirect "/api/v1/mixes/#{plid}"
|
||||||
|
end
|
||||||
|
|
||||||
begin
|
begin
|
||||||
playlist = fetch_playlist(plid)
|
playlist = fetch_playlist(plid)
|
||||||
rescue ex
|
rescue ex
|
||||||
@@ -2914,7 +2907,7 @@ get "/api/v1/playlists/:plid" do |env|
|
|||||||
end
|
end
|
||||||
|
|
||||||
begin
|
begin
|
||||||
videos = fetch_playlist_videos(plid, page, playlist.video_count)
|
videos = fetch_playlist_videos(plid, page, playlist.video_count, continuation)
|
||||||
rescue ex
|
rescue ex
|
||||||
videos = [] of PlaylistVideo
|
videos = [] of PlaylistVideo
|
||||||
end
|
end
|
||||||
@@ -2973,6 +2966,17 @@ get "/api/v1/playlists/:plid" do |env|
|
|||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
if format == "html"
|
||||||
|
response = JSON.parse(response)
|
||||||
|
playlist_html = template_playlist(response)
|
||||||
|
next_video = response["videos"].as_a[1]?.try &.["videoId"]
|
||||||
|
|
||||||
|
response = {
|
||||||
|
"playlistHtml" => playlist_html,
|
||||||
|
"nextVideo" => next_video,
|
||||||
|
}.to_json
|
||||||
|
end
|
||||||
|
|
||||||
response
|
response
|
||||||
end
|
end
|
||||||
|
|
||||||
@@ -2984,6 +2988,9 @@ get "/api/v1/mixes/:rdid" do |env|
|
|||||||
continuation = env.params.query["continuation"]?
|
continuation = env.params.query["continuation"]?
|
||||||
continuation ||= rdid.lchop("RD")
|
continuation ||= rdid.lchop("RD")
|
||||||
|
|
||||||
|
format = env.params.query["format"]?
|
||||||
|
format ||= "json"
|
||||||
|
|
||||||
begin
|
begin
|
||||||
mix = fetch_mix(rdid, continuation)
|
mix = fetch_mix(rdid, continuation)
|
||||||
rescue ex
|
rescue ex
|
||||||
@@ -3022,6 +3029,17 @@ get "/api/v1/mixes/:rdid" do |env|
|
|||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
if format == "html"
|
||||||
|
response = JSON.parse(response)
|
||||||
|
playlist_html = template_mix(response)
|
||||||
|
next_video = response["videos"].as_a[1]?.try &.["videoId"]
|
||||||
|
|
||||||
|
response = {
|
||||||
|
"playlistHtml" => playlist_html,
|
||||||
|
"nextVideo" => next_video,
|
||||||
|
}.to_json
|
||||||
|
end
|
||||||
|
|
||||||
response
|
response
|
||||||
end
|
end
|
||||||
|
|
||||||
@@ -3045,6 +3063,8 @@ get "/api/manifest/dash/id/:id" do |env|
|
|||||||
client = make_client(YT_URL)
|
client = make_client(YT_URL)
|
||||||
begin
|
begin
|
||||||
video = get_video(id, PG_DB, proxies)
|
video = get_video(id, PG_DB, proxies)
|
||||||
|
rescue ex : VideoRedirect
|
||||||
|
next env.redirect "/api/manifest/dash/id/#{ex.message}"
|
||||||
rescue ex
|
rescue ex
|
||||||
halt env, status_code: 403
|
halt env, status_code: 403
|
||||||
end
|
end
|
||||||
@@ -3408,6 +3428,24 @@ get "/vi/:id/:name" do |env|
|
|||||||
end
|
end
|
||||||
|
|
||||||
error 404 do |env|
|
error 404 do |env|
|
||||||
|
if md = env.request.path.match(/^\/(?<id>[a-zA-Z0-9_-]{11})/)
|
||||||
|
id = md["id"]
|
||||||
|
|
||||||
|
params = [] of String
|
||||||
|
env.params.query.each do |k, v|
|
||||||
|
params << "#{k}=#{v}"
|
||||||
|
end
|
||||||
|
params = params.join("&")
|
||||||
|
|
||||||
|
url = "/watch?v=#{id}"
|
||||||
|
if !params.empty?
|
||||||
|
url += "&#{params}"
|
||||||
|
end
|
||||||
|
|
||||||
|
env.response.headers["Location"] = url
|
||||||
|
halt env, status_code: 302
|
||||||
|
end
|
||||||
|
|
||||||
error_message = "404 Page not found"
|
error_message = "404 Page not found"
|
||||||
templated "error"
|
templated "error"
|
||||||
end
|
end
|
||||||
|
|||||||
@@ -176,7 +176,7 @@ def produce_channel_videos_url(ucid, page = 1, auto_generated = nil)
|
|||||||
continuation = Base64.urlsafe_encode(continuation)
|
continuation = Base64.urlsafe_encode(continuation)
|
||||||
continuation = URI.escape(continuation)
|
continuation = URI.escape(continuation)
|
||||||
|
|
||||||
url = "/browse_ajax?continuation=#{continuation}"
|
url = "/browse_ajax?continuation=#{continuation}&gl=US&hl=en"
|
||||||
|
|
||||||
return url
|
return url
|
||||||
end
|
end
|
||||||
@@ -196,6 +196,12 @@ def get_about_info(ucid)
|
|||||||
raise "User does not exist."
|
raise "User does not exist."
|
||||||
end
|
end
|
||||||
|
|
||||||
|
sub_count = about.xpath_node(%q(//span[contains(text(), "subscribers")]))
|
||||||
|
if sub_count
|
||||||
|
sub_count = sub_count.content.delete(", subscribers").to_i?
|
||||||
|
end
|
||||||
|
sub_count ||= 0
|
||||||
|
|
||||||
author = about.xpath_node(%q(//span[@class="qualified-channel-title-text"]/a)).not_nil!.content
|
author = about.xpath_node(%q(//span[@class="qualified-channel-title-text"]/a)).not_nil!.content
|
||||||
ucid = about.xpath_node(%q(//link[@rel="canonical"])).not_nil!["href"].split("/")[-1]
|
ucid = about.xpath_node(%q(//link[@rel="canonical"])).not_nil!["href"].split("/")[-1]
|
||||||
|
|
||||||
@@ -207,5 +213,37 @@ def get_about_info(ucid)
|
|||||||
auto_generated = true
|
auto_generated = true
|
||||||
end
|
end
|
||||||
|
|
||||||
return {author, ucid, auto_generated}
|
return {author, ucid, auto_generated, sub_count}
|
||||||
|
end
|
||||||
|
|
||||||
|
def get_60_videos(ucid, page, auto_generated)
|
||||||
|
count = 0
|
||||||
|
videos = [] of SearchVideo
|
||||||
|
|
||||||
|
client = make_client(YT_URL)
|
||||||
|
|
||||||
|
2.times do |i|
|
||||||
|
url = produce_channel_videos_url(ucid, page * 2 + (i - 1), auto_generated: auto_generated)
|
||||||
|
response = client.get(url)
|
||||||
|
json = JSON.parse(response.body)
|
||||||
|
|
||||||
|
if json["content_html"]? && !json["content_html"].as_s.empty?
|
||||||
|
document = XML.parse_html(json["content_html"].as_s)
|
||||||
|
nodeset = document.xpath_nodes(%q(//li[contains(@class, "feed-item-container")]))
|
||||||
|
|
||||||
|
if !json["load_more_widget_html"]?.try &.as_s.empty?
|
||||||
|
count += 30
|
||||||
|
end
|
||||||
|
|
||||||
|
if auto_generated
|
||||||
|
videos += extract_videos(nodeset)
|
||||||
|
else
|
||||||
|
videos += extract_videos(nodeset, ucid)
|
||||||
|
end
|
||||||
|
else
|
||||||
|
break
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
return videos, count
|
||||||
end
|
end
|
||||||
|
|||||||
@@ -104,21 +104,21 @@ def template_youtube_comments(comments)
|
|||||||
|
|
||||||
html += <<-END_HTML
|
html += <<-END_HTML
|
||||||
<div class="pure-g">
|
<div class="pure-g">
|
||||||
<div class="pure-u-2-24">
|
<div class="pure-u-4-24 pure-u-md-2-24">
|
||||||
<img style="width:90%; padding-right:1em; padding-top:1em;" src="#{author_thumbnail}">
|
<img style="width:90%; padding-right:1em; padding-top:1em;" src="#{author_thumbnail}">
|
||||||
</div>
|
</div>
|
||||||
<div class="pure-u-22-24">
|
<div class="pure-u-20-24 pure-u-md-22-24">
|
||||||
<p>
|
<p>
|
||||||
<a href="javascript:void(0)" onclick="toggle(this)">[ - ]</a>
|
<b>
|
||||||
<i class="icon ion-ios-thumbs-up"></i> #{child["likeCount"]}
|
<a href="#{child["authorUrl"]}">#{child["author"]}</a>
|
||||||
<b><a href="#{child["authorUrl"]}">#{child["author"]}</a></b>
|
</b>
|
||||||
- #{recode_date(Time.epoch(child["published"].as_i64))} ago
|
|
||||||
</p>
|
|
||||||
<div>
|
|
||||||
<p style="white-space:pre-wrap">#{child["contentHtml"]}</p>
|
<p style="white-space:pre-wrap">#{child["contentHtml"]}</p>
|
||||||
#{replies_html}
|
#{recode_date(Time.epoch(child["published"].as_i64))} ago
|
||||||
</div>
|
|
|
||||||
</div>
|
<i class="icon ion-ios-thumbs-up"></i> #{child["likeCount"]}
|
||||||
|
</p>
|
||||||
|
#{replies_html}
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
END_HTML
|
END_HTML
|
||||||
end
|
end
|
||||||
@@ -156,10 +156,10 @@ def template_reddit_comments(root)
|
|||||||
|
|
||||||
content = <<-END_HTML
|
content = <<-END_HTML
|
||||||
<p>
|
<p>
|
||||||
<a href="javascript:void(0)" onclick="toggle(this)">[ - ]</a>
|
<a href="javascript:void(0)" onclick="toggle_parent(this)">[ - ]</a>
|
||||||
<i class="icon ion-ios-thumbs-up"></i> #{score}
|
|
||||||
<b><a href="https://www.reddit.com/user/#{author}">#{author}</a></b>
|
<b><a href="https://www.reddit.com/user/#{author}">#{author}</a></b>
|
||||||
- #{recode_date(child.created_utc)} ago
|
#{score} points
|
||||||
|
#{recode_date(child.created_utc)} ago
|
||||||
</p>
|
</p>
|
||||||
<div>
|
<div>
|
||||||
#{body_html}
|
#{body_html}
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ class Config
|
|||||||
YAML.mapping({
|
YAML.mapping({
|
||||||
crawl_threads: Int32,
|
crawl_threads: Int32,
|
||||||
channel_threads: Int32,
|
channel_threads: Int32,
|
||||||
|
feed_threads: Int32,
|
||||||
video_threads: Int32,
|
video_threads: Int32,
|
||||||
db: NamedTuple(
|
db: NamedTuple(
|
||||||
user: String,
|
user: String,
|
||||||
@@ -14,6 +15,7 @@ class Config
|
|||||||
https_only: Bool?,
|
https_only: Bool?,
|
||||||
hmac_key: String?,
|
hmac_key: String?,
|
||||||
full_refresh: Bool,
|
full_refresh: Bool,
|
||||||
|
geo_bypass: Bool,
|
||||||
})
|
})
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|||||||
@@ -93,6 +93,25 @@ def get_proxies(country_code = "US")
|
|||||||
return get_nova_proxies(country_code)
|
return get_nova_proxies(country_code)
|
||||||
end
|
end
|
||||||
|
|
||||||
|
def filter_proxies(proxies)
|
||||||
|
proxies.select! do |proxy|
|
||||||
|
begin
|
||||||
|
client = HTTPClient.new(YT_URL)
|
||||||
|
client.read_timeout = 10.seconds
|
||||||
|
client.connect_timeout = 10.seconds
|
||||||
|
|
||||||
|
proxy = HTTPProxy.new(proxy_host: proxy[:ip], proxy_port: proxy[:port])
|
||||||
|
client.set_proxy(proxy)
|
||||||
|
|
||||||
|
client.head("/").status_code == 200
|
||||||
|
rescue ex
|
||||||
|
false
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
return proxies
|
||||||
|
end
|
||||||
|
|
||||||
def get_nova_proxies(country_code = "US")
|
def get_nova_proxies(country_code = "US")
|
||||||
country_code = country_code.downcase
|
country_code = country_code.downcase
|
||||||
client = HTTP::Client.new(URI.parse("https://www.proxynova.com"))
|
client = HTTP::Client.new(URI.parse("https://www.proxynova.com"))
|
||||||
@@ -127,7 +146,7 @@ def get_nova_proxies(country_code = "US")
|
|||||||
proxies << {ip: ip, port: port, score: score}
|
proxies << {ip: ip, port: port, score: score}
|
||||||
end
|
end
|
||||||
|
|
||||||
proxies = proxies.sort_by { |proxy| proxy[:score] }.reverse
|
# proxies = proxies.sort_by { |proxy| proxy[:score] }.reverse
|
||||||
return proxies
|
return proxies
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|||||||
@@ -238,3 +238,9 @@ def write_var_int(value : Int)
|
|||||||
|
|
||||||
return bytes
|
return bytes
|
||||||
end
|
end
|
||||||
|
|
||||||
|
def sha256(text)
|
||||||
|
digest = OpenSSL::Digest.new("SHA256")
|
||||||
|
digest << text
|
||||||
|
return digest.hexdigest
|
||||||
|
end
|
||||||
|
|||||||
@@ -104,6 +104,44 @@ def refresh_videos(db)
|
|||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
def refresh_feeds(db, max_threads = 1)
|
||||||
|
max_channel = Channel(Int32).new
|
||||||
|
|
||||||
|
spawn do
|
||||||
|
max_threads = max_channel.receive
|
||||||
|
active_threads = 0
|
||||||
|
active_channel = Channel(Bool).new
|
||||||
|
|
||||||
|
loop do
|
||||||
|
db.query("SELECT email FROM users") do |rs|
|
||||||
|
rs.each do
|
||||||
|
email = rs.read(String)
|
||||||
|
view_name = "subscriptions_#{sha256(email)[0..7]}"
|
||||||
|
|
||||||
|
if active_threads >= max_threads
|
||||||
|
if active_channel.receive
|
||||||
|
active_threads -= 1
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
active_threads += 1
|
||||||
|
spawn do
|
||||||
|
begin
|
||||||
|
db.exec("REFRESH MATERIALIZED VIEW #{view_name}")
|
||||||
|
rescue ex
|
||||||
|
STDOUT << "REFRESH " << email << " : " << ex.message << "\n"
|
||||||
|
end
|
||||||
|
|
||||||
|
active_channel.send(true)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
max_channel.send(max_threads)
|
||||||
|
end
|
||||||
|
|
||||||
def pull_top_videos(config, db)
|
def pull_top_videos(config, db)
|
||||||
if config.dl_api_key
|
if config.dl_api_key
|
||||||
DetectLanguage.configure do |dl_config|
|
DetectLanguage.configure do |dl_config|
|
||||||
@@ -156,39 +194,14 @@ def update_decrypt_function
|
|||||||
end
|
end
|
||||||
|
|
||||||
def find_working_proxies(regions)
|
def find_working_proxies(regions)
|
||||||
proxy_channel = Channel({String, Array({ip: String, port: Int32})}).new
|
loop do
|
||||||
|
regions.each do |region|
|
||||||
|
proxies = get_proxies(region).first(20)
|
||||||
|
proxies = proxies.map { |proxy| {ip: proxy[:ip], port: proxy[:port]} }
|
||||||
|
# proxies = filter_proxies(proxies)
|
||||||
|
|
||||||
regions.each do |region|
|
yield region, proxies
|
||||||
spawn do
|
Fiber.yield
|
||||||
loop do
|
|
||||||
begin
|
|
||||||
proxies = get_proxies(region).first(20)
|
|
||||||
rescue ex
|
|
||||||
next proxy_channel.send({region, Array({ip: String, port: Int32}).new})
|
|
||||||
end
|
|
||||||
|
|
||||||
proxies.select! do |proxy|
|
|
||||||
begin
|
|
||||||
client = HTTPClient.new(YT_URL)
|
|
||||||
client.read_timeout = 10.seconds
|
|
||||||
client.connect_timeout = 10.seconds
|
|
||||||
|
|
||||||
proxy = HTTPProxy.new(proxy_host: proxy[:ip], proxy_port: proxy[:port])
|
|
||||||
client.set_proxy(proxy)
|
|
||||||
|
|
||||||
client.get("/").status_code == 200
|
|
||||||
rescue ex
|
|
||||||
false
|
|
||||||
end
|
|
||||||
end
|
|
||||||
proxies = proxies.map { |proxy| {ip: proxy[:ip], port: proxy[:port]} }
|
|
||||||
|
|
||||||
proxy_channel.send({region, proxies})
|
|
||||||
end
|
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
loop do
|
|
||||||
yield proxy_channel.receive
|
|
||||||
end
|
|
||||||
end
|
end
|
||||||
|
|||||||
@@ -6,6 +6,7 @@ class MixVideo
|
|||||||
ucid: String,
|
ucid: String,
|
||||||
length_seconds: Int32,
|
length_seconds: Int32,
|
||||||
index: Int32,
|
index: Int32,
|
||||||
|
mixes: Array(String),
|
||||||
})
|
})
|
||||||
end
|
end
|
||||||
|
|
||||||
@@ -34,6 +35,10 @@ def fetch_mix(rdid, video_id, cookies = nil)
|
|||||||
raise "Could not create mix."
|
raise "Could not create mix."
|
||||||
end
|
end
|
||||||
|
|
||||||
|
if !yt_data["contents"]["twoColumnWatchNextResults"]["playlist"]?
|
||||||
|
raise "Could not create mix."
|
||||||
|
end
|
||||||
|
|
||||||
playlist = yt_data["contents"]["twoColumnWatchNextResults"]["playlist"]["playlist"]
|
playlist = yt_data["contents"]["twoColumnWatchNextResults"]["playlist"]["playlist"]
|
||||||
mix_title = playlist["title"].as_s
|
mix_title = playlist["title"].as_s
|
||||||
|
|
||||||
@@ -59,7 +64,8 @@ def fetch_mix(rdid, video_id, cookies = nil)
|
|||||||
author,
|
author,
|
||||||
ucid,
|
ucid,
|
||||||
length_seconds,
|
length_seconds,
|
||||||
index
|
index,
|
||||||
|
[rdid]
|
||||||
)
|
)
|
||||||
end
|
end
|
||||||
|
|
||||||
@@ -72,3 +78,37 @@ def fetch_mix(rdid, video_id, cookies = nil)
|
|||||||
videos = videos.first(50)
|
videos = videos.first(50)
|
||||||
return Mix.new(mix_title, rdid, videos)
|
return Mix.new(mix_title, rdid, videos)
|
||||||
end
|
end
|
||||||
|
|
||||||
|
def template_mix(mix)
|
||||||
|
html = <<-END_HTML
|
||||||
|
<h3>
|
||||||
|
<a href="/mix?list=#{mix["mixId"]}">
|
||||||
|
#{mix["title"]}
|
||||||
|
</a>
|
||||||
|
</h3>
|
||||||
|
<div class="pure-menu pure-menu-scrollable playlist-restricted">
|
||||||
|
<ol class="pure-menu-list">
|
||||||
|
END_HTML
|
||||||
|
|
||||||
|
mix["videos"].as_a.each do |video|
|
||||||
|
html += <<-END_HTML
|
||||||
|
<li class="pure-menu-item">
|
||||||
|
<a href="/watch?v=#{video["videoId"]}&list=#{mix["mixId"]}">
|
||||||
|
<img style="width:100%;" src="/vi/#{video["videoId"]}/mqdefault.jpg">
|
||||||
|
<p style="width:100%">#{video["title"]}</p>
|
||||||
|
<p>
|
||||||
|
<b style="width: 100%">#{video["author"]}</b>
|
||||||
|
</p>
|
||||||
|
</a>
|
||||||
|
</li>
|
||||||
|
END_HTML
|
||||||
|
end
|
||||||
|
|
||||||
|
html += <<-END_HTML
|
||||||
|
</ol>
|
||||||
|
</div>
|
||||||
|
<hr>
|
||||||
|
END_HTML
|
||||||
|
|
||||||
|
html
|
||||||
|
end
|
||||||
|
|||||||
@@ -26,11 +26,23 @@ class Playlist
|
|||||||
})
|
})
|
||||||
end
|
end
|
||||||
|
|
||||||
def fetch_playlist_videos(plid, page, video_count)
|
def fetch_playlist_videos(plid, page, video_count, continuation = nil)
|
||||||
client = make_client(YT_URL)
|
client = make_client(YT_URL)
|
||||||
|
|
||||||
if video_count > 100
|
if continuation
|
||||||
|
html = client.get("/watch?v=#{continuation}&list=#{plid}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
||||||
|
html = XML.parse_html(html.body)
|
||||||
|
|
||||||
|
index = html.xpath_node(%q(//span[@id="playlist-current-index"])).try &.content.to_i?
|
||||||
|
if index
|
||||||
|
index -= 1
|
||||||
|
end
|
||||||
|
index ||= 0
|
||||||
|
else
|
||||||
index = (page - 1) * 100
|
index = (page - 1) * 100
|
||||||
|
end
|
||||||
|
|
||||||
|
if video_count > 100
|
||||||
url = produce_playlist_url(plid, index)
|
url = produce_playlist_url(plid, index)
|
||||||
|
|
||||||
response = client.get(url)
|
response = client.get(url)
|
||||||
@@ -53,6 +65,11 @@ def fetch_playlist_videos(plid, page, video_count)
|
|||||||
nodeset = document.xpath_nodes(%q(.//tr[contains(@class, "pl-video")]))
|
nodeset = document.xpath_nodes(%q(.//tr[contains(@class, "pl-video")]))
|
||||||
|
|
||||||
videos = extract_playlist(plid, nodeset, 0)
|
videos = extract_playlist(plid, nodeset, 0)
|
||||||
|
if continuation
|
||||||
|
until videos[0].id == continuation
|
||||||
|
videos.shift
|
||||||
|
end
|
||||||
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
@@ -199,3 +216,37 @@ def fetch_playlist(plid)
|
|||||||
|
|
||||||
return playlist
|
return playlist
|
||||||
end
|
end
|
||||||
|
|
||||||
|
def template_playlist(playlist)
|
||||||
|
html = <<-END_HTML
|
||||||
|
<h3>
|
||||||
|
<a href="/playlist?list=#{playlist["playlistId"]}">
|
||||||
|
#{playlist["title"]}
|
||||||
|
</a>
|
||||||
|
</h3>
|
||||||
|
<div class="pure-menu pure-menu-scrollable playlist-restricted">
|
||||||
|
<ol class="pure-menu-list">
|
||||||
|
END_HTML
|
||||||
|
|
||||||
|
playlist["videos"].as_a.each do |video|
|
||||||
|
html += <<-END_HTML
|
||||||
|
<li class="pure-menu-item">
|
||||||
|
<a href="/watch?v=#{video["videoId"]}&list=#{playlist["playlistId"]}">
|
||||||
|
<img style="width:100%;" src="/vi/#{video["videoId"]}/mqdefault.jpg">
|
||||||
|
<p style="width:100%">#{video["title"]}</p>
|
||||||
|
<p>
|
||||||
|
<b style="width: 100%">#{video["author"]}</b>
|
||||||
|
</p>
|
||||||
|
</a>
|
||||||
|
</li>
|
||||||
|
END_HTML
|
||||||
|
end
|
||||||
|
|
||||||
|
html += <<-END_HTML
|
||||||
|
</ol>
|
||||||
|
</div>
|
||||||
|
<hr>
|
||||||
|
END_HTML
|
||||||
|
|
||||||
|
html
|
||||||
|
end
|
||||||
|
|||||||
@@ -119,6 +119,15 @@ def get_user(sid, client, headers, db, refresh = true)
|
|||||||
|
|
||||||
db.exec("INSERT INTO users VALUES (#{args}) \
|
db.exec("INSERT INTO users VALUES (#{args}) \
|
||||||
ON CONFLICT (email) DO UPDATE SET id = users.id || $1, updated = $2, subscriptions = $4", user_array)
|
ON CONFLICT (email) DO UPDATE SET id = users.id || $1, updated = $2, subscriptions = $4", user_array)
|
||||||
|
|
||||||
|
begin
|
||||||
|
view_name = "subscriptions_#{sha256(user.email)[0..7]}"
|
||||||
|
PG_DB.exec("CREATE MATERIALIZED VIEW #{view_name} AS \
|
||||||
|
SELECT * FROM channel_videos WHERE \
|
||||||
|
ucid = ANY ((SELECT subscriptions FROM users WHERE email = '#{user.email}')::text[]) \
|
||||||
|
ORDER BY published DESC;")
|
||||||
|
rescue ex
|
||||||
|
end
|
||||||
end
|
end
|
||||||
else
|
else
|
||||||
user = fetch_user(sid, client, headers, db)
|
user = fetch_user(sid, client, headers, db)
|
||||||
@@ -129,6 +138,15 @@ def get_user(sid, client, headers, db, refresh = true)
|
|||||||
|
|
||||||
db.exec("INSERT INTO users VALUES (#{args}) \
|
db.exec("INSERT INTO users VALUES (#{args}) \
|
||||||
ON CONFLICT (email) DO UPDATE SET id = users.id || $1, updated = $2, subscriptions = $4", user_array)
|
ON CONFLICT (email) DO UPDATE SET id = users.id || $1, updated = $2, subscriptions = $4", user_array)
|
||||||
|
|
||||||
|
begin
|
||||||
|
view_name = "subscriptions_#{sha256(user.email)[0..7]}"
|
||||||
|
PG_DB.exec("CREATE MATERIALIZED VIEW #{view_name} AS \
|
||||||
|
SELECT * FROM channel_videos WHERE \
|
||||||
|
ucid = ANY ((SELECT subscriptions FROM users WHERE email = '#{user.email}')::text[]) \
|
||||||
|
ORDER BY published DESC;")
|
||||||
|
rescue ex
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
return user
|
return user
|
||||||
|
|||||||
@@ -456,7 +456,9 @@ class Video
|
|||||||
is_family_friendly: Bool,
|
is_family_friendly: Bool,
|
||||||
genre: String,
|
genre: String,
|
||||||
genre_url: String,
|
genre_url: String,
|
||||||
license: {
|
license: String,
|
||||||
|
sub_count_text: String,
|
||||||
|
author_thumbnail: {
|
||||||
type: String,
|
type: String,
|
||||||
default: "",
|
default: "",
|
||||||
},
|
},
|
||||||
@@ -477,6 +479,9 @@ class CaptionName
|
|||||||
)
|
)
|
||||||
end
|
end
|
||||||
|
|
||||||
|
class VideoRedirect < Exception
|
||||||
|
end
|
||||||
|
|
||||||
def get_video(id, db, proxies = {} of String => Array({ip: String, port: Int32}), refresh = true)
|
def get_video(id, db, proxies = {} of String => Array({ip: String, port: Int32}), refresh = true)
|
||||||
if db.query_one?("SELECT EXISTS (SELECT true FROM videos WHERE id = $1)", id, as: Bool)
|
if db.query_one?("SELECT EXISTS (SELECT true FROM videos WHERE id = $1)", id, as: Bool)
|
||||||
video = db.query_one("SELECT * FROM videos WHERE id = $1", id, as: Video)
|
video = db.query_one("SELECT * FROM videos WHERE id = $1", id, as: Video)
|
||||||
@@ -490,8 +495,8 @@ def get_video(id, db, proxies = {} of String => Array({ip: String, port: Int32})
|
|||||||
args = arg_array(video_array[1..-1], 2)
|
args = arg_array(video_array[1..-1], 2)
|
||||||
|
|
||||||
db.exec("UPDATE videos SET (info,updated,title,views,likes,dislikes,wilson_score,\
|
db.exec("UPDATE videos SET (info,updated,title,views,likes,dislikes,wilson_score,\
|
||||||
published,description,language,author,ucid, allowed_regions, is_family_friendly,\
|
published,description,language,author,ucid,allowed_regions,is_family_friendly,\
|
||||||
genre, genre_url, license)\
|
genre,genre_url,license,sub_count_text,author_thumbnail)\
|
||||||
= (#{args}) WHERE id = $1", video_array)
|
= (#{args}) WHERE id = $1", video_array)
|
||||||
rescue ex
|
rescue ex
|
||||||
db.exec("DELETE FROM videos * WHERE id = $1", id)
|
db.exec("DELETE FROM videos * WHERE id = $1", id)
|
||||||
@@ -511,14 +516,18 @@ def get_video(id, db, proxies = {} of String => Array({ip: String, port: Int32})
|
|||||||
end
|
end
|
||||||
|
|
||||||
def fetch_video(id, proxies)
|
def fetch_video(id, proxies)
|
||||||
html_channel = Channel(XML::Node).new
|
html_channel = Channel(XML::Node | String).new
|
||||||
info_channel = Channel(HTTP::Params).new
|
info_channel = Channel(HTTP::Params).new
|
||||||
|
|
||||||
spawn do
|
spawn do
|
||||||
client = make_client(YT_URL)
|
client = make_client(YT_URL)
|
||||||
html = client.get("/watch?v=#{id}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
html = client.get("/watch?v=#{id}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
||||||
html = XML.parse_html(html.body)
|
|
||||||
|
|
||||||
|
if md = html.headers["location"]?.try &.match(/v=(?<id>[a-zA-Z0-9_-]{11})/)
|
||||||
|
next html_channel.send(md["id"])
|
||||||
|
end
|
||||||
|
|
||||||
|
html = XML.parse_html(html.body)
|
||||||
html_channel.send(html)
|
html_channel.send(html)
|
||||||
end
|
end
|
||||||
|
|
||||||
@@ -536,6 +545,11 @@ def fetch_video(id, proxies)
|
|||||||
end
|
end
|
||||||
|
|
||||||
html = html_channel.receive
|
html = html_channel.receive
|
||||||
|
if html.as?(String)
|
||||||
|
raise VideoRedirect.new("#{html.as(String)}")
|
||||||
|
end
|
||||||
|
html = html.as(XML::Node)
|
||||||
|
|
||||||
info = info_channel.receive
|
info = info_channel.receive
|
||||||
|
|
||||||
if info["reason"]? && info["reason"].includes? "your country"
|
if info["reason"]? && info["reason"].includes? "your country"
|
||||||
@@ -543,6 +557,10 @@ def fetch_video(id, proxies)
|
|||||||
|
|
||||||
proxies.each do |region, list|
|
proxies.each do |region, list|
|
||||||
spawn do
|
spawn do
|
||||||
|
info = HTTP::Params.new({
|
||||||
|
"reason" => [info["reason"]],
|
||||||
|
})
|
||||||
|
|
||||||
list.each do |proxy|
|
list.each do |proxy|
|
||||||
begin
|
begin
|
||||||
client = HTTPClient.new(YT_URL)
|
client = HTTPClient.new(YT_URL)
|
||||||
@@ -555,14 +573,16 @@ def fetch_video(id, proxies)
|
|||||||
info = HTTP::Params.parse(client.get("/get_video_info?video_id=#{id}&ps=default&eurl=&gl=US&hl=en&disable_polymer=1").body)
|
info = HTTP::Params.parse(client.get("/get_video_info?video_id=#{id}&ps=default&eurl=&gl=US&hl=en&disable_polymer=1").body)
|
||||||
if !info["reason"]?
|
if !info["reason"]?
|
||||||
bypass_channel.send(proxy)
|
bypass_channel.send(proxy)
|
||||||
else
|
break
|
||||||
bypass_channel.send(nil)
|
|
||||||
end
|
end
|
||||||
|
|
||||||
break
|
|
||||||
rescue ex
|
rescue ex
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
|
# If none of the proxies we tried returned a valid response
|
||||||
|
if info["reason"]?
|
||||||
|
bypass_channel.send(nil)
|
||||||
|
end
|
||||||
end
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
@@ -641,11 +661,25 @@ def fetch_video(id, proxies)
|
|||||||
if license
|
if license
|
||||||
license = license.content
|
license = license.content
|
||||||
else
|
else
|
||||||
license ||= ""
|
license = ""
|
||||||
|
end
|
||||||
|
|
||||||
|
sub_count_text = html.xpath_node(%q(//span[contains(@class, "yt-subscriber-count")]))
|
||||||
|
if sub_count_text
|
||||||
|
sub_count_text = sub_count_text["title"]
|
||||||
|
else
|
||||||
|
sub_count_text = "0"
|
||||||
|
end
|
||||||
|
|
||||||
|
author_thumbnail = html.xpath_node(%(//img[@alt="#{author}"]))
|
||||||
|
if author_thumbnail
|
||||||
|
author_thumbnail = author_thumbnail["data-thumb"]
|
||||||
|
else
|
||||||
|
author_thumbnail = ""
|
||||||
end
|
end
|
||||||
|
|
||||||
video = Video.new(id, info, Time.now, title, views, likes, dislikes, wilson_score, published, description,
|
video = Video.new(id, info, Time.now, title, views, likes, dislikes, wilson_score, published, description,
|
||||||
nil, author, ucid, allowed_regions, is_family_friendly, genre, genre_url, license)
|
nil, author, ucid, allowed_regions, is_family_friendly, genre, genre_url, license, sub_count_text, author_thumbnail)
|
||||||
|
|
||||||
return video
|
return video
|
||||||
end
|
end
|
||||||
|
|||||||
@@ -13,23 +13,32 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<p class="h-box">
|
<div class="h-box">
|
||||||
<% if user %>
|
<% if user %>
|
||||||
<% if subscriptions.includes? ucid %>
|
<% if subscriptions.includes? ucid %>
|
||||||
<a href="/subscription_ajax?action_remove_subscriptions=1&c=<%= ucid %>&referer=<%= env.get("current_page") %>">
|
<p>
|
||||||
<b>Unsubscribe from <%= author %></b>
|
<a id="subscribe" onclick="unsubscribe()" class="pure-button pure-button-primary"
|
||||||
</a>
|
href="/subscription_ajax?action_remove_subscriptions=1&c=<%= ucid %>&referer=<%= env.get("current_page") %>">
|
||||||
|
<b>Unsubscribe from <%= author %> <%= number_with_separator(sub_count) %></b>
|
||||||
|
</a>
|
||||||
|
</p>
|
||||||
<% else %>
|
<% else %>
|
||||||
<a href="/subscription_ajax?action_create_subscription_to_channel=1&c=<%= ucid %>&referer=<%= env.get("current_page") %>">
|
<p>
|
||||||
<b>Subscribe to <%= author %></b>
|
<a id="subscribe" onclick="subscribe()" class="pure-button pure-button-primary"
|
||||||
</a>
|
href="/subscription_ajax?action_create_subscription_to_channel=1&c=<%= ucid %>&referer=<%= env.get("current_page") %>">
|
||||||
|
<b>Subscribe to <%= author %> <%= number_with_separator(sub_count) %></b>
|
||||||
|
</a>
|
||||||
|
</p>
|
||||||
<% end %>
|
<% end %>
|
||||||
<% else %>
|
<% else %>
|
||||||
<a href="/login?referer=<%= env.get("current_page") %>">
|
<p>
|
||||||
<b>Login to subscribe to <%= author %></b>
|
<a id="subscribe" class="pure-button pure-button-primary"
|
||||||
</a>
|
href="/login?referer=<%= env.get("current_page") %>">
|
||||||
|
<b>Login to subscribe to <%= author %></b>
|
||||||
|
</a>
|
||||||
|
</p>
|
||||||
<% end %>
|
<% end %>
|
||||||
</p>
|
</div>
|
||||||
|
|
||||||
<p class="h-box">
|
<p class="h-box">
|
||||||
<a href="https://www.youtube.com/channel/<%= ucid %>">View channel on YouTube</a>
|
<a href="https://www.youtube.com/channel/<%= ucid %>">View channel on YouTube</a>
|
||||||
@@ -51,8 +60,50 @@
|
|||||||
</div>
|
</div>
|
||||||
<div class="pure-u-1 pure-u-md-3-5"></div>
|
<div class="pure-u-1 pure-u-md-3-5"></div>
|
||||||
<div style="text-align:right;" class="pure-u-1 pure-u-md-1-5">
|
<div style="text-align:right;" class="pure-u-1 pure-u-md-1-5">
|
||||||
<% if videos.size == 60 %>
|
<% if count == 60 %>
|
||||||
<a href="/channel/<%= ucid %>?page=<%= page + 1 %>">Next page</a>
|
<a href="/channel/<%= ucid %>?page=<%= page + 1 %>">Next page</a>
|
||||||
<% end %>
|
<% end %>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
document.getElementById("subscribe")["href"] = "javascript:void(0);"
|
||||||
|
|
||||||
|
function subscribe() {
|
||||||
|
var url = "/subscription_ajax?action_create_subscription_to_channel=1&c=<%= ucid %>&referer=<%= env.get("current_page") %>";
|
||||||
|
var xhr = new XMLHttpRequest();
|
||||||
|
xhr.responseType = "json";
|
||||||
|
xhr.timeout = 20000;
|
||||||
|
xhr.open("GET", url, true);
|
||||||
|
xhr.send();
|
||||||
|
|
||||||
|
xhr.onreadystatechange = function() {
|
||||||
|
if (xhr.readyState == 4) {
|
||||||
|
if (xhr.status == 200) {
|
||||||
|
subscribe_button = document.getElementById("subscribe");
|
||||||
|
subscribe_button.onclick = unsubscribe;
|
||||||
|
subscribe_button.innerHTML = '<b>Unsubscribe from <%= author %> <%= number_with_separator(sub_count + 1) %></b>'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function unsubscribe() {
|
||||||
|
var url = "/subscription_ajax?action_remove_subscriptions=1&c=<%= ucid %>&referer=<%= env.get("current_page") %>";
|
||||||
|
var xhr = new XMLHttpRequest();
|
||||||
|
xhr.responseType = "json";
|
||||||
|
xhr.timeout = 20000;
|
||||||
|
xhr.open("GET", url, true);
|
||||||
|
xhr.send();
|
||||||
|
|
||||||
|
xhr.onreadystatechange = function() {
|
||||||
|
if (xhr.readyState == 4) {
|
||||||
|
if (xhr.status == 200) {
|
||||||
|
subscribe_button = document.getElementById("subscribe");
|
||||||
|
subscribe_button.onclick = subscribe;
|
||||||
|
subscribe_button.innerHTML = '<b>Subscribe to <%= author %> <%= number_with_separator(sub_count) %></b>'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</script>
|
||||||
@@ -32,7 +32,7 @@
|
|||||||
<p><%= number_with_separator(item.video_count) %> videos</p>
|
<p><%= number_with_separator(item.video_count) %> videos</p>
|
||||||
<p>PLAYLIST</p>
|
<p>PLAYLIST</p>
|
||||||
<% when MixVideo %>
|
<% when MixVideo %>
|
||||||
<a style="width:100%;" href="/watch?v=<%= item.id %>">
|
<a style="width:100%;" href="/watch?v=<%= item.id %>&list=<%= item.mixes[0] %>">
|
||||||
<% if env.get?("user") && env.get("user").as(User).preferences.thin_mode %>
|
<% if env.get?("user") && env.get("user").as(User).preferences.thin_mode %>
|
||||||
<% else %>
|
<% else %>
|
||||||
<img style="width:100%;" src="/vi/<%= item.id %>/mqdefault.jpg"/>
|
<img style="width:100%;" src="/vi/<%= item.id %>/mqdefault.jpg"/>
|
||||||
|
|||||||
@@ -13,13 +13,13 @@
|
|||||||
<div class="pure-g h-box">
|
<div class="pure-g h-box">
|
||||||
<div class="pure-u-1 pure-u-md-1-5">
|
<div class="pure-u-1 pure-u-md-1-5">
|
||||||
<% if page >= 2 %>
|
<% if page >= 2 %>
|
||||||
<a href="/search?q=<%= query %>&page=<%= page - 1 %>">Previous page</a>
|
<a href="/search?q=<%= HTML.escape(query.not_nil!) %>&page=<%= page - 1 %>">Previous page</a>
|
||||||
<% end %>
|
<% end %>
|
||||||
</div>
|
</div>
|
||||||
<div class="pure-u-1 pure-u-md-3-5"></div>
|
<div class="pure-u-1 pure-u-md-3-5"></div>
|
||||||
<div style="text-align:right;" class="pure-u-1 pure-u-md-1-5">
|
<div style="text-align:right;" class="pure-u-1 pure-u-md-1-5">
|
||||||
<% if count >= 20 %>
|
<% if count >= 20 %>
|
||||||
<a href="/search?q=<%= query %>&page=<%= page + 1 %>">Next page</a>
|
<a href="/search?q=<%= HTML.escape(query.not_nil!) %>&page=<%= page + 1 %>">Next page</a>
|
||||||
<% end %>
|
<% end %>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -22,6 +22,7 @@
|
|||||||
<meta name="twitter:player" content="<%= host_url %>/embed/<%= video.id %>">
|
<meta name="twitter:player" content="<%= host_url %>/embed/<%= video.id %>">
|
||||||
<meta name="twitter:player:width" content="1280">
|
<meta name="twitter:player:width" content="1280">
|
||||||
<meta name="twitter:player:height" content="720">
|
<meta name="twitter:player:height" content="720">
|
||||||
|
<script src="/js/watch.js"></script>
|
||||||
<%= rendered "components/player_sources" %>
|
<%= rendered "components/player_sources" %>
|
||||||
<title><%= HTML.escape(video.title) %> - Invidious</title>
|
<title><%= HTML.escape(video.title) %> - Invidious</title>
|
||||||
<% end %>
|
<% end %>
|
||||||
@@ -91,20 +92,23 @@
|
|||||||
<% if user %>
|
<% if user %>
|
||||||
<% if subscriptions.includes? video.ucid %>
|
<% if subscriptions.includes? video.ucid %>
|
||||||
<p>
|
<p>
|
||||||
<a href="/subscription_ajax?action_remove_subscriptions=1&c=<%= video.ucid %>&referer=<%= env.get("current_page") %>">
|
<a id="subscribe" onclick="unsubscribe()" class="pure-button pure-button-primary"
|
||||||
<b>Unsubscribe from <%= video.author %></b>
|
href="/subscription_ajax?action_remove_subscriptions=1&c=<%= video.ucid %>&referer=<%= env.get("current_page") %>">
|
||||||
|
<b>Unsubscribe from <%= video.author %> <%= video.sub_count_text %></b>
|
||||||
</a>
|
</a>
|
||||||
</p>
|
</p>
|
||||||
<% else %>
|
<% else %>
|
||||||
<p>
|
<p>
|
||||||
<a href="/subscription_ajax?action_create_subscription_to_channel=1&c=<%= video.ucid %>&referer=<%= env.get("current_page") %>">
|
<a id="subscribe" onclick="subscribe()" class="pure-button pure-button-primary"
|
||||||
<b>Subscribe to <%= video.author %></b>
|
href="/subscription_ajax?action_create_subscription_to_channel=1&c=<%= video.ucid %>&referer=<%= env.get("current_page") %>">
|
||||||
|
<b>Subscribe to <%= video.author %> <%= video.sub_count_text %></b>
|
||||||
</a>
|
</a>
|
||||||
</p>
|
</p>
|
||||||
<% end %>
|
<% end %>
|
||||||
<% else %>
|
<% else %>
|
||||||
<p>
|
<p>
|
||||||
<a href="/login?referer=<%= env.get("current_page") %>">
|
<a id="subscribe" class="pure-button pure-button-primary"
|
||||||
|
href="/login?referer=<%= env.get("current_page") %>">
|
||||||
<b>Login to subscribe to <%= video.author %></b>
|
<b>Login to subscribe to <%= video.author %></b>
|
||||||
</a>
|
</a>
|
||||||
</p>
|
</p>
|
||||||
@@ -117,11 +121,15 @@
|
|||||||
</div>
|
</div>
|
||||||
<hr>
|
<hr>
|
||||||
<div id="comments">
|
<div id="comments">
|
||||||
<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="pure-u-1 pure-u-md-1-5">
|
<div class="pure-u-1 pure-u-md-1-5">
|
||||||
|
<% if plid %>
|
||||||
|
<div id="playlist" class="h-box">
|
||||||
|
</div>
|
||||||
|
<% end %>
|
||||||
|
|
||||||
<% if !preferences || preferences && preferences.related_videos %>
|
<% if !preferences || preferences && preferences.related_videos %>
|
||||||
<div class="h-box">
|
<div class="h-box">
|
||||||
<% rvs.each do |rv| %>
|
<% rvs.each do |rv| %>
|
||||||
@@ -144,38 +152,118 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<script>
|
<script>
|
||||||
function toggle(target) {
|
subscribe_button = document.getElementById("subscribe");
|
||||||
body = target.parentNode.parentNode.children[1];
|
if (subscribe_button.getAttribute('onclick')) {
|
||||||
if (body.style.display === null || body.style.display === "") {
|
subscribe_button["href"] = "javascript:void(0);";
|
||||||
target.innerHTML = "[ + ]";
|
|
||||||
body.style.display = "none";
|
|
||||||
} else {
|
|
||||||
target.innerHTML = "[ - ]";
|
|
||||||
body.style.display = "";
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
function toggle_comments(target) {
|
function subscribe() {
|
||||||
body = target.parentNode.parentNode.parentNode.children[1];
|
var url = "/subscription_ajax?action_create_subscription_to_channel=1&c=<%= video.ucid %>&referer=<%= env.get("current_page") %>";
|
||||||
if (body.style.display === null || body.style.display === "") {
|
var xhr = new XMLHttpRequest();
|
||||||
target.innerHTML = "[ + ]";
|
xhr.responseType = "json";
|
||||||
body.style.display = "none";
|
xhr.timeout = 20000;
|
||||||
} else {
|
xhr.open("GET", url, true);
|
||||||
target.innerHTML = "[ - ]";
|
xhr.send();
|
||||||
body.style.display = "";
|
|
||||||
}
|
xhr.onreadystatechange = function() {
|
||||||
|
if (xhr.readyState == 4) {
|
||||||
|
if (xhr.status == 200) {
|
||||||
|
subscribe_button = document.getElementById("subscribe");
|
||||||
|
subscribe_button.onclick = unsubscribe;
|
||||||
|
subscribe_button.innerHTML = '<b>Unsubscribe from <%= video.author %> <%= video.sub_count_text %></b>'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function get_youtube_replies(target) {
|
function unsubscribe() {
|
||||||
var continuation = target.getAttribute("data-continuation");
|
var url = "/subscription_ajax?action_remove_subscriptions=1&c=<%= video.ucid %>&referer=<%= env.get("current_page") %>";
|
||||||
|
var xhr = new XMLHttpRequest();
|
||||||
|
xhr.responseType = "json";
|
||||||
|
xhr.timeout = 20000;
|
||||||
|
xhr.open("GET", url, true);
|
||||||
|
xhr.send();
|
||||||
|
|
||||||
var body = target.parentNode.parentNode;
|
xhr.onreadystatechange = function() {
|
||||||
var fallback = body.innerHTML;
|
if (xhr.readyState == 4) {
|
||||||
body.innerHTML =
|
if (xhr.status == 200) {
|
||||||
|
subscribe_button = document.getElementById("subscribe");
|
||||||
|
subscribe_button.onclick = subscribe;
|
||||||
|
subscribe_button.innerHTML = '<b>Subscribe to <%= video.author %> <%= video.sub_count_text %></b>'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
<% if plid %>
|
||||||
|
function get_playlist() {
|
||||||
|
playlist = document.getElementById("playlist");
|
||||||
|
playlist.innerHTML = ' \
|
||||||
|
<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3> \
|
||||||
|
<hr>'
|
||||||
|
|
||||||
|
var plid = "<%= plid %>"
|
||||||
|
|
||||||
|
if (plid.startsWith("RD")) {
|
||||||
|
var plid_url = "/api/v1/mixes/<%= plid %>?continuation=<%= video.id %>&format=html";
|
||||||
|
} else {
|
||||||
|
var plid_url = "/api/v1/playlists/<%= plid %>?continuation=<%= video.id %>&format=html";
|
||||||
|
}
|
||||||
|
|
||||||
|
var xhr = new XMLHttpRequest();
|
||||||
|
xhr.responseType = "json";
|
||||||
|
xhr.timeout = 20000;
|
||||||
|
xhr.open("GET", plid_url, true);
|
||||||
|
xhr.send();
|
||||||
|
|
||||||
|
xhr.onreadystatechange = function() {
|
||||||
|
if (xhr.readyState == 4) {
|
||||||
|
if (xhr.status == 200) {
|
||||||
|
playlist.innerHTML = xhr.response.playlistHtml;
|
||||||
|
|
||||||
|
if (xhr.response.nextVideo) {
|
||||||
|
player.on('ended', function() {
|
||||||
|
window.location.replace("/watch?v="
|
||||||
|
+ xhr.response.nextVideo
|
||||||
|
+ "&list=<%= plid %>"
|
||||||
|
<% if params[:listen] %>
|
||||||
|
+ "&listen=1"
|
||||||
|
<% end %>
|
||||||
|
<% if params[:autoplay] %>
|
||||||
|
+ "&autoplay=1"
|
||||||
|
<% end %>
|
||||||
|
<% if params[:speed] %>
|
||||||
|
+ "&speed=<%= params[:speed] %>"
|
||||||
|
<% end %>
|
||||||
|
);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
playlist.innerHTML = "";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
xhr.ontimeout = function() {
|
||||||
|
console.log("Pulling playlist timed out.");
|
||||||
|
|
||||||
|
comments = document.getElementById("playlist");
|
||||||
|
comments.innerHTML =
|
||||||
|
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3><hr>';
|
||||||
|
get_playlist();
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
get_playlist();
|
||||||
|
<% end %>
|
||||||
|
|
||||||
|
function get_reddit_comments() {
|
||||||
|
comments = document.getElementById("comments");
|
||||||
|
var fallback = comments.innerHTML;
|
||||||
|
comments.innerHTML =
|
||||||
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
||||||
|
|
||||||
var url =
|
var url = "/api/v1/comments/<%= video.id %>?source=reddit&format=html";
|
||||||
"/api/v1/comments/<%= video.id %>?format=html&continuation=" + continuation;
|
|
||||||
var xhr = new XMLHttpRequest();
|
var xhr = new XMLHttpRequest();
|
||||||
xhr.responseType = "json";
|
xhr.responseType = "json";
|
||||||
xhr.timeout = 20000;
|
xhr.timeout = 20000;
|
||||||
@@ -185,38 +273,19 @@ function get_youtube_replies(target) {
|
|||||||
xhr.onreadystatechange = function() {
|
xhr.onreadystatechange = function() {
|
||||||
if (xhr.readyState == 4) {
|
if (xhr.readyState == 4) {
|
||||||
if (xhr.status == 200) {
|
if (xhr.status == 200) {
|
||||||
body.innerHTML = xhr.response.contentHtml;
|
|
||||||
} else {
|
|
||||||
body.innerHTML = fallback;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
xhr.ontimeout = function() {
|
|
||||||
console.log("Pulling comments timed out.");
|
|
||||||
|
|
||||||
body.innerHTML = fallback;
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
function get_reddit_comments() {
|
|
||||||
var url = "/api/v1/comments/<%= video.id %>?source=reddit&format=html";
|
|
||||||
var xhr = new XMLHttpRequest();
|
|
||||||
xhr.responseType = "json";
|
|
||||||
xhr.timeout = 20000;
|
|
||||||
xhr.open("GET", url, true);
|
|
||||||
xhr.send();
|
|
||||||
|
|
||||||
xhr.onreadystatechange = function() {
|
|
||||||
if (xhr.readyState == 4)
|
|
||||||
if (xhr.status == 200) {
|
|
||||||
comments = document.getElementById("comments");
|
|
||||||
comments.innerHTML = ' \
|
comments.innerHTML = ' \
|
||||||
<div> \
|
<div> \
|
||||||
<h3> \
|
<h3> \
|
||||||
<a href="javascript:void(0)" onclick="toggle_comments(this)">[ - ]</a> \
|
<a href="javascript:void(0)" onclick="toggle_comments(this)">[ - ]</a> \
|
||||||
{title} \
|
{title} \
|
||||||
</h3> \
|
</h3> \
|
||||||
|
<p> \
|
||||||
|
<b> \
|
||||||
|
<a href="javascript:void(0)" onclick="swap_comments(\'youtube\')"> \
|
||||||
|
View YouTube comments \
|
||||||
|
</a> \
|
||||||
|
</b> \
|
||||||
|
</p> \
|
||||||
<b> \
|
<b> \
|
||||||
<a rel="noopener" target="_blank" href="https://reddit.com{permalink}">View more comments on Reddit</a> \
|
<a rel="noopener" target="_blank" href="https://reddit.com{permalink}">View more comments on Reddit</a> \
|
||||||
</b> \
|
</b> \
|
||||||
@@ -231,10 +300,10 @@ function get_reddit_comments() {
|
|||||||
<% if preferences && preferences.comments[1] == "youtube" %>
|
<% if preferences && preferences.comments[1] == "youtube" %>
|
||||||
get_youtube_comments();
|
get_youtube_comments();
|
||||||
<% else %>
|
<% else %>
|
||||||
comments = document.getElementById("comments");
|
comments.innerHTML = fallback;
|
||||||
comments.innerHTML = "";
|
|
||||||
<% end %>
|
<% end %>
|
||||||
}
|
}
|
||||||
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
xhr.ontimeout = function() {
|
xhr.ontimeout = function() {
|
||||||
@@ -245,6 +314,11 @@ function get_reddit_comments() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
function get_youtube_comments() {
|
function get_youtube_comments() {
|
||||||
|
comments = document.getElementById("comments");
|
||||||
|
var fallback = comments.innerHTML;
|
||||||
|
comments.innerHTML =
|
||||||
|
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
||||||
|
|
||||||
var url = "/api/v1/comments/<%= video.id %>?format=html";
|
var url = "/api/v1/comments/<%= video.id %>?format=html";
|
||||||
var xhr = new XMLHttpRequest();
|
var xhr = new XMLHttpRequest();
|
||||||
xhr.responseType = "json";
|
xhr.responseType = "json";
|
||||||
@@ -253,9 +327,8 @@ function get_youtube_comments() {
|
|||||||
xhr.send();
|
xhr.send();
|
||||||
|
|
||||||
xhr.onreadystatechange = function() {
|
xhr.onreadystatechange = function() {
|
||||||
if (xhr.readyState == 4)
|
if (xhr.readyState == 4) {
|
||||||
if (xhr.status == 200) {
|
if (xhr.status == 200) {
|
||||||
comments = document.getElementById("comments");
|
|
||||||
if (xhr.response.commentCount > 0) {
|
if (xhr.response.commentCount > 0) {
|
||||||
comments.innerHTML = ' \
|
comments.innerHTML = ' \
|
||||||
<div> \
|
<div> \
|
||||||
@@ -263,6 +336,11 @@ function get_youtube_comments() {
|
|||||||
<a href="javascript:void(0)" onclick="toggle_comments(this)">[ - ]</a> \
|
<a href="javascript:void(0)" onclick="toggle_comments(this)">[ - ]</a> \
|
||||||
View {commentCount} comments \
|
View {commentCount} comments \
|
||||||
</h3> \
|
</h3> \
|
||||||
|
<b> \
|
||||||
|
<a href="javascript:void(0)" onclick="swap_comments(\'reddit\')"> \
|
||||||
|
View Reddit comments \
|
||||||
|
</a> \
|
||||||
|
</b> \
|
||||||
</div> \
|
</div> \
|
||||||
<div>{contentHtml}</div> \
|
<div>{contentHtml}</div> \
|
||||||
<hr>'.supplant({
|
<hr>'.supplant({
|
||||||
@@ -276,35 +354,59 @@ function get_youtube_comments() {
|
|||||||
<% if preferences && preferences.comments[1] == "youtube" %>
|
<% if preferences && preferences.comments[1] == "youtube" %>
|
||||||
get_youtube_comments();
|
get_youtube_comments();
|
||||||
<% else %>
|
<% else %>
|
||||||
comments = document.getElementById("comments");
|
|
||||||
comments.innerHTML = "";
|
comments.innerHTML = "";
|
||||||
<% end %>
|
<% end %>
|
||||||
}
|
}
|
||||||
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
xhr.ontimeout = function() {
|
xhr.ontimeout = function() {
|
||||||
console.log("Pulling comments timed out.");
|
console.log("Pulling comments timed out.");
|
||||||
|
|
||||||
comments = document.getElementById("comments");
|
|
||||||
comments.innerHTML =
|
comments.innerHTML =
|
||||||
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
||||||
get_youtube_comments();
|
get_youtube_comments();
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
function commaSeparateNumber(val){
|
function get_youtube_replies(target) {
|
||||||
while (/(\d+)(\d{3})/.test(val.toString())){
|
var continuation = target.getAttribute('data-continuation');
|
||||||
val = val.toString().replace(/(\d+)(\d{3})/, '$1'+','+'$2');
|
|
||||||
}
|
|
||||||
return val;
|
|
||||||
}
|
|
||||||
|
|
||||||
String.prototype.supplant = function(o) {
|
var body = target.parentNode.parentNode;
|
||||||
return this.replace(/{([^{}]*)}/g, function(a, b) {
|
var fallback = body.innerHTML;
|
||||||
var r = o[b];
|
body.innerHTML =
|
||||||
return typeof r === "string" || typeof r === "number" ? r : a;
|
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
||||||
});
|
|
||||||
};
|
var url = '/api/v1/comments/<%= video.id %>?format=html&continuation=' +
|
||||||
|
continuation;
|
||||||
|
var xhr = new XMLHttpRequest();
|
||||||
|
xhr.responseType = 'json';
|
||||||
|
xhr.timeout = 20000;
|
||||||
|
xhr.open('GET', url, true);
|
||||||
|
xhr.send();
|
||||||
|
|
||||||
|
xhr.onreadystatechange = function() {
|
||||||
|
if (xhr.readyState == 4) {
|
||||||
|
if (xhr.status == 200) {
|
||||||
|
body.innerHTML = ' \
|
||||||
|
<p><a href="javascript:void(0)" \
|
||||||
|
onclick="hide_youtube_replies(this)">Hide replies \
|
||||||
|
</a></p> \
|
||||||
|
<div>{contentHtml}</div>'.supplant({
|
||||||
|
contentHtml: xhr.response.contentHtml,
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
body.innerHTML = fallback;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
xhr.ontimeout = function() {
|
||||||
|
console.log('Pulling comments timed out.');
|
||||||
|
|
||||||
|
body.innerHTML = fallback;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
<% if preferences %>
|
<% if preferences %>
|
||||||
<% if preferences.comments[0] == "youtube" %>
|
<% if preferences.comments[0] == "youtube" %>
|
||||||
|
|||||||
Reference in New Issue
Block a user