Merge branch 'current' into feature/snapchat

This commit is contained in:
jj 2024-07-07 15:27:45 +02:00 committed by GitHub
commit c759c4539d
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
25 changed files with 462 additions and 203 deletions

View File

@ -0,0 +1,36 @@
---
name: main instance bug report
about: report an issue with cobalt.tools or api.cobalt.tools
title: '[short description of the bug]'
labels: main instance issue
assignees: ''
---
### bug description
clear and concise description of what the issue is.
### reproduction steps
steps to reproduce the described behavior.
here's an example of what it could look like:
1. go to '...'
2. click on '....'
3. download [media type] from [service]
4. see error
### screenshots
if applicable, add screenshots or screen recordings to support your explanation.
if not, remove this section.
### links
if applicable, add links that cause the issue. more = better.
if not, remove this section.
### platform information
- OS [e.g. iOS, windows]
- browser [e.g. chrome, safari, firefox]
- version [e.g. 115]
### additional context
add any other context about the problem here if applicable.
if not, remove this section.

View File

@ -1,32 +1,36 @@
--- ---
name: bug report name: bug report
about: report an issue with downloads or something else about: report a global issue with the cobalt codebase
title: '' title: '[short description of the bug]'
labels: bug labels: bug
assignees: '' assignees: ''
--- ---
**bug description** ### bug description
a clear and concise description of what the bug is. clear and concise description of what the issue is.
**reproduction steps** ### reproduction steps
steps to reproduce the behavior: steps to reproduce the described behavior.
here's an example of what it could look like:
1. go to '...' 1. go to '...'
2. click on '....' 2. click on '....'
3. download this video: **[link here]** 3. download [media type] from [service]
4. see error 4. see error
**screenshots** ### screenshots
if applicable, add screenshots or screen recordings to help explain your problem. if applicable, add screenshots or screen recordings to support your explanation.
if not, remove this section.
**links** ### links
if applicable, add links that cause the issue. more = better. if applicable, add links that cause the issue. more = better.
if not, remove this section.
**platform** ### platform information
- OS [e.g. iOS, windows] - OS [e.g. iOS, windows]
- browser [e.g. chrome, safari, firefox] - browser [e.g. chrome, safari, firefox]
- version [e.g. 115] - version [e.g. 115]
**additional context** ### additional context
add any other context about the problem here. add any other context about the problem here if applicable.
if not, remove this section.

View File

@ -1,17 +1,15 @@
--- ---
name: feature request name: feature request
about: suggest a feature for cobalt about: suggest a feature for cobalt
title: '' title: '[short feature request description]'
labels: feature request labels: feature request
assignees: '' assignees: ''
--- ---
**describe the feature you'd like to see** ### describe the feature you'd like to see
a clear and concise description of what you want to happen. clear and concise description of the feature you want to see in cobalt.
**describe alternatives you've considered** ### additional context
a clear and concise description of any alternative solutions or features you've considered. if applicable, add any other context or screenshots related to the feature request here.
if not, remove this section.
**additional context**
add any other context or screenshots about the feature request here.

12
.github/ISSUE_TEMPLATE/hosting-help.md vendored Normal file
View File

@ -0,0 +1,12 @@
---
name: instance hosting help
about: ask any question regarding cobalt instance hosting
title: '[short description of the problem]'
labels: instance hosting help
assignees: ''
---
### problem description
describe what issue you're having, clearly and concisely.
support your description with screenshots/links/etc when needed.

View File

@ -0,0 +1,18 @@
---
name: service request
about: request service support in cobalt
title: 'add support for [service name]'
labels: service request
assignees: ''
---
### service name & description
provide the service name and brief description of what it is.
### link samples for the service you'd like cobalt to support
list of links that cobalt should recognize.
could be regular video link, shared video link, mobile video link, shortened link, etc.
### additional context
any additional context or screenshots should go here. if there aren't any, just remove this part.

22
.github/workflows/fast-forward.yml vendored Normal file
View File

@ -0,0 +1,22 @@
name: fast-forward
on:
issue_comment:
types: [created, edited]
jobs:
fast-forward:
# Only run if the comment contains the /fast-forward command.
if: ${{ contains(github.event.comment.body, '/fast-forward')
&& github.event.issue.pull_request }}
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
issues: write
steps:
- name: Fast forwarding
uses: sequoia-pgp/fast-forward@v1
with:
merge: true
comment: 'on-error'

39
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,39 @@
# contributing to cobalt
if you're reading this, you are probably interested in contributing to cobalt, which we are very thankful for :3
this document serves as a guide to help you make contributions that we can merge into the cobalt codebase.
## translations
currently, we are **not accepting** translations of cobalt. this is because we are making significant changes to the frontend, and the currently used localization structure is being completely reworked. if this changes, this document will be updated.
## adding features or support for services
before putting in the effort to implement a feature, it's worth considering whether it would be appropriate to add it to cobalt. the cobalt api is built to assist people **only with downloading freely accessible content**. other functionality, such as:
- downloading paid / not publicly accessible content
- downloading content protected by DRM
- scraping unrelated information & exposing it outside of file metadata
will not be reviewed or merged.
if you plan on adding a feature or support for a service, but are unsure whether it would be appropriate, it's best to open an issue and discuss it beforehand.
## git
when contributing code to cobalt, there are a few guidelines in place to ensure that the code history is readable and comprehensible.
### clean commit messages
internally, we use a format similar to [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/) - the first part signifies which part of the code you are changing (the *scope*), and the second part explains the change. for inspiration on how to write appropriate commit titles, you can take a look at the [commit history](https://github.com/imputnet/cobalt/commits/).
the scope is not strictly defined, you can write whatever you find most fitting for the particular change. suppose you are changing a small part of a more significant part of the codebase. in that case, you can specify both the larger and smaller scopes in the commit message for clarity (e.g., if you were changing something in internal streams, the commit could be something like `stream/internal: fix object not being handled properly`).
if you think a change deserves further explanation, we encourage you to write a short explanation in the commit message ([example](https://github.com/imputnet/cobalt/commit/d2e5b6542f71f3809ba94d56c26f382b5cb62762)), which will save both you and us time having to enquire about the change, and you explaining the reason behind it.
if your contribution has uninformative commit messages, you may be asked to interactively rebase your branch and amend each commit to include a meaningful message.
### clean commit history
if your branch is out of date and/or has some merge conflicts with the `current` branch, you should **rebase** it instead of merging. this prevents meaningless merge commits from being included in your branch, which would then end up in the cobalt git history.
if you find a mistake or bug in your code before it's merged or reviewed, instead of making a brand new commit to fix it, it would be preferable to amend that specific commit where the mistake was first introduced. this also helps us easily revert a commit if we discover that it introduced a bug or some unwanted behavior.
- if the commit you are fixing is the latest one, you can add your files to staging and then use `git commit --amend` to apply the change.
- if the commit is somewhere deeper in your branch, you can use `git commit --fixup=HASH`, where *`HASH`* is the commit you are fixing.
- afterward, you must interactively rebase your branch with `git rebase -i current --autosquash`.
this will open up an editor, but you don't need to do anything else except save the file and exit.
- once you do either of these things, you will need to do a **force push** to your branch with `git push --force-with-lease`.

4
package-lock.json generated
View File

@ -1,12 +1,12 @@
{ {
"name": "cobalt", "name": "cobalt",
"version": "7.14.4", "version": "7.14.5",
"lockfileVersion": 3, "lockfileVersion": 3,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "cobalt", "name": "cobalt",
"version": "7.14.4", "version": "7.14.5",
"license": "AGPL-3.0", "license": "AGPL-3.0",
"dependencies": { "dependencies": {
"content-disposition-header": "0.6.0", "content-disposition-header": "0.6.0",

View File

@ -1,7 +1,7 @@
{ {
"name": "cobalt", "name": "cobalt",
"description": "save what you love", "description": "save what you love",
"version": "7.14.4", "version": "7.14.5",
"author": "imput", "author": "imput",
"exports": "./src/cobalt.js", "exports": "./src/cobalt.js",
"type": "module", "type": "module",

View File

@ -196,10 +196,10 @@ export function runAPI(express, app, gitCommit, gitBranch, __dirname) {
return res.sendStatus(404); return res.sendStatus(404);
} }
streamInfo.headers = { streamInfo.headers = new Map([
...streamInfo.headers, ...(streamInfo.headers || []),
...req.headers ...Object.entries(req.headers)
}; ]);
return stream(res, { type: 'internal', ...streamInfo }); return stream(res, { type: 'internal', ...streamInfo });
}) })

View File

@ -26,45 +26,46 @@ export async function runWeb(express, app, gitCommit, gitBranch, __dirname) {
app.get('/onDemand', (req, res) => { app.get('/onDemand', (req, res) => {
try { try {
if (req.query.blockId) { if (typeof req.query.blockId !== 'string') {
let blockId = req.query.blockId.slice(0, 3);
let blockData;
switch(blockId) {
// changelog history
case "0":
let history = changelogHistory();
if (history) {
blockData = createResponse("success", { t: history })
} else {
blockData = createResponse("error", {
t: "couldn't render this block, please try again!"
})
}
break;
// celebrations emoji
case "1":
let celebration = celebrationsEmoji();
if (celebration) {
blockData = createResponse("success", { t: celebration })
}
break;
default:
blockData = createResponse("error", {
t: "couldn't find a block with this id"
})
break;
}
if (blockData?.body) {
return res.status(blockData.status).json(blockData.body);
} else {
return res.status(204).end();
}
} else {
return res.status(400).json({ return res.status(400).json({
status: "error", status: "error",
text: "couldn't render this block, please try again!" text: "couldn't render this block, please try again!"
}); });
} }
let blockId = req.query.blockId.slice(0, 3);
let blockData;
switch(blockId) {
// changelog history
case "0":
let history = changelogHistory();
if (history) {
blockData = createResponse("success", { t: history })
} else {
blockData = createResponse("error", {
t: "couldn't render this block, please try again!"
})
}
break;
// celebrations emoji
case "1":
let celebration = celebrationsEmoji();
if (celebration) {
blockData = createResponse("success", { t: celebration })
}
break;
default:
blockData = createResponse("error", {
t: "couldn't find a block with this id"
})
break;
}
if (blockData?.body) {
return res.status(blockData.status).json(blockData.body);
} else {
return res.status(204).end();
}
} catch { } catch {
return res.status(400).json({ return res.status(400).json({
status: "error", status: "error",

View File

@ -24,7 +24,7 @@ export default function(r, host, userFormat, isAudioOnly, lang, isAudioMuted, di
else if (r.isGif && toGif) action = "gif"; else if (r.isGif && toGif) action = "gif";
else if (isAudioMuted) action = "muteVideo"; else if (isAudioMuted) action = "muteVideo";
else if (isAudioOnly) action = "audio"; else if (isAudioOnly) action = "audio";
else if (r.isM3U8) action = "singleM3U8"; else if (r.isM3U8) action = "m3u8";
else action = "video"; else action = "video";
if (action === "picker" || action === "audio") { if (action === "picker" || action === "audio") {
@ -48,13 +48,19 @@ export default function(r, host, userFormat, isAudioOnly, lang, isAudioMuted, di
params = { type: "gif" } params = { type: "gif" }
break; break;
case "singleM3U8": case "m3u8":
params = { type: "remux" } params = {
type: Array.isArray(r.urls) ? "render" : "remux"
}
break; break;
case "muteVideo": case "muteVideo":
let muteType = "mute";
if (Array.isArray(r.urls) && !r.isM3U8) {
muteType = "bridge";
}
params = { params = {
type: Array.isArray(r.urls) ? "bridge" : "mute", type: muteType,
u: Array.isArray(r.urls) ? r.urls[0] : r.urls, u: Array.isArray(r.urls) ? r.urls[0] : r.urls,
mute: true mute: true
} }

View File

@ -256,11 +256,11 @@ export default function(obj) {
if (!media_id && cookie) media_id = await getMediaId(id, { cookie }); if (!media_id && cookie) media_id = await getMediaId(id, { cookie });
// mobile api (bearer) // mobile api (bearer)
if (media_id && token) data = await requestMobileApi(id, { token }); if (media_id && token) data = await requestMobileApi(media_id, { token });
// mobile api (no cookie, cookie) // mobile api (no cookie, cookie)
if (!data && media_id) data = await requestMobileApi(id); if (media_id && !data) data = await requestMobileApi(media_id);
if (!data && media_id && cookie) data = await requestMobileApi(id, { cookie }); if (media_id && cookie && !data) data = await requestMobileApi(media_id, { cookie });
// html embed (no cookie, cookie) // html embed (no cookie, cookie)
if (!data) data = await requestHTML(id); if (!data) data = await requestHTML(id);

View File

@ -20,14 +20,15 @@ export default async function(o) {
}).then(r => r.text()).catch(() => {}); }).then(r => r.text()).catch(() => {});
if (!html) return { error: 'ErrorCouldntFetch' }; if (!html) return { error: 'ErrorCouldntFetch' };
if (!html.includes(`<div data-module="OKVideo" data-options="{`)) {
let videoData = html.match(/<div data-module="OKVideo" .*? data-options="({.*?})"( .*?)>/)
?.[1]
?.replaceAll("&quot;", '"');
if (!videoData) {
return { error: 'ErrorEmptyDownload' }; return { error: 'ErrorEmptyDownload' };
} }
let videoData = html.split(`<div data-module="OKVideo" data-options="`)[1]
.split('" data-')[0]
.replaceAll("&quot;", '"');
videoData = JSON.parse(JSON.parse(videoData).flashvars.metadata); videoData = JSON.parse(JSON.parse(videoData).flashvars.metadata);
if (videoData.provider !== "UPLOADED_ODKL") if (videoData.provider !== "UPLOADED_ODKL")

View File

@ -1,6 +1,6 @@
import { genericUserAgent } from "../../config.js"; import { genericUserAgent } from "../../config.js";
const videoRegex = /"url":"(https:\/\/v1.pinimg.com\/videos\/.*?)"/g; const videoRegex = /"url":"(https:\/\/v1\.pinimg\.com\/videos\/.*?)"/g;
const imageRegex = /src="(https:\/\/i\.pinimg\.com\/.*\.(jpg|gif))"/g; const imageRegex = /src="(https:\/\/i\.pinimg\.com\/.*\.(jpg|gif))"/g;
export default async function(o) { export default async function(o) {

View File

@ -12,17 +12,19 @@ async function findClientID() {
let scVersion = String(sc.match(/<script>window\.__sc_version="[0-9]{10}"<\/script>/)[0].match(/[0-9]{10}/)); let scVersion = String(sc.match(/<script>window\.__sc_version="[0-9]{10}"<\/script>/)[0].match(/[0-9]{10}/));
if (cachedID.version === scVersion) return cachedID.id; if (cachedID.version === scVersion) return cachedID.id;
let scripts = sc.matchAll(/<script.+src="(.+)">/g); let scripts = sc.matchAll(/<script.+src="(.+)">/g);
let clientid; let clientid;
for (let script of scripts) { for (let script of scripts) {
let url = script[1]; let url = script[1];
if (url && !url.startsWith('https://a-v2.sndcdn.com')) return; if (!url?.startsWith('https://a-v2.sndcdn.com/')) {
return;
}
let scrf = await fetch(url).then(r => r.text()).catch(() => {}); let scrf = await fetch(url).then(r => r.text()).catch(() => {});
let id = scrf.match(/\("client_id=[A-Za-z0-9]{32}"\)/); let id = scrf.match(/\("client_id=[A-Za-z0-9]{32}"\)/);
if (id && typeof id[0] === 'string') { if (id && typeof id[0] === 'string') {
clientid = id[0].match(/[A-Za-z0-9]{32}/)[0]; clientid = id[0].match(/[A-Za-z0-9]{32}/)[0];
break; break;

View File

@ -1,109 +1,169 @@
import { env } from "../../config.js"; import { env } from "../../config.js";
import { cleanString } from '../../sub/utils.js'; import { cleanString, merge } from '../../sub/utils.js';
import HLS from "hls-parser";
const resolutionMatch = { const resolutionMatch = {
"3840": "2160", "3840": 2160,
"2732": "1440", "2732": 1440,
"2560": "1440", "2560": 1440,
"2048": "1080", "2048": 1080,
"1920": "1080", "1920": 1080,
"1366": "720", "1366": 720,
"1280": "720", "1280": 720,
"960": "480", "960": 480,
"640": "360", "640": 360,
"426": "240" "426": 240
} }
const qualityMatch = { const requestApiInfo = (videoId, password) => {
"2160": "4K", if (password) {
"1440": "2K", videoId += `:${password}`
"480": "540", }
"4K": "2160", return fetch(
"2K": "1440", `https://api.vimeo.com/videos/${videoId}`,
"540": "480" {
headers: {
Accept: 'application/vnd.vimeo.*+json; version=3.4.2',
'User-Agent': 'Vimeo/10.19.0 (com.vimeo; build:101900.57.0; iOS 17.5.1) Alamofire/5.9.0 VimeoNetworking/5.0.0',
Authorization: 'Basic MTMxNzViY2Y0NDE0YTQ5YzhjZTc0YmU0NjVjNDQxYzNkYWVjOWRlOTpHKzRvMmgzVUh4UkxjdU5FRW80cDNDbDhDWGR5dVJLNUJZZ055dHBHTTB4V1VzaG41bEx1a2hiN0NWYWNUcldSSW53dzRUdFRYZlJEZmFoTTArOTBUZkJHS3R4V2llYU04Qnl1bERSWWxUdXRidjNqR2J4SHFpVmtFSUcyRktuQw==',
'Accept-Language': 'en'
}
}
)
.then(a => a.json())
.catch(() => {});
} }
export default async function(obj) { const compareQuality = (rendition, requestedQuality) => {
let quality = obj.quality === "max" ? "9000" : obj.quality; const quality = parseInt(rendition);
if (!quality || obj.isAudioOnly) quality = "9000"; return Math.abs(quality - requestedQuality);
}
const url = new URL(`https://player.vimeo.com/video/${obj.id}/config`); const getDirectLink = (data, quality) => {
if (obj.password) { if (!data.files) return;
url.searchParams.set('h', obj.password);
}
let api = await fetch(url) const match = data.files
.then(r => r.json()) .filter(f => f.rendition?.endsWith('p'))
.catch(() => {}); .reduce((prev, next) => {
if (!api) return { error: 'ErrorCouldntFetch' }; const delta = {
prev: compareQuality(prev.rendition, quality),
next: compareQuality(next.rendition, quality)
};
let downloadType = "dash"; return delta.prev < delta.next ? prev : next;
});
if (!obj.isAudioOnly && JSON.stringify(api).includes('"progressive":[{')) if (!match) return;
downloadType = "progressive";
let fileMetadata = {
title: cleanString(api.video.title.trim()),
artist: cleanString(api.video.owner.name.trim()),
}
if (downloadType !== "dash") {
if (qualityMatch[quality]) quality = qualityMatch[quality];
let all = api.request.files.progressive.sort((a, b) => Number(b.width) - Number(a.width));
let best = all[0];
let bestQuality = all[0].quality.split('p')[0];
if (qualityMatch[bestQuality]) {
bestQuality = qualityMatch[bestQuality]
}
if (Number(quality) < Number(bestQuality)) {
best = all.find(i => i.quality.split('p')[0] === quality);
}
if (!best) return { error: 'ErrorEmptyDownload' };
return {
urls: best.url,
audioFilename: `vimeo_${obj.id}_audio`,
filename: `vimeo_${obj.id}_${best.width}x${best.height}.mp4`
}
}
if (api.video.duration > env.durationLimit)
return { error: ['ErrorLengthLimit', env.durationLimit / 60] };
let masterJSONURL = api.request.files.dash.cdns.akfire_interconnect_quic.url;
let masterJSON = await fetch(masterJSONURL).then(r => r.json()).catch(() => {});
if (!masterJSON) return { error: 'ErrorCouldntFetch' };
if (!masterJSON.video) return { error: 'ErrorEmptyDownload' };
let masterJSON_Video = masterJSON.video
.sort((a, b) => Number(b.width) - Number(a.width))
.filter(a => ["dash", "mp42"].includes(a.format));
let bestVideo = masterJSON_Video[0];
if (Number(quality) < Number(resolutionMatch[bestVideo.width])) {
bestVideo = masterJSON_Video.find(i => resolutionMatch[i.width] === quality)
}
let masterM3U8 = `${masterJSONURL.split("/sep/")[0]}/sep/video/${bestVideo.id}/master.m3u8`;
const fallbackResolution = bestVideo.height > bestVideo.width ? bestVideo.width : bestVideo.height;
return { return {
urls: masterM3U8, urls: match.link,
isM3U8: true,
fileMetadata: fileMetadata,
filenameAttributes: { filenameAttributes: {
service: "vimeo", resolution: `${match.width}x${match.height}`,
id: obj.id, qualityLabel: match.rendition,
title: fileMetadata.title,
author: fileMetadata.artist,
resolution: `${bestVideo.width}x${bestVideo.height}`,
qualityLabel: `${resolutionMatch[bestVideo.width] || fallbackResolution}p`,
extension: "mp4" extension: "mp4"
} }
} }
} }
const getHLS = async (configURL, obj) => {
if (!configURL) return;
const api = await fetch(configURL)
.then(r => r.json())
.catch(() => {});
if (!api) return { error: 'ErrorCouldntFetch' };
if (api.video?.duration > env.durationLimit) {
return { error: ['ErrorLengthLimit', env.durationLimit / 60] };
}
const urlMasterHLS = api.request?.files?.hls?.cdns?.akfire_interconnect_quic?.url;
if (!urlMasterHLS) return { error: 'ErrorCouldntFetch' }
const masterHLS = await fetch(urlMasterHLS)
.then(r => r.text())
.catch(() => {});
if (!masterHLS) return { error: 'ErrorCouldntFetch' };
const variants = HLS.parse(masterHLS)?.variants?.sort(
(a, b) => Number(b.bandwidth) - Number(a.bandwidth)
);
if (!variants || variants.length === 0) return { error: 'ErrorEmptyDownload' };
let bestQuality;
if (obj.quality < resolutionMatch[variants[0]?.resolution?.width]) {
bestQuality = variants.find(v =>
(obj.quality === resolutionMatch[v.resolution.width])
);
}
if (!bestQuality) bestQuality = variants[0];
const expandLink = (path) => {
return new URL(path, urlMasterHLS).toString();
};
let urls = expandLink(bestQuality.uri);
const audioPath = bestQuality?.audio[0]?.uri;
if (audioPath) {
urls = [
urls,
expandLink(audioPath)
]
} else if (obj.isAudioOnly) {
return { error: 'ErrorEmptyDownload' };
}
return {
urls,
isM3U8: true,
filenameAttributes: {
resolution: `${bestQuality.resolution.width}x${bestQuality.resolution.height}`,
qualityLabel: `${resolutionMatch[bestQuality.resolution.width]}p`,
extension: "mp4"
}
}
}
export default async function(obj) {
let quality = obj.quality === "max" ? 9000 : Number(obj.quality);
if (quality < 240) quality = 240;
if (!quality || obj.isAudioOnly) quality = 9000;
const info = await requestApiInfo(obj.id, obj.password);
let response;
if (obj.isAudioOnly) {
response = await getHLS(info.config_url, { ...obj, quality });
}
if (!response) response = getDirectLink(info, quality);
if (!response) response = { error: 'ErrorEmptyDownload' };
if (response.error) {
return response;
}
const fileMetadata = {
title: cleanString(info.name),
artist: cleanString(info.user.name),
};
return merge(
{
fileMetadata,
filenameAttributes: {
service: "vimeo",
id: obj.id,
title: fileMetadata.title,
author: fileMetadata.artist,
}
},
response
);
}

View File

@ -33,6 +33,7 @@
"vk": { "vk": {
"alias": "vk video & clips", "alias": "vk video & clips",
"patterns": ["video:userId_:videoId", "clip:userId_:videoId", "clips:duplicate?z=clip:userId_:videoId"], "patterns": ["video:userId_:videoId", "clip:userId_:videoId", "clips:duplicate?z=clip:userId_:videoId"],
"subdomains": ["m"],
"enabled": true "enabled": true
}, },
"ok": { "ok": {

View File

@ -23,6 +23,10 @@ function transformObject(streamInfo, hlsObject) {
hlsObject.uri = createInternalStream(fullUrl.toString(), streamInfo); hlsObject.uri = createInternalStream(fullUrl.toString(), streamInfo);
if (hlsObject.map) {
hlsObject.map = transformObject(streamInfo, hlsObject.map);
}
return hlsObject; return hlsObject;
} }

View File

@ -1,7 +1,6 @@
import { request } from 'undici'; import { request } from 'undici';
import { Readable } from 'node:stream'; import { Readable } from 'node:stream';
import { assert } from 'console'; import { closeRequest, getHeaders, pipe } from './shared.js';
import { getHeaders, pipe } from './shared.js';
import { handleHlsPlaylist, isHlsRequest } from './internal-hls.js'; import { handleHlsPlaylist, isHlsRequest } from './internal-hls.js';
const CHUNK_SIZE = BigInt(8e6); // 8 MB const CHUNK_SIZE = BigInt(8e6); // 8 MB
@ -27,7 +26,7 @@ async function* readChunks(streamInfo, size) {
const received = BigInt(chunk.headers['content-length']); const received = BigInt(chunk.headers['content-length']);
if (received < expected / 2n) { if (received < expected / 2n) {
streamInfo.controller.abort(); closeRequest(streamInfo.controller);
} }
for await (const data of chunk.body) { for await (const data of chunk.body) {
@ -36,73 +35,88 @@ async function* readChunks(streamInfo, size) {
read += received; read += received;
} }
}
function chunkedStream(streamInfo, size) {
assert(streamInfo.controller instanceof AbortController);
const stream = Readable.from(readChunks(streamInfo, size));
return stream;
} }
async function handleYoutubeStream(streamInfo, res) { async function handleYoutubeStream(streamInfo, res) {
const { signal } = streamInfo.controller;
const cleanup = () => (res.end(), closeRequest(streamInfo.controller));
try { try {
const req = await fetch(streamInfo.url, { const req = await fetch(streamInfo.url, {
headers: getHeaders('youtube'), headers: getHeaders('youtube'),
method: 'HEAD', method: 'HEAD',
dispatcher: streamInfo.dispatcher, dispatcher: streamInfo.dispatcher,
signal: streamInfo.controller.signal signal
}); });
streamInfo.url = req.url; streamInfo.url = req.url;
const size = BigInt(req.headers.get('content-length')); const size = BigInt(req.headers.get('content-length'));
if (req.status !== 200 || !size) { if (req.status !== 200 || !size) {
return res.end(); return cleanup();
} }
const stream = chunkedStream(streamInfo, size); const generator = readChunks(streamInfo, size);
const abortGenerator = () => {
generator.return();
signal.removeEventListener('abort', abortGenerator);
}
signal.addEventListener('abort', abortGenerator);
const stream = Readable.from(generator);
for (const headerName of ['content-type', 'content-length']) { for (const headerName of ['content-type', 'content-length']) {
const headerValue = req.headers.get(headerName); const headerValue = req.headers.get(headerName);
if (headerValue) res.setHeader(headerName, headerValue); if (headerValue) res.setHeader(headerName, headerValue);
} }
pipe(stream, res, () => res.end()); pipe(stream, res, cleanup);
} catch { } catch {
res.end(); cleanup();
} }
} }
export async function internalStream(streamInfo, res) { async function handleGenericStream(streamInfo, res) {
if (streamInfo.service === 'youtube') { const { signal } = streamInfo.controller;
return handleYoutubeStream(streamInfo, res); const cleanup = () => res.end();
}
try { try {
const req = await request(streamInfo.url, { const req = await request(streamInfo.url, {
headers: { headers: {
...streamInfo.headers, ...Object.fromEntries(streamInfo.headers),
host: undefined host: undefined
}, },
dispatcher: streamInfo.dispatcher, dispatcher: streamInfo.dispatcher,
signal: streamInfo.controller.signal, signal,
maxRedirections: 16 maxRedirections: 16
}); });
res.status(req.statusCode); res.status(req.statusCode);
req.body.on('error', () => {});
for (const [ name, value ] of Object.entries(req.headers)) for (const [ name, value ] of Object.entries(req.headers))
res.setHeader(name, value) res.setHeader(name, value)
if (req.statusCode < 200 || req.statusCode > 299) if (req.statusCode < 200 || req.statusCode > 299)
return res.end(); return cleanup();
if (isHlsRequest(req)) { if (isHlsRequest(req)) {
await handleHlsPlaylist(streamInfo, req, res); await handleHlsPlaylist(streamInfo, req, res);
} else { } else {
pipe(req.body, res, () => res.end()); pipe(req.body, res, cleanup);
} }
} catch { } catch {
streamInfo.controller.abort(); closeRequest(streamInfo.controller);
cleanup();
} }
}
export function internalStream(streamInfo, res) {
if (streamInfo.service === 'youtube') {
return handleYoutubeStream(streamInfo, res);
}
return handleGenericStream(streamInfo, res);
} }

View File

@ -1,10 +1,12 @@
import NodeCache from "node-cache"; import NodeCache from "node-cache";
import { randomBytes } from "crypto"; import { randomBytes } from "crypto";
import { nanoid } from "nanoid"; import { nanoid } from "nanoid";
import { setMaxListeners } from "node:events";
import { decryptStream, encryptStream, generateHmac } from "../sub/crypto.js"; import { decryptStream, encryptStream, generateHmac } from "../sub/crypto.js";
import { env } from "../config.js"; import { env } from "../config.js";
import { strict as assert } from "assert"; import { strict as assert } from "assert";
import { closeRequest } from "./shared.js";
// optional dependency // optional dependency
const freebind = env.freebindCIDR && await import('freebind').catch(() => {}); const freebind = env.freebindCIDR && await import('freebind').catch(() => {});
@ -78,16 +80,36 @@ export function createInternalStream(url, obj = {}) {
} }
const streamID = nanoid(); const streamID = nanoid();
let controller = obj.controller;
if (!controller) {
controller = new AbortController();
setMaxListeners(Infinity, controller.signal);
}
let headers;
if (obj.headers) {
headers = new Map(Object.entries(obj.headers));
}
internalStreamCache[streamID] = { internalStreamCache[streamID] = {
url, url,
service: obj.service, service: obj.service,
headers: obj.headers, headers,
controller: new AbortController(), controller,
dispatcher dispatcher
}; };
let streamLink = new URL('/api/istream', `http://127.0.0.1:${env.apiPort}`); let streamLink = new URL('/api/istream', `http://127.0.0.1:${env.apiPort}`);
streamLink.searchParams.set('id', streamID); streamLink.searchParams.set('id', streamID);
const cleanup = () => {
destroyInternalStream(streamLink);
controller.signal.removeEventListener('abort', cleanup);
}
controller.signal.addEventListener('abort', cleanup);
return streamLink.toString(); return streamLink.toString();
} }
@ -100,7 +122,7 @@ export function destroyInternalStream(url) {
const id = url.searchParams.get('id'); const id = url.searchParams.get('id');
if (internalStreamCache[id]) { if (internalStreamCache[id]) {
internalStreamCache[id].controller.abort(); closeRequest(internalStreamCache[id].controller);
delete internalStreamCache[id]; delete internalStreamCache[id];
} }
} }

View File

@ -16,6 +16,10 @@ const serviceHeaders = {
} }
} }
export function closeRequest(controller) {
try { controller.abort() } catch {}
}
export function closeResponse(res) { export function closeResponse(res) {
if (!res.headersSent) { if (!res.headersSent) {
res.sendStatus(500); res.sendStatus(500);

View File

@ -6,7 +6,7 @@ import { create as contentDisposition } from "content-disposition-header";
import { metadataManager } from "../sub/utils.js"; import { metadataManager } from "../sub/utils.js";
import { destroyInternalStream } from "./manage.js"; import { destroyInternalStream } from "./manage.js";
import { env, ffmpegArgs, hlsExceptions } from "../config.js"; import { env, ffmpegArgs, hlsExceptions } from "../config.js";
import { getHeaders, closeResponse, pipe } from "./shared.js"; import { getHeaders, closeRequest, closeResponse, pipe } from "./shared.js";
function toRawHeaders(headers) { function toRawHeaders(headers) {
return Object.entries(headers) return Object.entries(headers)
@ -14,10 +14,6 @@ function toRawHeaders(headers) {
.join(''); .join('');
} }
function closeRequest(controller) {
try { controller.abort() } catch {}
}
function killProcess(p) { function killProcess(p) {
// ask the process to terminate itself gracefully // ask the process to terminate itself gracefully
p?.kill('SIGTERM'); p?.kill('SIGTERM');
@ -96,6 +92,10 @@ export function streamLiveRender(streamInfo, res) {
args = args.concat(ffmpegArgs[format]); args = args.concat(ffmpegArgs[format]);
if (hlsExceptions.includes(streamInfo.service)) {
args.push('-bsf:a', 'aac_adtstoasc')
}
if (streamInfo.metadata) { if (streamInfo.metadata) {
args = args.concat(metadataManager(streamInfo.metadata)) args = args.concat(metadataManager(streamInfo.metadata))
} }

View File

@ -44,9 +44,24 @@ export function cleanHTML(html) {
clean = clean.replace(/\n/g, ''); clean = clean.replace(/\n/g, '');
return clean return clean
} }
export function getRedirectingURL(url) { export function getRedirectingURL(url) {
return fetch(url, { redirect: 'manual' }).then((r) => { return fetch(url, { redirect: 'manual' }).then((r) => {
if ([301, 302, 303].includes(r.status) && r.headers.has('location')) if ([301, 302, 303].includes(r.status) && r.headers.has('location'))
return r.headers.get('location'); return r.headers.get('location');
}).catch(() => null); }).catch(() => null);
} }
export function merge(a, b) {
for (const k of Object.keys(b)) {
if (Array.isArray(b[k])) {
a[k] = [...(a[k] ?? []), ...b[k]];
} else if (typeof b[k] === 'object') {
a[k] = merge(a[k], b[k]);
} else {
a[k] = b[k];
}
}
return a;
}

View File

@ -674,7 +674,7 @@
"params": {}, "params": {},
"expected": { "expected": {
"code": 200, "code": 200,
"status": "stream" "status": "redirect"
} }
}], }],
"reddit": [{ "reddit": [{