Discussioni » Richieste di creazione

request: show youtube ratings thumbs up and down in search results

D N
§
Pubblicato: 08/02/2021

i can't seem to find it. can't be done?

§
Pubblicato: 08/02/2021

It probably can easily be done. But it will obviously take some time to load, and it will make a lot of network requests to youtube, but I think that this isn't a problem and you/your ip won't be banned because of multiple network requests.

§
Pubblicato: 09/02/2021

Looks easy, but it's really not

§
Pubblicato: 09/02/2021

@Konf

Really?
wouldn't it be needed to just make a js script that works on the search page, then a for condition that get's all search results links, fetchs all the search result links, then gets and display the thumbs up number for every result on the search page?

§
Pubblicato: 09/02/2021
Modificato: 09/02/2021

@hacker09

Yeah, I made a mistake. That method is ok. When I was answering the first time I checked the size of random youtube video page and it was ~800kb, so I disliked that idea. And another one too... But I forgot about compression that now actively used by a websites while transfering a data, such as gzip. ~200kb - way better. Don't forget to use proper request headers

§
Pubblicato: 09/02/2021

@Konf

Thanks for the reply. If I was to do this I probably wouldn't add any headers, I would just do fetch(); and parse that to html

I didn't know that was possible to fetch a website using compression...

In the country I live there's no metered internet, internet is unlimited

§
Pubblicato: 10/02/2021

@hacker09

It's not just traffic limits. It is also about load speed. Imagine you are scrolling down youtube with the script and it has took up all the traffic bandwidth. Now you must wait while the all scrolled up stuff would load before it will begin loading the actually needed.

To get around it you can cancel the old loads, or always preload the closest upwards and downwards stuff..

§
Pubblicato: 10/02/2021

@Konf

Oh, that's true, but what do you mean by "cancel old loads", do you mean that I should remove the thumbs up that the script added previously?
I don't think that less than 50 fetch requests and thumbs added would make the page too slow. What the script could do is exclude in the next fetch request the search result video link results that the script already fetched, this would stop the programming from "looping" the fetch requests

§
Pubblicato: 10/02/2021

I mean cancelling downloads that were not finished and were scrolled up enough far away.

§
Pubblicato: 10/02/2021
Modificato: 10/02/2021

@Konf

Nice, I didn't know that I could cancel fetch requests.

I think that what you said can be accomplished if I use an "settimeout" of "2" secs to get all the results links on the user view, then if the first actual result link isn't the same as 2 secs ago I could use AbortController() to cancel that fetch request. If this is what you meant.

§
Pubblicato: 10/02/2021

That is a bad way... What I think would be way better (pseudocode):

const videoPreviews = {
  waiting: { /* 1234: node, ...*/ }, // waiting for rating processing
  loading: {},
  done: {}
}

const videoPreviewsArr = doc.getAll(videoPreviews); // simple but bad example, getting should be dynamic...

videoPreviewsArr.forEach(el => {
  const coordinates = el.getBoundingClientRect(); // real method

  videoPreviews.waiting[coordinates.top] = el;
});

document.body.on('scroll', () => {
  const userViewportMiddle = scrollTop + (userScreenHeight / 2);

  const y = getClosestCoordinate({ to: userViewportMiddle, from: videoPreviews.waiting });

  const nodeToLoad = videoPreviews.waiting[y];

  move(nodeToLoad , { from: videoPreviews.waiting, to: videoPreviews.loading });

  beginLoading(nodeToLoad)
    .then(data => {
      move(data.node, { from: videoPreviews.loading, to: videoPreviews.done });
      showLikes(data.node, data.payload);
    });

  // and so on...
});
§
Pubblicato: 10/02/2021
Modificato: 10/02/2021

Nice!

That's very interesting...
I think that I was talking about getBoundingClientRect(), but I didn't know the existence of it yet...

I once had that problem with my "endless mal" script.
On this script a search page has 50 results that I made a for condition to loop these elements and add an onclick listener to every single one of them, but MAL is like YT, MAL has infinite scrolling, so every time the user scrolled down the page the results just increased, so the for condition was making the next loadings, and the whole browser slow, so I fixed this bug doing this.

(pseudocode)
async function(){
fetch next result page

var TotalSearchResults = doc.queryall('results_element') //On the fetched next search results page

for (var i = TotalSearchResults ; i < Actual_TotalSearchResults_OnThePage.length; i++) {
Actual_TotalSearchResults_OnThePage[i].setAttribute("target", "fancybox-frame");
Actual_TotalSearchResults_OnThePage[i].onclick = function() {
//do stuff
}
}
}

At least it worked...

Pubblica risposta

Accedi per pubblicare una risposta.

长期地址
遇到问题?请前往 GitHub 提 Issues。