Topic: order:random

Posted under General

Just a suggestion, I think it would be nice if there was an order:random search that shuffled everything differently every time you used it. That way, for example, you could view your favorites in a totally random order (rather than by id or whatnot.)

Updated by luvdaporn

i personally wouldn't have any use of it because i only have like 5 favorites but this would be very nice to have for those people with 1000 plus favorites. great idea

Updated by anonymous

It is too computationally expensive order results randomly, databases don't work like that.

Updated by anonymous

It would be a nice function though, not just for favorites. I sometimes just type in a random page number when I view all posts, just so I can see stuff I haven't before. An option to make it automatically do that would be nice, but it's nothing I'll lose sleep over if it doesn't happen.

Updated by anonymous

Metal_Fox said:
It would be a nice function though, not just for favorites. I sometimes just type in a random page number when I view all posts, just so I can see stuff I haven't before. An option to make it automatically do that would be nice, but it's nothing I'll lose sleep over if it doesn't happen.

Well there is a "random" button for single posts.

Thanks for the response Kitsu.

Updated by anonymous

ktkr

Former Staff

How about a random page button like paheal's.

Updated by anonymous

Manually go through all your favorites, note down their post IDs in a txt file, and use random.org to select one.

Yes, I realize how long that may take. Better start now.

Updated by anonymous

RlctntFr said:
Well there is a "random" button for single posts.

That is only one post though. In order for the results to be randomized, every post in the result needs to be fetched from the database and then sorted. And since there is no limit on how many posts can be favorited, it simply does not scale.

Updated by anonymous

Kitsu~ said:
That is only one post though. In order for the results to be randomized, every post in the result needs to be fetched from the database and then sorted. And since there is no limit on how many posts can be favorited, it simply does not scale.

for ($i=0; $i<10; $i++)
rand($totalfav-1)

pull those number from the order they are found in the database of favorites. Doesn't need to sort.

Updated by anonymous

Aurali said:
for ($i=0; $i<10; $i++)
rand($totalfav-1)

It don't know how perl's RNG works here, but I believe it will occasionally give duplicates :)

Anyway, I can't really imagine much use of this random fav feature

Updated by anonymous

Jazz said:
It don't know how perl's RNG works here, but I believe it will occasionally give duplicates :)

That's the idea behind a PRNG. Easy to fix, though:

$total_favorites = count_favorites();
$all_favorites = get_favorites();
for($i=0; $i<10 && $i<$total_favorites; $i++) {
&nbsp;&nbsp;&nbsp;&nbsp;$swap_index = rand($total_favorites - $i) + $i;
&nbsp;&nbsp;&nbsp;&nbsp;swap($all_favorites[$i], $all_favorites[$swap_index];
}

for($i=0; $i<10 && $i<$total_favorites; $i++) {
&nbsp;&nbsp;&nbsp;&nbsp;show_image($all_favorites[$i]);
}

That could still be too memory-intensive, depending on what's in the "all_favorites" array. And that's assuming booru's code makes it that easy. I've learned not to have such faith.

Updated by anonymous

import random
favorites = get_favorites()
random.shuffle(favorites)
for favorite in favorites: show_image(favorite)

Python FTW ;)

The problem is that every single favorite will need to be fetched from the database, this does not scale as the amount of favorites could be any amount.

Currently e621 has 60k posts, if all of them were favorited, 60k rows would need to be fetched, mapped into objects, and then a list with 60k elements gets shuffled.

Very expensive, not to mention that the amount of posts is ever-increasing.

Updated by anonymous

Kitsu~ said:
The problem is that every single favorite will need to be fetched from the database, this does not scale as the amount of favorites could be any amount.

I think the problem is if you want to shuffle all favorites and navigate the result somehow, server should keep shuffle result. I don't know exactly, but I suppose that all favorites are fetched if you use fav search because you still can sort them, aren't they?

Probably if e621 API is same or similar to danbooru API, RlctntFr can use a shitload of clients to do that dirty job on a client side

Updated by anonymous

Is it too much to expect that a person just SAVES pictures they like instead?

Updated by anonymous

Kitsu~ said:
import random
favorites = get_favorites()
random.shuffle(favorites)
for favorite in favorites: show_image(favorite)

Python FTW ;)

The problem is that every single favorite will need to be fetched from the database, this does not scale as the amount of favorites could be any amount.

Currently e621 has 60k posts, if all of them were favorited, 60k rows would need to be fetched, mapped into objects, and then a list with 60k elements gets shuffled.

Very expensive, not to mention that the amount of posts is ever-increasing.

I love Python ~<3

Why map them? I don't understand the point of shuffling if you're gonna look through all the images anyway. Just shuffle a new set every time they query it.

Updated by anonymous

Aurali said:
I love Python ~<3

Why map them? I don't understand the point of shuffling if you're gonna look through all the images anyway. Just shuffle a new set every time they query it.

At the very least, you'd need to retrieve the potentially 60k ID numbers from the posts that the person has favorited, even if you pick a more optimal shuffling routine that only grabs 1 page worth of random IDs from the person's favorites list. Reason being that you need the list of IDs to choose from since not all will be present, and it's not easy to query a database for the Nth, Pth, and Qth record from a given query without either making the database duplicate work or grabbing everything up front anyways.

Come to think of it, that means retrieving up to 60k more-or-less randomly positioned rows from within a table of potentially hundreds of thousands, if not millions, of records, depending on the size of the user base and the average size of any user's favorites list. This would be a very bad thing for server performance, just from an I/O perspective. Nevermind the actual randomization of the list, that seems like it would be trivial by comparison.

Just out of curiosity, does e621 at least cache its databases tables in memory, or does it just go ahead and thrash against the disk? Can it even cache the tables in memory? I have no realistic idea just how "big" the database would be for the booru, especially versus the potential memory needs for caching frequently-accessed images, like the most recent 100 thumbnails or such.

Updated by anonymous

ikdind said:
and it's not easy to query a database for the Nth, Pth, and Qth record from a given query without either making the database duplicate work or grabbing everything up front anyways.

true. things having to do with searching a database do qet quite intensive. Even with the built in rand function of the database.

Updated by anonymous

i think a good random function would be set up like this:

rating:(s/q/e)
mustcontain:(enter tags)
scorebetween:(minimum)(maximum)
(optional)
maycontain:(enter tags)
then you click a button and posts would show up in how many 'maycontain' tags were matched, score(increasing or decreasing), favs (increasing or decreasing) or any other ways people would case to think of.
edit: if you want a truly random effect, dont add any maycontain tags.

Updated by anonymous

  • 1