In such cases when I have big queries is it better to just paginate it, or can I just allow to query all.
One more input regarding pagination, keep in mind that if the total set of data is changing (frequently) over time, you need to adapt for that in your design.
For example imagine an api that returns a list of books that can be added and removed. If you went for a simple pagination design, due to removed/added books, the next page might return a book you already have received in the previous page.
In ICRC-7 we avoided this issue by implementing a prev take approach, where you optionally pass in the previous token you’ve received, and receive the tokens that come after this. There are technical details like sorting order and handling the case where prev is token that no longer exists you have to keep in mind here too.
So in case your data mutates (frequently), the type of data and if data is only added/removed or both all should be kept in mind while designing pagination.
Particularly data being returned within more than one page is a common bug that breaks React frontends that suddenly have a duplicate key in their lazy loaded list.
Maybe this helps
use candid::{CandidType, Deserialize};
use serde::Serialize;
use super::result::CanisterResult;
#[derive(CandidType, Debug, Serialize, Deserialize)]
pub struct PagedResponse<T> {
pub page: usize,
pub limit: usize,
pub total: usize,
pub number_of_pages: usize,
pub data: Vec<T>,
}
impl<T: Clone> PagedResponse<T> {
pub fn new(mut page: usize, mut limit: usize, data: Vec<T>) -> Self {
let total = data.len();
if page == 0 {
return Self {
page,
limit,
total,
number_of_pages: 0,
data: vec![],
};
}
if limit >= total {
limit = total;
}
let _number_of_pages_float = total as f32 / limit as f32;
let number_of_pages = _number_of_pages_float.ceil() as usize;
let mut start_limit = (page - 1) * limit;
let mut end_limit = (page - 1) * limit + limit;
if page >= number_of_pages {
page = number_of_pages;
start_limit = number_of_pages * limit - limit;
end_limit = total;
}
Self {
page,
limit,
total,
number_of_pages,
data: data[start_limit..end_limit].to_vec(),
}
}
pub fn map<R: Clone>(&self, f: impl Fn(&T) -> R) -> PagedResponse<R> {
PagedResponse {
page: self.page,
limit: self.limit,
total: self.total,
number_of_pages: self.number_of_pages,
data: self.data.iter().map(f).collect(),
}
}
pub fn into_result(self) -> CanisterResult<Self> {
Ok(self)
}
}
I am just like
const {posts} = reduxSelect(state=>state.socialFeed)
let load_more_posts = backendActor.get_posts(posts.lenght,posts.lenght+20 )
pub fn get_pagination(start: usize, count: usize) -> Vec<PostUser> {
POSTS.with(|posts| {
let posts = posts.borrow();
let total_posts = posts.len();
// If start is beyond the total number of posts, return an empty vector
if start >= total_posts as usize {
return Vec::new();
}
// Calculate the actual count based on the available posts
let actual_count = usize::min(count, (total_posts - start as u64).try_into().unwrap());
// for each post get the user User::get_user_from_text_principal(user_principal.clone());
posts
.iter()
.skip((start as u64).try_into().unwrap())
.take(actual_count)
flexible and lazy
Haha yeah that works as wel, only thing is if you want to have nice pagination or infinite scroll you need a bit more data then Vec<PostUser>