When you visit a website like Facebook or Instagram, you are like visiting an infinite feed of information. Have you thought about how is it possible to load the website that quickly while it has so much data?
This is done with
pagination. In the initial load of the website, you only download the first page of data. Usually, that's the first N items. When you continue to scroll down, it starts to load another N items for the second page, then the third page if you keep scrolling... By using pagination, websites can avoid loading unnecessary information to make the initial load fast!
In this task, you are asked to implement a cursor-based pagination API with the given data. With cursor-based pagination it can handle different cases like deletion and insertion elegantly.
Nobody here yet, solve it now to get on the board!