I could not come up with any idea how to search for that 2000th row. I thought of computing it with the help of 'page no' and 'page-size' variables, but something like:
5 rows from 300040 and .... then 5 rows from 300070.
won't work because there are many rows from other members in between.
Then I googled and found out that Limit clauses's offset parameter is a big performance issue. I have such structure that it lets user go onto Nth page, which I want to keep since only these pages have inbound links to articles.
So, only thing that comes to my mind is that usually users don't go beyond certain number of pages, and query for earlier pages are not that slow, then I have caching system in place, which will cache the page on first query run.
Mr. Ronan what you think could be done for better results?
In order to support this paginated design, the considerations I have are:
- Cache the results (obviously)
- Limit # of posts a user can post per month (a couple of hundreds)
- Avoid long navigation by increasing page size so that queries are run less often
Yup, I did this and the results are very reasonable, also I have merged year and month columns i.e. '201211';
Forgot to mantion that I had to create this trigger for the merged 'yearmonth' column since DEFAULT value can only be a constant.
CREATE TRIGGER yearmonth
FOR EACH ROW
SET NEW.yearmonth = DATE_FORMAT( CURRENT_TIMESTAMP , '%Y%m' )