Dealing with very large datasets in MySql (100,000+)

There is a MySQL limitation due to how it returns paginated records. In fact, MySQL has to return the entire table before it does pagination using LIMIT. As you can tell, it’s a lot of overhead and can cause datagrid to render slow.

phpGrid is optimized to handle large data set up to one million records. Beyond one million records, it is highly recommend to use filter in your sql query. It can resolved by applying WHERE clause to the original SQL query, so that the data is filtered BEFORE and only the paginated rows are returned to phpGrid for display. 

For large table with 1,000,000+ records, we recommend adding a sequence number column that tracks the actually paginated row beginning and ending number.

e.g.
SELECT * FROM my_table WHERE seq > 100 And seq < 200

Some excellent posts/articles on this issue:

Using sequence:
http://stackoverflow.com/questions/1243952/how-can-i-speed-up-a-mysql-query-with-a-large-offset-in-the-limit-clause

Using InnoDB engine
http://stackoverflow.com/questions/5483489/how-to-increase-performance-of-limit-1-when-the-is-a-huge-number

Using “late row lookup”
http://explainextended.com/2009/10/23/mysql-order-by-limit-performance-late-row-lookups/

We don’t support those out of box. It requires Enterprise and up license because it requires source code modification since each user may have different implementation. We will provide consulting on the subject as we did with a number of clients.

Feedback and Knowledge Base