Cannot display more that ~60000 records

Issue #209 resolved
Former user created an issue

Originally reported on Google Code with ID 209

What steps will reproduce the problem?
1. Go to any table with records greater then circa 60000 
example phpliteadmin.php?table=BIGdata&action=table_sql

2 Executing a query with a results greater then circa 60000 records
example SELECT * FROM "BIGdata" LIMIT 70000

What is the expected output? 
A full results set 
or an error saying that 'too many records' to display
or some kind of auto-limit showing number of records that can be displayed

What do you see instead?
empty result set (screenshot attached)

What version of the product are you using? On what operating system? 
phpLiteAdmin v1.9.4.1, running on ubuntu Ubuntu 12.04.2 LTS (GNU/Linux 3.5.0-26-generic

Database Extension (PDO/SQLiteDatabase/SQLiteDatabase3 - see Database
structure-tab in phpLiteAdmin)?
 SQLite version: 3.7.9
 SQLite extension [?]: PDO
 PHP version: 5.3.10-1ubuntu3.6

Please provide any additional information below.

Reported by maciej.naumienko on 2013-04-05 14:30:25

<hr> * Attachment: Snap 2013-04-05 at 16.25.25.png<br>Snap 2013-04-05 at 16.25.25.png

Comments (9)

  1. Christopher Kramer
    Thanks for your report.
    Does the same problem occur when you use the "browse" tab?
    | What is the expected output? 
    | A full results set 
    Well, your browser would probably get into trouble if we gave it a 70.000 lines table.
    I think we either need to introduce pagination here (which means adjusting / adding
    the LIMIT clause). Or we should simply check how many rows get returned and say "no,
    use a smaller limit".
    But besides the browser-problem, we can still improve this a lot: We currently fetch
    the whole resultset into memory ($result = $db->selectArray($query[$i], "assoc"); ).
    This is what makes your page appear blank: php runs out of memory and ends execution.
    If we would iterate through the resultset, we could give you the full table. But as
    I said, your browser will likely get into trouble with a table as huge as that.
    IE6 would not display the table until fully loaded, which means you would probably
    wait for ages until windows says "Internet explorer does not respond any longer".
    Have not tried it, though.
    It should help to start a new <table> every few hundred rows to come around problems
    like this.

    Reported by crazy4chrissi on 2013-04-05 16:05:28

  2. Former user Account Deleted
    You are welcome. Great product by the way.
    Same issue when using the "browse" tab and this emerges here if I go for somewhere
    around 50.000. 

    Reported by maciej.naumienko on 2013-04-08 11:20:24

  3. Former user Account Deleted
    I got the same problem here, inserting 88K rows and didn't get any result stoping the
    site and getting an DB with half data inserted. I've finally get the data into the
    DB using local server coding. Maybe it's a PHP timeout problem?

    Reported by nanozero87 on 2013-08-30 07:47:02

  4. Christopher Kramer
    How did you insert the 88K rows? Using some import (csv/sql)? Using the sql-tab? Using
    the insert-feature (probably not ;) )?
    It could be a php timeout or an out-of-memory.

    Reported by crazy4chrissi on 2013-08-30 09:05:22

  5. Christopher Kramer
    I guess this issue relates to the same root cause as issue #78.
    We often load full result sets into memory, which won't work with big result sets.
    We should switch to an iterator-based approach to avoid these problems.

    Reported by crazy4chrissi on 2014-01-15 10:09:00 - Status changed: Accepted - Labels added: Target-1.9.6

  6. Christopher Kramer

    With revison dd5a9a9, I changed the way the "Browse" feature fetches the result rows. It now fetches them one by one and thus needs far less memory and won't run into an out of memory error any more. I tried displaying a multi million rows table, and Firefox repeatedly crashed :D So if you really want to display millions of rows in the browser, go the Mozilla's bug tracker next ;-)

    Currently, this is only solved for the "Browse" tab, I will do the "SQL" tab soon.

    You can test the newest development version here:

  7. phpLiteAdmin repo owner

    SQL tab in table and db now also fetch row by row instead of loading all rows into memory. this allows to display far more rows than before. Number of rows and time consumed is now displayed below the result, as it is not known when the header is printed. This, together with revison dd5a9a9, fixes issue #209.

    → <<cset cee0a27b7da6>>

  8. Log in to comment