OpenCart – Massive SQL Insert

There will be times that you will be using the Opencart framework for your own web applications where the goal will not be eCommerce or having an online store. In doing this you may find yourself needing to perform a massive INSERT QUERY to your Database where the option to import via .CSV or EXCEL is not actually an option because other things will need to happen during the INSERT that CSV import can’t do. I wanted to write an article on how to properly INSERT tons of data into a MYSQL TABLE without bogging your system down because if done incorrectly you could be looking at hours upon hours of script time…which always leads to browser/session issues and/or php/mysql setting issues. To be fair, it’s never a great idea to perform massive data entry purely with a php function but sometimes there is no way around it.

Step One – Script Execution Time

You are going to want to make sure that your execution time is set to a longer time than default. This can be done in the php.ini that comes with Opencart. If you open the php.ini you will see these lines:

magic_quotes_gpc = Off;
register_globals = Off;
default_charset	= UTF-8;
memory_limit = 64M;
max_execution_time = 36000;
upload_max_filesize = 999M;
safe_mode = Off;
mysql.connect_timeout = 20;
session.use_cookies = On;
session.use_trans_sid = Off;
session.gc_maxlifetime = 172800;
allow_url_fopen = on;
;display_errors = 1;
;error_reporting = E_ALL;

The four lines that you want to pay attention to are:

  • memory_limit = 64M;
  • max_execution_time = 36000;
  • session.gc_maxlifetime = 172800;

Once you have these set to a value that suits the size of your script that will run, then it’s time to consider your PHP and how you will write it. For this particular demonstration we are going to INSERTING 100,000 new lines of data into one table. The first time I tried to do this I basically just created a loop that gathered all of the appropriate lines of data and attempted 100,000 INSERTS. This was a bad idea and never worked. It bogged the system down and took WAY TOO MUCH TIME. My script was breaking about an hour into the script. I was also saving images to the server as well which made it even more laborious. Let’s get to the PHP. MYSQL offers a couple ways to perform INSERTS. The correct way to INSERT this much data is to use ONE INSERT QUERY ONLY.


The type of INSERT we are going to be using will look like this:

INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);

The only thing you have to do now is write the PHP code that will loop all of the values into the INSERT. Believe it or not, this will cut the script execution time down by more than half, doing an INSERT this way.


The wrong way:

    // an array items to insert
    $array = array(    'dingo'    =>'A native dog',
            'wombat'    =>'A native marsupial',
            'platypus'    => 'A native monotreme',
            'koala'    => 'A native Phascolarctidae'

    $conn = mysql_connect('db_user', 'db_password' );
    mysql_select_db( 'test_db', $conn )

    foreach( $array as $key=>$value )
        $sql = "INSERT INTO test_table (name, description ) VALUES ($key, $value)";
        mysql_query( $sql, $conn );

This is the way I tried it the first time, which was a disaster. I had major performance issues and honestly it just crashed every time.

The right way:

By building the query within the foreach loop, and running a single SQL query, the whole process is greatly accelerated. This code also makes use of PHP iterators to handle the array. As an iterator only knows the current array member, it is much more efficient than a standard foreach loop, which makes a copy of the whole array internally to loop over.


    // an array items to insert
    $array = array( 'dingo'       => 'A native dog',
            'wombat'      => 'A native marsupial',
            'platypus'    => 'A native monotreme',
            'koala'       => 'A native Phascolarctidae'

    // begin the sql statement
    $sql = "INSERT INTO test_table (name, description ) VALUES ";

    // this is where the magic happens
    $it = new ArrayIterator( $array );

    // a new caching iterator gives us access to hasNext()
    $cit = new CachingIterator( $it );

    // loop over the array
    foreach ( $cit as $value )
        // add to the query
        $sql .= "('".$cit->key()."','" .$cit->current()."')";
        // if there is another array member, add a comma
        if( $cit->hasNext() )
            $sql .= ",";

    // now we can use a single database connection and query
    $conn = mysql_connect('db_user', 'db_password' );
    mysql_select_db( 'test_db', $conn );
    mysql_query( $sql, $conn );


Looking for quality OpenCart Hosting? Look no further than Arvixe Web Hosting!

Tags: , , , , , | Posted under OpenCart | RSS 2.0

Author Spotlight

Joe Stenhouse

I am a web application developer that specializes in PHP, JAVASCRIPT, MYSQL, HTML, and CSS. We manifest exciting potentials within the world wide web through means of innovation.

Leave a Reply

Your email address will not be published. Required fields are marked *