Web Development by Solarise

The Solarise.dev Blog

MySQL Management — Importing large mysql data dumps with Bigdump

Blog archive Robin Metcalfe 2nd December 2017

It can be a pain working with large databases, especially on servers with limited resources or over a slow connection. Tools such as phpMyAdmin sometimes run into irritating issues. Other options are available for importing large or complex data dumps.

Table of Contents

P.I.T.A.

I often find myself hitting a brick wall with large MySQL imports, running into issues getting them uploaded to my local (or remote) development servers. While not a massive obstacle, chances are on any given web project of a significant scale, database import issues will rear their head.

If you’re struggling with getting a particular MySQL export file imported to your database, I recommend you check out BigDump — staggered dump importer which breaks up any given import into chunks and processes them one at a time.

Bigdump

Usage is pretty straightforward, download the archived zip file from the link above and extract it to a suitable location on your web server.

Open up bigdump.php with the code editor of your choice, and edit the following lines to match your database config.

$db_server   = 'localhost';
$db_name     = 'mydatabase';
$db_username = 'username';
$db_password = 'secr3t!'; // don't use this as your password, obviously

Bigdump contains a whole heap of other handy configuration options that may come in useful if you do run into further issues during usage.

The one I’ve encountered most often, and which seem to help diagnosing most of the common issues are:

$linespersession    = 3000;

Adjust the $linespersession variable downwards if you still experience timeout issues.

$delaypersession    = 0;

If you find that the immport is running into problems with MySQL’s max_requests setting, add some more milliseconds to the $delaypersession variable

$max_query_lines = 300;

Handy if you have some particularly large queries to process. I’ve found that notching $max_query_lines up to 3000 or even 30000 has no major impact, and seems to solve most of the issues I’ve encountered

If you’re still running into problems, check out Bigdump’s faq page.

All-in-all, it’s one of the most useful database tools I’ve found, and out-performs even some of the more professional offerings out there. Definintely worth a try if you’re experiencing MySQL headaches.

Vital note: Delete this from your server once you’re done. It’s a massive no-no to leave something like this sitting around on a production (or even a development) server.

Author Image

About the author

Robin is the dedicated developer behind Solarise.dev. With years of experience in web development, he's committed to delivering high-quality solutions and ensuring websites run smoothly. Always eager to learn and adapt to the ever-changing digital landscape, Robin believes in the power of collaboration and sharing knowledge. Outside of coding, he enjoys diving into tech news and exploring new tools in the industry.

If you'd like to get in touch to discuss a project, or if you'd just like to ask a question, fill in the form below.

Get in touch

Send me a message and I'll get back to you as soon as possible. Ask me about anything - web development, project proposals or just say hi!

Please enable JavaScript in your browser to complete this form.

Solarise.dev Services

Maintenance Retainers

Min 3 hours per month, no contract

Project Planning

Get in touch to discuss your project details

Consultancy

Face-to-face chats. Solve your problems!