A couple days ago, when I finally got around to installing Linux and configuring my old PC as a 24×7 server, I had the notion that I’d like to make daily (or at least weekly) backups of the databases behind my websites (my forum and my blog primarily). The MySQL databases are provided by my hosting company (GoDaddy.com). However, when I sent an email to them asking why I couldn’t seem to connect a local mysqldump to the database servers they responded stating it was not allowed since it was a security risk and put unknown load upon their servers. Being the type of guy I am, I decided to do something about that.

The idea was that I wanted to set up a cron job to periodically run mysqldump and grab the online databases to archive them. However, GoDaddy did not want me to do this, since my Linux server was not part of their <sarcasm>elite clique</sarcasm>.

Obviously since my forums and blog actually work, GoDaddy allows the codedread server to get into the databases, so I chose to write up a PHP script and drop it on my codedread server. Then I can set up a cron job to periodically use wget to pass the PHP file some arguments that it would use to grab a full copy of the database and return it. If you wanted to get the database copy but don’t have a version of Linux with which you can wget the php file, you could write some short JScript or VBScript in Windows to retrieve the document in a similar fashion (let me know if you actually do this!). But I should make it clear that if your databases are any sizable length, you should NOT use a web browser to go retrieve it. The script that I’ve written locks up my Internet Explorer for about 15 minutes while retrieving the whole database and then refuses to save the resultant file (which is several megabytes large). If I fetch it through wget it takes under 1 minute. This script is definitely for non-interactive use only.

Anyway, I’m releasing the little PHP program (dbimp.php) under the GNU Public License. It leans somewhat heavily on a source file from the phpBB forum software, which is also free and under the GNU Public License. The phpBB file I use (mysql.php) wraps up the database functionality for MySQL into a class which allows me to instantiate a database instance, connect to it, perform queries, etc. Nice little bit of code, that.

The arguments that you need to pass into this PHP program are:

– imp_dbhost: the Database server
– imp_dbname: The Database name
– imp_dbuser: The user name for the database
– imp_dbpasswd: The password for the database
– imp_max: (Optional) The max number of rows to retrieve from each table. If not specified then all rows are returned. This option was useful for testing in a web browser.

An example URL would be (all one line):


Note that because the URL contains ampersands (&) you will need to escape the entire URL string within double-quotes when sending to wget on the command-line.

The results look something like this:

<?xml version="1.0">
   <TIMESTAMP id="start-of-import">2005-02-02,04:05:00</TIMESTAMP>
   <TABLE name="Employees" numrows="2" numfields="3">
         <FIELD name="EmployeeName" type="TEXT">Jeff</FIELD>
         <FIELD name="id" type="INT">1</FIELD>
         <FIELD name="Salary" type="TEXT">$2,000,000.00</FIELD>
         <FIELD name="EmployeeName" type="TEXT">Rob</FIELD>
         <FIELD name="id" type="INT">2</FIELD>
         <FIELD name="Salary" type="TEXT">$5.00</FIELD>
   <TIMESTAMP id="end-of-import">2005-02-02,04:05:03</TIMESTAMP>

I put timestamps into the XML output at the start and end so you know how long the database extraction takes. I intend to update this script to support output in .sql format (I’ll post that in a subsequent blog entry). This format will probably be smaller than my custom XML file because of my poor choice of where to put the semantic information (i.e. field names and types). Also, outputting in .sql format will allow you to rebuild the whole database from scratch by inputting the .sql file into MySQL.

The dbimp.php script is here as raw text. I’ve also included the mysql.php file as raw text too. Right-click and save these files locally, then rename them with the .php extension.

Cautionary Note the First: Your web host likely tracks bandwidth usage and this script is no different than a regular web page except that its return pages are very large. This means running the script will eat up some bandwidth and you may want to monitor that. If your database files add up to 10 MB and you run this backup script daily it will cost you 300 MB in bandwidth. As a consequence you might want to just run it weekly.

Cautionary Note the Second: I whipped this thing together late last night in about an hour. I am not liable for any damage this may cause to your databases or any breakdown in negotiations between you and your web host provider. Use at your own risk.

§31 · February 2, 2005 · PHP, Software, Technology, Tips, Web · · [Print]

5 Comments to “Let My Data Go…or else…”

  1. Rob says:

    That was quick. I think I’ll try it out. Another tip for anyone that uses this – rename the script. Just in case you care about your database becoming a free download to anyone that browses to the link.

  2. Jeff says:

    You should probably read the whole article. It would only be possible for someone to get your database by browsing to that link if the person knows your database server, database name, username and password. Otherwise the script fails to connect to the DB and prints a harmless error message.

    Also, I can show you how to set up a cronjob on Hydrogen to do a daily/weekly auto-backup of your sites if you want.

  3. Jeff says:

    But now that I think about this, you do need to be careful in case you try to browse this link with the proper arguments (including your password) and then go to immediately another site. This would show that site the full URL you came from (including your password) and potentially give someone else all the info they need to browse your database.

    Again, this should never be an issue if you only get this file via wget or a similar mechanism (i.e. not a browser).

  4. […] So last week I got my Linux server up and running 24×7 on my LAN. I’ve got it grabbing daily snapshots of my website databases. It’s running Apache 2.0, SSH, FTP and Samba successfully. From my main Windows PC, my home directory on the Linux box is my H: drive. So far very cool, and not that difficult for someone who doesn’t regularly use Linux. […]

  5. A. Heinen says:

    I have made a changes to your code to write a more mysqldump like file. I’ve posted it to my website at http://www.heinenshome.net/public/.