Working with Hosted WordPress Sites Locally
This project on GitHub allows you to use a debug a WordPress site by using a database backup and local copy of the site files using Docker. It handles modifying the database and configuration files to make the site accessible via
- Make sure you have Docker and your favorite PHP editor installed
- Clone the GitHub project
- Create a database backup of your site’s database using phpMyAdmin (or equivalent), and copy it to
setup/backup.sqlin the cloned project
- Copy all site files to to the
sitedirectory in the cloned project
- From the cloned project directory, run
- Once the Docker Compose application initializes, browse to
- To manage the database via phpMyAdmin, browse to
http://localhost:8080— use the credentials specified in your site’s
wp-config.phpor use root/insecure-12345
I use Microsoft Visual Code for PHP editing, but any other editor supporting XDebug should work as well. The project includes a
.vscode/launch.json entry that includes the pathMappings entry to facilitate breakpoints.
Recently, I had volunteered to work on a WordPress site. Having not worked with WordPress for a while, I quickly ran into issues with WordPress behavior that made me wish I had debugging capability. Having some familiarity with Docker, I figured I would just copy the site files and database locally.
Everything I needed to know on how to get this working was available, but in many different places. As I was going through the motions, I kept thinking about what “what happens on the next time I need to debug this site, or another site?” It seemed like it should be possible to script/automate the work so that it was repeatable. So I began pulling on that thread, and the result the solution documented here.
What I ended up building was a Docker Compose application that included three containers:
- The WordPress site (http://localhost)
- The MySQL database
- phpMyAdmin for working with the database (http://localhost:8080)
I used Docker Compose instead of putting everything in a single container. I’m not a huge fan of the “put everything and the kitchen sink” into monolithic Docker files, it feels like doing a VM but less efficiently. Segregating things keeps the Dockerfiles themselves relatively simple, and it makes it easier to experiment with different PHP versions, database engines, etc.
I use gawk and sed to parse and update the WordPress configuration file and database backup file (regular expressions that include quotes and slashes in a BASH script are always fun…)
The site container does the following:
- Installs libraries and PHP extensions needed to run WordPress and debug
- Enables the Apache rewrite module
- Sets up write permission to
- Creates a
host.docker.internalentry for Linux Docker users
wp-config.phpto access the database container, and enable debug messages
The database container does the following:
- Parses database settings from
wp-config.php(database name, user and password)
- Identifies the site domain from parsing the the database backup file
- Creates a copy of the database file, replacing all instances of the site domain with
http://localhostand restores that backup
- Creates permissions for the database settings in
Dealing with Weirdness
WordPress does not make it easy to transplant a site.
- Some settings, including database connectivity, are stored in a code file,
wp-config.php. Putting configuration in code, as opposed to standalone data files seems like a common practice for PHP developers. This makes it more difficult to review and update configuration using automation.
- The site and home URLs are stored in the database
- Images and links in pages and posts are stored using fully qualified URLs
My first approach was to create a HOSTS file entry for the domain to direct it locally. That worked, but it made it a pain to compare my local copy of the site with the hosted site. Ultimately, I was able to script updating
wp-config.php and the database backup files to give me
http://localhost access to my copy, and have links and images still work.
The Dockerfile for the site loads the basic libraries and dependencies needed by WordPress, including stuff like ImageMagick for image manipulation and, of course, XDebug.
Docker for Windows and Mac have a very handy feature that defines
host.docker.internal, which can be used to facilitate communication between PHP running in a container and an editor running locally (when using a bridge network). For whatever reason, Docker has decided not to include that functionality in the Linux version, which is a pain. I was able to integrate a solution, based largely in part on this article by Mitz, to have a Dockerfile generate this entry.
I am using Docker volume mounts to facilitate access to the site files and database backups. As a result, this information is not available during the container build. I’ve set up entry points for the website and database containers to handle initialization of the database, configuration, etc. After doing initialization, they fall back to the “built-in” initialization.
In the name of security, MySQL default behavior with Docker is to generate a random root password, and then force you to search log files to find out what it was. Since this is just for local debugging, I’ve hard-coded an environment variable in the Docker Compose file to force the root password to insecure-12345.
Closing Thoughts, What’s Next
Now I can work with this beast locally. I’ve been evaluating plug-in’s. Most that have anywhere near the functionality I need are bloated beasts. I get why they are that way, but tweaking them to do what I need is probably going to be more work than rolling my own functionality.
It looks like I’m going to need to write some plug-in’s for the site. WordPress is still natively bundling JQuery 1.x (nope), so I’m thinking of picking up Vue. And there’s the Gutenberg learning curve…