Can I Create That Plugin in 24 Hours?

When I arrived at the event, the goal was clear: Until the next afternoon, I’d be working on a new privacy plugin to make any WordPress site more GDPR-compliant.

And according to the spirit of the local Hackathon, I wanted to have a working release version of that plugin ready before the event ended. However, I came quite unprepared and did not know about the real problems that were facing me. Because in my mind it all looked nice and easy, I would only…

TL;WR; Don’t want the full story? Get the free privacy plugin here:


The Problem

The concept: Create a plugin that embeds any external style or script on the local WP site. Primarily, I wanted to make sure that Google Fonts were served from our own website’s domain – because according to a court decision, we cannot use Google Fonts without consent 😬

Even worse, some lawyers have seized the opportunity to send a wave of warnings with payment orders to thousands of website owners in Germany and Austria.

My Plan

This calls for a quick and easy solution to embed Google Fonts on a website to close that annoying GDPR gap. When I hit the event on Friday lunchtime, I had the following concept:

  1. Create a plugin that scans all scripts and styles that were enqueued using the WordPress Dependency API [term invented by me] via wp_enqueue_script or wp_enqueue_style.
  2. Detect external assets by comparing the script or style URL with the WordPress Site-URL
  3. When an external asset is found: Download it to the local site’s uploads-folder and swap out the external URL with the URL to the file in the uploads-folder
  4. If that would not work: Improvise!

Sounds like something that’s possible within 24 hours, right?

Create the Plugin

[Time 14:49]
As I did not have anything prepared, the first step was setting up the development environment for this project. Simple:

  1. As promised in my announcement on Facebook, I would first set up the GitHub repo and share the link with you there.
  2. Then I spin up a new WP site on my local machine
  3. And finally, clone the empty GitHub repo to the local dev folder
  4. Start to prepare a plugin using our plugin setup

Step 1. Setting up the GitHub repo was easy enough 😉 You’ll find the link to the repo below, in case you want to review the commit history or submit a pull-request.

Step 2. You might know that I love Kinsta, so naturally, I develop new projects using their development environment DevKinsta. Just a few clicks and a coffee later, my local server was running at https://gdpr-tool-dev.local:63006

Step 3. Suddenly, I was stuck. I could not clone the remote repo to my local machine. Something with my git integration was messed up… I already struggled with the project before opening my code editor.

I tried restarting the MacBook. It did not solve the problem. So, I deleted my credentials from my GitHub account and authorized my machine again. Still no success. Really, don’t ask me what happened, I still have no clue. After 45 minutes, I figured out a way to initialize a local repo and push it to the remote origin.

Because of the 24-hour time limit, I did not bother to investigate further. Instead, I opened PhpStorm and started setting up the new WordPress plugin locally.

Step 4. Finally, I could create a default plugin. This is the structure of all our plugins:

  • plugin.php is the entry point. It contains the plugin header, stores the file’s path in a constant, and then loads the next file
  • start.php contains a list of plugin files or classes to include; in larger plugins, it also has an autoloader. This file bootstraps all WP hooks and loads the plugin’s core files, starting with constants.
  • constants.php is a simple file that holds all constants or config details of the plugin. This helps to keep plugin.php short and collect all define()`s in one place.

Here’s the commit that contains the initial plugin code on GitHub.

Proof of Concept

[Time 16:08]
So, the boring stuff was handled, and it’s time to get our hands dirty! First, I spent twenty minutes spinning up a quick options page that we’ll use later. But for a start, I wanted to get some proof of concept logic working asap, to see if I’m on the right track:

Replacing external URLs

The first challenge was detecting external assets by inspecting script- and style-URLs. I usually go about this by reviewing WordPress core code to find actions and filters to hook into.

Unfortunately, the wp_register_script() function does not provide a hook to override the asset URL during registration. After digging in core-code and echoing debug output for a while, I found the key:

Right before generating the script or link tags, WordPress filters the full URL for that asset via scripts_loader_src and styles_loader_src!

Using those filters, I could print out a list of all scripts and styles used on the current page; you can see the concept-code in GitHub.

But which of those assets are external? The check is easy; we only need to filter out relative URLs and URLs that start with the website’s home_url. Here’s the code I came up with:

function is_external_url( string $url ) : bool {
	// Relative URLs are always local to the current domain.
	if ( 0 === strpos( $url, '/' ) ) {
		return false;

	// Get protocol-relative
	$home_url = preg_replace( '/^\w+:/', '', home_url() );

	return false === strpos( $url, $home_url );

Downloading styles and scripts

Detecting external URLs was only half of the solution. The other (and more complex) part would be downloading those assets to the local site and serving them from here.

Downloading the external asset via a PHP script is the actual step that adds additional privacy to your website: Instead of your visitor’s computer download the files from Google’s servers, WordPress requests those files. Google does not get any information about your visitors.

Browsers always download dependencies via a GET request; so it’s safe to use wp_remote_get() to fetch those files and store them locally.

Tip: The fastest way to download and store a file from a URL to a local file, you can use the filename argument in the second parameter of the wp_remote_get() function.

function cache_asset( $url ) {
	$cache_dir = WP_CONTENT_DIR . '/uploads/gdpr-cache/';
	$cache_path = $cache_dir . md5( $url );

	$resp = wp_safe_remote_get(
			'timeout'  => 300,
			'stream'   => true,
			'filename' => $cache_path, // ← this does the trick!

	if ( is_wp_error( $resp ) ) {
		@unlink( $cache_path );
		return false;

	return true;

I could combine that function with the two filters I discovered earlier to get a first working plugin:

add_filter( 'scripts_loader_src', function( $url ) {
	if ( is_external_url( $url ) ) {
		if ( cache_asset( $url ) ) {
			$url = WP_CONTENT_URL .
				'/uploads/gdpr-cache/' .
				md5( $url );

	return $url;
} );

Though it’s dirty and inefficient code, it does work! With those parts in place, the plugin was almost complete – at least that’s what I thought:

  • It reliably detected all external assets, via scripts_loader_src filter
  • Downloading the external files to the uploads folder worked
  • And then swap out the external URL with a link to the uploads folder

Here’s the full proof of concept snippet:

The Google Fonts Challenge

[Time 22:37]
But wait! When testing the plugin, I discovered that the website still made a dozen requests to Google servers to fetch fonts! Why!?

It turns out, that the cached CSS file with the Google Fonts code contained a lot of url() values that would load woff or ttf files from external servers. This means, we need to parse all CSS files, and then download any external files we find inside url() values.

Sample of the Google Fonts CSS file:

@font-face {
  font-family: 'Open Sans';
  font-style: italic;
  font-weight: 300;
  font-stretch: normal;
  font-display: swap;
  src: url( format('truetype');

So, I’ve continued to write a CSS parser that would fetch all URLs inside the stylesheet. We continue to download external files and replace those URLs inside the stylesheet with local URLs.

The commit to scan CSS contents and recursively download external assets is rather small. Still, it took a few hours, as I’ve spent a lot of time to research the possible (valid) url()-syntax variations and create a RegExp to recognize those.

As the logic for downloading files and swapping out URLs did already work, the only real challenge was that RegExp. After the above commit, I found no more connections to Google Servers on my test pages. Success!


[Time 01:12]
Technically, the plugin solved my GDPR goals, but I was not finished yet: Refreshing a page took about 30 seconds since every request would always download all external files before serving the page. It was time to optimize the working plugin

Cache Control

Of course, the first step was simple: Only download files that are not cached yet:

Step 1. I’ve added a new value to the wp_options table that holds an asset-list. It’s an array with all external URLs and the filename of the local copy of the remote file. Now it’s easy to check if a file was already cached, or if it needs to be downloaded.

Step 2. Also, I wanted a way for the admin to purge the entire cache and force WordPress to refresh all external assets. Since we already had an options-page, I’ve added a “Flush Cache” button. Essentially, that button deleted the wp_options-value which I added in the previous step.

Flushing the cache worked well, but I still had the initial problem: After flushing the cache, the next request again took 30 seconds to download all assets. During those 30 seconds, every request would be slow.

So, I needed a solution to download the assets without delaying the current request.

Staying in the Background

[Time 10:51]
I decided to use the WP Cron tasks to download assets in a background request. This will be quite a change to the existing plugin:

Step 3. As the cron task runs asynchronously, I first added a queue which holds a list of assets that need to be downloaded. So, instead of downloading a remote file instantly, it was added to that queue.

When a new asset was added to the queue, the plugin also spawns a new cron-request to instantly start processing the queue.

That approach also requires a basic process control, to prevent multiple cron tasks from spawning at the same time. A simple timestamp inside the wp_options table does the trick.

Processing the queue is an easy task since we can recycle the existing download logic. After downloading an external file, the WP-Cron task adds details about the file, its expiry and cache location to the assets-list in wp_options and continues with the next file.

Step 4. Finally, I was ready to add a more sophisticated cache control logic to the plugin: Instead of bluntly deleting all items from the cache, I introduced a new “expiry” flag in the asset list. This expiration time defines if an asset is current or outdated.

Now, I could instantly expire all files, instead of deleting them. This technique is called “cache refresh” or “invalidation”. When a page requests an outdated asset, we use the outdated file. At the same time, we spawn a new WP Cron task to load the latest version of that file and update the cache in the background.

And voilà! Finally, the plugin can cache external assets without any impact on the website’s performance.

Step 5. To finish everything, I updated the readme files, added some WP filters and actions, and reviewed code comments.

[Time 15:54]
After everything looked fine, I created the final plugin zip archive and release it to the WordPress plugin repository!

→ From putting my MacBook on the desk, to releasing the working plugin took 25 hours and 6 minutes.


Creating a privacy plugin to embed and serve all external files from the local WordPress website took me roughly 25 hours. This includes a few hours of sleep, a shower, and some cocktails at the bar.

As I did not prepare or research anything for this project, I was quite nervous about it at the beginning. Though my approach turned out to be working reliably very fast, and I could spend a good amount of time on testing and performance-optimizing the plugin.

The plugin is (and stays) free – this also includes “ad-free” – and will instantly make any WordPress website a lot more private.

If you’re using Google Fonts, I recommend using this plugin instead of manually uploading the font files to your website. The reason is, that you/your customer/any admin can use new Google Fonts without any technical overhead.

Disclaimer: I’ve created the plugin within a single day, so the amount of testing I could do is quite limited. If you have questions or discover a bug, please let me know, and I’ll address it asap!