Asset Versioning using Grunt

What is Asset Versioning? This is sometimes referred to as cache busting. Assets for your website are CSS files, JavaScript files, images and other things that are used by all the pages in a website. These are files that should not change often, so for better performance, you want the browser to just download the file the first time it accesses your site and then use cached copies going forward. After the initial page load, the rest of the pages on your website should load quicker because of this. This is accomplished by having your web server set expiration dates on those files far in the future. You can do this in your Apache configuration like this:

<FilesMatch "\.(gif|jpg|png|js|css)$">
  ExpiresActive On
  ExpiresDefault "access plus 10 years"

However, what happens when you make a change to a CSS or JS file. Since the expiration date is set in the future, the browser will continue to use the original version. Fixing this problem is what Asset Versioning or Cache Busting is meant to handle.

I know about 3 methods of doing asset versioning:

  1. Query string parameter: /js/main.js?v=1.2.0
  2. Folder: /1.2.0/js/main.js
  3. File name: /js/main.1.2.0.js

I had always used the query string parameter method for asset versioning, but I have learned that there are problems with that method. I found an article by Steve Souders that explained some proxy servers will not cache items with query strings, so using this technique will cause your files to not be cached at all which will result in slower performance which is the opposite of what we want. (FYI – Steve Souders wrote the book High Performance Web Sites.)

I don’t like the folder method, because not all your assets will change with every release. This will result in the cache changing every time you have a release even though some items do not need to be changed.

This leads me to the file name method. There are several methods of versioning file names. You can use a version number (like 1.2.0), but that generally means maintaining the file names by hand. Another option is using a date stamp in the file name. One problem I have run into with this method was related to running multiple servers in a cluster. We use 5 web servers in our load balancer. We would deploy the code to each server and run the versioning code and it would add a date stamp to each file. However, the date stamp also included the hour. Not all of the boxes would be running the versioning code at the same time so some of the servers would use a string like “2016031114”, but others would be running a few minutes later and have a string that ended in 15. This would end up with occasional 404 pages when an asset would be requested but the request would get sent to a different server in the load balancer. The final way I know of creates a hash (like a md5 hash) based on the file contents and adds that to the file name. I have found this method works best. I have found a grunt plugin for doing this.

The plugin that I chose is grunt-assets-versioning. There are several npm modules available, but this is the one I like best. As well as versioning your assets, it will output a mapping file of the assets versioned that can be used in your backend assets controller. To install it, do npm install grunt-assets-versioning --save-dev. This will automatically add it to your package.json file during the install. After installing, you activate it by updating your Gruntfile with grunt.loadNpmTasks('grunt-assets-versioning'); and defining a grunt task to do the versioning.

This is the grunt task I setup:

assets_versioning: {
    options: {
      tag: 'hash',
      post: true,
      versionsMapFile: 'app/helpers/AssetHelper.php',
      versionsMapTemplate: 'app/helpers/AssetHelper.php.tpl',
      versionsMapTrimPath: 'public'
    files: {
      options: {
        tasks: ['uglify:global','uglify:advert','cssmin:target']

Here is a quick overview of the options I use. The “tag” option is using “hash” which means take an MD5 hash of the file contents to use. There is an option for doing a date stamp, but that can cause problems when deploying across a web cluster, so I recommend using the “hash” value. The option “post” is set true so it will process the files after running a task which we want since we also want to minify our files.

The remaining options deal with creating a version map so we can identify the files created during the asset versioning task. The “versionsMapFile” option is the file name to create. If you do not specify a template, it will create a JSON file with the name specified. Here is an example of the JSON file it would generate:

  { "originalPath": "public/js/global.min.js", "versionedPath": "public/js/global.min.xxxxxxxxx.js", "version": "xxxxxxxxx" },
  { "originalPath": "public/js/advert.min.js", "versionedPath": "public/js/advert.min.xxxxxxxxx.js", "version": "xxxxxxxxx" },
  { "originalPath": "public/styles/global.min.css", "versionedPath": "public/styles/global.min.yyyyyyyy.css", "version": "yyyyyyyy" }

The next option I want to review is “versionsMapTrimPath”. This will remove portions from the versionedPath that you do not want. I the case above, the docRoot for the website is the public directory. So the path will be correct when referencing it in a src in a script tag, we want to remove the leading public so it would be a valid path. You can see that is what I do in the options used above. This would give us a path of “/js/global.min.xxxxxxxx.js”.

However the key to making this useful is creating a template for creating a more useful file for us than the JSON format above. This plugin uses underscore.js for its templating engine. Here is the template I use for creating a PHP file:

namespace app\helpers;

use \app\lib\Helper;
class AssetHelper extends Helper
  public static $dict = array(
<% _.forEach(files, function(file) { %>
    "<%= file.originalPath %>" => "<%= file.versionedPath %>",
<% }); %>

This template would result in this file being created:

namespace app\helpers;
use \app\lib\Helper;
class AssetHelper extends Helper
  public static $dict = array(
    "/js/global.min.js" => "/js/global.min.5e0677e7.js",
    "/js/advert.min.js" => "/js/advert.min.49317176.js",
    "/styles/global.min.css" => "/styles/global.min.3d71d72b.css",

In our PHP application, we use Twig for templating. Here is how I created a Twig function to use the data in the PHP file created by the template.

$twig->addFunction(new Twig_SimpleFunction(
    function ($asset) use ($app) {
        if (isset(AssetHelper::$dict[$asset])) {
            $asset = AssetHelper::$dict[$asset];
        return '';
    array('is_safe' => array('html'))

$twig->addFunction(new Twig_SimpleFunction(
    function ($asset) use ($app) {
        if (isset(AssetHelper::$dict[$asset])) {
            $asset = AssetHelper::$dict[$asset];
        return '';
    array('is_safe' => array('html'))

These functions work on the basis of if I find the string replace it with the versioned string otherwise use the original string. I would then use these functions like this:

{{ assetScript('/js/global.min.js') }}
{{ assetScript('/js/nonversioned.js') }}
{{ assetStylesheet("/styles/global.min.css") }}

This would output the following on what gets served to the client:

<script type="text/javascript" src="/js/global.min.5e0677e7.js"></script>
<script type="text/javascript" src="/js/nonversioned.js"></script>
<link href="/styles/global.min.3d71d72b.css" rel="stylesheet">

The last part of the grunt task is defining what files to process. In my case, I told it to process the output of tasks defined in the Gruntfile. The other tasks are for uglifying javascript and minifying css files. You can also list specific file names, but I find it more effective for performance to also uglify and minify assets.

This has been effective for me for versioning assets. If you have other ideas, please leave a comment.

Sharing templates between PHP and JavaScript

At work, we had a need to share templates between PHP and JavaScript. We had 2 paths to search results data. The first is from doing a dynamic search using AJAX and the results would be displayed by JavaScript. The second path was a way of browsing through the website and showing search results based on the browse selections. This path was server based and would be generated by PHP.

The search results would be displayed in cards, so we wanted to share the card template between both server side and browser. We saved a copy of the card template in the /public/views directory. This is a sample of what the card template looked like:

<div class="card">
  <div class="card_searchCard">
    <h1 class="card_name clearfix"><a href="{{ url }}">{{ display_name }}</a></h1>
    <div class="card_specialty">
      <span>{{ specialty }}</span>
  <div class="card__body clearfix">
    <div class="card__address row">
      {{ address1 }}
      {{ address2 }}<br>
      {{ city }}, {{ state }} {{ zip }}

To use this template in PHP, we settled on using Mustache PHP. A rough example of how we used it to render our results is:

$mustache = new Mustache_Engine(array(
    'loader' => new Mustache_Loader_FilesystemLoader(dirname(__FILE__) . '/../../public/views'),

$cards = array();
$template   = 'professionalCard';
foreach ($results as $result) {
    $cards[] = $mustache->render($template, $result);

This basically takes each result and renders it through the Mustache PHP templating engine. We saved the output of each result as a card and then would use those cards later.

It was trickier to use this from JavaScript, because we had to load the template from the web server to have it available for use. We decided to use MustacheJS for doing the JavaScript rendering. Normally, I would include the template with a script tag. However, if I included the code needed to make the template work in a script tag, the same file would not work with the PHP rendering. So the template files only contained the template with no extra code.

Here are the functions we setup for retrieving the template and rendering it with data:

function render(tmpl_name, tmpl_data) {
  var template;
  template = getTemplate(tmpl_name);
  return Mustache.render(template, tmpl_data);
function getTemplate(tmpl_name) {
  if ( !getTemplate.tmpl_cache ) {
    getTemplate.tmpl_cache = {};

  if ( ! getTemplate.tmpl_cache[tmpl_name] ) {
    var tmpl_dir = '/views';
    var tmpl_url = tmpl_dir + '/' + tmpl_name + '.mustache';
    var tmpl_string;
        url: tmpl_url,
        method: 'GET',
        async: true,
        success: function(data) {
          tmpl_string = data;
          getTemplate.tmpl_cache[tmpl_name] = _.template(tmpl_string);
  return getTemplate.tmpl_cache[tmpl_name];

Here is how we used those functions:

for (var i = 0; i < cardHits.length; ++i) {
  var hit = cardHits[i];
  var rendered_html = render(templ_name, hit);

  // render the hit
  res += rendered_html;

We chose to make sure the templates were preloaded before they were needed, so we added this document ready function:

  $( document ).ready(function() {

When we didn't preload the templates, we would get errors unless we made the AJAX call to load the template a synchronous call. I didn't like having a blocking function, so opted to load the templates at document ready.

If you have any questions or suggestions on alternate solutions, please leave a comment.

Review of “Go in Action”

I just finished the book “Go in Action”. I had been considering using Go in a side project and jumped at the chance to read this book. Before reading this book, my exposure to Go had been a couple of blog posts and a Plural Sight video.

Reading this book really leveled up my knowledge of Go. I now have a foundation to build on and know where to begin in starting my project. This book was written clearly. I appreciated the examples in the book. They were non-trivial and avoided repeated “Hello World” examples. It also introduced the built-in Go tooling and how to effectively use it. It introduced the concepts needed so you can develop programs and modules that integrate with the system tools. This has helped me understand what the community expects of modules that I create that I want to release.

I really enjoyed this book. I feel it helped me move from just being curious about the language to being ready to start exploring and developing with it.

Belated Happy New Year

I want to wish everyone a belated Happy New Year!

I have been trying to decide what I was going to do with my blog. I got out of the habit of regular updates when I started my new job at Vitals. What I have decided to do is post at least once/month. I will target twice/month but may not always get there. I don’t intend to post just to make a post, so I will only be posting meaningful content.

Let me know if there are any specific topics you would like to see more posts on.

Making sure /usr/local/bin is in your path

I was working on a linux system that was not automatically adding /usr/local/bin to the PATH which was causing problems trying to invoke local scripts. I found that creating the file “/etc/profile.d/” with the following content:

if [[ ! ":${PATH}:" == *":/usr/local/bin:"* ]]
then PATH="${PATH}:/usr/local/bin"

made sure that it got added to the PATH of all users on login.

Remote Team Communication

I wanted to share some thoughts on managing remote teams. I have worked with several remote teams and have learned things that have worked and things that have failed. The most important thing for effective remote teams is communication. You need regular communication beyond exchanging emails or posts on project boards. To build a team ethos or spirit takes work. Teams without an ethos or spirit are not going to be very productive. Members will just do the minimum needed to get by and will not go out of their way to help each other. This can easily come about when the only regular communication is by email. People need verbal communication to connect. Without building a connection, people will not be receptive to criticism or correction. I have found the regular use of Hangouts to be effective in building connections. Seeing people’s expressions is an important part of communication. That should happen on a weekly basis at the least. Doing daily standups (short 15 minute status meetings) using Hangouts has proven effective. Daily standups will also help teams organize and make plans for achieving goals.

It is also important to have non-work related communication among team members. This will help build bonds among team members. It will help people see the other team members as more than just a resource. To build a true team, team members need to invest in each other. An effective team will be a team of friends.

Another thing that will help build team ethos is getting the team together periodically. One thing I heard of another team doing is each year the team would take a one week vacation together. They would spend some time working together each day (2-4 hours) to help knit the team together, but would also have fun together. Having that time together in non-work activity helps build the connections that will make a happy lasting team.

Scott Dunn at TAP

Scott Dunn shared at our TAP (Tulsa Agile Practitioners) meeting this week. I really enjoyed the meeting. Scott conducted the meeting as a group discussion and not a presentation. He took topic suggestions from the group as well as covering items that he was passionate about.

The first topic covered was on group dynamics and handling conflict. One of the suggestions was using personality tests to understand the strengths and weaknesses of team members. It is important to balance teams. If everyone has the same strengths, you can develop biases and not work well because certain areas will get ignored. If you want to have healthy conflict in teams, you first need to spend time developing trust and respect in the team.

Another topic discussed was how to scale scrum. Since I am still new to scrum, I didn’t totally understand all the ideas presented. You have a scrum master that helps lead the team, but when there are several teams involved, you need to coordinate teams. Part of what is needed with multiple teams is maintaining consistency. That would be consistency in user interfaces and overall design. Signs of problems in scaling are seen when cross team problems start popping up.

During the discussion, Jason Knight shared a quote from a book (I missed the book name and author) – “Any fool can make money by ignoring maintenance.” I have seen this in person before. An environment like that is not going to keep good talented programmers around long. You need to keep up with technical debt and not let it accumulate and eventually cause major problems.

The final topic that Scott discussed was “What is the story of your life?” If someone made a movie of your life, what would the movie poster be? For most people this isn’t a possibility. You need to decide to make your life one that matters. This means you need to plan your life and not just react to things that come up. One acronym he shared was “G.R.O.W.”

G – goal – decide what you want for goals so you don’t go through life just reacting. Set a goal to make an impact.
R – reality – look at the reality of your life to see where you are now.
O – options – develop options to get from where you are to where you want to be.
W – walk it out – choose an option and execute.

This is what needs to be done to manage the story line of your life. The key is planning and not just reacting. I know I need to spend some time on “GROW”ing my life. This was a very thought provoking TAP meeting.

Changes Part 2

I am coming up on my last day at my current job and looking forward to starting my new one. I have been busy wrapping things up, so I don’t have much interesting to share. In addition to starting a new job, I am also looking at stopping consulting work on the side. I need to spend more time with family and working a full-time job and part-time consulting doesn’t leave much time for other activities. I am sorry to say, if you are not a current client, I am now unavailable. If you need a recommendation, I do know some other freelancers I could recommend, but that will be the extent of new client work going forward.