Notes from Daily Encounters with Technology RSS 2.0
 
# Saturday, August 15, 2015

One of the main DocPad features is support for layouts, which are used to ensure a common design for multiple pages. This is an example of a simple layout:

<!DOCTYPE html>
<html lang="en">
<head>
    <title><%= @document.title %></title>
</head>
<body>
    <h1>Site Name</h1>
    <h2><%= @document.title %></h2>
    <%- @content %>
</body>
</html>

It's using Eco (Embedded CoffeeScript) templating engine and shouldn't be difficult to understand even if you're seeing it for the first time.

This could be a page using this layout:

---
layout: "base"
title: "Page Title"
---
<p>Page content</p>

The header part specifies the name of the layout to use and sets document properties, which are accessed in the layout using the @document.propertyName syntax. The rest of the page replaces @content in the layout file. Here's the resulting generated page:

<!DOCTYPE html>
<html lang="en">
<head>
    <title>Page Title</title>
</head>
<body>
    <h1>Site Name</h1>
    <h2>Page Title</h2>
    <p>Page content</p> 
</body>
</html>

Layouts can even be nested, which opens doors to a new set options. A typical scenario for using them would be several different types of pages in a single site with a common basic design; e.g. articles could always display their author:

---
layout: "base"
---
<p>Author: <%= @document.author %></p>
<p>Published on: <%= @document.date %></p>
<%- @content %>

The sub-layout will get injected into the base layout. The page will need to define the additional properties, expected by it:

---
layout: "article"
title: "Article Title"
author: "Damir Arh"
---
<p>Page content</p>

The page will be generated as one would intuitively expect:

<!DOCTYPE html>
<html lang="en">
<head>
    <title>Page Title</title>
</head>
<body>
    <h1>Site Name</h1>
    <h2>Page Title</h2>
    <p>Author: Damir Arh</p>
    <p>Page content</p>
</body>
</html>

Once I started using nested layouts, I soon wanted to set certain document properties in the sub-layout instead of directly on the page, because the value was common to all pages based on that layout and I didn't want to repeat myself.

Section name is a good example of such a property. The base layout would take care of displaying it:

<!DOCTYPE html>
<html lang="en">
<head>
    <title><%= @document.title %></title>
</head>
<body>
    <h1>Site Name - <%= @document.section %></h1>
    <h2><%= @document.title %></h2>
    <%- @content %>
</body>
</html>

To set the section name directly in the page, I could just add it to the header along with all the other properties:

---
layout: "article"
title: "Article Title"
author: "Damir Arh"
section: "Articles"
---
<p>Page content</p>

Somehow I expected this to work even if I would put it in the sub-layout header instead. To my surprise, the property value remained uninitialized. Giving it some more thought, I realized it couldn't have worked: instead of setting a property on the page, I was setting it on the sub-layout. There's a different syntax to access page properties from a layout, and I've already been using it to display them. There's no reason I couldn't set the values the same way:

---
layout: "base"
---
<% @document.section = "Articles" %>
<p>Author: <%= @document.author %></p>
<%- @content %>

The generated page will still have properly initialized property value:

<!DOCTYPE html>
<html lang="en">
<head>
    <title>Page Title</title>
</head>
<body>
    <h1>Site Name - Articles</h1>
    <h2>Page Title</h2>
    <p>Author: Damir Arh</p>
    <p>Page content</p>
</body>
</html>

I didn't manage to find any such example online and spent too much time figuring it out. In my opinion, that's a reason enough for writing this blog post.

Saturday, August 15, 2015 9:04:56 PM (Central European Daylight Time, UTC+02:00)  #    Comments [0] - Trackback
Software | DocPad
# Sunday, August 9, 2015

After successfully migrating the content from the old DasBlog site to the new DocPad based one, it was time to generate permanent redirects of old URLs to new ones, to keep inbound links working and not lose search rankings, once the new site goes live. Since the site is going to be hosted in Azure, I decided to use the URL Rewrite module - rewrite maps to be exact; because I need to map a large number of individual URLs, which can't be covered by a generic rule.

To avoid cluttering the web.config file with all the mappings, I moved the rewrite maps into a separate file, keeping in web.config only the rules and a reference to the external file:

<configuration>
  <system.webServer>
    <rewrite>
      <rewriteMaps configSource="rewriteMaps.config" />
      <rules>
        <rule name="commentLinks">
          <match url="^CommentView,guid,.+\.aspx$" />
          <conditions>
            <add input="{commentLinks:{REQUEST_URI}}" pattern="(.+)" />
          </conditions>
          <action type="Redirect" url="{C:1}" appendQueryString="false" 
                  redirectType="Permanent" />
        </rule>
        <rule name="permalinks">
          <match url=".+\.aspx$" />
          <conditions>
            <add input="{permalinks:{REQUEST_URI}}" pattern="(.+)" />
          </conditions>
          <action type="Redirect" url="{C:1}" appendQueryString="false" 
                  redirectType="Permanent" />
        </rule>
      </rules>
    </rewrite>
  </system.webServer>
</configuration>

In the snippet above I have two rewrite rules for two rewrite maps. Matching their names; the first one takes care of GUID based comment pages, while the second one covers the article permalinks. Here's a valid snippet of rewriteMaps.config file:

<rewriteMaps>
  <rewriteMap name="permalinks">
    <add key="/BookReviewSignalRRealtimeApplicationCookbook.aspx" 
         value="/blog/posts/20150626-BookReviewSignalRRealTimeApplicationCookbook.html"/>
    <add key="/BookReviewMasteringTypeScript.aspx" 
         value="/blog/posts/20150615-BookReviewMasteringTypeScript.html"/>
  </rewriteMap>
  <rewriteMap name="commentLinks">
    <add key="/CommentView,guid,50cee933-b7e1-4a15-99a4-a70b69d07dbd.aspx" 
         value="/blog/posts/20150626-BookReviewSignalRRealTimeApplicationCookbook.html"/>
    <add key="/CommentView,guid,3fe215bf-3019-48fc-a765-bb4e3892fdd0.aspx" 
         value="/blog/posts/20150615-BookReviewMasteringTypeScript.html"/>
  </rewriteMap>
</rewriteMaps>

As long as rewriteMaps is the root element, and its children would be valid directly in web.config, everything should work fine. Also notice that rewrite map names should match the input attribute value in rule conditions.

Considering the large number of posts on my site, I had no intention of writing the rewrite maps by hand. Instead, I took advantage of the BlogML export of DasBlog content which I already used to convert the posts to the new format. Of course I also used the same tooling: Grunt and CoffeeScript. I just added the rewrite map generation to the conversion process.

First, I created two associative arrays with the URL mappings:

exportRewriteMaps = (posts) ->
  fs = require 'fs'
  moment = require 'moment'
  slug = require 'slug'
  titleCase = require 'title-case'

  permalinkMappings = {}
  commentLinkMappings = {}

  for post in posts
    newUrl = moment(post.$['date-created']).format('YYYYMMDD') + '-' + 
      slug(titleCase(post.title[0]._), '') + '.html'
    permalinkMappings[post.$['post-url'].replace('http://www.damirscorner.com/', '')] = newUrl
    commentLinkMappings['CommentView,guid,' + post.$['id'] + '.aspx'] = newUrl

The posts argument matches the post collection I used for conversion. Of course, I generate newUrl the same way I generated the filename for converted blog posts, with the omission of .md extension which is removed by DocPad when generating the site. Post permalinks are stored in post-url attribute; I just make them relative by removing the hostname part. I recreate the comment links in the code; I use id attribute which contains the post GUID.

Now I had everything I needed to export the rewrite maps to a file:

xmlBuilder = require 'xmlbuilder'

root = xmlBuilder.create('rewriteMaps')

createRewriteMap 'permalinks', permalinkMappings, root
createRewriteMap 'commentLinks', commentLinkMappings, root

fs.writeFileSync 'rewriteMaps.config', root.end { pretty: true }

I use the xmlbuilder-js package. I create the required rewriteMaps root element and pass it to the createRewriteMap function which generates the rewriteMap element corresponding to the other function arguments. In the end I dump everything into the destination file. The missing function is pretty straightforward:

createRewriteMap = (name, mappings, root) ->
  map = root.ele('rewriteMap',
    name: name)

  map.ele('add',
    key: '/' + key
    value: '/blog/posts/' + value) for key, value of mappings

The only part worth mentioning, is the correct folder prefix that's added to generated new post file names. The resulting file using the above code can be directly used by IIS. Just before switching to the new site I can only regenerate it to include to the new blog posts written in the interim.

Sunday, August 9, 2015 5:14:01 PM (Central European Daylight Time, UTC+02:00)  #    Comments [0] - Trackback
Development | CoffeeScript | Personal | Website | Software | IIS
# Saturday, August 8, 2015

I recently deployed my new web site to an Azure web app for the first time. The site seemed to load correctly, but a closer inspection with Fiddler revealed a couple of 404 errors.

404 errors for Font Awesome files

Font Awesome web font files appeared to be missing, although they were present on the web server. The reason was that by default files with .woff2 and .woff extensions are not served. When web server logging and detailed error messages are enabled for the web app, this becomes obvious from the error log in LogFiles/DetailedErrors:

The page you are requesting cannot be served because of the extension configuration. If the page is a script, add a handler. If the file should be downloaded, add a MIME map.

Since you don't have access to IIS Manager for an Azure web app, this needs to be done by adding the following lines to web.config:

<system.webServer>
    <staticContent>
        <remove fileExtension=".svg" />
        <mimeMap fileExtension=".svg" mimeType="image/svg+xml" />
        <remove fileExtension=".eot" />
        <mimeMap fileExtension=".eot" mimeType="application/vnd.ms-fontobject" />
        <remove fileExtension=".woff" />
        <mimeMap fileExtension=".woff" mimeType="application/font-woff" />
        <remove fileExtension=".woff2" />
        <mimeMap fileExtension=".woff2" mimeType="application/font-woff2" />
    </staticContent>
</system.webServer>

Even though I only encountered errors with .woff and .woff2 files I decided to include .svg and .eot files in the configuration as well. Different versions of different browsers retrieve web fonts in different order; e.g. when I tried it with Chrome 4, it attempted to download the .svg font first.

As soon as I deployed the new web.config file to the server, the error was gone.

Successfully downloaded .woff2 file

Saturday, August 8, 2015 3:57:34 PM (Central European Daylight Time, UTC+02:00)  #    Comments [0] - Trackback
Development | Azure | HTML5 | Software | IIS
# Sunday, August 2, 2015

In the scope of changing my blogging platform I also decided to switch from self-hosting the blog to hosting it in a Microsoft Azure web app. One of the available features, I want to take advantage of, is continuous deployment from a Git repository at one of the supported repository sites. Of course, the repository only contains the sources for the site, therefore it will need to be built every time the latest version is retrieved from the repository.

Configuration of the build that happens on every deployment will need to be committed to the repository along with the rest of the site. It turns out, azure-cli can generate most of it. Once this npm package is globally installed, it can be instructed to create a generic deployment script for a Node.js project:

npm install azure-cli -g
azure site deploymentscript --node

The generated deploy.cmd script ensures a working Node.js environment but doesn't know anything about DocPad and how to compile a site using it. Fortunately the script is well commented and easy to extend. There are several sections in the script, building the site will belong at the end of Deployment section. By default it consists of three parts, DocPad build will be the forth one, added immediately after the installation of npm packages:

:: 4. Build DocPad Site
echo Building the DocPad site
pushd %DEPLOYMENT_TARGET%
call  %DEPLOYMENT_TARGET%\node_modules\.bin\docpad.cmd generate --env static
IF !ERRORLEVEL! NEQ 0 goto error

There's one more change required to ensure reliable building in Azure. On first use DocPad prompts you to agree with the terms of use. This needs to be disabled or the Azure build will fail when this happens. The configuration setting can be added at the beginning of docpad.coffee file:

docpadConfig =
  prompts: false

Once you commit all three files to the repository (.deployment, deploy.cmd and docpad.coffee), everything is ready for Azure deployment.

All of the Azure configuration will be done using the Azure preview portal. You first need to provision a new web app to host your site.

Provisioning a new Azure web app

If you're new to Microsoft Azure, pay attention to the service plan you're going to choose. While you're just testing everything during development, you'll want to take advantage of the free shared infrastructure tier, which allows you to host up to 10 sites for free.

Free shared infrastructure pricing tier in Azure

Once the web app is providioned for you, you can configure continuous deployment in its dashboard by clicking on the corresponding tile.

Set up continuous deployment

The wizard will guide you through the following steps:

  • Choose Source (e.g. Bitbucket or GitHub)
  • Authorization (using OAuth to avoid enetering password into Azure portal)
  • Choose your organization (containing the repository you want to use)
  • Choose project (i.e. repository to use)
  • Choose branch (master by default, I have a dedicated deploy branch for deployment)

Once you configure everything, Azure will scan the repository and start the deployment of the latest commit in the selected branch. If everything goes well, the build will succeed and the site will be deployed.

Active deployment was successful

If you try to navigate to your site, it still won't work, because DocPad generated the site in out subfolder of the working directory, while by default the root directory is being served. To change that navigate to Settings > Application settings and scroll to the bottom of the pane. Virtual applications and directories are configured there. Delete the existing root entry pointing at site\wwwroot and create a new one pointing at site\wwwroot\out. After you save the changes, the site should start working as expected.

Virtual applications and directories configuration

If it still doesn't work for you, you'll want to access the files on the server to diagnose the issue. There are 2 ways to do that.

You can use Server Explorer in Visual Studio. If you're connecting to Azure for the first time, you'll need to right click on the Azure node and select Connect to Microsoft Azure Subscription... from the context menu. After you enter your Azure credentials, you'll be able to navigate to your web app and directly access any of the files.

Server Explorer in Visual Studio

Alternatively you can connect to the web app host using FTP. I don't think the password is displayed anywhere in the portal, but you can get all the connection details by downloading the publish profile from the web app dashboard.

Download publish profile from dashboard

The downloaded XML file contains two publish profiles; you need the one with FTP as publishMethod. The attributes of interest are: publishUrl, userName and userPWD. Enter this information into your FTP client of choice to access the files.

Sunday, August 2, 2015 4:19:30 PM (Central European Daylight Time, UTC+02:00)  #    Comments [0] - Trackback
Development | Azure | Software | DocPad | Git
# Sunday, July 26, 2015

Having a site automatically deployed from a Git branch can be convenient, but I don't feel all that comfortable comfortable deploying a commit before successfully running all the tests. Of course, this can easily be achieved by first committing to a different branch, running all the tests next and only then merging the commit to the branch, the deployment is done from. Being lazy, I don't want to do the merging manually - that's what I have my continuous integration server for. If it's already running the tests, it should do the merging, as well.

TeamCity has built-in support for gated commit build pattern in the form of pre-tested commits. Unfortunately, to make them work, you need to use a supported IDE or a command line tool. That's why I decided in favor of an alternative approach: automatic merge feature. It took some experimentation to configure it correctly, but I like how it turned out in the end. I'm writing down the steps I had to make, in case I ever want use it in another project.

For the automatic merge feature to work at all, TeamCity must monitor both the source and the destination branch. I decided to deploy from the deploy branch and keep committing my work to master branch. Relevant settings are parts of VCS Root configuration (you need to Show advanced features for Branch specification field to show up):

VCS Root Branch Specification

Failing to have your destination branch watched, makes TeamCity unaware of it and results in the following error:

Automatic merge failed: Cannot find destination branch to merge into: no VCS branch maps to the 'deploy' logical branch name according to the VCS root branch specification.

Since TeamCity will now start committing to your source control, you might also want to change the username it does it with. For Git, it uses the following default value: username <username@hostname>. This can be configured with another advanced VCS root feature: Username for tags/merge.

With all that configured, it's time to add the Automatic merge build feature:

Add build feature: Automatic merge

Most of the configuration is pretty straightforward:

  • Watch builds in branches must contain source branch(es) filter: +:master in my case
  • Merge into branch must contain the destination branch: deploy in my case
  • You only want to Perform merge if: build is successful
  • Merge policy is up to you; I decided to Always create merge commit

I had problems with the default Merge commit message: parameter references failed to resolve properly (e.g. %teamcity.build.branch% always resolved to <default> instead to the source branch as one would expect, and %build.number% always resolved to 1). I can live with a fixed message, though. Git's merge tracking works good enough for my needs.

Sunday, July 26, 2015 11:41:24 AM (Central European Daylight Time, UTC+02:00)  #    Comments [0] - Trackback
Software | Git | TeamCity
My Book

NuGet 2 Essentials

About Me
The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

All Content © 2015, Damir Arh, M. Sc. Send mail to the author(s) - Privacy Policy - Sign In
Based on DasBlog theme 'Business' created by Christoph De Baene (delarou)
Social Network Icon Pack by Komodo Media, Rogie King is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.