July 2012

Volume 27 Number 07

Forecast: Cloudy - Mixing Node.js into Your Microsoft Azure Solution

By Joseph Fultz | July 2012

Joseph FultzNode.js has been getting a lot of press as of late, and it’s highly touted for its asynchronous I/O model, which releases the main thread to do other work while waiting on I/O responses. The overarching rule of Node.js is that I/O is expensive, and it attempts to mitigate the expense by forcing an asynchronous I/O model. I’ve been thinking about how it might be incorporated into an already existing framework. If you’re starting from scratch, it’s relatively easy to lay out the technology choices and make a decision, even if the decision is pretty much based on cool factor alone. However, if the goal is to perform a technology refresh on one part of a solution, the trick is picking something that’s current, has a future, doesn’t come with a lot of additional cost and will fit nicely with the existing solution landscape.

That’s exactly what I’m going to demonstrate in this column. I’ll take an existing solution that allows viewing of documents in storage but requires a shared access signature to download them. To that solution I’ll add a simple UI using Node.js. To help with that implementation, I’ll take advantage of some commonly used frameworks for Node.js. The solution, therefore, will include:

  • Node.js—the core engine
  • Express—a Model-View-Controller (MVC)-style framework
  • Jade—a rendering and templating engine

Together these three tools will provide a rich framework from which to build the UI, much like using ASP.NET MVC 3 and Razor.

Getting Started

If you’re new to Node.js, it’s probably best to start with the walk-throughs available from Microsoft at windowsazure.com/develop/nodejs. You’ll also need to install the Node.js SDK for Azure. Additionally, you’ll probably want to spend a little time poking around Express (expressjs.com) and Jade (jade-lang.com). If you’re new to these tools, you’ll find some familiar concepts and a mix of familiar and unfamiliar syntax.

For this scenario, my existing services will do the work on the Azure side, and the Azure-hosted Node.js-based site will call those services to render a list of documents for access. Generally, it’s a useful practice to put a layer of indirection between the client and the back-end services. This isolates the services from any interface changes, but the real value is often in the extra functional flexibility and the way you can include and exclude back-end service providers.

In the existing solution, as represented in Figure 1, the goal was to give a user access only if he was authenticated, which generates a Shared Access Signature (SAS). The idea was to grant read access to those who authenticated and then subsequently grant full Create, Read, Update, Delete (CRUD) access to a particular document based on role and membership level. Here I’ll focus solely on the read permissions.

The Request Sequence
Figure 1 The Request Sequence

Creating the Services

I’ll mock up an authentication service to return an ID. The subsequent service call will retrieve a list of files. The Azure Storage container I’m using (“documents”) has the public permissions restricted. I want to provide a list of documents even if the user isn’t authenticated, but I don’t want an unauthenticated user to be able to open the files. My two calling signatures for the API I’ve created are:

https://[host]/admin/GetAccess?user=[user]&pwd=[password]
https://[host]/admin/files?accessId=[authId]

Of course, you’ll want a more realistic auth service that doesn’t use the querystring and does use SSL; I won’t cover that piece of the solution here.

First, I need a method for creating a SAS (see Figure 2). I’ll need this method shortly when I create the method that builds the document list.

Figure 2 Method for Getting the Shared Access Signature

public string GetSharedAccessSignature()
{
  string sas = "";          
  sas = (string) System.Web.HttpContext.Current.Cache.Get("sas");
  // If no SAS then get one
  if (sas == null)
  {
    // TODO: hardcoded container, move to config
    CloudBlobContainer container = 
      blobClient.GetContainerReference("documents");
    // Ask the container for SAS passing in the a newly initialized policy
    sas = container.GetSharedAccessSignature(new SharedAccessPolicy()
    {
      SharedAccessStartTime = DateTime.Now,
      SharedAccessExpiryTime = DateTime.Now.AddMinutes(MaxMinutes),
      Permissions = 
        SharedAccessPermissions.Read | SharedAccessPermissions.List
    });
    // Add to cache for reuse because this isn't a per-user SAS
    System.Web.HttpContext.Current.Cache.Add("sas", sas, null,
      DateTime.Now.AddMinutes(MaxMinutes), new TimeSpan(0,0,5,0,0),
      CacheItemPriority.High, null);
  }
  return sas
}

For the most part, this is fairly typical Azure Storage code until the call to GetSharedAccessSignature. There isn’t a shared access policy here, so I need to pass information regarding when to start or stop allowing access and the type of permissions. All I want to provide via the SAS is the ability to read and list files. Also, because the SAS will theoretically be used by anyone who is authenticated, I add it to the cache for reuse in order to avoid collision and to reduce the amount of churn in generating access keys.

The service interface will be configured as a WebMethod:

[OperationContract]
[WebGet(UriTemplate = "Files?accessId={accessId}")]
List<BlobInfo> GetFiles(string accessId);

Note the use of the custom class BlobInfo—again, I’m using indirection. I have specific fields that I want to return and IListBlobItem doesn’t necessarily represent them. So, I’ll marshal the information returned from IListBlobItems into a list of my type, as shown in Figure 3.

Figure 3 GetFiles Implementation

public List<BlobInfo> GetFiles(string accessId)
{
  List<BlobInfo> blobs = new List<BlobInfo>();
  CloudBlobClient sasBlobClient = default(CloudBlobClient);
  CloudStorageAccount storageAccount =
    CloudStorageAccount.FromConfigurationSetting(
    "StorageAccountConnectionString");
  string sas = default(string);
  if(accessId.Length > 0)
  {
    // For the mock just make a simple check
    if(VerifyId(accessId))
    {
      sas = GetSharedAccessSignature();
      // Create the blob client directly, using the SAS
      sasBlobClient = new CloudBlobClient(storageAccount.BlobEndpoint,
        new StorageCredentialsSharedAccessSignature(sas));
    }
  }
  else
  {
    sasBlobClient = storageAccount.CreateCloudBlobClient();
  }
  CloudBlobContainer blobContainer =
    sasBlobClient.GetContainerReference("documents");
  foreach (IListBlobItem blob in blobContainer.ListBlobs())
  {
    BlobInfo info = new BlobInfo();
    info.Name = blob.Uri.LocalPath;
    info.Uri = blob.Uri.AbsoluteUri;
    info.Sas = sas;
    info.CombinedUri = blob.Uri.AbsoluteUri + sas;
    blobs.Add(info);
  }
  return blobs;
}

It’s important to note in Figure 3 that I’m using the SAS if the user is authenticated in order to return a list that respects the access policy on the container.

With the Representational State Transfer (REST) service in place I can run a quick test via a browser window. By setting up the service interface this way, it becomes easy for me to mock authentication by using a well-known value until I have the for loop generating the list and the SAS running properly. Once that’s done, the VerifyId(string) simply checks to see if I have a credential cached with a key equal to the accessId. Figure 4 shows a list returned without being authenticated. Because the list returned by the service wasn’t authenticated, the SAS value is set to nil. Thus, I can use the data to render the list, but I can’t give a working link to the user, because there is no SAS.

An Unauthenticated List
Figure 4 An Unauthenticated List

Figure 5 shows the authenticated list, which does include the SAS.

An Authenticated List with the SAS
Figure 5 An Authenticated List with the SAS

It will be the job of the Node.js client to sort through what the service returns from an authenticated call and render the hyperlinks with the SAS postfixed to the URI. To help with that, I’ve provided a CombinedUri element so that the client needs to access only that one element. Finally, while the XML is great, because I’m working in Node.js, it makes sense to change the attribution on the interface to return JSON so that the service response can be directly consumed as an object:

[WebGet(UriTemplate = "Files?accessId={accessId}",
  ResponseFormat=WebMessageFormat.Json)]

Here’s what the JSON output looks like:

[{"CombinedUri":"https:\/\/footlocker.blob.core.windows.net\/documents\/AzureKitchen-onesheet.docx?st=2012-03-05T05%3A22%3A22Z&se=2012-03-05T05%3A27%3A22Z&sr=c&sp=rl&sig=Fh41ZuV2y2z5ZPHi9OIyGMfFK%2F4zudLU0x5bg25iJas%3D","Name":"\/documents\/AzureKitchen-onesheet.docx","Sas":"?st=2012-03-05T05%3A22%3A22Z&se=2012-03-05T05%3A27%3A22Z&sr=c&sp=rl&sig=Fh41ZuV2y2z5ZPHi9OIyGMfFK%2F4zudLU0x5bg25iJas%3D","Uri":"https:\/\/footlocker.blob.core.windows.net\/documents\/AzureKitchen-onesheet.docx"}]

As noted, JSON is what we ultimately want here, as it’s directly consumable within Express and Jade.

Node.js UI

I’ve already set up Node.js, Express and Jade, so I’m ready to create the UI. I’ve gone through the process of getting deployment of Node.js roles up and running in Visual Studio, but that’s a fairly detailed and completely manual process. So, because there’s no tools integration for the Node.js part of this, I’ll use Sublime Text 2 to do my editing and Chrome to debug (as described on Tomasz Janczuk’s blog at bit.ly/uvufEM).

I should mention a few housekeeping items. For the uninitiated, the frameworks I’m employing provide some easy-to-use wrappers around certain functionality, MVC and a template-rendering engine:

  • Restler for easy REST calls (think simplified WebClient)
  • Express for the general MVC-style application framework
  • Jade for template-rendering, similar to the Razor engine used in ASP.NET MVC

These are all considered modules in Node.js (like DLLs in .NET) and are generally installed from GitHub via Node Package Manager (NPM). For example, to install Restler, use the command “npm install restler” from within the project folder. This takes the place of manually installing a module and adding a reference to it in the project.

One last bit of information for the unfamiliar. You’ll notice a lot of anonymous functions nested in other functions. My best advice is to just reformat the code enough to see the nesting as you work with it until you naturally parse it without having to reformat it. I’ll attempt to nest my samples for readability as well as use screenshots from Sublime, which are nicely colored and help with readability.

I used the commands New-AzureService and New-AzureWeb­Role to create an app named AzureNodeExpress. I’ve also made a couple of other modifications. In server.js I added routes to get to the index page; the ASP.NET analog is the MapRoutes method used in MVC projects.

Server.js Modifications

Much like using statements in C#, I need to tell Node.js which libraries I’ll be using. In Node.js, I’ll set those references by assigning a variable the value of the return of the require(‘[lib name]’) function. Once the references are set, I do a bit of configuration to set some of the engine variables (for example, setting “view engine” to “jade”). Of particular interest are the “view engine,” router, bodyParser, cookieParser and session.

 I’ll skip some of the more mundane elements, but I do want to set up my routing. For the Get verb on my Index page I’ll simply render the view directly:

app.get('/index',
  function(req, res){
    res.render('index.jade', {title: 'index'});
  }
);

However, for the Post verb I want to pass the handling over to the index model. To accomplish that I have to “bind” a defined method of the model:

app.post('/index', index.GetAccessKey.bind(index));

With the routing in place I’ll need to set up both the view and the model.

The View—Index.jade

In a sense, I’m skipping from the beginning to the end by going from the controller to the view, but when working in an MVC style I like to create a simplified view to work against first. Jade syntax is basically HTML without the decoration of brackets. My entire Jade template is shown in Figure 6.

Figure 6 Jade Template

html
head
  title Default
body
  h1 File List Home Page
  br
  label Welcome #{username}
  form(method='post', action='/index')
    label Username:
      input(name='username', type='text')
    br
    label Password:
      input(name='password', type='password')
    br
    button(type='submit') Login
  h2 Files   
  form
    table(border="1")
      tr
        td Name
        td Uri
        each doc in docList
          tr
            td #{doc.Name}
            td
              a(href=#{doc.CombinedUri}) #{doc.Name}

Of note here are the use of #{[var]} to reference variables and the table template with the loop in it, which is a kind of abbreviated foreach. I’ve arbitrarily named the list of items that I want to iterate over docList. This is important because in the index.js page where I ask Jade to render this view, I’ll need to pass in the value for docList. Things are pretty basic here, because I’m just creating a developer UI—simple and plain.

The Model—Index.js

Having set up the runtime infrastructure in server.js and the final view template in index.jade, I’m left with the meat of the execution, which happens in index.js. Remember I set up a binding for app.Post to the index page. That binding will load and run the prototype I’ve created in index.js. To do this, I’ll add functions to the index prototype, as shown in Figure 7. In essence, I’m creating a named function (for example, GetAccessKey) and defining an anonymous function as its execution body. In each of these functions I’ll use the Restler module to simplify the REST calls I need to make.

Figure 7 Index.js Functions

The first call once the Post bind happens is to GetAccessKey, which simply takes the username and password I submitted via the form post, appends them to the URI as part of the querystring and uses Restler to do a Get. Remember, in Node.js all communication happens asynchronously, which is one of the reasons you’ll see the proliferation of highly nested anonymous functions. Staying true to that pattern in the call to rest.get, I define an anonymous function that’s executed once the request is complete. Without the error-handling code, the line simplifies to:

rest.get (uri).on('complete',
  function(result){
    accesskey = result;
this.ShowDocs (req, res);}
  )

Hopefully, that reformatting will help give a sense of what’s going on. Once I’ve gotten the key from my service, I’ll postfix it to the URI in the method to fetch the document list. Now things get a little different from the usual. In the anonymous function handling the return of the REST call to get the document list, I ask Jade to render the results for me:

res.render ('index', {title: "Doc List",
    layout: false,
    docList: result,
    username: req.BODY.username});

I noted earlier that in the template I created the variable name “docList.” I now need to make sure I’m using that correct name. The call to res.render tells the Express framework to render the “index” resource and then passes parameters in via a list of colon- and comma-separated name:value pairs.

Runtime

If I attempt to browse to one of the files to download it, I’m presented with nothing. The Web page isn’t found. You might expect an unauthorized error from Azure Storage, but if you try to access something that’s marked private, what’s returned is that the resource doesn’t exist. This is by design, and it’s desirable because something that’s “private” shouldn’t exist to the public, even in concept. If a 401 error were returned instead, it would indicate that something is actually there, violating the security policy represented by private.

Because I’ve secured the storage location, no direct access is allowed. However, once I run the sample code, the story is a little different. I publish the application using the Windows PowerShell Publish-AzureService command, browse to the page and enter my credentials; I’m then presented with a list of links to the files (see Figure 8).

Links to the Files
Figure 8 Links to the Files

Because my service is brokering the calls to storage, I was able to list the files even though I can’t list them directly. Also, because each link is postfixed with the SAS, when I click on it I’m prompted to Open or Save the target document.

Wrapping Up

If you’re interested in new or trending technologies as a way to evolve your Azure application, and you’re a true believer of what’s going on in the Node.js domain, Azure has you covered—not only with hosting the solution, but also on the development side with options such as a client library for Node.js, hitting the REST API directly or through indirection as I demonstrated here. Development certainly would be a lot better and easier if Node.js had proper tooling support, and I’m sure we’ll eventually see some sort of integration with Visual Studio if the popularity of Node.js continues to grow.


Joseph Fultz is a software architect at Hewlett-Packard Co., working as part of the HP.com Global IT group. Previously he was a software architect for Microsoft, working with its top-tier enterprise and ISV customers to define architecture and design solutions.

Thanks to the following technical experts for reviewing this article: Bruno Terkaly