Government 2013

Volume 28 Number 10A

Geo-Data Visualization - Geo-Visualization of Government Data Sources

By Malcolm Hyson

When you work with multiple government clients, you’ll likely encounter common issues spread across the different organizations. I routinely see multiple efforts revolving around search, content targeting and data visualization. In this article, I’ll cover a way to handle geographic data queries using a cloud-hosted API. Then I’ll show how you can visualize the results using the Bing Maps SDK for Windows Store apps.

Solution Components

Last year I got a suggestion from a colleague to produce a mobile app that would provide publicly available data based on a user’s current location. We went back and forth on ways our end user would interact with the content and decided that a geo-referenced query capability would be the best way to go. With a little research, I found that similar features existed in a few apps, but I was surprised at the lack of apps focused on delivering publicly accessible, geo-coded government information.

Data Sources I researched what types of data existed out there in the wild and quickly found some promising sources, with some surprises regarding the volume of data and its quality. Even after I found what I considered to be a glut of content, I continually got new suggestions from my counterparts working on other government agency engagements. As you’d expect, when pulling content from multiple diverse locations, the one thing you won’t get is a standard format. Also, I found that many of the sites varied in relation to the variety and fidelity of content, but most of it came in a few well-known document types such as CSV, GeoRSS, Keyhole Markup Language (KML) and shapefiles.

Once I had a few data sources in mind, I had to determine what components I needed to connect my end users with a reliable hosted service. Here are the tools I selected:

  • Bing Maps SDK for Windows Store apps: For client-side rendering of maps and geo-referencing search results.
  • Team Foundation Service: Continuous integration with Windows Azure Web Sites and source control management (SCM).
  • Windows Azure: Windows Azure SQL Database (formerly SQL Azure) for the data storage and geo-query capabilities; Windows Azure Web Sites to host the API.
  • Windows Store SDK: Application deployment and framework.
  • Visual Studio 2012 (Update 3)

Choosing a Data Store and API Although I could write my own app to query and return results using the various aforementioned formats from different sources, there’s no reason to reinvent the wheel. The SQL Server platform supports the ability to use spatial queries, so I decided to use it to take care of any data normalization and content query issues. Leveraging this structure allows me to quickly convert any result set into an XML or JSON data feed or any other format that’s easily rendered by the geo-display engine of my choice.

I decided to use an ASP.NET MVC 4 Web app with the Web API template to deliver the results to the end client. ASP.NET MVC 4 includes Entity Framework 5 with spatial query support that provides a reliable way to query content stored in SQL Server based on a user-selected, geographically referenced point.

Next, I needed to choose where to host my app, and I decided to go with the Windows Azure Web tier. Visual Studio contains built-in hooks to publish ASP.NET MVC projects to a Windows Azure Web Site. This allows me to take advantage of the free Web application tier along with the ability to easily scale the app if there’s sufficient demand.

The Client App After I defined my API, the next step was to devise a way for end users to consume it. Because this is a geo-referenced app, the Bing Maps service was a natural choice to use to render the results on a map. Then I decided to go with a Windows Store app to take advantage of the client deployment and update models. The app created here could be published to the Windows Store with some additional polish, or placed in an enterprise app store if you wanted to manage the deployment of the code within your organization.

Source Control I won’t go into much detail related to the application lifecycle management (ALM) of the service; however, in keeping with the cloud-hosted theme, I chose to manage the solution with Team Foundation Service for a few reasons. First is the obvious need for source control and task management. Then there’s the issue of basic infrastructure maintenance cost. Although Visual Studio Team Foundation Server 2012 is a pretty easy server application to set up and maintain within your own infrastructure, like any service it comes with a certain amount of management overhead. A few things to think about are OS cost, network infrastructure, security and scalability. Leveraging my own infrastructure can take care of these issues, but it becomes hard to justify when comparing the capabilities available with Team Foundation Service that come with a negligible amount of effort to set up and utilize the platform. All you have to do is sign up using your Live ID and create a new project, and you’re off and running. With that said, I’ll use Team Foundation Service unless I find a well-defined argument against it.

Debrief: Government-Hosted Geo-Tagged Content

Different government organizations have published geo-tagged data in a variety of formats and levels of detail. This article describes how to find and then subsequently normalize that content. It also covers how to publish the solution using the Microsoft Windows Azure hosted service with the visualization handled on a Windows Store app integrated with Bing Maps to provide a highly scalable app at a minimal cost.

 

IT Brief:

By leveraging Software as a Service (SaaS) services when possible, IT managers can drastically reduce or potentially eliminate the amount of infrastructure required to deliver solutions to their customers. Services such as Windows Azure alleviate the need to focus on keeping servers up and online. By leveraging these tools, organizations may quickly deploy and scale applications in a matter of minutes, using the following:

  • Windows Azure Web Site publishing
  • Team Foundation Service application lifecycle management (ALM)
  • Geo-visualization with Bing Maps
  • Leveraging a SaaS model to increase the speed of delivery and reduce infrastructure cost

Dev Brief:

Using Windows Azure SQL Database with geographic fields to normalize content from multiple, publicly available data sources can provide many interesting opportunities for data analysis and visualization. By leveraging Team Foundation Service in conjunction with the continuous integration option when connected to Windows Azure, solution developers can quickly build, test and deploy their apps without ever leaving the Visual Studio IDE. This involves:

  • Team Foundation Service ALM
  • Bing Maps SDK for Windows Store apps
  • Windows Store apps
  • Windows Azure SQL Database geographic query capabilities

More Information:

  • Team Foundation Service: bit.ly/SlfGJ8
  • Bing Maps SDK for Windows Store apps:  bit.ly/13svBhM
  • Rick Strahl’s blog post, “Basic Spatial Data with SQL Server and Entity Framework 5.0”:  bit.ly/MuaLUS
  • “Walkthrough: Publishing an F#/C# MVC Application to Windows Azure”: bit.ly/18hQWO2

 

Building the App

Provisioning the Windows Azure Web Site As stated earlier, I decided to use a Windows Azure Web Site to host the API tier, leveraging an ASP.NET MVC 4 Web API app in conjunction with the SCM capabilities found in Team Foundation Service. Because the back-end service will be a simple Web-hosted API communicating with a SQL Server database, I can quickly get a Web project up and running using the free Web Site option available from the Windows Azure management console. I like to use this option when doing rapid application development and prototyping because of the low barrier to entry coupled with the ability to leverage a highly scalable architecture. While I’m using a nonsecured site for this application, you could easily add an SSL certificate to encrypt the client communication with the Web service. To learn more about securing API endpoints, see the Windows Azure documentation, “Configure an SSL Certificate on an HTTPS Endpoint,” at bit.ly/ueHwrT. To sum it all up, you can scale the app up as it grows and starts to consume more resources. More important, it’s free to start, and I’ve yet to find a price better than free from a hosted-application perspective.

When you create the Windows Azure Web Site, you’ll be presented with a few options. I chose to use the custom Web Site option with a new database. Another detail worthy of note is the “Publish from source control” option in the site-provisioning wizard. The combination of the Windows Azure Web Site in conjunction with the continuous integration capabilities of Team Foundation Service provides the ability to quickly code, test and publish the app with minimal effort.

You have a variety of source control options to pick from when you publish from source control, and I selected the default Team Foundation Service option. The integration of Visual Studio and Team Foundation Service allows me to publish my application using continuous integration, so as I check in code, my app will be automatically deployed to a Windows Azure Web Site as part of the build process. To get more details, see the Windows Azure documentation, “Continuous delivery to Windows Azure by using Team Foundation Service,” at bit.ly/N0hWAp. I highly recommend taking a little time to read and understand this tutorial if you plan to take advantage of these features.

Defining the Data Store In an attempt to keep my structure as simple as possible for the first iteration, I decided to keep my schema definition to a minimum. My Source table holds the details related to my different content sources, and my Item table holds the details related to my entity of interest. Figure 1 shows the table structure in the Visual Studio entity model view.

My Schema Definition Table Structure in the Visual Studio Entity Model View
Figure 1 My Schema Definition Table Structure in the Visual Studio Entity Model View

The key to the app is the Location field of the Item table, which uses the sys.geography type. I’ll use this field to handle my coordinates for the items in the Item table. I’ll also use it to determine the distance from my query coordinates at run time. 

Deploying the Database With the first revision of my app, I plan to keep my database changes to a minimum, so I decided to publish my schema from my database project. Visual Studio 2012 provides all of the tooling needed to publish to Windows Azure SQL Database and manage the changes for the duration of the application lifecycle. I like the SQL Server Database Project in this case because it allows me to manage the database schema just like the rest of my source code.

One nuance to using the Database Project with Windows Azure SQL Database is that you need to let Visual Studio know this is a Windows Azure SQL Database Project—as opposed to a regular SQL Server instance—so it can properly deploy. You do this by opening the properties for the Database Project and selecting Windows Azure SQL Database as the target platform.

At this point, I published my schema to the Windows Azure SQL Database instance via the Visual Studio IDE by right-clicking my Database Project and selecting the Publish menu option. Once I enter the information required to point the publishing wizard to my Windows Azure-hosted database, I like to save my publishing profile for future use. This creates a .settings file with the relevant details to publish to a Windows Azure SQL Database instance in the future. Typically, I like to set the default publishing profile of my Database Projects during the development of the data tier to ensure quick and reliable schema synchronization with the current projects. This is accomplished by right-clicking the .publish.xml file and choosing the Set as Default Profile option. Now when I want to publish, my Windows Azure SQL Database values will be prepopulated.

With the settings in place, pushing the schema over to my Windows Azure SQL Database instance couldn’t have been easier. Within a few seconds of selecting publish, I was presented with a message stating the request completed successfully.

Adding New Content For most data-driven applications, securing accurate content can be a daunting effort. Fortunately, there are many government geo-referenced data sources available for public consumption. The volume of information can almost seem overwhelming when you start to look into what’s out there, but trust me, it’s well worth your time to check out these sites and familiarize yourself with their content for future reference. Figure 2 lists some sites that provide interesting and diverse content.

Figure 2 Some Sites That Provide Good Geographically Tagged Content

Site URL Description
Data.gov data.gov Data from the executive branch. Part of the current administration’s open data policy. Focused on delivering federally approved content with the inclusion of other data sets from state governments and universities.
National Atlas 1.usa.gov/ro1cu Good for geo-referenced bulk stats such as crime and environmental data over time.
National Atlas (1 Million-Scale Data) 1.usa.gov/YpGUyZ_ High-resolution data from National Atlas. Data from this part of the site is subjected to a higher level of scrutiny than the other datasets. Data from this part of the National Atlas is also available in a publicly accessible Web service at 1.usa.gov/vpUfAQ.
U.S. Geological Survey (USGS) usgs.gov The USGS provides an excellent resource for naturally occurring events within the United States. Here you’ll find information on topics such as climate change, influenza outbreaks, earthquakes, rainfall and other events that occur within the United States.
Recovery.gov recovery.gov A source for content related to government funding initiatives. This site contains content that details government spending by region.
U.S. Census Bureau 1.usa.gov/1a0a43G No list of U.S. government data sources would be complete without including content from the Census Bureau.
Naval Oceanography Portal 1.usa.gov/bkII5 Lots of content related to aquatic conditions. The United States Naval Observatory also provides some high-quality astrological data.
Office of Coast Survey (OCS) 1.usa.gov/17iNiSU I was extremely surprised by how much content was provided by OCS. If I were to do anything with maritime vessels, I’d probably use this data source.

When I started working with geo-referenced data sources back in the late ’90s, the dominant platform in the market was from Esri. That company is still a fairly large player in the geo-visualization arena, and you’ll more than likely run into plenty of content delivered using its shapefile format (.shp), which most Esri applications can quickly consume. Because SQL Server doesn’t support the native conversion of shapefile files to any type of structure readable by SQL Server, I had to find a way to extract the content from the structure into an easier-to-consume format. As with most interesting problems, there’s an interesting solution. Morten Nielsen has provided a free utility called SQL Server 2008 Spatial Tools (bit.ly/3X9lUJ), with which I’ve had pretty good success. If you don’t want to use his code in your environment, Nielsen has provided links to other support tools.

For content, I decided to use the “North American Atlas - Populated Places” (1.usa.gov/13CgufS) download for starters. Using the SQL Server 2008 Spatial Tools application, I uploaded the shapefile file to its own temporary table and then used a select statement to insert it into my defined format, which quickly yielded 1,858 records with which to work. With my database deployed and full of good content, I was ready to create the API.

Building the API

Now that I had my data definition and some content in place, it was time to create the API layer with which my client apps would interact. I decided to go with an ASP.NET MVC 4 Web API project because of its built-in ability to deliver responses in JSON or XML formats. Going this route allowed me to skip all the serialization plumbing I used to have to construct myself to ship my results to the client.

For the Web API, I started by using the ASP.NET MVC 4 Web Application template and selecting the Razor view engine. I chose to leave the default option for “create unit test” selected. While I won’t go into detail on unit testing here, it’s important to note that as you take your app through a full lifecycle, having the ability to functionally validate your app via the unit-test framework tools provided in Visual Studio shouldn’t be overlooked. Once I had the project in place, I needed to add the connectivity to the data store prior to setting up the API commands.

Entity Framework 5 and Web API Entity Framework 5 introduced support for spatial queries, which—in addition to all of the other benefits you gain by leveraging the framework—made it an easy choice to use for integration to my Windows Azure SQL Database to drive the spatial API query support. To do this, I created the model by right-clicking the Models folder, adding a new ADO.NET Entity Data Model (EDM), and following the rest of the framework wizard’s steps in order to connect my Web API project with my Windows Azure SQL Database instance.

Thinking through the security of the app, I decided to start off with only providing the ability to query, because this API would be publicly accessible to anonymous users. If I were going to include any PUT or DELETE operations with this release, I’d need to secure those operations to ensure my data could only be modified by authorized users. To provide my query capabilities, I used the standard controller template and overloaded the GetItems function to allow it to handle my geo-referenced query commands.

My API is centered on one operation: to retrieve the results from the database within the specified radius of the selected latitude and longitude, with an optional text parameter that will match the title or description of the items that meet my geographic range restrictions. Figure 3 shows my GetItems function implemented within my Items class.

Figure 3 The GetItems Function Returns Items Within a Specified Radius from the Latitude/Longitude Variables

public IEnumerable<SearchResult> GetItems(
  double latitude, double longitude, double radius, string text = ""){
  DbGeography point = 
    DbGeography.PointFromText(String.Format("POINT({0} {1})", 
     longitude, latitude),
     DbGeography.DefaultCoordinateSystemId);
  double distance = MilesToMeters(radius);
  List<SearchResult> results = new List<SearchResult>();
  db.Items.Where(
    i => (i.Title.Contains(text) || i.Description.Contains(text) || 
      String.IsNullOrEmpty(text))
      && i.Location.Distance(point) < distance)
    .ToList()
    .ForEach(r =>
    {
      results.Add(new SearchResult(r));
    }
  );
  return results.AsEnumerable();
}

The implementation is pretty straightforward, but there are a few areas worth noting:

Distance Unit of Measurement Entity Framework queries using the Distance function will be returned in metric measures. Because I’m writing this app for consumption within the United States, I decided to let the user enter his radius in standard measures instead, so I had to account for the conversion on the server side.

Serializing the Result It’s tempting to just want to send back enumerable list entities from the generated model. Although this is possible, I’d also need to deal with the management of the Source-to-Item Entity relationship serialization. Both JSON and XML serializers will attempt to encode the Item and any referenced objects such as Source, which contains a child property of Items—one of which is the Item you’re currently serializing. You can see how things would start to get a bit unwieldy. In short, you’re throwing the serializer into an infinite reference loop with the default settings when it attempts to turn your class into a JSON result. One way to overcome this is to tell the serializer to ignore the reference loops altogether. Fortunately, Hongye Sun has provided a great blog post on why this issue occurs, along with several methods on how to fix it (see “Loop Reference handling in Web API” at bit.ly/1ammOy2).

Next, I needed to make a few other adjustments to get the XML serialization working. I’ve seen a few ways to resolve this, and I tend to go with changing the setter on the entity reference model properties to something other than “public.” I normally choose “internal,” because it allows all generated code to access the setting within my solution, which eliminates the need to change any of the preset code.

With all of that in mind, I chose to create a class for returning the results to the client, because there’s no reason to return any relational data to the consumer of the API, and it simplifies the serialization of the results. I also decided to pass my Item model to the constructor of my MapSearchResult class to quickly populate the result with the properties from the Item and Source classes that I wanted sent back to the consumer.

Using Spatial Content with Windows Azure SQL Database Anyone who’s been around developers long enough has heard the phrase, “It works on my box.” For your sake, I’m hoping you hear it used in jest and not as justification to avoid troubleshooting a potential issue. The reason I bring this up is that like any good developer, you’ll more than likely create and test locally before throwing your code over the fence to the Windows Azure Web instance, only to see your perfectly developed solution fail miserably when you place it in the cloud and attempt to run a spatial query against your API.

If you configure your Windows Azure instance for tracing, you’ll see an error thrown when attempting to load the Microsoft.Sql­Server.Types assembly. It turns out that Windows Azure SQL Database is running SQL Server 2012, but it doesn’t include all of the out-of-the box binaries, so your app will fail when it attempts to load the aforementioned assembly. Quite a few posts show how to get the required libraries deployed to your Windows Azure instance, but, fortunately, a developer who goes by the handle of “faivre” has put together a nifty NuGet package (bit.ly/141YkKt) that adds the required binaries to your deployment for you.

Deploying the Web API Project It goes without saying that I’d run a few simple tests locally prior to deploying my API project over to Windows Azure. Once I tested my code and basked in the glory of my single-function API, it was time to release it to the world. I personally like to configure the deployment of the local app using a publishing profile associated with the project prior to testing the continuous integration build; however, there are other options. See the MSDN Library article, “Walkthrough: Publishing an F#/C# MVC Application to Windows Azure” (bit.ly/18hQWO2) for more in-depth information. I typically like to download my publishing profile using the Download Publishing Profile menu item on the Web portal and store it with the solution. Once it’s downloaded, I select Publish from the project context menu to bring up the publishing wizard. When the publishing wizard opens, you can import your downloaded profile by selecting the Import button.

Getting published to Windows Azure once is great, but because I want this to be a maintainable app across a team of developers, having the developers publish to Windows Azure from their local workstations at will probably isn’t the best choice for reliability. I mentioned earlier that I checked the continuous integration option when I created the Windows Azure Web Site. This feature will automatically deploy my app to my Windows Azure Web Site after a successful build. This isn’t something to gloss over from the perspective of managing your app in relation to its full lifecycle. I’m really only scratching the surface of what this type of capability allows you to do from a management and deployment perspective (see the aforementioned Windows Azure documentation for more details).

With the tremendous amount of tooling provided by Windows Azure and Visual Studio, it was easy to overlook the capabilities I already had in place prior to writing the client app. I didn’t have to worry about details such as acquiring servers, licensing software, getting public IP addresses, installing apps and managing source control. With a minimal amount of code, I could run geo-referenced queries combined with the ability to quickly scale my app up or down as my resource demands changed—not to mention I could easily publish app updates without ever having to leave the comfort of Visual Studio. With the API in place, I needed to visualize these results on a map to see it in action.

Building the Client

For obvious reasons, I wanted to display my geo-referenced query results on a map, and because my API is cloud-hosted, I could choose from a variety of clients to render the results. I decided to go with a Windows Store app, which allowed me to leverage the Bing Maps SDK for Windows Store apps while giving me all of the screen real estate provided to a Windows Store app. After I registered and received my Bing Maps key, I was able to create a map with just a few lines of XAML centered on the Washington, D.C., area (latitude 38, longitude -77). I could also start using the end user’s current location using the geolocation services with a method provided in the Geolocation sample in the Windows Dev Center (see bit.ly/NnV3Ou).

Next, I added my search selection pop-up dialog XAML template that would be displayed when a user clicks the map. I created a user control that will show my end users the selected coordinates and then allow them to enter search terms and set the radius to the desired value. So far, so good—but I still needed to connect the map to the API and display the results.

I could quickly jam the code to query the API and lay out the items on the map into the tapped event on the search selection user control, but this could start to become unwieldy as my app grows and I add new capabilities. With this in mind, I opted to go with a Model-View-ViewModel (MVVM)-like design pattern to connect my app to the Web API. By doing this, I could abstract the Web API connection to its own library to potentially reuse it on other clients such as Windows Phone or a desktop client. This also gave me the ability to data bind my XAML definition to the view model, which allows for greater flexibility with the design of my app using the data-bound templates. Even though the MVVM Light Toolkit has become a well-known standard for this type of implementation, I chose not to use it due to the relatively low complexity in my app. If you’re going to leverage the MVVM design model in your app, I recommend taking a little time to familiarize yourself with the MVVM Light Toolkit (bit.ly/TkVcM).

The MapSearchViewModel will give my app the ability to query and receive the results from the API. I decided to create a MapQueryResult property to take my user input and transform it into my API query. My SearchResult property is an ObservableCollection of MapSearch­Result items. Each item contains the Description, Latitude, Longitude and Title of each search result match. The LoadGeoSearchApiResults async method (shown in Figure 4) is where I placed the code to execute the query and populate the result collection. Because this call may take some time, I used an await operator, which allows my UI to stay responsive while waiting for a query to complete and return results. Figure 4 shows the LoadGeoSearchApiResults function, which handles the call to my Windows Azure-hosted Web API and populates the SearchResults collection.

Figure 4 Populating the SearchResults Collection

private async void LoadGeoSearchApiResults(object p)
{  try
  {
    // Load the results and populate the collection
    this.SearchResults.Clear();
    this.IsWorking = true;
    this.QueryComplete = false;
    HttpClient client = new HttpClient();
    client.BaseAddress = new Uri(this.BaseAddressUrl);
     // Default: https://govdata.azurewebsites.net/
    client.DefaultRequestHeaders.Accept.Add(
      new MediaTypeWithQualityHeaderValue("text/xml"));
    HttpResponseMessage response = await client.GetAsync(
      String.Format(
        "/api/Items?text={0}&latitude={1}&longitude={2}&radius={3}",
      new object[] { this.Query.Text, this.Query.Latitude,
       this.Query.Longitude, this.Query.Radius }));
    using (Stream stream = await response.Content.ReadAsStreamAsync())
    {
      XDocument doc = XDocument.Load(stream);
      XNamespace defaultNs = doc.Root.GetDefaultNamespace();
      var results = doc.Descendants(defaultNs + "SearchResult");
      foreach (var result in results)
      {
        this.SearchResults.Add(new MapSearchResult()
        {
          Title = result.Descendants(defaultNs + "Title").First().Value,
          Description = result.Descendants(defaultNs + "Description").First().Value,
          Latitude = Convert.ToDouble(result.Descendants( defaultNs + "Latitude").First().Value),
          Longitude = Convert.ToDouble(result.Descendants( defaultNs + "Longitude").First().Value)
        });
      }
    }
  }
  catch (Exception ex)
  {
    throw ex;
  }
  finally
  {
    this.IsWorking = false;
    this.QueryComplete = true;
  }
}

After creating the MapSearchViewModel, I bound it to the app using a singleton pattern in the codebehind for the MainView. After connecting the view model to the app, I was able to bind it to the query UI and map display with just a few XAML binding statements. Following are two of my XAML snippets for the map control and items binding as a reference.

Here’s the Bing Maps XAML:

<bm:Map Credentials="{StaticResource ResourceKey=BingMapsKey}"
  x:Name="myMap" MapType="Aerial"
  ZoomLevel="5" Tapped="myMap_Tapped">
  <bm:MapItemsControl x:Name="SearchResultItems"
     ItemsSource="{Binding SearchResults}" ItemTemplate=
    "{StaticResource   SearchResultItemTemplate}"/>
  <bm:Map.Center>
    <bm:Location Latitude="33" Longitude="-77" />
  </bm:Map.Center>
</bm:Map>

Here’s the search result pushpin template:

<DataTemplate x:Key="SearchResultItemTemplate">
  <bm:Pushpin PointerPressed="pushpin_Clicked" >
    <bm:MapLayer.Position >
      <bm:Location Latitude="{Binding Latitude}"
         Longitude="{Binding Longitude}">
      </bm:Location>
    </bm:MapLayer.Position>
  </bm:Pushpin>
</DataTemplate>

With everything in place, it was time for the moment of truth, and I quickly saw that with a simple tap (see Figure 5) of the map my client was correctly rendering my search results from my deployed API (see Figure 6).

Displaying the Query Pop-up When the Map Is Tapped
Figure 5 Displaying the Query Pop-up When the Map Is Tapped

Results Matching the Query Parameters
Figure 6 Results Matching the Query Parameters

What I like about these types of geo-referenced map scenarios is the broad applicability across different data types and problem domains. With very little modification, my API serving up geo-­referenced queries could be used to augment other apps in a variety of ways, such as notifications based on the user’s current location, enhanced imagery displays based on specified coordinates, and object tracking using geo-visualization. I hope this article has shown you a few tips and tricks to aid you along the way to developing and managing your own geo-based solution.  


Malcolm Hyson is chief technology officer of Discover Technologies LLC. He’s also a newly appointed member of the Visual Studio ALM Rangers team. Reach him at mhyson@discovertechnologies.com.

Thanks to the following technical experts for reviewing this article: Soung Bae (Soung.Bae@Baellc.net) and Westin Kriebel (Westin.Kriebel@microsoft.com)