Skip to main content

How We Used Data in GiveAQuiz.com

Shawn Wildermuth
AgiliTrain

Published: July 2010

Download the Code

Visit the Site

Several months ago I was working for a client who wanted me to put together a quick quiz for their developers. They were trying to determine which developers on their team were ready to move into a new tiger team and they wanted the best of the bunch. I figured it would be the Internet to the rescue. I searched for a site that would let me not only create a quiz but give it to a number of people. While there were a number of sites that would let me give a survey and get cumulative results, what I really needed was to give a quiz, not a survey. Surveys would gather answers from participants, but, in constrast, quizzes have ‘correct’ answers, where a survey do not. I needed to give a quiz and get the results of how many correct answers the participants choose. As I started thinking about it, I decided that this might even be a good resource for non-technical teachers (like school teachers). That’s when I decided to get serious about building the site.

Other than the usefulness of the site, the other purpose of the site was to provide an exemplar of certain development techniques. The focus was to leverage as much of the Microsoft data stack as possible so as to provide a roadmap for using the tools and technologies. We built GiveAQuiz.com using the following data technologies:

  • SQL Server 2008
  • Entity Framework 4
  • WCF Data Services 4
  • Visual Studio 2010’s SQL Server Database Project [1]
  • SQL Server Reporting Services

For more information on these technologies, you can visit the MSDN Data DevCenter.

In addition, we used the newest tools and web technologies to build the functionality of the site. These technologies are not specifically mentioned in this whitepaper and would be similar to the discussions in the building of NerdDinner.com [2]. The web technologies used were:

  • ASP.NET MVC 2
  • Silverlight 4

 

Starting Out

We started by doing a specification with user stories and user interface prototypes. For the user stories, we stayed simple and decided that there were three users:

 

Glenda

Glenda teaches Algebra in a public high school. She wants to give quizzes to her students but the current way of printing out quizzes and spending all night grading them is a pain. On top of that, using paper tests does not seem very environmentally friendly to Glenda. She decides to turn to the Internet for a solution. After searching the internet for quizzing engines with Bing, she decides to try out “GiveAQuiz.com”. She registers with the new site ready to create her first quiz. She walks through the quiz designer and creates a ten question quiz to test out on her students. She is asked to supply a pass-phrase she can hand out to her students so they can take the quiz without having to register with the site. She sets the time limit of fifteen minutes for the test. A couple days pass and she returns to the “GiveAQuiz.com” site to see the results. She sees that most of her students have taken the test and realizes that two of the questions were missed by the majority of the students. She uses this information to update her lesson plan to provide more information on those parts of the curriculum.

 

Chloe

Chloe is a high school student. While in Glenda’s class, she was instructed to visit the “GiveAQuiz.com” to take a quick quiz. When she returns home, she visits the site and heads to the “Take a Quiz” section. The site requests she enter her name and passphrase supplied by her teacher. She doesn’t need to register with the site to take the test, but instead uses the passphrase to get to the test without registering. She proceeds to answer the questions, one at a time, and submits the test (after being given a chance to review her answers).

 

Jake

Jake is a software consultant. He has been spending time at a client helping mentor their team on different aspects of development. He is pulled aside by one of the managers who confided in him that they wanted to add a couple of members to the team. In order to figure out which members of the team to add, they wanted him to come up with a simple programming test for the potential team members. To simplify giving the test, Jake turned to “GiveAQuiz.com”. He registered in order to create a quiz and created a quick twenty questions to test the candidates. He returned after the tests were given to see the results and looked at the quiz summary report to see the top three of the candidates to tell the manage which ones he’d move to the new team.

These three users helped us determine that we needed to have an anonymous and a registration story as well as the types of entities in our system.

For the prototyping of our user interface, we decided to use Blend’s SketchFlow [3] facility to build our screens. At first, we struggled with whether we should use SketchFlow as it was really designed to be a Silverlight prototyping tool, but after mocking up the first couple of screens it was a perfect match for mocking up our web screens as well as our Silverlight screens. For example, Figure 1 is the mock-up of the “Take a Quiz” page.

Figure 1: Mockup of the Take a Quiz Page

 

By mixing the user stories and the mock-ups of the user interface, we quickly had a good idea of the data model we were going to require. The different parts of the site would need only about half-a-dozen different entities so it should be easy to quickly create a model.

 

Modeling the Data

Since we are using the Entity Framework, we decided to start modeling the data directly in the entity designer. At the time we originally created the model, we knew we would not get the model perfect, but it was an obvious place to start. To start we created a new Entity Framework Model (called GiveAQuizModel.edmx) and choose to start with an empty model as shown in Figure 2.

Figure 2: Creating an Empty Entity Framework Model

 

Using the empty model, we used the toolbox to create our first entity (Quiz) as shown in Figure 3. You will notice that the new entity already has a key created called “Id”.

Figure 3: Creating a new Entity

 

Creating a new property is as simple as right-clicking and selecting “Add-> Scalar Property” as shown in Figure 4. Once the property is added it can be named by double clicking the name on the design surface.

Figure 4: Adding a Scalar Property

 

Once we had the properties set up, we used the property window to change the data type and other constraints (e.g. nullable, maximum length for strings, etc.) for our entity properties. Figure 5 shows the property window and how we set up the Name to a string no longer than 100 characters in length.

Figure 5: Setting the DataType and Constraints

 

We repeated this until we had a couple of entities created (Quiz and QuizResult). In our model, we wanted these two entities related so we simply use the toolbox again to drag an Association between the two models as shown in Figure 6. This created a new relationship between the two entities. Typically you start the drag operation from the parent object and drop onto the child entity. You can also double-click the relationship created to change the properties of the relationship if necessary.

Figure 6: Creating an Association

 

Once we created the basic entities and relationships we had our initial version of the model containing the entities we needed for GiveAQuiz.com as shown in Figure 7.

Figure 7: Our Data Model

 

This version of the model is a first step that allowed us to start developing the functionality. But before we could do this, we needed a database to play with.

 

Building the Database

A great feature of the Entity Framework Visual Studio integration is that we can create a database directly from the working model we’ve created. This is exactly what we did to create our database. To do this, we right-clicked on the Entity Framework model designer and picked the “Generate Database from Model” option to start the database generation. Choosing this option opened the dialog shown in Figure 8 and let us generate the database directly. We created a new connection at this point as well.

Figure 8: Generate Database Wizard

 

This wizard took the entities we created and inferred a schema for us in the new database. The database generation is not automatic, as the wizard gave us a .sql file that we had to run on a particular database. By using the “Connect” toolbar button (as seen in Figure 9), we could connect to an instance of SQL Server, create a blank database and execute the database creation script.

Figure 9: Connecting to the Database

 

The generation of the database was strictly type per table (TPT) which means that there is a one-to-one matching of the entities with database tables and associations to foreign-keys. At the same time it created the database, it also creates the mappings to the new database objects for us.

To see the database that was created, we opened up the Server Explorer to look at our new database. By looking at the new QuizDB, we could the new schema that was created as shown in Figure 10.

Figure 10: Server Explorer

 

We noticed that the new database definition took our EntitySet names from the Entity Framework model and used them for the table definitions (which is typically the plural of the individual entities). As the Server Explorer showed us, the new database tables were all created in the ‘dbo’ schema like most default tables are created. For us, we wanted to segment our data into a separate schema (which we called GiveAQuiz for obvious reasons). So we needed to change the entity framework model to use the schema. We did this by using the design surface for the entities again. Instead of having a particular item on the design surface selected, we make sure that nothing was selected and looked at the property panel again. This gave us a look at the model’s properties as shown in Figure 11.

Figure 11: Changing the Database Schema Name

 

Once we could see the properties, we changed the “Database Schema Name” to GiveAQuiz and regenerated the database. By doing this, our new tables were in the newly created GiveAQuiz database schema name and all our mappings were updated accordingly.

One advantage of using this Entity Framework database deployment strategy is that as the Entity Framework model changes during development, we can push the changes back down to a database by generating the database from the model. Now that we had the first draft of our database schema, we needed some sample data to work with. To do this, we opened up the Server Explorer in Visual Studio to manually enter some data. The Server Explorer let us look at the schema that was created by the Entity Framework model and open tables to add some sample data. You can open the Server Explorer via the View Menu (or Ctrl-W then L). Once we opened the Server Explorer, we opened the Data Connections item and expanded the QuizDB database as shown in Figure 12:

Figure 12: Server Explorer

 

We opened up the tables node allows us to see the new tables and right-clicked the table and picked “Show Table Data”. Once a table was shown, we manually added some sample data (as shown in Figure 13).

Figure 13: Adding Sample Data

 

With sample data in hand, we had to figure out if the data access code was the way we wanted it for the project.

 

Generating the Data Access Code

The Entity Framework model we created automatically generates code based on the model. This code was fine in early versions of Entity Framework, but it was notoriously difficult to change the way that the generated code was created. So before we started writing code against that code, we decided to use an entity generator to produce our data access code instead of relying on the basic entity framework default generation. To create the entity creator, we right-clicked on the Entity Framework designer and pick “Add Generator Item…” This launched the “Add New Item…” dialog with the two types of entity generators. We choose the “ADO.NET EntityObject Generator” to be able to control the generation of the entity code (see Figure 14). This template uses the T4 [4] template generation to allow customization of the code created.

Figure 14: EntityObject Generator Template

 

Now that we had the code generation working the way we wanted, it was time to start writing code.

While using the EntityObject generator builds the same code as the Entity Framework model by default, we wanted to have access to the generator templates to make changes as necessary. In this first revision of the project we ended up not needing to customize the template, but this may come up in subsequent versions of GiveAQuiz.com.

 

Working with the Data

Now that we have our working data access code, we could start using it. Since we are using ASP.NET MVC 2 for our website framework, we decided to use the repository approach, as popularized by Domain Driven Design [5], where all of our data access is isolated in a single class that could be tested separately. The first question in the repository was to determine how we wanted to handle the context object.

In the Entity Framework, a class that derives from ObjectContext is created that is the starting point for access to the database. Typically this context object is used to keep change state and endpoints for LINQ queries against the model. For our project, this was called GiveAQuizEntities and contains IQueryable endpoints for every table in our entity model.

Since this context object is the starting point for issuing LINQ queries, we wrapped access to it in the repository through a simple private read-only property so we could use it throughout the repository with deferred construction:

public class QuizRepository : IQuizRepository
{
  // Deferred construction of the Context
  private GiveAQuizEntities _ctx = null;
  private GiveAQuizEntities Context
  {
    get { return _ctx ?? (_ctx = new GiveAQuizEntities()); }
  }
  ...
}

That way in the repository, every method that needs data access can just use the Context property to access it. For example, to get a list of the QuizDefinitions that the user owns, we do a simple LINQ query using the context object as a starting point:

public class QuizRepository : IQuizRepository
{
  ...
  public IEnumerable<QuizDefinition> GetUserQuizDefinitions(Guid userKey)
  {
    return Context.QuizDefinitions
                  .Include("Quizzes")
                  .Where(q => q.Owner == userKey)
                  .OrderBy(q => q.QuizName)
                  .ToList();
  }
  ...
}

We could have decided to use the fluent interface [6] instead of the integrated query syntax (e.g. LINQ) for getting the data but at the end of the day in our usage the functionality was identical.

The queries in the repository generally use an eager loading pattern instead of letting lazy loading of relationships to happen. This is an efficiency to prevent multiple calls to the database when necessary. Eager loading in Entity Framework can be done by using the Include method (or the Include clause in a query). The “Include” method tells the Entity Framework that we want that named relationship to be loaded when we return data. In the earlier example, we used the Include (“Quizzes”) to indicate that we wanted the QuizDefinitions and all Quizzes associated with the definitions in one database round-trip. If we did not eager load the related entity, it would have issued a query to the database every time we accessed a Quiz in the Quizzes relationship.

Accessing data in the model through this simple use of the context object and LINQ queries works simply but what about pushing changes back to the database? For new objects, this is as simple as adding the new object to the context before saving it using the AddToXXX methods):

public class QuizRepository : IQuizRepository
{
  ...
  public void SaveQuizResult(QuizResult quizResult)
  {
    Context.AddToQuizResults(quizResult);
    Context.SaveChanges();
  }
  ...
}

But doing updates or deletes is different. Because this is a web project and not a rich client, we have to assume that the lifecycle of the pages is short so the context object is likely not going to be holding on to changed instances for us. To get around this issue we have to reconnect the changed objects to the context before saving them to the database. From the MVC Controller, we call the repository like so:

public class QuizController : BaseController
{
  ...
  [Authorize]
  [AcceptVerbs(HttpVerbs.Post)]
  public ActionResult Start(StartQuizViewModel model)
  {
    ...
    var quiz = Repository.GetQuiz(model.QuizId);
    quiz.Name = model.QuizName;
    quiz.Password = model.Passphrase;
    quiz.StartDate = model.StartDate;
    quiz.EndDate = model.EndDate;
    quiz.HasEnded = false;
    Repository.UpdateQuiz(quiz);
    ...
    return RedirectToAction("Index");
  }
  ...
}

In the repository we use the “ApplyCurrentValues” method to reattach the Quiz object to the context before saving like so:

public class QuizRepository : IQuizRepository
{
  ...
  public void UpdateQuiz(Quiz quiz)
  {
    Context.Quizzes.ApplyCurrentValues(quiz);
    Context.SaveChanges();
  }
  ...
}

By using the ApplyCurrentValues method, we can tell the context that this is a changed entity and when the save changes happens it will attempt to update the database instead of adding a new entity.

For deletions, we use a mix of the two approaches. We first need to load the object to be deleted by issuing a query, then calling DeleteObject to mark the entity for deletion. The call to SaveChanges completes the deletion of the object in the database.

public class QuizRepository : IQuizRepository
{
  ...
  public void DeleteQuiz(int id)
  {
    var quiz = Context.QuizDefinitions.FirstOrDefault(q => q.Id == id);
 
    if ( quiz != null )
    {
      Context.DeleteObject(quiz);
      Context.SaveChanges();
    }
  }
  ...
}

While the repository represents the most basic data access, for our application this is more than enough of a data access API for our needs. The only difference between this project and a large enterprise application is scope (e.g. having multiple repositories for different types of data), but the general practice of the way that the data is accessed is the same.

 

Changing the Database Schema

During the use of the entity model in developing the pages, we found that we needed to make changes to the design of our database. The problem was that as we generated our schema changes, we wanted to keep existing data. We had two choices in making schema changes.  We could either try and make the schema changes on a live database to preserve the changes or just rebuild the database on every change with some sample data included in the build. We chose the latter choice and aimed to set up a database build that included sample data.

In addition to the sample, data, we also wanted to be able to have the database scripts generate the schema and data for the Forms Authentication. In ASP.NET 2.0, Microsoft included a tool to add something called “ASP.NET Application Services” which is a default implementation of a membership, roles and profile database to simplify creating authenticated websites. ASP.NET Application Services is installed into a database using the aspnet_regsql.exe tool that ships with the .NET Framework (version 2.0 and above). For our site, we needed to build the ASP.NET Application Services tool during the database builds. To achieve this, we needed more control over the creation of the database. This proved difficult with the initial plan to have the Entity Framework build the database for us. The schema generation from the Entity Framework was solid enough; there just wasn’t an easy way to our own sample data during database creation. Our sample data included sample users in the ASP.NET Services schema so we needed to build that schema before we applied our sample data. So we moved from letting the Entity Framework create the database so we had more control over the generation of the database.

Instead we decided to add a SQL Server 2008 Database project to our solution. For SQL Server 2005 and 2008 there are several projects supported in Visual Studio (though these are only supported on some editions [7] of Visual Studio). We decided on the SQL Server 2008 Database project since we wanted to manage our own user defined database project. The SQL Server 2008 Wizard project type would have walked us through the process and created a SQL Server 2008 Database project for us, but we decided to do it by hand since it was clearer to us. To get this added to our solution, we looked in the Database/SQL Server folder in the “Add New Project” dialog to create our database application as shown in Figure 15.

Figure 15: SQL Server 2008 Database Project

 

We could have started out with this project to begin with instead of using the Entity Framework but refactoring to use this project type ended up being a better approach as it let us get to code quicker then move to this project type when we actually needed the additional functionality.

Once we added the project, we had to reverse engineer our database to get the scripts built for us. To do this, we right-clicked the database project and selected the “Import Database Objects and Settings…” choice which brought us to the “Import Database Schema” dialog (as shown in Figure 16). By pressing Start, it read the database and created the scripts to create the database.

Figure 16: Import Database Wizard

 

At first the sheer number of .sql files created seemed a little overwhelming and we were concerned about the maintainability of the sql files (as seen in Figure 17).

Figure 17: Database Project in the Solution Explorer

 

But that was before I understood that the real meat of the tool was in the “Schema View” window (see Figure 18). The Schema View lets you view the schema but also edit the individual table objects as well as some refactoring (like renaming fields). This makes the management of the database a lot easier.

Figure 18: Schema View

 

By double-clicking on a table (or other object), it opens up a SQL file that represents the script that will be used to credate the table. Then you can manage that table by typing in the SQL. It does assume that for standard changes (adding fields) that you will edit the SQL script itself. This doesn’t represent a big difficulty level for most developers. It worked well for our needs.

CREATE TABLE GiveAQuiz.[QuestionDefinitions] (
    [Id]                   INT            IDENTITY (1, 1) NOT NULL,
    [SortOrder]            INT            NOT NULL,
    [QuizDefinition]       INT            NOT NULL,
    [QuestionTypeName]     NVARCHAR(1024) NOT NULL,
    [QuestionData]         NVARCHAR (MAX) NOT NULL
);

Once changes were made to the scripts, we simply right-clicked on the project and picked “Deploy” in the Solution Explorer to generate the database. This can even drop and create the database for you (which is what we do on GiveAQuiz). The only caveat to this approach is that by default the database project when built does a full deploy. That tended to slow down the debugging process of the project for us so we removed the database project from the build and deploy during the build process using the solution’s property page (as shown in Figure 19).

Figure 19: Stopping the Database Project from Deploying during the Build

 

Once the changes to the database were deployed, we would go to the Entity Framework model and update it from the database. This is opposite to how we started. Instead of using the Entity Framework as the model to push to the database, instead we now updated the model based on any database changes we made with the SQL Server 2008 Database project.

To do this open the entity framework design surface and right-click on the surface to get the context menu. On the context menu, choose Update Model from Database as shown in Figure 20.

Figure 20: Updating the Entity Model from the Database

 

This will update the model with any new changes. If you have renamed tables or fields, you will get duplicates that you will need to remove in the model before it will build but for the most part this works extraordinarily well. This process was repeated dozens of times during the process of developing GiveAQuiz.com as we needed to tweak our data design.

 

Improving the Database Scripts

Being able to refactor our database design is one of the reason we used the SQL Server 2008 Database project for our database generation. The other two reasons are to be able to create sample data and to include the ASP.NET Application Database in our database creation.

First let’s show how we add our sample data. We decided to include scripts for adding sample data because we wanted to be able to build the database anytime we needed instead of doing database comparisons and pushing the changes (which  the SQL Server Project also supports). We chose building the database instead of updating changes as the team is distributed so knowing what version level a particular database was at meant that we had to do more complex comparisons/merges. Rebuilding from scratch for our team during the development stages was just simpler and would take less development time when the database changed. Because the Database Project allows us to do database schema comparisons, we knew that during release cycles we could create change scripts that coincided with our releases without having to do this on every checked-in change.

In the database project, there is a folder for scripts which we use to add demo users and demo quiz definitions. Normally, you would use the “Scripts\Post-Deployment” folder to put in scripts to be executed after the deployment. In that same folder there is a script called “Script.PostDeployment.sql” that is executed after the deployment of the schema is complete. You can add your sample data there (or more reasonably point to a separate script to be executed) like so:

/*
 Post-Deployment Script Template                      
--------------------------------------------------------------------------------------
 This file contains SQL statements that will be executed before the build script.
 Use SQLCMD syntax to include a file in the pre-deployment script.               
 Example:      :r .\myfile.sql                        
 Use SQLCMD syntax to reference a variable in the pre-deployment script.         
 Example:      :setvar TableName MyTable                            
               SELECT * FROM [$(TableName)]                  
--------------------------------------------------------------------------------------
*/
:r .\AddUsers.sql
:r .\AddQuizDefinitions.sql

This is the easiest way to add sample data as part of the deployment of the new database. By including any scripts you need in this post-deployment script, we could add sample data to the database once the database is built. This did not work for our needs though. In GiveAQuiz.com, we wanted to have a couple of sample users set up in ASP.NET Application Services as well as related objects to those users. Since we could not run the ASP.NET Application Services’ installer in these Post-Deployment scripts, we had to do it another way.

To create the ASP.NET Application Database objects, we could have tried to reverse engineer the database objects into the scripts, but there were some issues with how that worked and so we decided that the easiest way to do it was to actually run the installer during the deployment of the database. To do this we needed a way to run the Application Database installer *after* the deployment. Unfortunately this is not as simple as it should be. Putting a post-build script doesn’t help as we needed to create the Application Database object after the deployment of the scripts, not before. To do this we added a PostDeploy.cmd file that contains code to add the database objects. One of our schema objects also depends on the Application Database objects so we moved our sample data into that .cmd file as well. To get the .cmd file to be executed post-deployment you have to do some project file surgery. Inside the .dbproj file you need to add a target for post-deployment and an execution of the command file like so:

<Target Name="AfterDeploy">
  <Exec Command="call &quot;$(ProjectDir)PostDeploy.cmd&quot;" ContinueOnError="false" />
</Target>

This tells the Database project to execute our .cmd file once the deployment is over so we can run the installer then run the scripts to add the sample data we defined. Our PostDeploy.cmd file looks like this:

call "%VS100COMNTOOLS%\..\..\VC\vcvarsall.bat" x86
call "%FrameworkDIR32%\%FrameworkVersion32%\aspnet_regsql.exe" -S . -E -A all -d QuizDB -Q
call osql -S . -E -d master -i "Scripts\Post-Deployment\AddUsers.sql"
call osql -S . -E -d master -e -i "Scripts\Post-Deployment\AddQuizDefinitions.sql"

We first set up the Visual Studio 2010 paths by calling vcvarsall.bat that is installed with Visual Studio 2010. That batch files adds the %FrameworkDIR32% and %FrameworkVersion% macros for us to use is finding aspnet_regsql.exe. The aspnet_regsql.exe is a tool for creating the Application Database objects in an existing database. Once the Application objects are in the database, we can use OSQL to call our post-deployment scripts (which include creating of users in the Application Database).

With both of this in place, any developer (or eventually a build server) can build the code and deploy a brand-new database at any time necessary. The build is self-describing so we no longer need to worry about sharing .mdf/.ldf files and making sure they are in sync with the rest of the project. It’s just code that we’re versioning, including the database code.

 

Using Data across the Internet

In the GiveAQuiz.com project, we found that we had two separate pieces of functionality that Silverlight would be a good fit for since they were fairly rich in their requirements: Quiz Creation and Taking a Quiz. We were using Silverlight for this part of the project we had to have a way to share our data models (e.g. the Entity Framework model) across the wire. To expose our data model, we decided to use WCF Data Services to provide a REST-based API to access the data from our Silverlight applications.

WCF Data Services is a technology that allows you to take any LINQ-enabled data access technology (e.g. Entity Framework, LINQ2SQL, NHibernate.LINQ) and expose it across the wire via OData [8]. OData allows data to be exposed across HTTP connections via a URI-based syntax. This syntax supports the full create, read, update and delete (i.e. CRUD) operations but maps them directly to HTTP verbs (GET, PUT, POST, DELETE). The format of the data that is retrieved or updated is either XML or JSON (JavaScript Object Notation [9]) formatted data. The XML format is based on the Atom [10] syndication format; where the JSON format is a standard object notation. By creating a WCF Data Service, we can expose our data to our Silverlight client (and AJAX-based HTML but we’re not using it for that purpose) in a standards-based way.

While typically OData sources are accessed via the URI syntax, we are using the Silverlight client to access the data. The Silverlight client (or the .NET client if you’re building desktop applications) is crucial to the simplification of the data access in that we can issue LINQ queries in Silverlight and have them converted to URI syntax calls automatically for us. That way the data access in Silverlight should be very similar to the ASP.NET code that accesses data.

Before we could get started in the Silverlight client, we had to create the WCF Data Service by adding it to the website project. WCF Data Services is implemented using WCF so it is typically added directly to the web site project. The WCF Data Service itself is just a wrapper around the Entity Framework project. After the WCF Data Service was added to our web project, we could see that it is just a simple class that derives from the DataService<T> class. The generic parameter of the DataService<T> class is any class that exposes one or more IQueryable endpoints. This was the Context object of Entity Framework project (called GiveAQuizEntities) we saw earlier in this whitepaper:

public class QuizDataService : DataService<GiveAQuizEntities>
{
  ...
}

By including the GiveAQuizEntities class in the derivation of the DataService class, we are telling the WCF Data Service that we want it to possibly expose any or all of the queryable endpoints (e.g. Quizzes, QuizDefinitions, etc.) from the WCF Data Service. By default none of these endpoints are exposed directly. We had to specifically tell the service which (if any) of these endpoints to include in the service.

To configure the service, we used the InitializeService method inside the class. This method allowed us to tell the data service which entities you want to grant access to (and what types of rights you want to grant permission to those entities). At first during our implementation of the site we took the brute-force approach of giving unfettered access to the entire model that we needed in our Silverlight clients by granting all EntitySetRights as shown below:

public class QuizDataService : DataService<GiveAQuizEntities>
{
  // This method is called only once to initialize service-wide policies.
  public static void InitializeService(DataServiceConfiguration config)
  {
    // Configure the necessary access for Creating Quizzes
    // TODO Revisit the security as granting all rights is
    // probably too coarse grained
    config.SetEntitySetAccessRule("QuizDefinitions", EntitySetRights.All);
    config.SetEntitySetAccessRule("QuestionDefinitions", EntitySetRights.All);
   
    ...
    config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
  }
  ...
}

Once we provided access to the WCF Data Service, we could navigate to it directly via a web browser to see what was in the WCF Data Service [11] as shown in Figure 21.

Figure 21: Viewing WCF Data Service in the Browser

 

Figure 21 shows the Atom service document of the WCF Data Service with collections for each of the elements we are exposing. Note that the collection has a title as well as an ‘href’ so that navigating to the collection is as simple as using the relative path to our service (e.g. http://(site)/Services/QuizDataServices.svc/QuizDefinitions as seen in Figure 22.

Figure 22: QuizDefinitions via our WCF Data Service

 

But using this URI based API is not what we wanted. What we really wanted was the ability to do the same sort of queries in our Silverlight client that we did on the server. That is where the WCF Data Services Silverlight Client came in. It allowed us to formulate our queries as LINQ queries and have the client-code turn that into a URI-based call. For example, when we wanted to retrieve a specific QuizDefinition in the Silverlight code, we could create a LINQ query like so:

// Silverlight code!
var qry = from q in Context.QuizDefinitions
                           .Expand("QuestionDefinitions")
          where q.Id == id
          select q;

This call then gets turned into the URI syntax under the covers for us. That what the primary reason we used WCF Data Services as it made our client-side code look very similar to the server code. To get started we first needed to get a proxy to the WCF Data Service by using the same “Add Service Reference” dialog that is used for WCF Web Services and ASP.NET Web Services as shown in Figure 23.

Figure 23: Add Service Reference adding a WCF Data Service

Once the service reference is added, we can access the data the new generated code. Much like in the ASP.NET MVC project, all the data access is isolated into a single location (called the QuizModel in our Silverlight project). This class represents the gateway to the database through the WCF Data Service. When querying through the WCF Data Service, we used LINQ to perform the queries, just like we would do on the server side:

public class QuizModel : IQuizModel
{
  public GiveAQuizEntities Context
  {
    get
    {
      if (_ctx == null)
      {
        _ctx = new GiveAQuizEntities(new               
                     Uri("/Services/QuizDataService.svc", UriKind.Relative));
      }
      return _ctx;
    }
  }
  public void GetQuizDefinitionAsync(int id, Action<QuizDefinition,
                                                    Exception> callback)
  {
    var qry = from q in Context.QuizDefinitions
                               .Expand("QuestionDefinitions")
              where q.Id == id
              select q;
    ...
  }
  ...
}

You should note that the style of accessing the data through a Context object is the same as on the server. This means that the context object is able to do not only querying, but also management of object state in our Silverlight object so that we can issue changes back across the wire as needed.

This query is similar to the client-side query. The Expand method is the equivalent of the Include method on the entity framework (e.g. it eager loads the related entities). The big change in the Silverlight queries is that Silverlight itself does not allow synchronous network requests of any kind. This includes access to the WCF Data Service. Where in the ASP.ENT MVC code, we were issuing queries and updates and blocking until the calls returned, this is not possible in Silverlight [12].  

To get around this, the WCF Data Service uses the DataServiceCollection<T> class to expose a LoadCompleted event and a LoadAsync method to perform the asynchronous call. For example, after we defined the query, we created the DataServiceCollection<T> using the type of the item returned from the server (i.e. QuizDefinition). Then we handled the LoadCompleted event on the collection itself so that we could check for errors or cancellation once the call returned:

public class QuizModel : IQuizModel
{
  DataServiceCollection<QuizDefinition> _trackedDefinitions = null;
  ...
  public void GetQuizDefinitionAsync(int id,
                                     Action<QuizDefinition, Exception> callback)
  {
    var qry = from q in Context.QuizDefinitions
                               .Expand("QuestionDefinitions")
              where q.Id == id
              select q;
    AppMessages.IsBusyMessage.Send(true);
    _trackedDefinitions = new DataServiceCollection<QuizDefinition>();
    _trackedDefinitions.LoadCompleted += (s, e) =>
      {
        try
        {
          if (e.Cancelled)
          {
            callback(null, new Exception("Load of Quiz was cancelled"));
          }
          else if (e.Error != null)
          {
            callback(null, StripDataServiceException(e.Error));
          }
          else
          {
            callback(_trackedDefinitions.FirstOrDefault(), null);
          }
        }
        finally
        {
          AppMessages.IsBusyMessage.Send(false);
        }
      };
    // Perform the query
    _trackedDefinitions.LoadAsync(qry);
  }
  ...
}

While there is more wiring here than on the server, the basic approach of using a LINQ query to define the operation, then execution to retrieve the data is the same as on the server.

Because Silverlight is a rich client experience, we are longer-lived so the context object in Silverlight can be used to monitor changes that occur in the client. For new entities, this requires the same kind of code that entity framework on the server does. When a new entity is created, it needs to let the context know there is a new entity (a QuizDefinition in this case). We isolate this by having the creation of the new entity happen inside the QuizModel:

public class QuizModel : IQuizModel
{
  ...
  public QuestionDefinition CreateQuestionDefinition(IQuestionType type)
  {
    var def = new QuestionDefinition()
              {
                Id = 0,
                QuestionTypeName = type.GetType().AssemblyQualifiedName,
                SortOrder = 0
              };
    Context.AddToQuestionDefinitions(def);
    return def;
  }
  ...
}

Note that we are not only creating the new QuestionDefintion object here but adding it to the Context so that when we tell the Context to save changes (below) this new object is saved with it. Deletes are done in this same way.

Changes are simplified in that the context can monitor the entities in the Silverlight space and know that an entity needs to be updated without having to interact with the QuizModel directly. All changes to the entities that occur anywhere in the Silverlight application before the SaveAsync call is executed are marked as changes so that when the save happens, all changes are batched to the server to issue the changes automatically:

public class QuizModel : IQuizModel
{
  ...
  public void SaveAsync(Action<Exception> callback)
  {
    Context.BeginSaveChanges(SaveChangesOptions.Batch, r =>
      {
        try
        {
          var response = Context.EndSaveChanges(r);
          if (response != null && response.IsBatchResponse)
          {
            // Return the first error if there is one.
            var batchError = response.FirstOrDefault();
            callback(batchError == null ? null : batchError.Error);
          }
        }
        catch (Exception ex)
        {
          callback(ex);
        }
      }, null);
  }
  ...
}

By using WCF Data Services, we are creating similar programming models on both sides of the wire so that developers should be familiar with the way to do data access no matter which code they are using.

 

Securing the WCF Data Service

Now that we have our WCF Data Service working in our Silverlight client, we have to be concerned about how to protect the access to the WCF Data Service. While we want to prevent all unauthorized access to the WCF Data Service, we still need to be aware of the use-cases for authorized access. In some cases, we can simply protect the WCF Data Services using ASP.NET Forms Authentication [13] or Windows Authentication by protecting access to the .svc file using a web.config file. For example, we could have included a web.config as shown in Figure 24. By limiting the access to the whole folder to users that were authenticated in the web.config file as shown:

<?xml version="1.0"?>
<configuration>
    <system.web>
      <authorization>
        <deny users="?" />
        <allow users="*"/>
      </authorization>
    </system.web>
</configuration>

Using the web.config file in this way, we could have limited access to only authenticated users. This works well if we had been using the WCF Data Service within a secured website, but since we needed access granted to a mix of authenticated and anonymous users, we had to find a different approach.

Figure 24: Using ASP.NET Security to Protect the WCF Data Service

 

Instead we had to take the idea of securing the WCF Data Service in several steps. The first part of securing the WCF Data Service was to reduce the surface area to only the operations and entities that we had to have access to. By returning to the InitializeService call in the DataService<T> we defined on the server, we could use the EntitySetRights’ enumeration to limit access to the service. For example, since we were only accessing a single QuizDefinition entity at a time, we did not need to allow for enumeration of all the QuizDefinitions in the WCF Data Service. Additionally, we realized that we were only inserting and updating QuizDefintions (e.g. not deleting) so we could change the access rule to only return a single QuizDefinition and only allow new ones to be created and updated through the WCF Data Service layer. This limitation is only extended to the WCF Data Service so the ASP.NET MVC code is completely unaffected by this change. It simply reduces the surface area of the calls to only the operations on those entities that we need from Silverlight (or other client API):

public class QuizDataService : DataService<GiveAQuizEntities>
{
  ...
  // This method is called only once to initialize service-wide policies.
  public static void InitializeService(DataServiceConfiguration config)
  {
    // Configure the necessary access for Creating Quizzes
    config.SetEntitySetAccessRule("QuizDefinitions", EntitySetRights.ReadSingle |
                                                     EntitySetRights.WriteAppend |
                                                     EntitySetRights.WriteReplace);
    config.SetEntitySetAccessRule("QuestionDefinitions", EntitySetRights.All);
    // Grant Access for Taking Tests (Requires Anonymous access)
    config.SetEntitySetAccessRule("Quizzes", EntitySetRights.ReadSingle);
    config.SetEntitySetAccessRule("QuizTakers", EntitySetRights.WriteAppend);
    config.SetEntitySetAccessRule("QuizResults", EntitySetRights.WriteAppend);
    config.SetEntitySetAccessRule("QuizAnswers", EntitySetRights.WriteAppend);
    config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
  }
  ...
}

Unfortunately the WCF Data Service does not have the ability to protect certain end-points to access via Authentication and others not, we have to allow all operations to be accessible from anonymous access since some of our entities can be accessed anonymously. Since the configuration allows QuizDefinitions and QuestionDefinitions to be changed through an anonymous call, we have to protect any of the change requests (INSERT, UPDATE and DELETE) to only happen when the client is authenticated. We did this by using WCF Data Services’ interception support.

Interception allowed us to define methods that were called during either query or change requests. In the case of protecting the QuizDefinitions and QuestionDefinitions, we wrote methods on the WCF Data Service that were annotated with the ChangeInterceptor attribute (specifying the endpoint that this interceptor will be used for). For example, our change interceptor for QuestionDefinitions checks to see if the call is authenticated, and if not, it throws a DataServiceException to prevent any changes when not authenticated:

public class QuizDataService : DataService<GiveAQuizEntities>
{
  ...
  [ChangeInterceptor("QuestionDefinitions")]
  public void OnQuestionDefinitionSave(QuestionDefinition question,
                                       UpdateOperations operation)
  {
    if (!HttpContext.Current.Request.IsAuthenticated)
    {
      throw new DataServiceException(
                  "Cannot update/change Questions unless logged in.");
    }
  }
  ...
}

In addition to checking to see if the user is authenticated, we ensure that the question’s owner is the same user who is logged-in. We did this by first making sure that the operation is either a change (e.g. update) or a delete. Then we can check to see that QuizDefinition for the question has the same owner as the logged-in user. This is done by calling our GetOwner method which returns a Guid that represents the user in the ASP.NET Application Services’ database:

public class QuizDataService : DataService<GiveAQuizEntities>
{
  ...
  [ChangeInterceptor("QuestionDefinitions")]
  public void OnQuestionDefinitionSave(QuestionDefinition question,
                                       UpdateOperations operation)
  {
    if (!HttpContext.Current.Request.IsAuthenticated)
    {
      throw new DataServiceException(
                  "Cannot update/change Questions unless logged in.");
    }
    if (operation == UpdateOperations.Change ||
        operation == UpdateOperations.Delete)
    {
      if (question.QuizDefinition.Owner != GetOwner())
      {
        throw new DataServiceException(
          "Cannot update or delete QuestionDefinitions that you do not own.");
      }
    }
  }
  ...
}

We could have done this same thing for query operations as well by using Query Interceptors. Query Interceptors are very different than ChangeInterceptors. QueryInterceptors allow you to specify an expression to add to the query. For example, we could limit the queries to the QuizDefinitions by adding an expression (the lambda confirming that the owner matches the logged-in user):

public class QuizDataService : DataService<GiveAQuizEntities>
{
  ...
  [QueryInterceptor("QuizDefinitions")]
  public Expression<Func<QuizDefinition, bool>> OnQuizDefinitionQuery()
  {
    var owner = GetOwner();
    // Does not work as we need anonymous access
    return q => q.Owner.ID == owner.ID;
  }

  ...
}

For our needs this does not work though as anonymous users can access quiz definitions (so they can take the quiz), so we cannot do this. Ultimately what we really want is to specify that queries must come from our Silverlight application. There are no fool-proof ways to do this though as most strategies are prone to spoofing because of how HTTP works. For example if we were to check the HOST header to make sure calls were coming from our website, this would protect it but it takes a minimum of effort to spoof the HOST header which makes this just useless:

public class QuizDataService : DataService<GiveAQuizEntities>
{
  ...
  [QueryInterceptor("QuizDefinitions")]
  public Expression<Func<QuizDefinition, bool>> OnQuizDefinitionQuery()
  {
    // Only allow if the request of the call came from our website
    // If we call the Data Service from our own website this won’t work
    // but since we use Entity Framework directly we don’t need to
    // access the Data Service directly except from the Silverlight client
    if (!HttpContext.Current.Request.Headers["Host"].Contains(
          HttpContext.Current.Request.Url.GetComponents(
            UriComponents.Host, UriFormat.Unescaped)))
    {
      throw new DataServiceException("Cannot call from outside our domain");
    }
    // returning a true means we aren't limiting the query
    return b => true;
  }
  ...
}

At this point we protected our WCF Data Service by severely limiting the surface area of an attack but we still haven’t protected it completely. We still want to have a way of protecting from the view of the average user since we use the WCF Data Service exclusively as our data access layer for our Silverlight client. One approach is to use a custom header (or Authorization header) to make sure the Silverlight client is the only app that uses it. But the problem with that approach is that Silverlight cannot be trusted to keep secrets. It is like AJAX in that way. If you don’t trust keeping encryption keys (or something else as dangerous) in JavaScript, you shouldn’t keep it in Silverlight either.

Since there was no foolproof way to verify that all calls coming into the service were coming from our Silverlight application, we needed to make sure that none of the operations the service could perform were dangerous. This leaves us with a less than secure WCF Data Service on our hands. Because the nature of the site is that the data is not mission critical in an organization, the road blocks we have put in will help get in the way of people doing nasty things with our data, but it won’t stop the most resourceful of them from getting through.

 

Reporting Against the Data

We needed to use reports to give users the results for an entire Quiz or an individual student’s results. We mocked up the reports in our original prototype. For example Figure 25 shows a simple quiz report mocked up.

Figure 25: Quiz Report Mock-up

 

To accomplish this we wanted to used SQL Server Reporting Services 2008 R2 to report directly against the database, but since we were not going to initially have a Reporting Server available in our beta rollout, we opted to use SQL Server Reporting Services compatible reports that exist for standalone ASP.NET projects.

To build reports we used a Visual Studio Report template. This template is in the Reports folder in the “Add New Item…” dialog as shown in Figure 26.

Figure 26: Adding a Report

 

We picked the Report Wizard so that it could help us get started with a standard report. The first step of the wizard (as shown in Figure 27) was to help us create a data source for our report. The Report Wizard in Visual Studio uses the name “Dataset” to show different data sources that can be created for the report. The “Dataset” name is the SQL Server Reporting nomenclature for a data source. For these reports though, it will create a client-side “DataSet” that is used to feed data into the report. This is very confusing as the two phrases are very close to each other. When we migrate the reports to use SQL Server Reporting Server the “Dataset” will just be a data source to the SQL Server database directly..

Figure 27: Choosing a  dataset for a Report

 

Next, we clicked on the “New” button to open the “Dataset Properties” dialog. This started a “Data Source Configuration Wizard” (as seen in Figure 28) that walked us through the steps to choose a connection string and the data to show in our report. The wizard resulted in a DataSet object added to our ASP.NET MVC project. The DataSet is a container for the data we want to report on. But before we defined the shape of our data schema, , we needed to define the report properties.

Figure 28: Picking the data for our report

 

Back in the Report Wizard, it showed us a list of “datasets” on the bottom of the current page of the wizard. The “datasets” name misled us at first because it led us to believe that this was for a list of actual “DataSets” but in fact was a list of the DataTables in our new DataSet. We opened this drop-down and picked the QuizAnswers (as shown in Figure 29) as this report was to show the individual results of a quiz. When we changed the dataset, we noticed that the fields available to us changed as well.

Figure 29: Picking a dataset (or actually a DataTable that points at a database table)

 

Once we had the shape of the data we wanted, we moved to the next page of the wizard. This next page let us define the basic report we wanted. We used the wizard page to put the initial fields we wanted in our report by dragging the fields into the “Values” box as shown in Figure 30 below.

Figure 30: Dragging the Fields into the Values section

 

The next two pages showed us a preview of our report section and a way to format the report to our liking. We just clicked through the Next and Finish buttons to complete the report at this point. This left us with a simple report with our detail section.  The whole report from the wizard is shown in Figure 31.

Figure 31: Our initial report

 

Once the wizard was completed, we ended up with a report file (.rdlc file) in our web project (specifically in /Reports/QuizDetails.rdlc). The DataSet that the report wizard gave us was not the data we really wanted. The DataSet created the correct shape of the data, but we wanted to fill in the data with data for a specific quiz that was given. For this we needed a query that returned the same shape as the DataSet that was created.  To do this, we had to go to the DataSet itself and do some refactoring. When we opened the DetailsDataSet that the wizard created, it showed all the tables (and their relationships) but that wasn’t useful for our needs. Since we needed to get a flat set of data to report on, we decided to change it to return only a single DataTable instead. We deleted all the DataTables but the QuizAnswers DataTable (the one we selected as our dataset in the wizard). Once we were left with just the QuizAnswers, we needed to change the TableAdapter that was used to fill the table. We right clicked on the TableAdapter and picked Configure (as shown in Figure 32).

Figure 32: Configuring the TableAdapter

 

In the TableAdapter Configuration Wizard, we simply added in a  SQL query that represented the data we wanted to retrieve. Since this was for a report, we did some calculations to determine if the answers were correct or not. We also wanted our DataTable to be filled in with one and only one QuizResult’s data so we put a query parameter for the where clause in which we limited the result to only one QuizResultId as shown in Figure 33.

Figure 33: Adding a SQL Query to our DataSet

 

The query parameter is important as we needed to be able to limit the data returned. Since we worked with a local SQL Server Reporting Services report (e.g. the report is kept locally on the web server and not remotely on the SSRS report server) ), we did not need to use a report parameter to specify the id of the QuizResult we wanted to use in this report. We do need to specify it when we specify the data to use for the report. We did this directly in the web page that shows the report to the user.

In addition, changing the data in the DataSet proved difficult to keep in sync with the report as the report designer in Visual Studio 2010 crashed quite a bit and in other cases simply did not work. It required that we restarted Visual Studio 2010 a number of times during development, but we did get it working after some frustration.

 

Displaying the Report

When trying to display our report, we found that the ASP.NET ships with a ReportViewer to show reports but it is an ASP.NET Web Forms control. The control requires the use of ViewState to work which is not supported in ASP.NET MVC. Not a problem, ASP.NET Web Forms and MVC can co-exist in the same project very simply. We created a simple web forms page to host our report. Because we could not use it on a MVC page (because our master pages were MVC pages, not WebForms), we decided to show the report in a MVC View via a WebForms page viewed in an IFRAME. It is not as clean as we would have liked, but by using an IFrame to point at the report allowed us to not duplicate the master pages in WebForms and in MVC:

<%@ Page Title=""
    Language="C#"
    MasterPageFile="~/Views/Shared/InternalPage.master"
    Inherits="System.Web.Mvc.ViewPage<dynamic>" %>
<asp:Content ID="Content1" ContentPlaceHolderID="TitleContent" runat="server">
  Quiz Details Report
</asp:Content>
<asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server">
  <% var src = ResolveUrl(
                 String.Format("~/Reports/QuizDetails.aspx?quizResultsId={0}",
                               Request.QueryString["quizResultsId"])); %>
  <iframe src="<%= src %>" frameborder="0" width="850px" height="700px"></iframe>
</asp:Content>
<asp:Content ID="Content3" ContentPlaceHolderID="PageHeader" runat="server">
  Quiz Details Report</asp:Content>

By using the iframe, we could create our web forms pages with the report viewer without and simply host it on a MVC View that contained the master pages.

Our WebForms page (quizDetails.aspx) is a simple WebForms page with just the required pieces to display the report. To host the report on a web forms page, we required three elements: a ScriptManager, the ReportViewer and an ObjectDataSource:

<body>
  <form id="form1" runat="server">
  <div>
    <asp:ScriptManager ID="scriptMgr" runat="server" />
    <rsweb:ReportViewer ID="ReportViewer1"
                        runat="server"
                        Font-Names="Verdana"
                        Font-Size="8pt"
                        InteractiveDeviceInfos="(Collection)"
                        WaitMessageFont-Names="Verdana"
                        WaitMessageFont-Size="14pt"
                        Width="800px"
                        Height="675px">
      <LocalReport ReportPath="Reports\QuizDetailsReport.rdlc">
        <DataSources>
          <rsweb:ReportDataSource DataSourceId="reportDataSource"
                                  Name="QuizDetailsDataSet" />
        </DataSources>
      </LocalReport>
    </rsweb:ReportViewer>
    <asp:ObjectDataSource ID="reportDataSource "
                          runat="server"
                          OldValuesParameterFormatString="original_{0}"
                          SelectMethod="GetData"
                          TypeName="GiveAQuizSite.
                                    Reports.
                                    QuizDetailsDataSetTableAdapters.
                                    QuizAnswersTableAdapter">
      <SelectParameters>
        <asp:QueryStringParameter DefaultValue=""
                                  Name="QuizResultsId"
                                  QueryStringField="quizResultsId"
                                  Type="Int32" />
      </SelectParameters>
    </asp:ObjectDataSource>
  </div>
  </form>
</body>

Because the ReportViewer is an ASP.NET AJAX control. We needed to add a ScriptManager object to the page to support the injecting scripts into that page that ASP.NET AJAX and the ReportViewer need.. The ReportViewer is used to point to the correct report file (.rdlc) and to specify which data source to useEven though had we already specified the data in our design of the report, we had to respecify a data source for the report here.  The original definition was for design-time support, and here we used the same source (the generated “DataSet”) as our source with the QuizResultID passed in to specify which report to show. The ReportDataSource is a pointer to a data source on the page (note that we’re using an ObjectDataSource to use the DataSet that was created for us). The ReportDataSource also specifies the name of the data source (e.g. dataset) defined inside the report (i.e. QuizDetailsDataSet).

Lastly, the ObjectDataSource specifies the TypeName of a TableAdapter that can generate a DataSet that we need.  (i.e. QuizDetailsTableAdapter). For the parameters it uses a QueryStringParameter to replace the quizResultId with the query string parameter specified.

Once the Web Forms page is defined, we can just run the MVC View and the iframe goes out and runs and shows the report as shown in Figure 34.

Figure 34: The Report displayed on a web page

 

SQL Server Reporting Services

In this initial creation of GiveAQuiz.com we planned to host the site at an ISP where we had a shared machine to use to host websites. Because of this, we did not have a dedicated SQL Server Reporting Services instance to use so we used a local report located on the web serverThe plan moving forward is to take advantage of a fuller experience for reporting by hosting our reports directly in a SQL Server 2008 R2’s Report Services instance . Local reports provide a migration path to move from a report located on the web site to a SQL Server Reporting Service server--based report in the future. But why would we want to move towards a SQL Server Reporting Services solution instead of just keep the reports on the web server?  Using the full SQL Server Reporting Services has additional benefit that can help with both added functionality as well as scalability.

The benefits of moving to SQL Server Reporting Services include:

  • Parameter prompt UI enables users to specify the query filters directly in the viewer control. Per earlier comment, this is not possible in local reports.
  • Performance: allowing you to design more complex reports, such complex expressions, etc, as well as offloading report processing to a dedicated remote server.
  • Report scheduling
  • Report delivery (e-mailing reports, not just making them available via a web site).
  • Report caching
  • Report data syndication

SQL Server Reporting Services allows you to deliver of reports based on specific schedules. This would allow us to add the functionality of sending daily reports of a quiz to the owner of the quiz without having to write our own scheduling system. This is something we plan on adding in the second revision of the site going forward but until we move the site to a instance of SQL Server Reporting Services, we could not include it in this first version. For scheduled reports, we also want to be able to make the delivery of the reports directly in a usable format like PDFs, Excel or Word documents. When a scheduled report is sent via e-mail, you can decide what format(s) to use. This would allow us to deliver the reports in a form that is immediately usable no matter the form-factor (smart phone, laptop, desktop, etc.).

One concern we have with reporting in general is that usually reporting queries are not tuned to perform particularly well, so as the site grows we’d like to be able to cache the reports so that they are not impacting the SQL Server performance directly. This works with reports that are run ad-hoc as well as scheduled reports. For scheduled reports, they can be run on a schedule then subsequent calls to the report simply return the cached version of the report.

Lastly, we would like to be able to allow the users to manipulate the results of the reports in any way they see fit. In SQL Server 2008 R2 Reporting Services, we could add the ability to expose our report results directly as OData feeds (the same kind of feed that WCF Data Services uses, but in this case the access would be read-only). Using their data as an OData feed would allow the users to get access to the results in summary versions that are the same as the data we directly report against so they can use it in the ways that they need it.

 

Where Are We?

The GiveAQuiz.com site is an open source example of how to use the Microsoft data stack to meet real user needs. The hope is that it both serves as an adequate example as well as a service to the Internet community as well. By using the Microsoft data stack, we spent most of the time in development working through actual business problems (e.g. how to handle timed tests, how to make question types extensible so writing new question types would just appear in the application). We were powered by the data in our application, but did not waste time fighting the database or data access layer. It simply became a tool for us to manipulate our data and report against it.

 

About the Author

Shawn Wildermuth is a Microsoft MVP (C#), member of the INETA Speaker's Bureau and an author of six books about .NET. Shawn is involved with Microsoft as a Silverlight Insider, Data Insider and Connected Technology Advisors (WCF/Oslo/WF). He has been seen speaking at a variety of international conferences including SDC, VSLive, WinDev, MIX, DevTeach and DevReach. Shawn has written dozens of articles for a variety of magazines and websites including MSDN, DevSource, InformIT, CoDe Magazine, ServerSide.NET and MSDN Online. He has over twenty years of software development regularly blogs about a range of topics including Silverlight, Oslo, Databases, XML and web services on his blog ( http://wildermuth.com). He is currently teaching workshops around the country through his training company AgiliTrain.

Notes

[1] This project is often referred to as “DataDude” as that was its codename before being integrated into Visual Studio.

[11] Note: To view the raw XML in IE7 or IE8, you’ll need to disable the “Feed Reading View” by Go to “Internet Options”>Content>Feeds Setting. Uncheck “Turn on feed reading view”.

[12] Silverlight blocks all synchronous access by throwing NotImplementedException in the networking stack.

Microsoft is conducting an online survey to understand your opinion of the MSDN Web site. If you choose to participate, the online survey will be presented to you when you leave the MSDN Web site.

Would you like to participate?