Export (0) Print
Expand All
26 out of 32 rated this helpful - Rate this topic

What Microsoft Visual Studio 2005 Team Edition for Database Professionals Can Do for You

Visual Studio 2005
 

Tony Loton
LOTONtech Limited

August 2006

Applies to:
   Visual Studio 2005 Team Edition for Database Professionals - CTP4

Summary: This article provides an overview of the latest addition to the Microsoft Visual Studio Team System: Visual Studio 2005 Team Edition for Database Professionals. (17 printed pages)

Note   This document was developed prior to the product's release to manufacturing, and as such, you may find inconsistencies with the details included here and those found in the shipping product. The information is based on the product at the time this document was created and should be used for planning purposes only. Information is subject to change at any time without prior notice. Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.

Contents

Introduction
Managing Database Objects and Schemas as Version Controlled Items
Unit Testing Stored Procedures
Data Generation
Rename Refactoring of Database Objects
Conclusion

Introduction

In the first half of 2006, I was commissioned to design a brand new system using Microsoft Visual Studio 2005 Team Edition for Software Developers as the integrated development environment. The system was based on ASP.NET 2.0 and Microsoft SQL Server 2005 technologies, and the tool and the technologies performed well. But on the database side, we had to adopt some practices that would not have been necessary had the most recent edition of Visual Studio—namely Visual Studio Team Edition for Database Professionals—been available.

In this article, I'll introduce some of the new features of this Visual Studio edition, paying particular attention to those features that would have been most beneficial to our project team. To avoid treading on anyone's intellectual property toes, I've reworked and simplified some of the examples, but do keep in mind that my coverage really is driven by my hindsight view of our project. In that sense, the title of this article could just as well be "What Microsoft Visual Studio Team Edition for Database Professionals Could Have Done for Me."

The set of problems that I'll cover includes:

  • How to manage database objects as version controlled items within a solution.
  • How to unit test when your only executable "units" are stored procedures.
  • How to set up a database schema and test data for unit testing.
  • How to easily rename database table and column names that have been generated by a schema creation tool.

Managing Database Objects and Schemas as Version Controlled Items

As with all of our other software artifacts—source code files, XML files, and so forth—we wanted our database artifacts—table definitions, stored procedures, and so forth—to be part of our solution. We wanted not only to share those artifacts between developers but also to put them under version control. The problem was that the database objects are stored within the database itself, and the editing of those objects is driven from Visual Studio 2005 Server Explorer instead of Solution Explorer—therefore outside the solution scope and outside version control.

To share database objects and schemas by simply pointing all development computers to the same database server instance is not ideal. This approach gets worse if you want each developer to work locally on his development computer making schema changes that are not immediately available to other developers until thoroughly tested.

One option would be for developers to use a file-based SQL Server 2005 Express Edition database and to store the underlying database .mdf file within the solution. In this scheme, developers could make changes to the database schema locally as long as they had checked out the latest version of the .mdf file, providing that we did not allow multiple checkouts. Putting the whole database schema under version control is hardly a recipe for concurrent working, so we quickly shelved that idea.

Probably the most widely used approach—which is the one we adopted—is that of generating .sql script files from the database in Server Explorer into a database project in Solution Explorer, as shown in figure 1.

Aa730872.tedbpro01(en-US,VS.80).gif

Figure 1. Generate Create Script to Project

That approach gives you something in your solution to put under version control, namely the generated .sql script files. You also get a lower level of granularity than would be achieved by putting the whole database schema under version control in an .mdf file. But do you see the mismatch between the hierarchy shown in Server Explorer and that shown in Solution Explorer? In Server Explorer, the database objects are nicely organized in tables, stored procedures, and so forth. In Solution Explorer, we see no such arrangement, merely a set of .sql files all at the same level within the Create Scripts folder. Yes, I know that we could have created folders and moved the scripts around in the project, but would the Generate Create Script to Project option keep track of where we'd put them?

In our chosen approach, developers working on a particular table or stored procedure definition could make changes directly in the script file. As an alternative, they could change the script by opening the item in Server Explorer, re-generating the script, and checking in the script after finishing. Other developers could get the most recent version from source control, as they would any other artifact, and run the script or scripts against their own development databases, as shown in figure 2.

Aa730872.tedbpro02(en-US,VS.80).gif

Figure 2. Running scripts

That's fine, apart from the fact that it's difficult for the developer to see the effect of applying the script or scripts to the database. With all those separate scripts, it's also difficult for the developer to decide which scripts to apply and which not. In the end, we created an aggregate script that executed all of the other scripts in turn. This required each developer to recreate the local database entirely after each "get latest" operation.

Now that I have set out the problem and our best solution to date, let's take a look at what Team Edition for Database Professionals offers in this area.

For starters, you can now create a database project in which schema objects are arranged in your project exactly as they are arranged in Server Explorer. You can display Schema View, which is to database objects what Class View is to .NET code objects. Take a look at figure 3, and remember that you're looking at the structure of a database project in Schema View, not the contents of a database in Server Explorer.

Aa730872.tedbpro03(en-US,VS.80).gif

Figure 3. Schema View

Although you can't see it in Figure 3, the database objects are still stored individually as script files so that you can put each one under version control just as before. The only real difference—which you can see in Solution Explorer in figure 4—is that the file extensions are now .proc.sql for stored procedures (and .table.sql for table scripts) instead of the one-size-fits-all .sql extension. The version control system that you use to manage those files is your choice, as long as it's compatible with Visual Studio. The mostly likely choices are Microsoft Visual Studio 2005 Team Foundation Server (TFS, the preferred choice) or Visual SourceSafe 2005.

Aa730872.tedbpro04(en-US,VS.80).gif

Figure 4. Solution Explorer

As you create or modify your stored procedures and table definitions in Solution Explorer, you have not changed anything in your database at all. So the next question that springs to mind is how to see the effect of those changes on your database, which you can do if you open the Data menu, point to Schema Compare, and click New Schema Comparison.

After you select that menu option, the New Schema Comparison dialog box appears and prompts you to select a source schema and target schema. In figure 5, I have selected the project that contains the database scripts as the source schema and a local database as the target schema.

Aa730872.tedbpro05(en-US,VS.80).gif

Figure 5. New Schema Comparison

When you accept the settings in that dialog box, Visual Studio displays several panes that illustrate the differences between the two schemas, in a style that is very similar to most tools that merge source code. The upper pane (see figure 6) shows the existence— or not—of each object in the source and target schemas. You also see an Update Action, which helps you to decide on the appropriate action to take for each object. In my example, the target database already contains a definition for the Accounts table, so the default action for that object is Skip. The target database does not contain a definition for the stored procedure, so the default action for that object is Create.

Aa730872.tedbpro06(en-US,VS.80).gif

Figure 6. Schema Compare

The additional panes show the definitions of the objects in both the source schema and the target schema, along with the schema update script that could be executed to make the change. In my example, because the stored procedure is new, we see its definition in the Source Object pane but not in the Target Object pane.

Applying the changes to the target schema is simply a matter of clicking the Write Updates toolbar button. Later you'll learn about another way to update a target schema, as part of the initialization of unit tests.

Unit Testing Stored Procedures

We all know the value of unit testing in modern software development, right? But what if you have no units to test?

In our real-life project, we were using ASP.NET 2.0 controls, such as GridView, connected through SqlDataSources to Stored Procedures. You can argue amongst yourselves about whether we should have gone through ObjectDataSources, thus providing us with some testable units. But it's lucky that we didn't because it illustrates the next point perfectly.

We had Web pages, which were better tested using Web tests, and we had business logic. Because the business logic was encapsulated entirely within stored procedures, we could not generate unit tests for it in Team Edition for Software Developers. Our solution was to create a wrapper class in Visual Basic .NET having a method that corresponded to each and every stored procedure that we had created. Those methods had exactly the same parameters and return values as the underlying stored procedures, and the methods simply delegated to the stored procedures. For example, the following code wraps an update stored procedure named spEmployeesUpdateUserName.

Public Class SprocWrappers
  Inherits WrapperBase

  Public Shared Function spEmployeesUpdateUserName(ByVal EmployeeNumber As String, ByVal UserName As String) As Integer

    Dim sqlConnection As SqlConnection

    sqlConnection = New SqlConnection(GetConnectionString)
    sqlConnection.Open()

    Dim sqlCommand As SqlCommand = New SqlCommand("spEmployeesUpdateUserName", sqlConnection)
    sqlCommand.Parameters.AddWithValue("@EmployeeNumber", EmployeeNumber)
    sqlCommand.Parameters.AddWithValue("@UserName", UserName)
    sqlCommand.CommandType = CommandType.StoredProcedure
    Dim rowsUpdated As Integer = sqlCommand.ExecuteNonQuery()

    Return rowsUpdated

  End Function

End Class

In Team Edition for Software Developers (not considering the database edition just yet), you could create a unit test. For example, you can right-click the spEmployeesUpdateUserName method and then click Create Unit Test. Your unit test method would typically contain code as in the following example:

' Update the employee user name
resultInt = SprocWrappers.spEmployeesUpdateUserName("EMP001", "Tony")
Assert.IsTrue(resultInt > 0, "EmployeesTest: Failed to update Employee username")

That approach works well enough, but you can imagine how much effort it takes to reliably create a complete set of wrapper methods solely for the purpose of unit testing your stored procedures. You'll be pleased to hear that the Team Edition for Database Professionals takes away all of that hard work.

By looking back at my earlier figures, you'll see that my sample schema for this article contains a single stored procedure, which is named spSelectAccounts. Its function is very simple: to select all of the rows from my Accounts table. The code for that stored procedure in the file spSelectAccounts.proc is shown in the following example:

CREATE PROCEDURE [dbo].[spSelectAccounts]
AS
   SELECT * FROM Accounts
RETURN 

With Team Edition for Database Professionals, it's very easy to create a unit test for that procedure. Just right-click spAccounts in Schema View, and click Create Unit Tests to display the Create Unit Tests dialog box, as shown in figure 7. I've selected the spSelectAccounts stored procedure, created a Visual Basic test project named SpUnitTest to hold the test class, and named the test class spSelectAccountsTest.vb.

Aa730872.tedbpro07(en-US,VS.80).gif

Figure 7. Create Unit Tests

You can achieve the same result by opening the Test menu, clicking New Test, and then selecting the Database Unit Test template.

In figure 8, I am prompted for a database connection to use when running the unit test. I choose the same local database that I used previously, and at this stage, I completely ignored the Pretest Database Setup options. The fact is that I have already set up my test database with the required schema (Figure 6) and I have manually populated the Accounts table with three sample rows. So I'll run with that setup for now and revisit data generation later in the article.

Aa730872.tedbpro08(en-US,VS.80).gif

Figure 8. Database Connection Configuration

After I've selected the database connection, the new unit test class appears in design view, as figure 9 shows. There is an alternative code view, of course, which I'll not delve into for the purposes of this article.

Click here for larger image

Figure 9. Unit Test Class in Design View (Click on the image for a larger picture)

If you created the unit test by opening the Test menu and clicking New Test, the Transact-SQL (T-SQL) code to exercise the stored procedure will not have been generated automatically. You would then have the option of entering some T-SQL statements to exercise more than one stored procedure in a single test, or you could enter the code shown in Figure 9 to exercise your one chosen stored procedure. Those statements—in case you can't read them in the figure—would be:

DECLARE @RC Int
SELECT @RC = 0
EXEC @RC = [dbo].[spSelectAccounts]
SELECT RC=@RC 

What I'd really like to draw your attention to is the Test Conditions pane, which was populated by default with an Inconclusive test condition, as you would expect with any newly generated unit test. I removed that default condition, replaced it with a Row Count condition, and specified (in Properties) that I expected two rows to be returned. If you remember, I populated the Accounts table with three sample rows, so I actually expect this condition to fail. And that's exactly what we see in the lower pane of Figure 9, where you see the Result = Failed and the error message:

Row Count Condition (RowCountCondition1) Failed: ResultSet 1 : The 3 row(s) did not match the 2 expected rows(s)

To get that result, I ran the test by clicking Run on the Test Results toolbar. However, it's perfectly possible to also run the test from Visual Studio's Test View, just as you would with any other unit test.

Data Generation

The success or failure of your unit tests will depend on the state of the database—both the table definitions and the data within those tables—when the tests are run. In our project, we had not only a set of SQL scripts to recreate the database schema for each developer, but we also had a set of SQL scripts to pump a known set of data into the database. Thus each developer would have the same schema and the same data with which to perform unit tests. Our solution was only semi-automatic in that we had the scripts but the developers had to run them before unit testing (or execute them as part of the nightly build). However, the whole process is fully automated by Team Edition for Database Professionals.

Do you remember the project configuration dialog box, shown in figure 10, which had additional fields for specifying the Pretest Database Setup? In that dialog box, you can specify a database schema project and a data generator file that together define the structure and content of your database as it should be immediately before unit testing.

Aa730872.tedbpro10(en-US,VS.80).gif

Figure 10. Database Connection Configuration with Pretest Database

You can open that dialog box again by opening the Test menu and clicking Database Test Configuration, which I did to create Figure 10. You can see that I have specified the database project that I've been working with as the Database project for schema deployment. I've also specified a file named UnitTestingDataGeneration.dgen as my data generation file. The choice of database project is obvious, but where did the data generation file come from?

I created that file by right-clicking the Data Generation Plans subfolder of my database project and then clicking Data Generation Plan. Figure 11 shows my new file, UnitTestingDataGenerationPlan.dgen, in Solution Explorer. In the main pane, you can see that I have configured this data generation plan to insert two rows into the Accounts table. You will remember that three rows of test data in that table caused the unit test for the spSelectAccountsTest stored procedure to fail and that two rows should cause the test to pass.

Aa730872.tedbpro11(en-US,VS.80).gif

Figure 11. Unit Test Data Generation

When I run the unit test again with that pre-test database configuration, it does indeed pass.

At this point, it is interesting to open up the Accounts table to see what the two rows of generated data look like, so that's what I've done in figure 12. You can see that generated data consists of randomly generated strings, which in my case is sufficient because I only want to test the number of rows—not their contents. It's beyond the scope of this article, but I can tell you that it is possible to generate much more meaningful data by using the Regular Expression data generator. For example, if you want your sample account numbers to consist always of eight numeric digits, then rest assured it can be done.

Aa730872.tedbpro12(en-US,VS.80).gif

Figure 12. Generated Data

Rename Refactoring of Database Objects

The final problem that we encountered in our project, for which the new Visual Studio edition provides a solution, is that of renaming database objects. That problem arose for us as follows:

One member of our team, who had a good background in database design, took on the task of designing our database schema. He had an Oracle background and was therefore not familiar with the Microsoft toolset. Therefore, he used one of those one-size-fits-all, entity-relationship (ER) tools to create a database-independent schema complete with field domains, constraints, and inheritance of entities. From that generic schema, we generated SQL Server Data Definition Language (DDL).

The problem was that his spelling was a bit dodgy, so we would have column names like ASESSMENT (look closely) that were not spotted until well into the project. I'm not joking! And because of how the tool for modeling schemas worked, we would see some column names with auto-generated prefixes, such as a column named TITL_TITLE where plain old TITLE would have done just fine. Undoing those problems is surprisingly difficult because of the key relationships between those columns, the stored procedures that rely on their names, and so on. Of course, this is where the Rename Refactoring feature of Team Edition for Database Professionals comes in.

To demonstrate the refactoring, I'll rename one of the columns in my Accounts table from SortCode (which is a number that identifies banks in the United Kingdom) to RoutingTransitCode (which I understand to be the equivalent in the United States). So that we can see any consequential effects of that refactoring, I've already changed my spSelectAccounts stored procedure from...

CREATE PROCEDURE [dbo].[spSelectAccounts]
AS
   SELECT * FROM Accounts
RETURN 
to..
CREATE PROCEDURE [dbo].[spSelectAccounts]
AS
   SELECT AccountNumber, SortCode, AccountName FROM Accounts
RETURN 

To proceed with the refactoring, I clicked the SortCode column of the Accounts table in Solution Explorer, opened the Data menu, pointed to Refactor, and then clicked Rename. That sequence displayed the Rename dialog box, as figure 13, and I can use that dialog box to specify the new name of the column.

Aa730872.tedbpro13(en-US,VS.80).gif

Figure 13. Rename

Notice the Preview changes check box, which, when selected, displays the Preview Changes dialog box as shown in the following illustration. You can see that the SortCode column of the Accounts table will be renamed (of course). More interestingly, the SELECT statement of the spSelectAccounts stored procedure will also change.

Aa730872.tedbpro14(en-US,VS.80).gif

Figure 14. Preview Changes - Rename

Indeed, the end result is that the stored procedure code is automatically updated, as the following example shows:

CREATE PROCEDURE [dbo].[spSelectAccounts]
AS
   SELECT AccountNumber, RoutingTransitNumber, AccountName FROM Accounts
RETURN 

Now a word of warning—this rename facility will only update references to renamed objects within the database itself. So you'll have to rename any references to those objects in your .NET code either manually, by using search and replace, or by using Visual Studio's (code) rename refactoring. That was not such a problem in our project because—as I said earlier—we had encapsulated most of the business logic and data logic within stored procedures. As luck would have it, our renaming problem was well encapsulated within the database.

Conclusion

In this article, I've introduced many of the new features provided by Visual Studio Team Edition for Database Professionals; in particular, those features that would have been genuinely useful in my most recent project. That dose of realism was warts-and-all, in the sense that where I think we might have taken a wrong turn architecturally—for example, in coupling our page logic directly to stored procedures—I've not shied away from describing our approach if it served to illustrate an important point.

I've included enough detail to illustrate the appropriateness of each new feature in a real-life situation, while keeping it simple enough for you to follow without too much effort. I've not covered some new features—such as the enhancements to the T-SQL editor—because they were not directly relevant to my set of problems. However, I hope to have given you enough food for thought to dig a bit deeper yourself.

Related Information

Visual Studio Team System Developer Center

 

About the author

Tony Loton works through his company LOTONtech Limited (http://www.lotontech.com) as an independent consultant, trainer, and author. He has been involved with Visual Studio Team System since the very early beta releases and has co-authored the Wrox book Professional Visual Studio 2005 Team System (http://www.wrox.com/WileyCDA/WroxTitle/productCd-0764584367.html).

Did you find this helpful?
(1500 characters remaining)
Thank you for your feedback
Show:
© 2014 Microsoft. All rights reserved.