Export (0) Print
Expand All

Reality Check with the Modeling Panel


August 2006

Click ARCast: Reality Check with the Modeling Panel to listen to this ARCast.

Wait a minute! Stop all the hype. It is time for a reality check. Listen as our panelists Steven Kelley, Martin Danner, and Jack Greenfield enter the cone of silence to share their thoughts...

Ron Jacobs: Welcome, welcome, friends, to ARCast. Hey, I am so glad you joined me today, because today we have the classic ARCast format. To start it all, ARCast no. 7, and in this series, we are talking about Model-Driven Development. We are asking, "Is this likely to gain widespread adoption? Will it make building software faster, better, or cheaper?"

Well, listen up, because we have some experts on the line who are going to give you their opinion. First up: Steven Kelly. Steven is the CTO of MetaCase and cofounder of the DSM forum. Steven is an organizer of OOPSLA workshops on domain-specific modeling. Steve has authored articles in journals such as Dr. Dobbs and Object Spectrum, and he is a member of the editorial board for the journal of Database Management. Let's welcome Steven Kelley.

Steven Kelley: Hi, this is Steven Kelley from MetaCase. From selling the Model-Driven Development, the model is dominant at a floor where, whilst worth stating that, it should be quite self-evident by now. So strange to hear Jack Greenfield disagrees. Jack said that in their DSL tool, he doesn't want model to be dominant over code. This might be related to what Steve Cook said in the ArcTalk podcast, that Microsoft only expect their DSL tools to be able to generate 40 to 80 percent of code. Of course, the tools are at a pre-beta stage, so far.

So, hope as they mature over a few versions, Microsoft will finally stop believing more on the approach and in that tool's ability to deliver a much higher percentage over product from models. That was certainly true for us back in 1995. It took our customers to show us that a 100-percent code generation really could be done. Since there is a possibility for misunderstanding here, let's make clear what we are talking about in terms of generating a full product.

Almost any product-chip today would contain a large amount of third-party or in-house code which is not produced specifically for that project, but is reused over many projects and often by many companies. That's not the code would expect to see modeled or generated, not as part of that project.

No, what we are interested in is the new code needed for not product or project. Before, that would have been written by hand. Now, we are aiming at a 100-percent generation of that code, and often we can meet that target. May be hard to believe, until you see it, but I am sure it was hard for people to believe that coding languages like C could effectively replace assembler. Of course, there is always the chance of some new feature in our next product requiring something not yet in our modeling language or generators.

That's no problem; we can either add it to the language or generators, if you think it's going to be useful in the future. Or then if it's a one-off thing, we can simply code it by hand like we used to. This, then, is similar to how we use components now. Normally, we convert through a high-level API. Though sometimes we need to drop down to a lower level to do something special, providing that's the exception and not a rule, it's not a problem. After all, development of games and embedded software are still used to have it to drop right down to the level of assembler for a few critical tweaks.

And, whatever we do, we will always be building on top of the stack of these lower levels of code, assembler, binary signals, analog signals, right down to the joys of quantum physics. The great thing in raising the level of abstraction is that these are then somebody else's problem.

They build compilers, assemblers, and chips that automatically transform our input progressively down to the very lowest level, where things actually happen. As Brain Sallack rightly pointed out, that raise in the level of abstraction and automation is what makes our lives easier and more productive. In other words, domain-specific modeling does to code what compilers did to assembler.

Ron Jacobs: Our next panelist is Martin Danner. Martin Danner founded Arrowrock Corporation in 1998. He is a PMI project management professional in Microsoft Certified Solution Developer for the .NET Framework and a Microsoft architect in VP. Let's welcome Martin Danner!!

Martin Danner: Hi, this is Martin Danner with Arrowrock Corporation in Boise, Idaho. I want to start out today by talking about the modeling wars. Much is made about the feud between Microsoft Software Factories and UMG's model-driven architecture framework for MDA. A fair amount of polarizing rhetoric is being bandied about, including there are some sharp opinions on subject. One gets the impression that it is an "either-or" proposition, that you are either in the MDA camp or the software-factories camp.

Well, I've had a close look at both, and I can tell you that it's remarkable how they share many of the same basic concepts. For instance, both describe a method for designing systems at some predefined level of abstraction, and then converting that design into an implementation of one sort or other through a brief predefined transformation process.

Also, both employ the notion of viewpoints and views. Simply put, a viewpoint is the type of model—say, a class diagram—whereas a view is an instance of that model for a specific system—for example, a class diagram of an accounts-receivable module. So, what is all the fuss about? As far as I can tell, this is the main difference. In MDA, all the modeling is done using UML and its companion, Object Constraint Language, or OCL.

The software factory, on the other hand, has not constrained the modeling to any particular language. Rather, it employs the notion of a Domain-Specific Language, or DSL. The DSL is custom-tailored to express the concepts of the domain being modeled. Unlike UML, which defines a single modeling language, there can be many different DSLs, just as there are many different domains to be modeled.

What's more, you can invent your own DSL to create a modeling environment that perfectly fits the domain you are working in. Believe me, when that little nugget of truth sinks in, you gets this "aha!" moment that's really quite liberating. Now, I am not saying UML doesn't work. Actually, it's a very useful tool. In fact, I use it all the time. Now, the point I am trying to make is that UML may not always be the right tool for the job.

UML is designed to model a specific domain—namely, object-oriented software systems. In other words, UML is a domain-specific language designed to model object-oriented software system. Think about it. This is really a key point to someone, and I say it again: The UML is a domain-specific language designed to model object-oriented software system.

All well and good, you might say, because in the end we are all building object-oriented system, anyway. But you need to ask yourself why I should limit myself to this one level of abstraction. Sure, this software system should build in are object oriented, but is it possible, even desirable, to design your own software system and at even higher level of abstraction?

Let me explain what I mean. Imagine, if you will, that you can take your application domain and organize into reusable elements, where each element represents some well-defined bit of functionality. Now, take your elements and put them individual box to create your very own interesting modeling environment.

Next, drag and drop those elements onto desired surface, configure each element as needed, and wire them together in a way that represents the application you want to build. Finally, with the push of button, transform your design into a clean, tight code, using templates you know are good because you created them. Touch up your application, if needed, with your own custom code that comfortably coexists with the generated code.

Compile and serve with a garnish of system documentation also generated automatically, and there you have it: a fully baked software application built in a fraction of time from reusable assets. What I just described, the essence of the domain-specific modeling, sounds like a dream at some distant future, doesn't it?

Actually, we have been able to do it for several years now with a product MetaEdit Plus from MetaCase, which was cofounded by one of our panelists, Steven Kelly, and now you can do it in Visual Studio 2005 using the DSL tools extensibility. Both products come with tutorials and samples to get you started. I have gone through them and found them to be a very useful learning experience. Software factories are very similar to domain-specific modeling, but extended in important new ways.

For instance, software factory might utilize multiple domain-specific languages to model the system from different viewpoints, and then combine and transform those models in various ways to produce a variety of development artifacts that include source-code files, configuration files, SQL scripts, test cases, deployment manifest, and system documentation, and so on. Also, software factory expand generative programming capabilities to Guidance Automation.

Guidance Automation provides context-sensitive, just-in-time guidance in the form of templates, recipes, and wizards for the creation of entire solution or individual project, or even a single project item. The Guidance Automation toolkit (GAT) has kept low-profile, so far. But I understand that Microsoft's patterns & practices team will begin releasing, application blogs, its guidance packages in 2006. So, you will be hearing a lot more about it soon.

Of course, there is more to software factories than what I have just described. In fact, if you pick up and read the book on software factories by Jack Greenfield and company, you will soon find that finally it goes really deep, but don't let that discourage you. I recommend that while you are reading software-factories book, or perhaps re-reading it, event for the tutorials from Microsoft's DSL tool and MetaCase MetaEdit Plus product. You can start with something simple and expand your horizons from there.

I wrap up by saying that I think Model-Driven Development has a potential of becoming a disruptive technology that fundamentally changes the way that we all build software. Domain-specific modeling in its more elaborate offspring software factories holds a great deal of promise. In the next and final installment of this panel discussion, I will talk about whatever it takes for widespread adoption of a Model-Driven Development to occur, and I also share with you my humble opinion regarding the questions originally posted to this panel, which is, "Is Model-Driven Development ready for prime time?"

One final comment, a request actually: I would really like to know your thoughts about the subject, as well, any questions this discussion may have sparked. So, please take a minute and add your comments to the Channel 9 Web pages for this podcast.

Ron Jacobs: Our third and final panelist for this episode is Jack Greenfield. Jack is an Architect for the Enterprise Frameworks and Tools team at Microsoft. He was previously Chief Architect, Practitioner Desktop Group, at Rational Software Corporation, and Founder and CTO of InLine Software Corporation. Let's welcome Jack Greenfield.

Jack Greenfield: Hi, this is Jack Greenfield with week 3. I would like to respond to the comments that Martin Danner makes in this ARCast session. I agree with Martin that MDD has the potential to become disruptive technology that significantly improves the way we build software, and I am looking forward to hearing his comments about what it will take to make modeling mainstream. I have to disagree, however, with some of the other things he said, especially his claim that software factories and MDA are similar when I said none.

To support his claim, he says that both use models to describe systems at a higher level of abstraction, and then use transformation to produce the implementation. He later says that the result of this process after some manual touch-up is a fully baked application. Well, that doesn't sound like what we are doing in software factories today.

He also suggests that software factories and MDA take in essence same approach to his views and viewpoint. If you heard my comments in ARCast sessions 5 and 6, you'll have heard me say several things and make it hard for me to agree with Martin on this topic. I repeat them here briefly, because it is quite important to understand it, why software factories are not MDA. First, software factories can use models; they don't focus on models. Instead, they focus on defining the development process for family related system, and then on supporting the process with reusable assets.

Yes, it's true that some of those assets may be models and modeling tools, but software factories explicitly integrate modeling with more traditional development methods and practices. And some of these other assets may be templates or patterns or best-practice guidelines or class libraries.

As I explained in session 5, we don't always know enough about every aspect of a problem domain to modeling. We can, however, identify some of those aspects and capture information that have each aspect should be addressed in the development process.

This focus on more traditional development methods and practices is not merely window dressing; in fact, I believe it's critical to avoiding the over-promising and under-delivery that doom case to failures. MDAs integrate modeling with more traditional development methods and practices. The legitimate reason for concern, in my opinion, is that it will suffer similar fate.

Second, software factories can use transformation; they don't focus on transformation. Instead, they focus on defining viewpoints from which different stakeholders participate in development process. And on managing the artifacts appropriate to each viewpoint, the artifact management may at times involve transformation, such as transforming a logical data model into a physical one.

But, more often than not, it involves other types of operations across the relationships between viewpoints, such as trace, validation, or analysis. As I explained in session 5, the reason is that we don't always know enough about a given pair of viewpoints or about how they relate to each other to transform artifacts between them automatically. In my opinion, recognizing the limits of our knowledge arbitrate problem domains in the limitations of the technologies we are offering to marketplace is critical to the success of Model-Driven Development.

The general claim that applications of any type can be generated in new their entirety from models, fully baked, is just not credible. Worse, it undermines the credibility of legitimate efforts to improve productivity, to the pragmatic and judicious application of Model-Driven technologies in the context of more traditional development methods and practices. MDA's focus on transformation gives legitimate fodder to the critics of Model-Driven Development, in my opinion.

Third, I find it hard to agree with Martin's suggestions that software factory and MDA take essentially same approach to views and viewpoints. Yes, the PIMs and CIMs defined by MDA can be seen as viewpoints.

However, MDA does not explicitly incorporate the notion of viewpoint, nor does it provide a way to define or reason about viewpoint. In order to see PIM and CIM as viewpoints, we first have to learn to think about software development in terms of viewpoints through some other methodology, and then interpret MDA in those terms. More importantly, the number of viewpoints we can ascribe to MDAs is fixed at two: There is one PIM and one CIM.

MDA doesn't give us a way to reason about requirements or logical architecture or implementation or deployment or technical architecture or testing or operational management or maintenance or life-cycle processes for a family of systems, in terms of a set of certain viewpoints and relationships among them. Software factories, by contrast, are explicitly based on viewpoints and relationships between viewpoints; in fact, software factory is a methodology for defining and reasoning about viewpoints.

To say that software factories and MDA both use viewpoints is like saying that monkey and humans both use computers. Sure, we see monkeys playing with keyboards and they might appear to be using the computers, but monkeys are presumably unaware of the significance of the computers.

And, certainly, can't be expected to design new ones. In short, Martin's summary of software factories and MDA is essentially summary of MDA. I suppose that if we would have described software factories only in terms of MDA, ignoring the very real differences between them that I have just described, then, yeah, we can superficially claim that the two technologies are similar.

But I hope you can see, from what I just said about software factories, that's a bit like claiming their use unicycle and Ferraris are similar, because they both have wheels. But the problem to such a claim is it, that it casts two things as similar, despite the fact the differences between them are far more significant than similarities.

Martin also claims that UML is domain-specific language for modeling OO software systems. The problem I have with this claim is that, for a language to function effectively as DSL in a Model-Driven Development paradigm, the domain has to be narrow enough to allow the language to be fit to purpose. If the domain is too broad, then the information captured by models expressed in DSL would be too general to support the kind of high-fidelity code generation we are looking for.

I believe the track record of code generation from UML proves this point. Of course, high-fidelity code generation is indeed possible when the UML is heavily decorated with stereotypes and tags. But, at that point, UML has merely become a DSL for some other domain. In other words, an UML profile is just a kinky way of defining DSLs. In the book, we suggest the term "general-purpose language": the languages like UML that are designed to describe almost anything.

So, to sum up, calling on decorated UML and DSL is a bit like calling an axe a scalpel. Sure, they both cut things, but the axe is a general-purpose cutting tool, whereas scalpel is a cutting tool [that] has been carefully designed for a very specific purpose. And the attention paid to requirements of problem domain, in case of the scalpel, makes all the difference.

Ron Jacobs: And the debate rages on, what [do] you think? Will these models really change anything? I remember using model types stuff for ages. It only seems to me that these tools were OK for certain things, but, I don't know, I find it hard to believe that 100 percent of my application will never be generated by a model.

And there is always that the ways in which models are somewhat static over time. They don't really adapt the new ways of looking at things like, you know, the move from object-oriented kind of base systems to component base system to service base system. I don't think the models have kept up. But, who knows, maybe things would be different this time, maybe. What do you think? Post your comments, and we will see you next time on ARCast.

© 2014 Microsoft