Connected Systems for Manufacturing
Connecting Business Processes Across Enterprises
Building a Composite Application for Trading-Partner Collaboration
Logical View of the Architecture
Implementing This Solution as a Composite Application
Connecting Business Processes Within the Enterprise
Role-Specific Guidance for Building and Deploying Composite Applications
This white paper is the fourth in a set of composite applications. The first paper introduced the topic, the second introduced platform technologies and capabilities to deliver these kinds of composite applications, and the third provided a set of generic best practices to deploy OBAs in the enterprise. This one will provide more specific guidance for building enterprise applications, with a focus on enterprises that manufacture goods and get them to retail outlets through distribution networks.
Manufacturers today face the pressures of competition, government regulations, evolving technology, and high customer expectations. Increasingly, however, new pressures and changes are intensifying and affecting how companies conduct business. With globalization, competitors are more numerous, are more specialized, and respond faster to customers than ever before. This leads to shorter product life cycles, rapid commoditization, and shrinking margins. In addition, the need for companies to play on the global scale often results in mismatches between the supply and demand for products and services. Globally networked supply chains also lead to longer order-through-to-settlement cycles, resulting in poor cash flows.
An increased focus on regulation is another destabilizing factor, as it requires that companies have fine-grained control and visibility over organizational practices. These regulations could be governmental (Sarbanes-Oxley), exercised by trading partners (retailers requiring RFID tagging), or be industry-specific (environmental regulations). Because technological advances continue to reshape entire industries, companies must remain at the forefront or face obsolescence.
Businesses tell us that these rapid changes and pressures present them with both their biggest challenge and their biggest opportunity. They know they must find a new, better way to navigate the business world. They must have a clear view of their own resources, exchange information quickly and easily, and combine technologies in new ways.
These new imperatives place unprecedented pressure on business decision makers at manufacturing enterprises. First, businesses are expected to have greater focus on what they are doing. They must focus on how they are going to grow. They must focus on how to deliver more value and differentiate themselves from their customers. They must have greater agility and greater responsiveness inside the organization. They must have more visibility.
On one hand, companies are focusing more on the core of their business. They are focusing on what is happening inside the organization. At the same time, companies have greater and greater interdependence with the outside world. Every company exists within a value chain. How do you establish trust with those other participating organizations? How do you establish visibility not just within your organization but across that entire value chain? From a customer perspective, if you have multiple participants in the value chain, how do you offer a seamless experience to the customer—even if there are multiple participants involved in pulling that thing together?
In short, remaining competitive in this environment requires companies to have a greater inward focus on their core business and processes, while simultaneously being more externally focused on competitive differentiation, partner integration, and delivery of an agile value chain. AMR Research calls this the need for manufacturers to move to a Demand-Driven Supply Network (DDSN). In their report titled "DDSN: 21st Century Supply on Demand," published in AMR Research report #17484 (August 12, 2004), Kevin O'Marah and Joe Souza define DDSN as:
A system of technologies and processes that senses and reacts to real-time demand signals across a supply network of customers, suppliers, and employees.
To do this, manufacturing companies must synchronize demand (sales and marketing), supply (production), and product development (engineering). This requires linkages between the design-, buy-, make-, and sell-side processes. Becoming demand-driven is a fundamental shift in how to do business; it is based on aligning processes—customer processes on one side, supplier-partner processes on the other side, and production processes on the bottom. This allows companies to sense and shape demand, and respond profitably to demand fluctuations or disruptions in the supply chain. It is this shift that is creating the demand-driven supply network.
This need to synchronize has been recognized for years. For example, in his book Men and Machines, published in May 1929, Stuart Chase writes:
Before the advent of mechanical power, handicraft met demand as it arose…. Then came James Watt and his steam engine. The machine proceeded to develop in accord with its own laws, regardless of the needs and conveniences of man. It continually increased its efficiency of rapid production with corresponding decrease of efficiency in elasticity and adaptability…. Still, production continues to increase, not in response to any demand, but under the compulsions of mechanical evolution….
Figure 1. The need to synchronize was recognized by Stuart Chase in his book, published in1929.
But how does a manufacturer become demand-driven? As Prof. Hau L. Lee points out in an article called "The Triple-A Supply Chain" in the October 2004 edition of the Harvard Business Review, "The best supply chains aren't just fast and cost-effective. They are also agile and adaptable, and they ensure that all their companies' interests stay aligned."
These three traits—agility, adaptability, and alignment (the Triple-A)—require manufacturers to put in business processes that help to manage change. Examples of such business processes are demand shaping and management, pull-based replenishment, and integrated product design.
These kinds of business processes require IT systems that detect change as early as possible, and then respond to change at the right time and in the right manner. This requires IT assets that can be rapidly connected and configured, with the connections between assets also reconfigurable. This requires composition. It should be possible to connect systems not only internally, within the enterprise, but also externally, beyond organizational boundaries. Being connected internally implies connectivity from the edge of the network all the way to the enterprise systems—from plant instrumentation to enterprise LOB systems. Being connected externally requires connectivity with partners, suppliers, and customers.
What are the benefits of connecting systems and aligning processes in this manner? Manufacturing organizations will see improvements in all of the metrics that are important to them, and this can bring significant financial benefits. For example, in their report titled "The Handbook for Becoming Demand Driven," published in AMR Research report #18410 (July 19, 2005), Lora Cecere, Debra Hofman, Roddy Martin, and Laura Preslan have shown that achieving higher levels in the DDSN maturity model will lead to improvements in demand forecast accuracy and perfect-order percentages. An AMR study with roughly 300 manufacturing companies showed that, on average, a 5 percent forecast-accuracy improvement led to a 10 percent perfect-order improvement. That translates to 50 cents of earnings-per-share (EPS) improvement, a 5 percent return on asset improvement, and 3.3 percent margin improvement. This is huge for manufacturers.
There is, however, a major obstacle for manufacturers to implement the software and systems that can drive these kinds of cross-functional business transformations. Organizations will already have existing systems in place to handle their current business processes, and, while these systems might automate certain kinds of transactions and processes, in most cases they will not work well outside the boundaries of what they were designed to do. So, it will not always be clear how to migrate systems from their "as-is" states to the desirable "to-be" state. In many cases, it might not even be clear what the desired to-be state should be.
As in any other journey, this transition requires a road map. In their report titled "Prioritizing Technology Investments to Support DDSN," published in AMR Research report #17333 (July 16, 2004), Erik Keller and Eric Austvold suggest one such road map, which is shown in Figure 2. While this road map was devised to help organizations reengineer their supply chains to be demand-driven, it is laid out in terms of generic capabilities that should enable agility, adaptability, and alignment for many kinds of business processes.
Figure 2. High-level road map proposed by AMR Research to align supply-, demand-, and product-innovation-related processes into a demand-driven supply network
At a very high level, the benefits of such an approach are laid out in Table 1, also from AMR Research (2004).
Table 1. DDSN investment path
Summarizing this guidance from AMR Research, organizations must add capabilities for integration, reporting, portals, analytics, and business-process management to sense and respond to real-time demand signals across a supply network of customers, suppliers, and employees. However, this leaves IT departments grappling with the question of how actually to architect their internal systems to achieve these benefits. In a set of sample scenarios, the remainder of this chapter will show how to leverage the platform capabilities from Chapter 2 and the architectural guidelines from Chapter 3.
The points made in this article will now be illustrated in the context of a solution template for supply-chain collaboration. This solution was chosen because close collaboration with customers, suppliers, and partners has become essential for companies to be successful in the marketplace. The importance of this was highlighted in a comment made by Robert Handfield in the article " Reducing Costs Across the Supply Chain," from Issue 14 of Optimize magazine (December 2002): "…the future won't be about companies competing against each other, but rather about supply chains competing against other supply chains." Typically, companies collaborate with their suppliers, so that procurement transactions can be automated, and so that timely information can be shared without too much manual intervention.
A functional overview of supplier collaboration is shown in Figure 3. Note, in this figure, that workflows have been segmented into strategic, tactical, and operational.
Figure 3. Functional overview of supplier collaboration (Click on the picture for a larger image)
The goal is to make the supplier collaboration Triple-A, thus reducing costs and inefficiencies, while ensuring a high level of service from suppliers. If this process is not set up to be Triple-A, the organization might experience the pain points listed in the table below.
Table 2. Pain points in non-Triple-A supplier collaboration
|Customer mfg||Poor understanding of supplier capacity to meet demand||Over-order of inventory
|Logistics||Poor communication of capacities||Increased transportation and overtime labor costs|
|Sales||Poor communication of urgency||Increased quoted price and eroded margins|
|Manufacturing||Poor visibility of true needed date||Increased manufacturing cost due to expediting|
To avoid these potential problems, we must build composite solutions that enable cross-functional processes across the different stakeholders in the table.
Figure 4 illustrates how a purchase-order request is routed within a supplier organization. This is similar for all of the other form-based operational processes in Figure 3. In the section on best practices for the workflow capability of the connected systems model in Chapter 2, we mentioned that workflow is everywhere, and that it does not just live in the data center on integration servers. This is illustrated in Figure 4, in which workflow is shown to serve the following functions:
These workflow instances could be running on central servers in the corporate data centers, or closer to information workers on departmental servers, or on client applications to manage document routing and life cycles.
Figure 4. Building a composite application for trading-partner collaboration (Click on the picture for a larger image)
Figure 5 shows a logical view of the architecture simplified from Figure 4. We will then break this down into smaller subsystems for discussion.
Figure 5. Logical view of the architecture
An inbound message is a document received from a trading partner, and must be acknowledged and routed to an appropriate person for processing, as shown in Figure 6. This set of activities requires the following capabilities, each of which is explained in the immediately following subsections.
Figure 6. Architecture for inbound-message processing
Inbound messages from different partners might potentially be received in multiple message formats and delivered over multiple channels, such as Web services, EDI, e-mail, and RosettaNet. Messages can be exchanged in a variety of different patterns: one-way, asynchronous two-way, or synchronous two-way messaging. Therefore, the architecture must provide multiple messaging channels. These are entry points to handle each combination of these message-interchange patterns and message formats. After the message has been received over one of these channels, it must be transformed into a canonical format that is required for downstream services. This does not have to be in the same format as the original request.
After the message has been received by the message broker, it is processed, and a transformed message is persisted to a message queue. This enables loose coupling between the message sender and the message receiver, and promotes adaptability in the architecture. The message persisted in the message queue will be in the canonical format described earlier. The goal here is reliability (no message should be lost), and it is only after the message has been persisted that an acknowledgement should be sent back to the requesting party. Another benefit is that it is easier to build fault tolerance into the overall architecture in this fashion.
After the incoming message has been transformed and persisted, an acknowledgment message can be returned to the sender. This acknowledgement is synchronous and is a response to the original request, acknowledging receipt of the request. This is not likely to be an actual response, unless the message processing can be completely automated. The actual response will come later, after the message has been processed by an appropriate person or system, and this will be asynchronous to the original request requiring a capability to make a call back to the systems of the requesting party. This capability requires a shared understanding of the interfaces to make that call back. Both the sending of the acknowledgement and the final response will be governed by service-level agreements (SLAs) that have been established during the setup of a specific program between the trading partners.
Asynchronous message processing implies that both trading partners will have the ability to refer to the original request (for example, by referring to an order by its ID) at later times. This requires a shared understanding of identifiers for individual business entities. This can get complicated when the data schemas from the request are transformed into the canonical data models used within the organization. However, as shown in Figure 7, the complexity does not end there, because different departments internally also refer to documents with different identifiers.
Figure 7. Master-data management system needed to cross-reference documents and manage master-data elements (Click on the picture for a larger image)
As part of the inbound-message transformation process, it is very likely that application services exposed by existing LOB applications will have to be accessed. Typically, the inbound message will have to be extended with information from these applications, to create the canonical data models used in subsequent processing. This extra information might be for internal document tracking, aggregate level information that is missing in the request, or routing purposes. For example, one of the application services likely to be invoked during the message-transformation process is the master-data management system described earlier.
There are multiple ways to instantiate the architecture for processing inbound messages. One way is to use the Microsoft BizTalk Server as the message broker and use accelerators to make it easier to synchronize with trading partners (such as the RosettaNet accelerator). BizTalk can be used also to transform messages, and then push them into Microsoft SQL Server Service Broker, which queues them up prior to subsequent processing.
Documents must be routed from their message queues to the appropriate person who will be handling that document. Before the document reaches its intended recipient, it might have to be preprocessed into a form that the end user can use for a quick decision. For example, a purchase-order request from the customer might have to be run through an order-promising or fulfillment engine to check the request against inventory that is available to promise (ATP). Figure 8 demonstrates these capabilities.
Figure 8. Workflows to set up manual processes
These workflows could be triggered as documents arrive, or they could be run in batch mode on a scheduled basis. Again, there are multiple ways to instantiate this architecture. For example, one way is to use the BizTalk Server. This is discussed in more detail in the section on implementation that follows.
In the earlier document-routing step, data and content for smart documents is generated by a business service that has preprocessed the incoming message. Typically, this content is XML that complies with specified schemas for canonical document formats. Documents that reach the end users for manual processing can be of different types, but these are expected to be Office InfoPath forms for operational workflows, and spreadsheets for tactical and strategic workflows.
It is straightforward for information workers to use Office InfoPath and Office Excel to create forms and spreadsheets that can generate XML conformant to a specified schema. Coding is not required, and XML schematization of these documents does not detract from flexibility to customize look-and-feel. Both Office Excel and Office InfoPath provide users with the ability to import XML schemas pertaining to business processes, and then to use graphical tools to associate schema elements with graphical controls (Office InfoPath) or spreadsheet cells (Office Excel). For many business processes, there are industry standards available that provide XML schemas that you can use to create the forms and spreadsheets, as depicted in Figure 9.
Figure 9. PO-request and PO-confirmation forms created from industry-standards-based XML schemas (Click on the picture for a larger image)
After the smart document has been generated from within the setup workflow, it is submitted to the Microsoft Office services responsible for managing its life cycle. Typically, this means that the document is submitted to a Microsoft Office SharePoint document library or form library. After this is done, an entry can be made to a user's task list and an e-mail sent to that user, including a notification of the task and a link to the smart document for download.
Smart documents are submitted to Microsoft Office services for manual processing—for example, to a document library or forms library in Office SharePoint. The Office SharePoint Server can be configured to start workflows whenever a document is added to a library or modified. In addition to out-of-the-box workflows for things such as document approval, the server can be extended with custom workflows and activities. Users can then configure a document library to associate a set of workflows with a set of actions (such as document added or document modified).
In addition to this support for workflow, Microsoft Office 12 Server provides great out-of-the-box support for a number of Microsoft Office services, such as collaboration, enterprise content management, portal, search, workflow, business intelligence, and project management. In this way, Office SharePoint forms a great platform for hosting composite applications, as shown in Figure 10.
Figure 10. Microsoft Office 12 Server provides platform support for a number of different types of Microsoft Office services. (Click on the picture for a larger image)
For example, consider document life-cycle management (see Figure 11). Documents that reach users for manual processing go from authoring and collaboration through management and publication to archiving or destruction. This refers to all kinds of documents, whether they are Office InfoPath forms or Office Excel spreadsheets. The stages that a document might go through are:
Whenever a document reaches one of these stages, a new workflow could be triggered. As an example, when a document expires, an expiration workflow would manage the archiving process.
Figure 11. Managing the life cycle of documents during manual processing steps
Users can submit smart documents by editing the document and saving it back to the document library. This would then trigger a completion workflow to process the document. The activities in this completion workflow would then invoke endpoints on enterprise services to update the appropriate LOB systems. After this is done, the completion workflow could post any reply back to the trading partner through message queues that have been set up for outgoing messages.
Figure 12. Workflows to complete manual activities, for example, by updating LOB systems
Note from Figure 12 that business-process workflows can be located centrally or closer to the user. For example, these workflows could be running within central servers running BizTalk for enterprise-wide process automation or within departmental servers running applications like Office Server that have embedded Windows Workflow Foundation (WWF).
After messages have been posted to queues for outbound messages, they must be delivered as responses to the original request (see Figure 13). This task must be handled by a message broker, as outbound messages must be converted from canonical formats into the formats used by each of the different trading partners. As discussed earlier, outbound messages to trading partners can potentially be sent in multiple message formats and delivered over multiple channels. Therefore, the architecture must provide for a set of exit points from the system that can handle each combination of message formats and distribution channels. This is the job that the message broker must perform.
Figure 13. Workflows to process outbound messages
Note that clearly this is asynchronous from the initial request and that this processing of outbound messages could be achieved in multiple ways. For example, this processing could be happening in outbound pipeline processing in a BizTalk server, or it could be done in a server hosting WWF.
To demonstrate the use of composition to build this solution, a reference implementation was built using two scenarios that involved document exchange between a retailer and a manufacturer:
To demonstrate composition, the implementation leveraged data and process models from the RosettaNet industry standard for B2B collaboration. This allowed us to build a collection of software assets that could be deployed into the 2007 Microsoft Office System and other Microsoft platform technologies. The approach followed was first to generate XML schemas (XSDs) from the DTDs publicly available, and then use the schema definitions to design various artifacts, such as Web service interface definitions, Office InfoPath forms, and Office Excel spreadsheets. This is shown in Figure 14, and the implementation approach is described immediately afterward, followed by a list of the software assets that made up the composite solution.
Figure 14. Data and process models from the RosettaNet standard were leveraged to build a collection of software assets, which could be assembled into composite applications.
In the past, data issues have been the bane of many an enterprise solution, and one of the major technical hurdles is the choice of data models, as these form the foundation of the solution. It is hard to come up with good models without a lot of iteration, and usually the best approach is to leverage past industry learning and experience for the particular composite application being built. The best way to do this is to seek out an industry standard with prepackaged data models. For the reference implementation, the RosettaNet specification was used; it comes with various process and data models for trading-partner collaboration. As the data models are in the form of DTDs, the first step was to convert these into a form that is more amenable to tooling, and so the DTDs were converted into XML Schema (XSD) documents using Microsoft Visual Studio. An example of the PO-confirmation schema is shown in Figure 15.
The next step was to use the xsd.exe tool to generate C# classes that corresponded to the data models.
xsd 3A4_MS_V02_03_PurchaseOrderConfirmation1.xsd /l:CS /classes /outputdir:classes /namespace:RosettaNet_3A4_MS_V02_03_Confirmation
Figure 15. An abbreviated view of the XML schema document for the PO-confirmation model in the RosettaNet schema (Click on the picture for a larger image)
User interfaces were designed using Microsoft Office client applications. Purchase-order (PO) processing forms (request and confirmation) were built using the Office InfoPath graphical forms designer. Strategic forecasts were built into Office Excel. Visual Studio Tools for Office (VSTO) was used to extend Office Outlook with an add-in, for users to search for and edit POs in the event of e-mail notifications.
For example, Figure 16 shows how the RosettaNet models for PO request and confirmation were imported into the Office InfoPath forms design tool, to create forms that could be hosted by the 2007 Microsoft Office System on the server and rendered thin in client browsers. The form is just an XML presentation view layered on top of the RosettaNet XML data model, and all of the server-side application logic is in terms of these same data models. An advantage of this approach is that it leads to loose coupling between the view (the form) and the model (the server-side application logic). The layout of the form could change, and different sets of schema elements could be bound to form UI controls, but this would not change the data models themselves or the syntax of server-side interfaces. Naturally, the view and the model are not completely decoupled, because the semantics of the application logic might have to change if additional schema elements were added to the view.
Figure 16. PO request and confirmation forms were built using the graphical Office InfoPath forms designer, after importing XML schemas from RosettaNet. (Click on the picture for a larger image)
A similar approach was taken to create documents for the strategic forecasts exchanged between the trading partners. XML schemas from the RosettaNet specification were used for this purpose and bound to Office Excel spreadsheets using XML maps. This is a great way to associate spreadsheet cells with XML schema elements, so that XML data can be written to and read from spreadsheets. This binding between XML data and spreadsheet cells is loosely coupled, as users can rearrange layout of the cells. Nonetheless, Office Excel will still track the association to schema elements.
However, there are a couple of restrictions to keep in mind here. First, XML maps cannot handle certain kinds of complex schemas, such as where there are collections of collections. Second, spreadsheets registered with Office Excel services in 2007 Microsoft Office System cannot contain XML maps. There are a few ways to work around these issues.
When the schema for the data model is too complex for XML maps, a simplified (flattened) schema can be used, and server-side processing can transform from one format to another. As 2007 Microsoft Office System documents are saved as XML, using the Open XML schema, this opens the door for server-side XML processing that was not possible with earlier versions of Microsoft Office. These kinds of server-side logic can be built into packaged WWF activities that are assembled into workflows in Office SharePoint and can transform documents as they arrive. Also, these packaged activities might be code or even an XSL style sheet.
The restriction on using XML maps with spreadsheets registered with Office Excel services was not a major issue for the scenarios in the reference implementation. These spreadsheets represent working documents that are in-process and used by information workers in the strategic-forecasting business processes. These documents are generated by workflows, edited by people, and then deconstructed by other workflows, which turn them from documents into messages and data. This usage pattern does not fit the three key scenarios that are mainly targeted by Office Excel services, which are:
Therefore, those Office Excel spreadsheets with XML maps were stored in a document library that was not registered with Office Excel services, and server-side processing was built into WWF activity libraries to strip out the XML maps before those spreadsheets were consumed through Office Excel services.
As shown in Figure 17, an add-in was added to Office Outlook using VSTO. This added PO management capabilities through a custom ribbon, tab, and task pane. An implication of this is that this add-in is a component that must be deployed on the client machine, instead of on the server.
Figure 17. Support for managing POs was added to Office Outlook, through an add-in. (Click on the picture for a larger image)
Office SharePoint sites were set up for various departments in the reference implementation: sales, marketing, and production. In real life, these could be either departmental sites or team-collaboration sites. Setting up sites implies assembly and configuration using the Office SharePoint administration tools, instead of development.
After the sites had been set up, server-side storage of in-process documents was provisioned on these sites by creating document libraries to hold strategic-forecast spreadsheets and PO forms.
For the reference implementation, business-process representations were created in two places:
Before these sets of workflows could be assembled, a domain-specific workflow library was created with activities for processing PO and forecasts. These workflow activities were then packaged into an assembly, and then workflows were assembled from these building blocks.
In the earlier section on generating industry models, we described how XML schemas and C# classes were generated from RosettaNet schemas. These were then used to build Web service interfaces to represent LOB systems. These interfaces form part of the assets that make up the composite application, because it should be possible to deploy the packaged process against any kind of IT landscape. So, these Web services interfaces represent the touch points to back-end IT systems needed to implement to integrate through Web services.
These Web services are invoked from the business activities that make up the aforementioned workflows.
The reference implementation also showed how to create composite applications for cross-functional processes right into Office SharePoint by defining data entities in the Business-Data Catalog (BDC) and consuming these entities from within Office SharePoint lists. The BDC is a shared service that is built into the 2007 Microsoft Office System and can be used to define the entities themselves, lightweight relationships between entities, and actions that can be taken upon them.
For example, Figure 18 shows the user experience for one of these cross-functional processes. There are two lists on this Office SharePoint page—one for supplier-order header information and the second for details information. Selecting an order from the list above causes all of the line items for that order to be displayed in the list below. There is a drop-down menu on the lower list that brings up selected information about the supplier-order line item and displays it in a form for editing.
Figure 18. Cross-functional application for editing supplier orders built into Office SharePoint (Click on the picture for a larger image)
To set this up, BDC Web Parts are set up for the two tables. The first table is mapped to a BDC entity for Supplier Order, and the second table is mapped to a BDC entity for Supplier Order line item. The parent-child relationship between the two lists is modeled from within Office SharePoint. In addition, there is an action defined on the BDC entity for Supplier Order line item called "Change Promised PO" that brings up the Office InfoPath form for editing some details of the line item. An action is defined by a name (which shows up in the drop-down menu), a URL (which is where the selected row gets posted back to), and a set of attributes from the selected entity that get posted back to the URL (see Figure 19). This leads to two possibilities: the URL might lead to a Web service that can process the data being sent back, or it could point back to an Office InfoPath form being stored on the server, which is happening here. The Office InfoPath form has a code-behind that takes the parameters being passed back to it and uses it to prepopulate the form.
Figure 19. Setting up the cross-functional process from Figure 18 (Click on the picture for a larger image)
The composite application also had assets for BI. There were SQL Server Analysis Services cubes for order fulfillment, and a production schedule and inventory plan that are populated from the transactional data using SQL Server Integration Services. This information is plugged into BI dashboards in Office SharePoint. Dashboards were also set up from spreadsheets stored in Office Excel services and Office SharePoint lists. Finally, the Business Scorecard Manager tool was used to create reports that plugged into SQL Server Reporting Services on top of the transactional data.
A typical systems landscape for a manufacturer is shown in Figure 20. Some of the important terms are explained here:
Figure 20. Typical high-level systems landscape for a manufacturer
Technology advances are enabling a new generation of field devices; soon, there will be a lot more computing power distributed close to the edge. These changes will be driven by the rapid adoption of radio-frequency identification (RFID), the emergence of new and powerful mobile devices, and (in the future) by sensor networks. As more applications and devices are deployed at the edge, there will be a greater need to manage these devices and the data streams they generate. Thus, the proliferation of edge devices can be expected to push the deployment of edge servers to accomplish these tasks. In general, the architecture for these edge devices and edge servers will resemble Figure 21.
Figure 21. Proliferation of new types of edge devices will push deployment of edge servers into the IT systems landscape.
Take, for example, the case of RFID technology, which is making it possible to gather greater quantities of data at the edge. Microsoft is extending its platform to provide the services in Figure 21 for RFID. Here, the devices corresponding to the lowest layer in the stack will primarily be RFID readers. These readers will be generating events, as (and when) it detects RFID tags; then, these events will get passed along to the edge server for processing. Within the edge server, an RFID service will deliver the device-abstraction, device-management, and event-processing capabilities.
The device-abstraction layer in the RFID service will ensure that devices from different vendors look the same to the device-management and event-processing layers, as well as to the business applications being fed by the edge server. The device-management layer will provide a single, consistent way to deploy, configure, and monitor RFID devices. The edge-processing layer will make it possible to filter, transform, and aggregate events raised by RFID devices to ensure that business applications see events in the proper context. A further complication, as devices get smarter, is that some of the processing from the edge server will get offloaded to the devices (things like filtering); capabilities of devices will vary across vendors, and yet the device-management layer of the edge server would have to expose these varying capabilities through a consistent interface to manage devices collectively as a group.
Another disruptive change (expected soon) is the broad industry support emerging for Web services. This will start a trend to introduce Web services at each of the different layers of manufacturing systems in Figure 21. For example, "smart" field devices (sensors, actuators, controllers, and so forth) might expose Web services interfaces. These could be functional interfaces for setting and retrieving information, or management interfaces to configure and deploy the device. Client applications, such a Human Machine Interface (HMI) system or a management console, could then connect directly to the device. Standards bodies such as the OPC Foundation are releasing Web services specifications to ensure interoperability among vendors.
Enterprises want to integrate their production systems with their business systems to achieve significant increases in productivity and drive decision making into the factory floor. Plant personnel should be able to monitor the health of their systems and be alerted to exceptions and potential problems. They should also be able to gain insight into the business metrics that will be affected by problems in production. However, this requires sophisticated yet easy-to-use tools that can scale across complex systems landscapes.
Also, as sensors and other devices become more sophisticated and pervasive, the potential to monitor hard-to-reach manufacturing equipment increases, which gives manufacturers the potential to improve their industrial automation and drive down costs. For example, the Air Force Research Laboratory was awarded a patent to monitor conduits, such as electrical wires and hydraulic lines. This allows the detection of stresses to the conduit, along with alerts to potential leaks, while avoiding time-consuming manual inspections. To make this even better, these "smart" devices at the edge could be connected into enterprise systems, through edge servers, with unified management consoles and the ability to distribute reports. Figure 22 shows a potential high-level architecture to do this.
Figure 22. Integrating business processes across the enterprise, from plant-floor systems to enterprise applications
What are some of the cross-functional processes for which composite applications might be built and deployed? Some examples of potential areas within the manufacturing vertical are the following:
An increase in product complexity, coupled with greater customer expectations, has put pressure on research and development. From an article titled " Expect the Unexpected," in the September 4, 2003, edition of The Economist, comes the following quote: "With the pace of innovation hotting up, any enterprise that fails to replace 10 percent of its revenue stream annually is likely to be out of business within five years."
There is a need for solutions that integrate product design and engineering with other business processes for production and demand generation. This requires greater collaboration among teams, often distributed globally when design teams are not co-located with manufacturing teams. Platform capabilities required are real-time communications, document collaboration, and data synchronization between Product Data Management and Product Information Management systems and other enterprise applications.
Mounting pressures for collaboration, coupled with increased market demands, have created new challenges for sales and marketing. In the research article titled "Predicts 2004: Customer Service, from Function to Process," published December 5, 2003, Gartner predicted the following: "Customer service will continue to be the 'litmus test' of an enterprise's commitment to the customer, and as such will be a key indicator of the integrity of the enterprise."
For better customer-service levels, enterprises must synchronize demand (sales and marketing) with both supply (production) and product development (engineering). This requires solutions beyond those offered by traditional CRM systems. Some of these industry solutions are:
Increasing pressures on operations from networked supply-chain, R&D, product-development, and customer-service initiatives have changed the playing field for business decision makers in manufacturing and operations. This is highlighted by the CRM Today research report titled " Product Lifecycle Management Services Market Heats Up, New IDC Study Reveals," published December 28, 2001, that states the following: "With increased product complexity, increased customer demands around product performance, and correspondingly decreased product life cycles, manufacturers are being pressured to deliver better product cheaper and faster."
Improving operations usually requires active monitoring of events and information to get more proactive decision making. This usually requires real-time or near real-time flows from enterprise resources (sources of data) to business decision makers (consumers of information). This is complicated, because data exists in multiple locations; is being generated at multiple frequencies and formats; and often must be cleansed, aggregated, transformed, and correlated before it can become meaningful information. Also, the role of the business decision maker should determine access rights and presentation of information. Furthermore, when exceptions or unplanned events occur (and they will), resolution of those events typically affects people and activities involved in multiple business processes. Usually, such kinds of cross-functional applications do not exist in the enterprise; however, with the right kinds of IT assets in place, a platform for composition will enable rapid assembly and configuration of composite applications to meet these needs.
What will these composite applications look like? They will provide role-based views of operational data, with real-time (or near real-time) visibility into distribution, procurement, inventory, and operations. For example, some of this information might include demand forecasts, supply commitments, inventory positions, purchase orders, delivery schedules, advance shipment notifications, and goods receipts.