Enhancing the BDC Model File for SharePoint Server Search
This topic provides information about the approaches you can use to enhance the BDC model file for SharePoint Server search.
Last modified: October 13, 2010
Applies to: SharePoint Server 2010
In this article
Use Inline Property I/O When Retrieving Large-Scale Data
Enumeration Optimization When Crawling External Systems
Improving Crawl Speed with the UseClientCachingForSearch Property
Security in BDC Model Files
The BDC metadata model includes properties that are designed to support Microsoft SharePoint Server 2010 search specifically. For a list of these properties and their descriptions, see the Search-Specific Properties in BDC Model Files table in SharePoint Server Search Connector Framework. When you want to create a BDC model file for an external system that you want to enable for search, you can enhance the model file to optimize performance when crawling external systems.
In general, if some of the data returned for an item is large scale, instead of returning it with the SpecificFinder method, you should use one of the following specialized methods to retrieve the data:
Use the BinarySecurityDescriptorAccessor method when passing a security access control list (ACL) instead of the WindowsSecurityDescriptor property.
Use the StreamAccessor method when passing streams.
Unless network latency is high, the improved performance is usually better than the cost of an extra trip to the external system.
Do not enumerate more than 100,000 items per call to the external system. Long-running enumerations can cause intermittent interruptions and prevent a crawl from completing. We recommend that your BDC model structures the data into logical folders that can be enumerated individually, as shown in the following example.
This example demonstrates enumerating against a database table with one million rows, but with a fixed set of values in ColumnA. In this scenario, you can consider ColumnA as the external content type and write an enumerator for this set of values by using the following SQL statement.
Next, define the specific finder using the following SQL statement.
Finally, you must define the association navigation operation, as follows.
Any method should begin returning results within two minutes, or the crawler will cancel the call. For example, a complex SQL statement that uses a LIKE clause may take longer than two minutes to complete, and would cause the crawler to cancel the call.
The UseClientCachingForSearch property improves the speed of full crawls by caching the item during enumeration. Using this property is also recommended when implementing incremental crawls that are based on change logs, because it improves incremental crawl speed.
If items are larger than 30 kilobytes on average, do not set this property, as it will lead to a significant number of cache misses and negate performance gains.
If the repository uses NTLM authentication, we recommend that you specify PassThrough authentication for crawling.
Profile pages may require that you use the Secure Store Service because of the multi-hop delegation problem from the front-end web server. If you encounter this problem, you can optimize the crawl while still allowing profile pages by creating two similar LobSystemInstance instances. The first instance should use credentials from the Secure Store Service authentication. This instance should not contain the ShowInSearchUI property. The second instance should use PassThrough authentication, and should contain the ShowInSearchUI property. Profile pages use the first LobSystemInstance instance, and the crawler uses the second instance.
This requires that you set the ShowInSearchUI property at the LobSystemInstance level instead of at the LobSystem level.