Export (0) Print
Expand All

How to: Programmatically Configure a Crawl Schedule for a Content Source

SharePoint 2007

In Enterprise Search in Microsoft Office SharePoint Server 2007, you indicate what content is to be crawled by the search indexing service through the content sources configured for the search service's Shared Services Provider (SSP). You can add new content sources to an SSP's content source collection by using the Enterprise Search Administration object model. For more information, see How to: Add a Content Source.

Adding a content source for your content is only part of the task. For content to be included in the content index, the search index component must actually crawl the content as well.

You can manually initiate a full or incremental crawl of a content source—as well as pause, resume, or stop the crawl—by calling the appropriate methods of the ContentSource class. For more information, see How to: Programmatically Manage the Crawl of a Content Source.

However, if you want the content for a content source crawled on a regular, ongoing basis, we recommend that you set up a crawl schedule. You can also do this by using the Enterprise Search Administration object model.

The following procedure describes how to:

  • Set up a console application to use the Enterprise Search Administration object model.

  • Configure a full crawl schedule for a content source, using the WeeklySchedule class.

  • Configure an incremental crawl schedule for a content source, using the DailySchedule class.

To set up your application to use the Enterprise Search Administration object model

  1. Set references in your application to the following DLLs:

    • Microsoft.SharePoint.dll

    • Microsoft.Office.Server.dll

    • Microsoft.Office.Server.Search.dll

  2. In your console application's class file, add the following using statements near the top of the code with the other namespace directives:

    using Microsoft.SharePoint;
    using Microsoft.Office.Server.Search.Administration;
    
  3. Create a function to write out usage information to the console window.

    private static void Usage()
    {
    Console.WriteLine("Manage Content Source Crawl Status");
    Console.WriteLine("Usage: ManageCrawlStatus.exe <ContentSource>");
    Console.WriteLine("<ContentSourceName> - Specify the content source name.");
    }
    
  4. In the Main() function of the console application, add code to check the number of items in the args[] parameter; if it is less than 1, meaning that no value was specified to identify the content source, then call the Usage() function defined in the step 3.

    if (args.Length < 1 )
    {
       Usage();
       return;
    }
    
  5. Following the code from step 4, add the following to retrieve the search context for the SSP.

    /*
    Replace <SiteName> with the name of a site using the SSP
    */
    string strURL = "http://<SiteName>";
    SearchContext context;
    using (SPSite site = new SPSite(strURL))
    {
        Context = SearchContext.GetContext(site);
    }
    

To create a crawl schedule using the DailySchedule class

  1. Create an instance of the DailySchedule class.

    DailySchedule daily = new DailySchedule(context);
    
  2. To indicate when to start crawling the content source, and how frequently to crawl, configure the DailySchedule properties. For example:

    //Indicates the schedule starts on the 15th day of the month.
    daily.BeginDay = 15;
    //Indicates the schedule starts in January.
    daily.BeginMonth = 1;
    //Indicates that the schedule starts in 2007.
    daily.BeginYear = 2007;
    //The next two lines of code indicate that the schedule starts at 2:30 in the morning.
    daily.StartHour = 2;
    daily.StartMinute = 30;
    //Indicates that the content should be crawled every day.
    daily.DaysInterval = 1;
    
  3. Retrieve the collection of content sources configured for the SSP's search service.

    Content sspContent = new Content(context);
    ContentSourceCollection sspContentSources = sspContent.ContentSources;
    

To create a crawl schedule using the WeeklySchedule class

  1. Create an instance of the WeeklySchedule class.

    WeeklySchedule weekly= new WeeklySchedule(context);
    
  2. To indicate when to start crawling the content source, and how frequently to crawl, configure the WeeklySchedule properties. For example:

    //Indicates the schedule starts on the 1st day of the month.
    weekly.BeginDay = 1;
    //Indicates the schedule starts in January.
    weekly.BeginMonth = 1;
    //Indicates that the schedule starts in 2007.
    weekly.BeginYear = 2007;
    //The next two lines of code indicate that the schedule starts at 11:15 at night.
    weekly.StartHour = 23;
    weekly.StartMinute = 15;
    //Indicates that the content should be crawled every week.
    weekly.WeeksInterval = 1;
    

To configure the content source to use the new schedules

  1. Retrieve the value specified in the args[0] parameter, and verify that the SSP's content source collection contains a content source with that name.

    string strContentSourceName = args[0];
    if(sspContentSources.Exists(strContentSourceName) )
    {
       <…>
    }
    else 
    {
       Console.WriteLine("Content source does not exist.");
    }
    
  2. Retrieve the content source with the specified name, and set the IncrementalCrawlSchedule and FullCrawlSchedule properties to the new schedules.

    ContentSource cs = sspContentSources[strContentSourceName];
    cs.IncrementalCrawlSchedule = daily;
    cs.FullCrawlSchedule = weekly;
    cs.Update();
    

Example

Following is the complete code for the sample console application described in this topic.

Prerequisites

  • Ensure a Shared Services Provider is already created.

Project References

Add the following Project References in your console application code project before running this sample:

  • Microsoft.SharePoint

  • Microsoft.Office.Server

  • Microsoft.Office.Server.Search

using System;
using System.Collections.Generic;
using System.Text;
using Microsoft.SharePoint;
using Microsoft.Office.Server.Search.Administration;

namespace ManageCrawlStatus
{
    class Program
    {
        static void Main(string[] args)
        {
            try
            {
                if (args.Length < 1)
                {
                    Usage();
                    return;
                }
                /*
                    Replace <SiteName> with the name of a site using the Shared Services Provider
                */
                string strURL = "http://<SiteName>";
                SearchContext context;
                using (SPSite site = new SPSite(strURL))
                {
                    Context = SearchContext.GetContext(site);
                }

                DailySchedule daily = new DailySchedule(context);
                //Indicates the schedule starts on the 15th day of the month.
                daily.BeginDay = 15;
                //Indicates the schedule starts in January.
                daily.BeginMonth = 1;
                //Indicates that the schedule starts in 2007.
                daily.BeginYear = 2007;
                //The next two lines of code indicate that the schedule starts at 2:30 in the morning.
                daily.StartHour = 2;
                daily.StartMinute = 30;
                //Indicates that the content should be crawled every day.
                daily.DaysInterval = 1;
                WeeklySchedule weekly = new WeeklySchedule(context);
                //Indicates the schedule starts on the 1st day of the month.
                weekly.BeginDay = 1;
                //Indicates the schedule starts in January.
                weekly.BeginMonth = 1;
                //Indicates that the schedule starts in 2007.
                weekly.BeginYear = 2007;
                //The next two lines of code indicate that the schedule starts at 11:15 at night.
                weekly.StartHour = 23;
                weekly.StartMinute = 15;
                //Indicates that the content should be crawled every week.
                weekly.WeeksInterval = 1;

                string strContentSourceName = args[0];
                Content sspContent = new Content(context);
                ContentSourceCollection sspContentSources = sspContent.ContentSources;

                if (sspContentSources.Exists(strContentSourceName))
                {
                    ContentSource cs = sspContentSources[strContentSourceName];
                    cs.IncrementalCrawlSchedule = daily;
                    cs.FullCrawlSchedule = weekly;
                    cs.Update();
                }
                else
                {
                    Console.WriteLine("Content source does not exist.");
                }
            }

            catch (Exception e)
            {
                e.ToString();
            }
        }

        private static void Usage()
        {
            Console.WriteLine("Configure Crawl Schedule");
            Console.WriteLine("Usage: ConfigureCrawlSchedule.exe <ContentSourceName>");
            Console.WriteLine("<ContentSourceName> - Specify the content source name.");
        }
    }
}

See Also

Community Additions

ADD
Show:
© 2014 Microsoft