Once a search service application is configured, it needs data for indexing. In this recipe, we will add a new content source for our search service application.
For this recipe, we should have a search service application created in the Provisioning a search service application recipe.
Follow these steps to add a new content source to our search service application:
Local SharePoint Sites
, for the content source in the Name field.http://sharepoint/
for instance. Multiple SharePoint sites may be indexed as a single content source. To add more SharePoint sites, add them on a new line in the Start Addresses field.The content source can be configured to index only the site collection that matches the URL provided or to index everything under that URL. For instance, when enabled, http://sharepoint/site will be indexed when http://sharepoint/ is added to the Start Addresses field.
Search crawls in SharePoint are conducted on a per content source basis. Content sources define what is being crawled and how often. They can include SharePoint sites, websites, file shares, Microsoft Exchange public folders, line-of-business data from business data connectivity services connections, and custom repositories. Each content source defined can use multiple content sources of the same content type. For instance, a content source could include multiple, different websites. A content source, however, could not include both a website and a line-of-business data connection.
Content sources can also be created and configured with PowerShell.
Follow these steps to configure a content source using PowerShell:
Get-SPEnterpriseSearchServiceApplication
Cmdlet:$ssa = Get-SPEnterpriseSearchServiceApplication "Search Service Application"
New-SPEnterpriseSearchCrawlContentSource
Cmdlet and assign it to a variable:$cs = New-SPEnterpriseSearchCrawlContentSource -Name "SharePoint Sites" -SearchApplication $ssa -Type SharePoint -SharePointCrawlBehavior CrawlVirtualServers -StartAddresses "http://sharepoint/"
The SharePointCrawlBehavior
parameter is the equivalent of the Crawl Settings section in the web interface. CrawlVirtualServers
instructs the indexer to index all content under the URL provided and CrawlSites
instructs the indexer to only index the site collection at the URL provided.
$cs.EnableContinuousCrawls = $true $cs.Update()
3.129.210.102