When configuring a crawl rule to store the authentication cookies the login page was returning multiple cookies with same name but different domain.
This gave issues in a later stage (during crawl) because all cookies would be sent along with the request, the target system had issues correctly identifying us due to these “duplicate” cookies.
You can easily check the request information that is sent during crawl by starting fiddler on the crawl server and configuring the proxy settings to http://localhost:8888 (default Fiddler settings).
In the end we chose for an alternate method of configuring the cookies, namely through PowerShell. This gave us the ultimate flexibility to configure exactly the cookies we wanted to pass along with the crawl requests.
asnp microsoft.sharepoint.powershell -ea 0 | Out-Null $ssa = get-spenterprisesearchserviceapplication $crPath = 'http://authenticatedwebsite*' # Get or create crawl rule $cr = Get-SPEnterpriseSearchCrawlRule -SearchApplication $ssa | ? { $_.Path -eq $crPath } if ($cr -eq $null) { $cr = New-SPEnterpriseSearchCrawlRule -Path $crPath -SearchApplication $ssa -Type InclusionRule -AuthenticationType CookieRuleAccess -FollowComplexUrls $true } # Set cookie credentials $cr.SetCredentials('CookieRuleAccess', 'myUser=crawlUser; myPwd=crawlPassword', 'http://cookie-set-via-powershell')SharePoint 2013: Some observations on Enterprise Search
I’m doing some testing with the Enterprise Search in SharePoint 2013 for a customer scenario and here are some observations…Content Source as Crawled Property
The “Content Source” name is out of the box available as Managed Property on all content in the search indexThis makes it possible to create Result Sources that aggregate content from different Content Sources similar to Search Scopes back in the old days.Meta elements (HTML <meta> tags) as Crawled Properties
Information from meta elements in web pages is extracted into crawled propertiesConsider the following example:After crawling this website with SharePoint 2013 Search it will create (if new) or use (if existing) a Crawled Property and store the content from the meta element. The Crawled Property can then be mapped to Managed Properties to return, filter or sort query results.Query string parameters as Crawled Properties
Query string parameters are not extracted into Crawled PropertiesThis was actually a request from the customer in order to be able to provide additional information regarding documents (on their website) into the search index. As I suspected it isn’t possible out of the box but you could definitely do it using Content Enrichment.The “OriginalPath” is available as an input property for Content Enrichment and contains the exact url used for indexing the document:With Content Enrichment it is pretty straightforward to look for predefined query string parameters and then map them to output properties.$ssa = Get-SPEnterpriseSearchServiceApplication $config = New-SPEnterpriseSearchContentEnrichmentConfiguration $config.Endpoint = 'http://cews:818/ContentEnrichmentService2.svc' $config.InputProperties = 'OriginalPath', 'ContentSource' $config.OutputProperties = 'MyParam1', 'MyParam2' $config.DebugMode = $false $config.SendRawData = $false Set-SPEnterpriseSearchContentEnrichmentConfiguration –SearchApplication $ssa –ContentEnrichmentConfiguration $configSharePoint: Portal navigation limited to 50 dynamic items
I was looking into an issue where the Navigation Settings page wouldn’t show all subsites in the treeview. When reproducing it was limited to 50 dynamic items.The treeview component is aMicrosoft.SharePoint.Publishing.Internal.WebControls.HierarchicalListBox which connects to the active Microsoft.SharePoint.Publishing.Navigation.PortalSiteMapProvider. This has a property“DynamicChildLimit” that can be explicitly configured in the web.config.// Microsoft.SharePoint.Publishing.Navigation.PortalSiteMapProvider public int DynamicChildLimit { get { int? num = this.dynamicChildLimit; if (num.HasValue) { return num.GetValueOrDefault(); } if (this.Version < 14) { return 50; } return 0; } set { this.dynamicChildLimit = new int?(value); } }The active provider used is “GlobalNavSiteMapProvider”, defined in web.config as<add name="GlobalNavSiteMapProvider" description="CMS provider for Global navigation" type="Microsoft.SharePoint.Publishing.Navigation.PortalSiteMapProvider, Microsoft.SharePoint.Publishing, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" NavigationType="Global" EncodeOutput="true" />I tried specifying Version=”14” but then it defaulted to 20 items, a path I didn’t further investigate. So I just explicitly specified the DynamicChildLimit=”100” and that fixed the issue.<add name="GlobalNavSiteMapProvider" description="CMS provider for Global navigation" type="Microsoft.SharePoint.Publishing.Navigation.PortalSiteMapProvider, Microsoft.SharePoint.Publishing, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" NavigationType="Global" EncodeOutput="true" DynamicChildLimit="100" />Issue
First day after my vacation and I got presented with a nice situation at one of our clients. Most users trying to create new subsites would get “Sorry, this site hasn’t been shared with you” and the site would NOT get created. Troubleshooting this showed that during site provisioning the “SiteFeed” Feature would throw an exception which would roll back the site creation. The relevant lines in the ULS logs pointed towards the “Following” (Social) of the newly created site:FollowedContent.FollowItem:Exception:Microsoft.Office.Server.UserProfiles.FollowedContentException: ItemDoesNotExist : Item does not exist. at Microsoft.Office.Server.UserProfiles.SPS2SAppUtility.GetPersonalUrl(UserProfile& profile) at Microsoft.Office.Server.UserProfiles.SPS2SAppExecutionContext.InitializeForProfile() at Microsoft.Office.Server.UserProfiles.SPS2SAppExecutionContext.EnsureInitialized() at Microsoft.Office.Server.UserProfiles.FollowedContent.FollowItem(FollowedItem item, Boolean isInternal)
Could not follow the url https://sharepoint/newsub
Leaving Monitored Scope (Event Receiver (Microsoft.Office.Server.UserProfiles, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c, Microsoft.Office.Server.UserProfiles.ContentFollowingWebEventReceiver)).
The newly created site could not be added to the user’s Social list on his MySite due to an Access Denied on adding the item to the list.Cause
The MySites were recently migrated from SharePoint 2010 (classic mode) to SharePoint 2013 (claims mode). MySites work by setting the ‘owner’ as Primary Site Collection Administrator, but theconversion to claims had erased all the classic-mode Site Collection Administrators from the migrated site collections. Querying for the Site Collection Administrators would just return empty which is of course not good.Solution
I whipped up a PowerShell script that would loop all MySites and report any sites with missing or non-matching Site Collection Administrator. Toggling the ‘report only’ flag in the script will correct the situation.asnp Microsoft.SharePoint.PowerShell -ea 0 | Out-Null cls $reportOnly = $false Write-Host "ReportOnly:" $reportOnly $mySites = Get-SPSite -Limit ALL | ? { $_.RootWeb.WebTemplate -eq "SPSPERS" } Write-Host "Found" ($mySites.Count) "MySites" $mySites | % { $s = $_ $w = $s.RootWeb $owner = $w.EnsureUser($w.Title) $primaryAdmin = $w.SiteAdministrators | Select -First 1 if ($primaryAdmin -eq $null) { Write-Host -ForegroundColor Red ($s.ServerRelativeUrl) ": no primary SC admin. Owner should be" $owner if ($reportOnly -eq $false) { $owner.IsSiteAdmin = $true $owner.Update() } } elseif ($owner.IsSiteAdmin -eq $false) { Write-Host -ForegroundColor Yellow ($s.ServerRelativeUrl) ": primary SC admin (" $primaryAdmin ") does NOT match owner (" $owner ")" if ($reportOnly -eq $false) { $owner.IsSiteAdmin = $true $owner.Update() } } else { Write-Host -ForegroundColor Green ($s.ServerRelativeUrl) ": primary SC admin (" $primaryAdmin ") matches owner (" $owner ")" } $s.Close() }After correcting all sites should turn up green.SharePoint 2013: Open PDF files in client applicationSharePoint 2013 has this quirk going for quite some time where clicking on a PDF file opens it in the browser no matter what you configure in the list settings or client application. One of our customers migrated to SharePoint 2013 and suddenly they lost this functionality. Together with my colleagueElio Struyf we proposed a small customization that would add the “Open in client” option to the fly out menu for PDF documents.The customization itself is devised as a SharePoint Sandboxed Solution that can be installed and activated per Site Collection on either an on-premises environment or Office 365.The functionality can be enabled or disabled via the corresponding Site Collection Feature.
0 comments:
Post a Comment