This blog post will show you how to write your own custom security post-trimmer for SharePoint 2013. Not only that, we will take you through the steps of deploying and registering the trimmer using a crawl rule before putting the trimmer to work.
Please visit the official MSDN documentation for the overview and definitive source of documentation of this feature:
There are two kinds of custom security trimming: Pre-trimming refers to pre-query evaluation where the backend rewrites the query adding security information before the index lookup in the search index. Post-trimming refers to post-query evaluation where the search results are pruned before they are returned to the user.
Post-trimming analyzes each search result after the query has been evaluated. Performance and potential incorrect refiner data and hit counts aside, sometimes it is necessary to perform last-minute security evaluation "when the results are served".
One scenario could be to deny documents in the search results for machines that do not have proper anti-virus software. Another can be to restrict certain documents from being visible in the search results outside a given time of day.
Let's create a simple post-security trimmer. It should remove documents from the search results if a certain hint is given for document results. We will use the string field docaclmeta for this purpose. The rules are simple. Documents will be removed if this field contains the text 'deny'. If this field is empty or contains anything else (e.g. 'allow'), then the documents will be visible in the search results.
This MSDN article offers useful starting tips on creating the security post-trimmer project in Visual Studio, adding the references to both the Microsoft.Office.Server.Search.dll and the Microsoft.IdentityModel.dll.
The following is added to the using directives at the beginning of the class file, SearchPostTrimmer.cs:
We then define the class as implementing the ISecurityTrimmerPost interface in the class declaration:
public class SearchPostTrimmer : ISecurityTrimmerPost
We do not need any additional settings for this trimmer to work. Thus, the Initialize method remains empty:
/// Initialize the post-trimmer.
/// <param name="staticProperties">Static properties configured for the trimmer.</param>
/// <param name="searchApplication">Search Service Application object</param>
public void Initialize(NameValueCollection staticProperties, SearchServiceApplication searchApplication)
// Intentionally blank
The actual trimmer logic resides in the CheckAccess method. This method returns a bit array with values 1 or 0, grant access or deny access, respectively.
We will check for the term 'deny' in document's docaclmeta field. If so, we will remove that document from the search results.
/// Check access for the results returned from the search engine
/// <param name="documentUrls"> List of the URLs for each document whose access is to be determined by the security trimmer implementation.</param>
/// <param name="documentAcls"> List of the document ACLs for each document whose access is to be determined by the security trimmer implementation. This list may be null or may contain String.Empty strings.</param>
/// <param name="sessionProperties"> Transient property bag valid within the scope of a single query processing component execution.</param>
/// <param name="userIdentity"> Identity of the user.</param>
/// <returns>a bitarray with values 1 if the respective document identifier from documentUrls has been granted access or 0 if the respective document identifier has not been granted access </returns>
public BitArray CheckAccess(
IDictionary<string, object> sessionProperties,
if ((null == documentCrawlUrls) || (documentCrawlUrls.Count < 1))
throw new NullReferenceException("Error: CheckAccess method is called with invalid URL list");
if ((null == documentAcls) || (documentAcls.Count < 1))
throw new NullReferenceException("Error: CheckAccess method is called with invalid ACL list");
if (null == passedUserIdentity)
throw new NullReferenceException("Error: CheckAccess method is called with invalid user identity parameter");
// Initialize the bit array with false value which means all results will be trimmed out.
var urlStatusArray = new BitArray(documentCrawlUrls.Count, false);
for (var x = 0; x < documentAcls.Count; x++)
urlStatusArray[x] = true;
if (string.Compare(documentAcls[x], "deny", StringComparison.InvariantCultureIgnoreCase) == 0)
urlStatusArray[x] = false;
urlStatusArray[x] = true;
One might consider the following tips to improve the overall performance with this trimmer as a starting point:
After you have built the custom security trimmer project, you must deploy it to the global assembly cache on any server in the Query role.
Now, you can issue queries and the beauty of the Post-trimmer logic should reveal itself on each query evaluated. Try modifying the docaclmeta field contents in the Product.xml file for the custom connector, issue a full crawl and repeat the query.
Author: Sveinar Rasmussen (sveinar).