Category Archives: dev

SharePoint: Fast Search Pipeline Extensibility for Specific Content Source


You may also be interested in: ViewPoint for SharePoint


 

Editor’s note: Contributor Alex Choroshin is a Sharepoint Team Leader at Bank Leumi. Follow him @choroshin

Since the Pipeline Extensibility is not restricted to any content source and the fact that you do not have a proper API, it makes it really hard to work with a specific Content Source.

Luckily for us we have a Crawled Property which is mapped to the managed property "ContentSource":


<CrawledProperty propertySet="012357BD-1113-171D-1F25-292BB0B0B0B0" varType="31" propertyName="315" />

You can include this in your extensibility configuration and you can see which content source the data came from and apply appropriate logic.

Example:


static void Main(string[] args)
     {
         XDocument inputDoc = XDocument.Load(args[0]);
         XElement outputElement = new XElement("Document");
         //get Content Source from input file
         string contentSourceName = GetContentSource(inputDoc);
         if (contentSourceName == "My Content Source")
         {
             //your logic
         }
         outputElement.Save(args[1]);
     }
     private static string GetContentSource(XDocument inputDoc)
     {
         var res = from cp in inputDoc.Descendants("CrawledProperty")
                   where new Guid(cp.Attribute("PropertySet").Value).Equals(new Guid("012357BD-1113-171D-1F25-292BB0B0B0B0")) &&
                   cp.Attribute("PropertyID").Value == "315"
                   select cp.Value;
         return res.First();
     }

This way we manage to apply our own logic for a specific Content Source.

Reference:

Great answer by Mikael Svenson:
http://social.technet.microsoft.com/Forums/en-US/fastsharepoint/thread/4a510295-35d9-403b-9d32-3650fb89dcb8/
http://blogs.msdn.com/b/thomsven/archive/2010/09/23/debugging-and-tracing-fast-search-pipeline-extensibility-stages.aspx?Redirected=true

I would like to thank Jorge Einbund, a talented .NET developer for helping me with this post.

Hope you’ll find this post helpful

Build a Search Driven Solution with SharePoint 2013 - Part II


You may also be interested in: O’Reilly - SharePoint 2010 at Work


 

Editor’s note: Contributor Nicki Borell is a SharePoint Evangelist & Consultant for Experts Inside. Follow him @NickiBorell.

Part I is about Search Driven in on-premise environments

Part II will show the options and differences with O365 SharePoint Online

In Part I, I wrote about the options and scenarios of Search and Search Driven Apps in SharePoint 2013 on-premise. Now, let’s see what options we have using the online version of the new SharePoint 2013.

There are two big differences in SharePoint Online:

<!-[if !supportLists]->1.       <!-[endif]->We did not have the new Webpart Family “Search Driven Content”.

So, in fact we only can use the common Search Result Webpart and its options within the Query Builder to create dynamic search solutions

<!-[if !supportLists]->2.       <!-[endif]->In SharePoint Online we are not able to configure our own content sources or manipulate the settings like Crawler scheduling etc.

Missing content sources option in SharePoint Online and its impact

Because the option “content sources” is not available in SharePoint Online we have to look for alternatives to bring in our own content to the SharePoint Online environment. Another point is the index freshness, which cannot be manipulated to set up Crawler scheduling or manually decide to configure continuous crawling options.

Index freshness

Also, in the official technet documentation, there is not a clear statement whether continuous crawling is set up by default for SharePoint Online or not. My own experience says: “Yes, it is”. I get search result in the index within 2min – 4min. This is really fast and for me, in most scenarios, ok.

Own content

To bring in your own content to your SharePoint Online Search you have to use “Result Sources”. To set up those 3rd Party Result Sources you can use “Site Setting” to configure them in the context of a Site Collection or SharePoint Admin Center to do it globally.

2013-05-20-SearchDriven-Part02-01.jpg

The dialog for setting up a Result Source shows the option for bringing in your own content:

<!-[if !supportLists]->·         <!-[endif]->Local SharePoint

<!-[if !supportLists]->·         <!-[endif]->Remote SharePoint

<!-[if !supportLists]->·         <!-[endif]->OpenSearch 1.0/1.1

<!-[if !supportLists]->·         <!-[endif]->Exchange

For all options you have to configure a security context to access the result source. Using Remote SharePoint you can use SSO or path thru authentication. Using Open Search we have the option to use Anonymous, Basic Authentication, Digest Authentication, NTLM, Form Authentication or Cookie Authentication. Using Remote SharePoint to call an on-premise SharePoint Search you have to set up a Search Federation based on an Identity Federation.

In all cases we have to configure a Remote Address and we can configure a Query Transformation (as described in Part I) to filter or manipulate the query which is sent to the remote system.

In my demo tenant I simply used some open search based systems.

Source

Source Url

Twitter

http://search.twitter.com/search.atom?q={searchTerms}

Facebook

http://search.live.com/results.aspx?q={searchTerms}+site%3afacebook.com&format=rss

YouTube

https://gdata.youtube.com/feeds/api/videos?q={searchterms}

(These can of course also be remote endpoints for your LOB systems or other on-premise sources)

Setting up those systems as a “Result Source” gave me the option to use them in my SharePoint Online system to build Search Driven experiences.

2013-05-20-SearchDriven-Part02-02.jpg

Search Driven experiences and solutions in SharePoint Online

The easiest way to use the new result source, and within them remote content, is using them in the result Webpart. In this example you can see an overview site in my SharePoint Online that aggregates news from configured social media sources based on Result Webparts:

2013-05-20-SearchDriven-Part02-03.jpg

For example filling the atwork area for Twitter news I configured the Result Webpart with these setting:

<!-[if !supportLists]->·         <!-[endif]->Setting up the “Select a query” with the “Twitter” Result Source

<!-[if !supportLists]->·         <!-[endif]->Fill in a query that focus to the desired results

2013-05-20-SearchDriven-Part02-04.jpg

Using the configured result sources within Query Rules (as described in Part I) we can build a focused search experience like this social media search page:

2013-05-20-SearchDriven-Part02-05.jpg

Search Driven Solutions can of course also be built based on SharePoint Online content. Here you see an example based on content placed in SharePoint Online lists ad libraries:

2013-05-20-SearchDriven-Part02-06.jpg

And of course you can mix results coming from remote sources and results coming from your SharePoint Online source.

This Webcast shows the described samples in action: Building Search Driven Solution with SharePoint 2013 Part II - On YouTube

2013-05-20-SearchDriven-Part02-07.png

Gooey SharePoint Scripting


You may also be interested in: ViewPoint for SharePoint


 

Editor’s note: Contributor Alex Brassington is a SharePoint consultant working for Trinity Expert Systems. Follow him @brassingtonalex

Today i’m going to show you how to script Sharepoint through the GUI. Whilst in this example we’ll be running the code on server the same concepts and approach can be used to Script it from any machine that can hit the relevant website…

Our example might seem to be a little forced but it’s based on a real world experience. We had a client who had a fairly complicated Content Type scenario, over 150 Content Types spread over 8 levels of inheritance with untold columns. Then we discovered an issue and needed to publish every single one of those content types. This is the classic example of where PowerShell should be used but awkwardly they’d been burnt with PowerShell publishing before.

As such we had a flat edict, no PowerShell publishing of content types. It must go through the GUI.

A post i’d seen recently by Dr James McCaffrey popped into my head. It was about using PowerShell to automate testing of web applications using PowerShell.
Why not use the same process to automate the publishing of the content types?

The first thing to do is to get ourselves an IE window:


$ie = New-Object -com &q0uot;Internet Explorer" 
#This starts by default in a hidden mode so let's show it 
$ie.Visible = $true

This isn’t much use on its’ own so let’s send it to a page. In our case we want to go to the page to publish one of our content types. We know that the publish page itself is an application page that is referenced from a site collection root web with the following URL syntax:

siteCollectionRoot/_layouts/managectpublishing.aspx?ctype=ContentTypeID

Glossing over how to get the ContentTypeID for now we have this:


$pageUrl= "http://sharepoint/sites/cthub/_layouts/managectpublishing.aspx?ctype=0x0100A4CF347707AC054EA9C3735EBDAC1A7C"

Now PowerShell moves fast, so we’ll need to wait for Javascript to catch up.


While ($ie.ReadyState -ne 4) 
{ 
    Sleep -Milliseconds 100 
}

Now we’re there, let’s get the publish button. Thankfully this button has a consistent ID that we can get using the trusty F12 button in IE.


Identifying an element’s ID uwing F12

The catchily titled “ctl00_PlaceHolderMain_ctl00_RptControls_okButton” button? Depressingly i think i’m starting to see the naming convention behind these ids…


$textBoxID = "ctl00_PlaceHolderMain_ctl00_RptControls_okButton"
#You have to pass the Document into it's own object otherwise it will fail 
$document = $ie.Document 
$button= $document.getElementByID($buttonID)

And now all we need to do is to click that button:


$button.Click()

Now you might think that we’ve done all we need to do here and slap it into a foreach loop and be done with it. Of course you can’t do that as you need to give IE time to send that request using our good old friend Javascript.

So we wait for the page to re-direct us:


   
While ($ie.locationurl -eq $url) 
{ 
start-sleep -Milliseconds 100 
}

Now we can slap it into a foreach loop and with a little bit of work we can come up with something like the code below:


Add-PSSnapin Microsoft.SharePoint.PowerShell -ea SilentlyContinue 
#URL for the content type hub to use 
$CTHubURL= "https://sharepoint/sites/cthub"
#Get the Content Type hub 
$site = Get-SPSite $CTHubURL
#Content Types to publish 
$ContentAndColumns = @( 
("Document Type 1"), 
("Document Type 2"), 
("Document Type 3") 
) 
#Open a new IE window 
$ie = New-Object -com "InternetExplorer.Application"
#Make the window visible 
$ie.visible = $true
#Loop through the content types and publish them 
foreach ($contentTypeName in $ContentTypes) 
{ 
    Write-Verbose  "Processing $ContentTypeName"
    #Content types live at the root web 
    $web = $site.rootWeb 
    #Get the content type using it's name 
    $ct =   $web.ContentTypes[$ContentTypeName] 
    #Get the GUID for the CT 
    $GUID = $ct.ID.ToString() 
    #Get the URL for the page based on the content type hub url, the application page that does publishing and the GUID 
    $url = $CTHubURL+ "/_layouts/managectpublishing.aspx?ctype=" + $GUID  
    #Go to the page 
    $ie.navigate($url) 
    #Wait for the page to finish loading 
    while ($ie.ReadyState -ne 4) 
     { 
        start-sleep -Milliseconds 100 
     } 
     #The ID of the button to press 
    $buttonID = "ctl00_PlaceHolderMain_ctl00_RptControls_okButton"
    $document = $ie.Document 
    $btn = $document.getElementByID($buttonID) 
    #Push the button 
    $btn.click() 
    #Wait for the page to be re-directed 
     while ($ie.locationurl -eq $url) 
     { 
        start-sleep -Milliseconds 100 
     } 
     Write-Verbose "Content Type $contentTypeName published"
}

I don’t know about you but there is something deeply neat about sitting at your desk watching IE do the dull task that you were convinced was going to bring your RSI back with a vengance, and in half the time you could do it.

This example might not be useful for that many people but the concept is intriguing. There’s no reason most of this can’t be done without any code on the server at all, the only time we use it is to get the GUIDs and those can be pre-fetched if needs be. Nor does it need any significant rights, as long as the account you use has permision to get into that site collection and publish content types then that’s all they need.

The logical destination of this is Office 365, the scripts and rules for running them on there are limited and limiting, they have to be. But the beauty of Scripting is that we don’t have to be limited by the detail of code, we can use higher level components and tools to worry about that for us. In this case, the GUI that microsoft were kind enough to provide us for when it’s too awkward to find the PowerShell console.

SharePoint 2013 Upgrade


You may also be interested in: fpweb.net


 

Editor’s note: Contributor Riccardo Emanuele is founder and chairman of ImageFast Ltd. Follow him @rearcardoor

You can only upgrade to SharePoint 2013 from SharePoint 2010 so those of you looking to upgrade directly from 2007 will have to take the 2010 route first.

There is no in-place upgrade option for 2013, not that we ever used it, so every upgrade needs to be onto new kit and either done via the DB attach method or by creating a new farm and using third party tools to migrate the content.

2013 needs:

  • 64 bit Windows Server 2008 R2 SP1 or 64 bit Windows Server 2012.
  • 64 bit SQL Server 2008 R2 SP1 or 2012.
    • 2012 SP1 is needed if you are planning to use BI

When you move 2010 site collections across to 2013, they remain in 2010 mode and SharePoint 2013 maintains both a 14 hive and a 15 hive to support both 2010 and 2013 mode site collections. You can still view and access 2010 mode site collections but they will only provide the 2010 level of functionality; site collection administrators will get visual warnings on the page when a site is in 2010 mode.

If you have 2007 site collections in your 2010 farm, then you should upgrade them to 2010 mode before moving them to 2013. There is no visual upgrade tool in 2013 like there was in 2010 so to upgrade 2010 site collections to 2013 mode you will have to run a PowerShell command, or by using the Upgrade this Site Collection item in Site Settings (site collection administrators only).

Microsoft recommend that site collection administrators are left to upgrade their own site collections post upgrade, rather than as part of the upgrade. The exceptions of course are those site collections which may need particular attention; such as high volume, highly customised, or critical sites.

Once a site collection has been marked for upgrade to 2013 mode, an item is added to a new upgrade queue which is processed by an Upgrade Site Collection timer job. This timer job runs every minute and can run parallel upgrades; there is a throttle applied to prevent the server being over utilised by this activity.

It is possible to view what a 2010 mode site will look like when it is upgraded to 2013 mode by using the Create an Evaluation Site Collection function. A daily timer job processes this request and copies the 2010 site collection to a new 2013 mode site collection within the same content database, gives it a URL with the same name as the source site collection and appends -eval on the end. By default this is retained for 31 days after which point it will be deleted. The idea is to let a site collection administrator look at the site in 2013 mode and determine what changes will be needed before upgrading the site for real.

There is also a site collection health checker that can be run against both 2010 mode and 2013 mode site collections.

How to Get Login Name and Display Name using SharePoint 2013 REST API


You may also be interested in: O’Reilly - SharePoint 2010 at Work


 

Editor’s note: Contributor Alex Choroshin is a Sharepoint Team Leader at Bank Leumi. Follow him @choroshin

Working with REST API is quite simple and straight forward. For example when you need to fetch data from a list you can use the following JQuery Ajax code snippet:


jQuery.ajax({
               url: "http://YourSite/_api/web/lists/getbytitle('ListName')/items",
               type: "GET",
               headers: { "Accept": "application/json;odata=verbose" },
               success: function(data, textStatus, xhr) {
                var dataResults = data.d.results;
                alert(dataResults[0].Title);     
                },
               error: function(xhr, textStatus, errorThrown) {
               alert("error:"+JSON.stringify(xhr));
               }
            });

Another good example is when you need to get specific fields like “Title”, “ID” or “Modified” , you can use the "$select keyword.

For example:


url: "http://YourSite/_api/web/lists/getbytitle('ListName')/items$select= Title ,ID, Modified",
            type: "GET",
            headers: { "Accept": "application/json;odata=verbose" },
            success: function(data, textStatus, xhr) {
             var dataResults = data.d.results;
             alert(dataResults[0].Modified);     
                },
            error: function(xhr, textStatus, errorThrown) {
            alert("error:"+JSON.stringify(xhr));
            }
         });   

But what happens when you need to get a user/group field like “Author” ? well, things are not as obvious as they seem.

Unfortunately you can’t use /getbytitle(‘ListName’)/items or /getbytitle(‘ListName’)/items?filter=Author to get the user field since this field does not exist in the response data, but luckily for us we have an “AuthorId” field that (as you already guessed) will get us the user id.

So, after getting the user id from your list you need to make another Ajax call to get the user login name/display name by using /_api/Web/GetUserById method .

Example:


function getUser(id){
var returnValue;
  jQuery.ajax({
   url: "http://YourSite/_api/Web/GetUserById(" + id + ")",
   type: "GET",
   headers: { "Accept": "application/json;odata=verbose" },
   success: function(data) {
           var dataResults = data.d;
      //get login name  
      var loginName  = dataResults.LoginName.split('|')[1];
      alert(loginName);     
      //get display name
      alert(dataResults.Title);
   }
 });
}

Full Example:


jQuery.ajax({
    url: "/SiteName/_api/web/lists/getbytitle('ListName')/items",
    type: "GET",
    headers: { "Accept": "application/json;odata=verbose" },
    success: function(data, textStatus, xhr) {
        var dataResults = data.d.results;
        var resultId = dataResults[0].AuthorId.results[0];        
        getUser(resultId)                                                                          
    },
    error: function(xhr, textStatus, errorThrown) {
        alert("error:"+JSON.stringify(xhr));
    }
});                                                                           
function getUser(id){
var returnValue;
  jQuery.ajax({
   url: "http://YourSite/_api/Web/GetUserById(" + id + ")",
   type: "GET",
   headers: { "Accept": "application/json;odata=verbose" },
   success: function(data) {
           var dataResults = data.d;
      //get login name  
      var loginName  = dataResults.LoginName.split('|')[1];
      alert(loginName);     
      //get display name
      alert(dataResults.Title);
   }
 });
}

I would like to thank Ofir Elmishali for helping me with this post.

Hope you’ll find this post helpful.

SharePoint: Getting “This collection already contains an address with scheme http” Error When Creating a Custom WCF Service


You may also be interested in: Documentation Toolkit for SharePoint


 

Editor’s note: Contributor Alex Choroshin is a Sharepoint Team Leader at Bank Leumi. Follow him @choroshin

Problem:

The problem is caused by the fact that IIS supports specifying multiple IIS bindings per site (which results in multiple base addresses per scheme, in our case HTTP), but a WCF service hosted under a site allows binding to only one base address per scheme.

Multiple addresses example (in our case two):

2013-04-23-SharePointErrorWCFService-01.jpg

Solution:

Create a custom service factory to intercept and remove the additional unwanted base addresses that IIS was providing.

A) Add the custom service factory to your Custom.svc file


<%@ServiceHost language=c# Debug="true" Service="MySolution.Services.CustomService, $SharePoint.Project.AssemblyFullName$"
Factory="MySolution.Core.CustomHostFactory", $SharePoint.Project.AssemblyFullName$ %>

* Don’t forget to add the assembly full name: $SharePoint.Project.AssemblyFullName$ or you’ll get “The CLR Type ‘typeName’ could not be loaded during service compilation” error.

B) Create a custom factory by inheriting from ServiceHostFactory and overriding the CreateServiceHost method.

By using the current request host name you can check which base address to use, and if no host name found, use the first one.


public class CustomServiceHostFactory : ServiceHostFactory
    {
        protected override ServiceHost CreateServiceHost(Type serviceType, Uri[] baseAddresses)
        {
            string hostName = HttpContext.Current.Request.Url.Host;
            foreach (Uri uri in baseAddresses)
            {
                if (uri.Host == hostName)
                    return new ServiceHost(serviceType, uri);
            }
            return new ServiceHost(serviceType, baseAddresses[0]);
        }
    }
    public class CustomHost : ServiceHost
    {
        public CustomHost(Type serviceType, params Uri[] baseAddresses)
            : base(serviceType, baseAddresses)
        { }
        protected override void ApplyConfiguration()
        {
            base.ApplyConfiguration();
        }
    }
}

Hope you’ll find this post helpful.

SharePoint: Search in XsltListViewWebPart


You may also be interested in: SharePoint-based solutions by B&R Business Solutions


 

Editor’s note: Contributor Dmitry Kozlov is the leader of the SharePoint Forms Designer Team at SharePoint Forms Designer and Co-founder of PlumSail. Follow him @spform

I had the following problem in our project: my customer has a long list with many text fields. I need to give his users a tool for quick navigation in this list, as well as for searching and editing elements. The best solution was a text filter. When a user enters text into it, the list automatically is filtered by all columns as follows:

First, I added XsltListViewWebPart (XLVWP) with a default view, then I added an input text box with a ‘Search’ button:


<input type="text" name="searchText" />
<button type="submit">Search</button>

I configured a new ParameterBinding element in the XLVWP to bind it with my text box


<ParameterBinding Name="SearchText" Location="Form(searchText)" DefaultValue="" />

In View parameter, I have added following query:


<Query>
  <Where>
    <Or>
      <Or>
        <Contains>
          <FieldRef Name="Title"/>
          <Value Type="Text">{SearchText}</Value>
        </Contains>
        <Contains>
          <FieldRef Name="Author"/>
          <Value Type="Text">{SearchText}</Value>
        </Contains>
      </Or>
      <Contains>
        <FieldRef Name="PostCategory"/>
        <Value Type="Text">{SearchText}</Value>
      </Contains>
    </Or>
  </Where>
  <OrderBy>
    <FieldRef Name="PublishedDate" Ascending="FALSE"/>
  </OrderBy>
</Query>

Now when I enter text into my filter text box and press ‘Search’ button my list is filtered by Title, Author and Category columns. I can see 3 important problems:

<!-[if !supportLists]->1. <!-[endif]->the user has to press the ‘Search’ button to start filtering instead of simply entering the text

<!-[if !supportLists]->2. <!-[endif]->The user has to wait for page reload

<!-[if !supportLists]->3. <!-[endif]-> When the user first opens this page the list is empty because the filter is empty.

I started fixing these problems one by one. First I added asynchronous update to my list view. Check ‘Enable Asynchronous Update’ and ‘Show Manual Refresh Button’ in the properties of XLVWP:

Users have a manual refresh button in the right-hand upper corner of the list:

When they enter text into the filter text box and press this button, XLVWP is filtered without the page reload. I found a event receiver in IE developer tools:


javascript: __doPostBack('ctl00$m$g_09891d16_ead7_4eb6_9588_3c2eb636c6eactl02','cancel');return false;

I added it to the onkeyup event handler of my filter text box and then removed the ‘Search’ button:

Search:


<input onkeyup="javascript: __doPostBack('ctl00$m$g_09891d16_ead7_4eb6_9588_3c2eb636c6ea$ctl02','cancel');" />

Great, now the list is filtered, without a page update, while the user inputs the text. Ok, but the last problem remains: an empty list when the user first comes to the page. To solve it I used a calculated field in my list: _TitleToFilter with formula: ="###"&Title. Then I added a default value to the binding parameter: ###


<ParameterBinding Name="SearchText" Location="Form(searchText)" DefaultValue="###" />

In the query I replaced Title column with _TitleToFilter:


<Contains>
  <FieldRef Name="_TitleToFilter"/>
  <Value Type="Text">{SearchText}</Value>
</Contains>

Now that the filter is empty, the sequence of three sharps (###) is used as a filter pattern. And all items have this substring in their _TitleToFilter column.

Ok, but a new problem occured: when the user clears the filter text box; the list becomes empty. The default value does not apply because the filter sends a postback parameter but with an empty value. So I added a new hidden field to send the filter value to my XLVWP and fill this field with javascript while the user enters the text into the filter:


<input type="hidden" name="searchText" id="searchText" />
Search: <input onkeyup="document.getElementById('searchText').value = this.value == '' ? '###' : this.value; javascript: __doPostBack('ctl00$m$g_09891d16_ead7_4eb6_9588_3c2eb636c6ea$ctl02','cancel');" />

Now it works perfectly. There is no need for the manual refresh button now. To remove it form XLVWP you can just uncheck ‘Show Manual Refresh Button’ in its properties.

Create an Association Between Related Entities with FAST for SharePoint 2010 or SharePoint 2013 Search


You may also be interested in: SharePoint-based solutions by B&R Business Solutions


 

Editor’s note: Contributor Johnny Tordgeman is CTO of E4D Solutions ltd. Follow him @jtordgeman

When setting an association between entities we use one of the following approaches to get back results:

  • DirectoryLink – Using the DirectoryLink approach, if I search for a customer, I will get his entity as result but clicking on the result will take me to the profile page of the entity where I can see all the products he owns.
  • AttachmentAccessor – Using the AttachmentAccessor approach, if I search for a customer, I will get it’s parent entities back, meaning the product(s) he/she is related to.

So how do we create this magnificent beast called associated entities? Well, read ahead and find out!

The data source

For the purpose of the tutorial I created a simple database with two tables: Products and Customers. The tables are connected by the ProductId foreign key. Each product has a one-to-many relation to customers. To download the script that generates the tables and demo data click here.

The BCS model

In order to skip the boring parts of creating the BCS model, its entities and all the type descriptors I’ve created a starter solution that you can download right here. Open the solution with Visual Studio 2012 and double click the AssociatedBcsModel.bdcm file. You should get the following BCS model:

The model already includes the minimum required methods (ReadList and ReadItem) for both entities and a DAL that is used to connect to the database. To simplify things, the connection string is defined within the DAL file using the _connection variable:


private static string _connection = "Server=.;Database=Hippodevs;Trusted_Connection=True";

In a real world application we will probably use the Secure Store to hold this kind of information.

Creating the association

To get things rolling let’s create an association. Right click anywhere on the BCS model area and select Add and then Association:

The Association Editor popup appears. This popup is the key for setting an association:

First, we set the association name. Since we want to keep things simple for the sake of this tutorial, let’s leave the default name. Next, we set the source and destination entities. I mentioned in the beginning of the tutorial that our goal is to have a one-to-many relationship between a product and its customers, so in this case product is our source entity and customer is our destination entity.

Next up is the foreign key association check box. Our relationship is based on foreign key (ProductId) so just leave this check box checked.

Now comes the interesting part: mapping the identifier. This is the step where we tell FAST (or SharePoint 2013 search) which property on our destination entity maps to the identifier of the source entity. In our destination entity we have a product id property (ProductId) which should be matched with the identifier of the source entity, named PrdId. Scroll down the fields list until you spot the following field:


ReadList.returnParameter.CustomerEntityList.CustomerEntity.ProductId

As you can see from the field name, this is the ProductId property of the customer entity. Open the Source ID dropdown next to the field we just spotted and select the one possible option, PrdId. You now mapped the two entities together! Exciting isn’t it?

To keep things simple, leave the names of the methods as they are and click the OK button. Your entities should now look as follows:

Notice the nice line in the middle connecting the two entities? That means they are connected. In addition, each of the entities spots a new method which is used to tell the search engine how to get the list of child (or destination) entities ids that will be called by the crawl process.

Coding the ProductEntityToCustomerEntity method

One of the new methods that was added during the last step is ProductEntityToCustomerEntity. This method is used to tell the search engine the ids of the items we wish to have as the child entities of a specific product entity. The method takes an id as an argument and returns an ienumerable of the child entities:


public static IEnumerable<ProductEntity> CustomerEntityToProductEntity(string custId)
{
    throw new System.NotImplementedException();
}

In order to get the list of customers who bought a specific product we set our sql query as:


Select ID from Customers where ProductID='id'

Add a new method to the DAL file to handle the call to the SQL server as:


public static IEnumerable<CustomerEntity> GetListOfCustomersForProduct(string prdId)
        {
            List<CustomerEntity> Entities = new List<CustomerEntity>();
            SqlConnection EntityConnection = null;
            SqlDataReader SqlReader = null;
            try
            {
                //Connection DB
                EntityConnection = new SqlConnection(_connection);
                //Open COnnection ( will be close at finnaly statement)
                EntityConnection.Open();
                //Declare Sql Command 
                SqlCommand cmd = new SqlCommand();
                cmd.Connection = EntityConnection;
                cmd.CommandText = string.Format("Select ID from Customers where ProductID='{0}'", prdId);
                SqlReader = cmd.ExecuteReader();
                if (SqlReader.HasRows)
                {
                    DataTable dt = new DataTable();
                    dt.Load(SqlReader);
                    foreach (DataRow row in dt.Rows)
                    {
                        Entities.Add(new CustomerEntity() { CustomerId = row["ID"].ToString(), ProductId = prdId });
                    }
                }
                return Entities;
            }
            catch (Exception ex)
            {
                //Write to log
                return Entities;
            }
            finally
            {
                // close reader
                if (SqlReader != null)
                {
                    SqlReader.Close();
                }
                // close connection
                if (EntityConnection != null)
                {
                    EntityConnection.Close();
                }
            }
        }

Go back to the ProductEntityToCustomerEntity and change it as follows so it will call the new method we just created:


public static IEnumerable<CustomerEntity> ProductEntityToCustomerEntity(string prdId)
        {
            return DAL.DAL.GetListOfCustomersForProduct(prdId);
        }

Setting the AttachmentAccessor property

When we created our BCS model and its associations we also created two new methods for the association. How does the search engine know which of those it should call during the crawl process? The answer is simple, by setting the AttachmentAccessor property on the desired association.

Open your BCS model in XML view (right click on it, select Open With… and then XML (Text) Editor). Look for an association named ProductEntityToCustomerEntityAssociationNavigator, which will look as follows:


<Association Name="ProductEntityToCustomerEntityAssociationNavigator" Type="AssociationNavigator" ReturnParameterName="customerEntityList" ReturnTypeDescriptorPath="CustomerEntityList">
    <SourceEntity Name="ProductEntity" Namespace="BcsAssociationDemo.AssociatedBcsModel" />
    <DestinationEntity Name="CustomerEntity" Namespace="BcsAssociationDemo.AssociatedBcsModel" />
</Association>

This is the declaration of the association we created earlier. Notice its type is AssociationNavigator. If we want this to be the association the search engine takes into consideration, all we have to do is add the AttachmentAccessor property to it as:


<Association Name="ProductEntityToCustomerEntityAssociationNavigator" Type="AssociationNavigator" ReturnParameterName="customerEntityList" ReturnTypeDescriptorPath="CustomerEntityList">
                  <Properties>
                    <Property Name="AttachmentAccessor" Type="System.String">x</Property>
                  </Properties>
                  <SourceEntity Name="ProductEntity" Namespace="BcsAssociationDemo.AssociatedBcsModel" />
                  <DestinationEntity Name="CustomerEntity" Namespace="BcsAssociationDemo.AssociatedBcsModel" />
                </Association>

The “Secret Sauce”

We are almost ready for the first crawl, but before we do that consider the following: if we leave the BCS model as it is now, the ReadList method of both entities will fire and the search engine will crawl all of the items in both entities. This is exactly what we want for the product entity but no where near what we want for the customers. For customers we only want to crawl the customers that are related to a product.

The ReadList method is mandatory for any BCS entity, so if we want to make sure no customer entity gets crawled without it being associated to a product, all we have to do is change the ReadList method for the customer entity as follows:


public static IEnumerable<CustomerEntity> ReadList()
        {
            //return DAL.DAL.GetListOfCustomers();
            List<CustomerEntity> customers = new List<CustomerEntity>();
            return customers;
        }

To the Crawl-Mobile!

Time for the fun part! Deploy the solution to your local SharePoint environment. Make sure you give the BCS model permissions, as in any BCS solution. Navigate to your FAST Content SSA service and create a new content source based on the line of business (BCS) solution we just deployed. Go ahead and make a full crawl. You should be all set and ready to go in about 3 minutes.

Head back to your FAST Query SSA service and add the newly created productentity.productname crawled property to the Title managed property. Perform another full crawl.

Once everything is set head to your FAST search center (or SharePoint 2013′s search center) and try to search for Playstation 3. You should get the following result:

Nothing unexpected here right? We search for a product, we get the product details from the product entity.

Now, go ahead and search for a customer name, for example Sandra J. Wynn:

We searched for a customer but instead of getting the customer details from its entity – we get the details of its parent entity, a product!

If you map additional crawled properties to managed properties and request them back in the core search results web part settings page – they too will be returned!

Epilogue

The BCS model we created is simple but it defiantly shows the strength of working with related entities. Just imagine how much easier it’s becoming for you to crawl a complex entity which is connected to many other entities in a one-to-many relationship.

Try to map a crawled property of the customer entity to a managed property and add it to the refinement panel. You can now refine the results (product entities) based on customers properties.

To download the finished solution click here.

SharePoint: Can’t Activate Site Collection Feature When Creating New Site From a Custom Web Template


You may also be interested in: SharePoint-based solutions by B&R Business Solutions


 

Editor’s note: Contributor Alex Choroshin is a Sharepoint Team Leader at Bank Leumi. Follow him @choroshin

The onet.xml file is basically divided into two parts, first is the "SiteFeatures" element and the second element is called "WebFeatures". The "SiteFeatures" Section that holds the site features starts activating all the features only when creating a site collection. The "WebFeatures" Section that holds the web features starts activating all the web scoped features only when creating a site.

Scenario: you created a custom web template and deployed the solution but when trying to create a site from your custom web template you get the following error "the site template requires that the feature {GUID} be activated in the site collection". Of course you can always activate the site collection scoped feature manually but, let’s be serious; you need all the necessary features to be automatically activated.

Solution: When creating a site you need to trigger the site collection scoped feature using a web scoped feature.

The steps are:

A) Create an empty web scoped feature and in the "FeatureActivated" Event Receiver add the following code:


public override void FeatureActivated(SPFeatureReceiverProperties properties)
        {
            try
            {
                //Ensure that scope is correctly set
                if (properties.Feature.Parent is SPWeb)
                {
                    SPWeb web = (SPWeb)properties.Feature.Parent;
                    foreach (SPFeatureProperty property in properties.Feature.Properties)
                    {
                        Guid featureGuid = new Guid(property.Value);
                        //Verify feature status
                        SPFeature feature = web.Site.Features[featureGuid];
                        if (feature == null)
                        {
                            //Activate site collection scoped feature, if requested and not currently activated
                            web.Site.Features.Add(featureGuid);
                        }
                    }
                }
            }
            catch (Exception ex)
            {}
        }

B) In the onet.xml file in the "WebFeature" Element, add the following xml:


<WebFeatures>
      <!-- Custom Site collection scoped feature activation -->
       <Feature ID="YourEmptyFeatureGuid">
         <Properties xmlns="http://schemas.microsoft.com/sharepoint/">
           <Property Key="SiteScopedGUID" Value="YourSiteCollectionFeatureID"/>
         </Properties>
       </Feature>
</WebFeatures>

  • In the Feature ID element add your empty feature’s ID
  • In the Property Key="SiteScopedGUID" element add the site collection feature id that you want to activate.

Hope you’ll find this post helpful

Build a Search Driven Solution with SharePoint 2013 - Part I


You may also be interested in: SharePoint Conference.ORG 2013


 

Editor’s note: Contributor Nicki Borell is a SharePoint Evangelist & Consultant for Experts Inside. Follow him @NickiBorell.

Part I is about Search Driven in on-premise environments

Part II will show the options and differences with O365 SharePoint Online Search Driven Solutions which are not new in SharePoint 2013. But with SP2013 they reached a new dimension and there are many more out-of-the-box web parts and options to work with content that is in your search index.

Admin Stuff

Why would you use Search Driven? Good question. Let’s ask “why not”? The answer is the index latency. Search Driven Solutions are based on the search index and that means that the data freshness depends on the index freshness. With the new feature Continuous Crawling and other solutions like Event Driven Crawling we can come up with a really up to date index. Depending on your environment you can obtain index freshness in the scope of 2 minutes or so. Some other points in the context of Search Driven Solutions are:

  • Separate presentation from storage
  • Flexible and dynamic
  • Breaking down site collection boundaries
  • Eliminate large list thresholds
  • Allows flexible & dynamic publishing

Special Data means Special Search and special Search Results

In some cases we don’t have to choose between normal SharePoint Search and Search Result web part or using Search Driven Solutions. There is a very useful and powerful option in between. With SharePoint 2013 there are some new features. In the context of “Special Date means Special Search and special Search Results” we will now have a closer look at “Result Source” and “Query Rules”.

Result Source

Working with search based solutions generally we start with a “Result Source”. Result Sources are places under Site Settings or if you were to configure them for the complete farm in the search service application. A Result Source has some basic parameters:

Protocol: defines from where the results are coming

Type: focused between content and people search

Query Transformation: gives us the option to focus on which data is shown in this Result Source using Search Syntax. Also we have the option of using the Query Builder to define the Query Transformation.

2013-03-19-2013SearchDrivenSolution-01.jpg

To use a Result Source we have to configure the search result web part to use this source. This is simply configured in the settings of the result web part:

2013-03-19-2013SearchDrivenSolution-02.jpg

Query Rules

Query Rules are used to manipulate search query. A Query Rule is always based on a Result Source. That’s why we have to start with a Result Source. Query Rules are also based in the site setting or in search service application. Working with Query Rules we have two main parameters.

  1. Query Condition: this parameter defines under which condition the Query Rules takes effect.
  2. To do this we had several options. The easiest way is “Query matches Keyword exactly”. But we also can use Term store using the option “Query matches Dictionary exactly” This brings many powerful options. For example if you extend your Term set which is referred, you do not need to reconfigure you Query Condition. Also this can be useful in a Multilanguage environment.

  3. Actions: The section configures what should happen if a Query matches.
  4. We can configure a promoted result which is similar to the Best Bets we know from SharePoint 2010 and we can place a Result Block.

Result Blocks are a new feature that allows us to place a separate block containing the data we configured based on our Result Source in the top of the search result web part. Every Result Block can be configured using Query Builder. For example if you use a special Result Block only showing pictures the configuration should be like this one:

  • Query: {subjectTerms} contenttype:image
  • Settings: Item Display Template -> Picture Item

Display Templates are also a new feature in SharePoint 2013. They allow us to use different visualizations based on content type or so. Click here for more details

Here is an example:

2013-03-19-2013SearchDrivenSolution-03.jpg

Bring all this together we can deliver special search for special data.

2013-03-19-2013SearchDrivenSolution-04.jpg

To get a result like this we had to configure a Result Source, based on the Query Rules and then use the Result Source in a search result web part. A detailed step by step walkthrough is shown in the Webcast at the end of the post.

Search Driven Publishing Model

The above solutions are all based on a search query which had to be filled in a search box by a user or had to be configured as a “fixed keyword query” in the settings of the search result web part. Now let’s see how we can create dynamic pages showing content based on Search Querys using the new web part family “Search Driven Content”.

2013-03-19-2013SearchDrivenSolution-05.jpg

As you can see there are preconfigured web parts for different scopes. The context driven web parts like “Popular Items” and “Recommended Items” are based on search analytics, user context and user activity. Other ones like “Pictures” or “Pages” containing a special visualization based on the contented type. The “Search Driven Content” web parts can also be used to visualize search results based on a search query which is typed into a search box. All search web parts can be combined with each other. This is used in configuring the following example:

2013-03-19-2013SearchDrivenSolution-06.jpg

Here you can see the “Search Driven Content” web part for Pictures. In the settings dialog the Display Template is configured to show “Picture on top, 3 lines on bottom”. Under Property Mappings you can choose which Managed Property’s are used to fill the lines. In the context of the shown Refiner web part “Refinement Target” is configured to the “Search Driven Content” web part Pictures. The binding is based on the Title of the web part.

Solutions based on the Search API

The Search API allows building Apps or other solutions based on the content coming from the search index. For more information about the SharePoint 2013 Search API look here: LINK

Here is an example based on my demo environment:

http://win-ful28kv4389/_api/search/query?querytext=’contenttype%3Aorbeabike’

Using this query the result looks like this:

2013-03-19-2013SearchDrivenSolution-07.jpg

Using this XML we built a demo App showing the same data like in the above shown search result web part using the Result Blocks:

2013-03-19-2013SearchDrivenSolution-08.jpg

Webcast with hands on system demos:

2013-03-19-2013SearchDrivenSolution-09.png