X Blogs

who you gonna call?

The People picker and domain trusts

clock January 31, 2014 06:58 by author Robi

In the last couple of projects on SharePoint I was working with environments, where company was associated with couple of domains. Since SharePoint has to be configure to work in such environments, I decided to introduce the issue on two typical scenarios and explain in what way it is then necessary to set up a SharePoint site, to make it possible to resolve users in people picker from other domains.

Scenario 1 – "Two way trust"

The characteristic of "Two Way Trust" is that users from domain 1 have rights to access resources in domain 2 and vice versa. This means that users from a domain 1 can read data in the Active Directory of domain 2 and users from a domain 2 can read data from AD in domain 1. This is very important information, since configuration of SharePoint Server depends on these characteristic.

Figure 1 shows the scenario where we have set up 2 way trust between the domain 1 and domain 2. In domain 1 SharePoint Server is placed, which can be used by users from domain 1 and domain 2. The Problem, which in this case refers to the People picker is that people picker by default search only in Active Directory domain in which the SharePoint Server is installed. So, if you want to add users from other domains, we need to fix some of the settings on the SharePoint Server.

Settings that must be set in this case refers to the entire web application. Of course, these settings cannot be set in the user interface, so it is necessary in this case to open SharePoint 2013 Management Shell and enter a couple of lines of code.

$wa = Get-SPWebApplication http://portal.domena1.loc


# List the current values for a web application




<# ====================================


Portal web app


===================================== #>







Domain 1



$newdomain1 = new-object Microsoft.SharePoint.Administration.SPPeoplePickerSearchActiveDirectoryDomain


### #### ### Register FQDN ### ### ###

$newdomain1.DomainName ='domena1.loc ';


### #### ### Register netbios name ### ### ###

$newdomain1.ShortDomainName ='domena1';





Domain 2



$newdomain2= new-object Microsoft.SharePoint.Administration.SPPeoplePickerSearchActiveDirectoryDomain



### ### ### Register FQDN ### ### ###

$newdomain2.DomainName ='domena2.loc';


### #### ### Register netbios name ### ### ###

$newdomain2.ShortDomainName ='domena2';





$wa.Update ()

For setting the "People Picker" control, first we need to save the web application object in a new variable. The current value of the variable can be listed with the following command $wa.PeoplePickerSettings. SearchActiveDirectoryDomains.

In order to be able to add new values to search other domains, we have to make a new object of type Microsoft. SharePoint. Administration. SPPeoplePickerSearchActiveDirectoryDomain, and then set the value. The first value is the FQDN of the domain and the second value is the NetBIOS name, for a short domain name. The same procedure must be repeated for all the domains that you want to allow in the people picker.

At the end we have to, of course, save the changes that are stored in the web application object. We have to repeat the procedure on all web applications on which we want to change the behavior of the people picker, including the Central Administration.

Scenario 2 – "a One way trust"

»One way trust" is a way of linking domains where one domain trusts another domain. For example, I used a one way trust between domain 1 and domain 2, where domain1 trusts domain 2, while domain 2 does not trust domain 1. To make it simple, we could say that the users from domain 2 can access data in the domain 1 while domain 1 users cannot access the data in the domain 2. Such a scenario is widely used among intranet and extranet environments, where users from the intranet domain can access extranet domain, while the reverse is not possible.


In a scenario where you use one way trust ", we can use the same script that we used in bidirectional trusts, with the difference that you need to provide some sort of access. These parameters are the username and password of the user who will be able to access resources in the domain 2.




Creating a password app




Stsadm –o setapppassword –password 'Pa$$w0rd'


# ========================================================


$wa = Get-SPWebApplication http://portal.domena1.loc


# List the current values for a web application




<# ====================================


Portal web app


===================================== #>







Domain 1



$newdomain1= new-object Microsoft.SharePoint.Administration.SPPeoplePickerSearchActiveDirectoryDomain


### #### ### Register FQDN ### ### ###

$newdomain1.DomainName ='domena1.loc';


### #### ### Register netbios name ### ### ###

$newdomain1.ShortDomainName ='domena1';





Domain 2



$newdomain2 = new-object Microsoft.SharePoint.Administration.SPPeoplePickerSearchActiveDirectoryDomain


### ### ### Accounta rights Settings ### ### ###


$pass=convertto-securestring 'Pa$$w0rd' –AsPlainText -Force

$credentials=new-object –typename System.Management.Automation.PSCredential –argumentlist $user,$pass





### ### ### Register FQDN ### ### ###

$newdomain2.DomainName ='domena2.loc';


### #### ### Register netbios name ### ### ###








Because in this scenario, the application pool account does not have rights to access resources that are in a domain 2, we need to set the username and password for access to Active Directory objects in domain 2.

The first step is to create the app password for storing credentials on all server in our farm. Command Stsadm –o setapppassword –password 'Pa$$w0rd' creates a registry key where app pool account can read credential key from.

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\15.0\Secure

The next step is to create a connection for each domain specifically and also set up a user account for connecting to the specified domain.

For these purpose, you can use the account that synchronize data between SharePoint Server and Active Directory, provided that we have a SharePoint Server and the User Profile Synchronization Service. Otherwise, you can create normal service account, which will be exclusively used for reading data from the specified domain.

The last step is to set up application pool account permissions to read AppCredentialKey. AppCredentialKey registry key is limited to certain local server groups and if the permissions are not correctly set up, Event Logs are full of "Registry access is not allowed" errors and users are not getting resolved in the people picker. Users and groups that have access to registry key by default are:

  • System
  • Network Service
  • WSS_Restricted_WPG_V4
  • Administrators

In order to grant access for application pool to the AppCredentialKey, you can add service account to WSS_Restricted_WPG_V4 group. Accounts, which must be in this group are:

  • Application pool account for the web application
  • The Sandbox service account
  • The service account of the Central Administration – provided that it is not in the Administrators group.

SharePoint configuration can be quite challenging in some scenarios so I hope that this post will help SharePoint admins ease their work and enhance their understanding of SharePoint. This configuration also applies for SharePoint 2010.


Rob Vončina

SharePoint Server MVP

Export – import alerts

clock November 10, 2013 03:23 by author Robi

In my previous blog post I have explained how you can troubleshoot alerts. In this one, I'm just going to post scripts I used for exporting and importing all alerts in a site collection.

Here is the script for exporting all alerts in a site collection to a csv file:

$site = Get-SPSite "http://2013portal"

$alertResultsCollection = @()

foreach ($web in $site.AllWebs) {

foreach ($alert in $web.Alerts){

$alertURL = $web.URL + "/" + $alert.ListUrl

$alertResult = New-Object PSObject

$alertResult |Add-Member -type NoteProperty -name "WebUrl" -Value $web.Url

$alertResult | Add-Member -type NoteProperty -name "ListURL" -value $alertURL

$alertResult | Add-Member -type NoteProperty -name "AlertTitle" -value $alert.Title

$alertResult | Add-Member -type NoteProperty -name "ListUrl" -value $alert.ListUrl

$alertResult | Add-Member -type NoteProperty -name "List" -value $alert.List

$alertResult | Add-Member -type NoteProperty -name "DeliveryChannel" -value $alert.DeliveryChannels

$alertResult | Add-Member -type NoteProperty -name "AlertType" -value $alert.AlertType

$alertResult | Add-Member -type NoteProperty -name "EventType" -value $alert.EventType

$alertResult | Add-Member -type NoteProperty -name "Frequency" -value $alert.AlertFrequency

$alertResult | Add-Member -type NoteProperty -name "AlertTime" -value $alert.AlertTime

$alertResult | Add-Member -type NoteProperty -name "SubscribedUser" -value $alert.User

$alertResultsCollection += $alertResult






#Export to CSV

$alertResultsCollection | Export-CSV C:\Users\sp2013_farm_admin\Desktop\Alerts.csv


And here is the script you can use to import all alerts in one site collection from csv file:


Import-Csv C:\Users\sp2013_farm_admin\Desktop\Alerts.csv |ForEach-Object{










$web=Get-SPWeb $webUrl


$user = $web.EnsureUser($subscribedUser)

$newAlert = $user.Alerts.Add()

$newAlert.Title = $alertTitle


$newAlert.List = $list

$newAlert.DeliveryChannels = [Microsoft.SharePoint.SPAlertDeliveryChannels]::$deliveryChannel

$newAlert.EventType = [Microsoft.SharePoint.SPEventType]::$eventType

$newAlert.AlertFrequency = [Microsoft.SharePoint.SPAlertFrequency]::$frequency

if($frequency -ne "Immediate"){








Hope it helps.

Robi Vončina

"Alert Me" feature and SharePoint 2013

clock October 31, 2013 22:58 by author Robi


In the previous few projects we had some issues involving alerts after upgrading or migrating to SharePoint 2013, so I wanted to explain how you can help yourself out when troubleshooting alerts in SP2013.

I described here a few most common issues we were faced with.

1. The "Alert Me" Feature is not available

If you happen to be in the documents library, and you do not find the command "Alert Me", the most likely problem is that you have not set up "Outgoing Email Settings" in Central Administration. Outgoing Email can be found in the category System Settings Configure Outgoing E-Mail Settings or at the URL: http://[CA_URL],/_admin/globalemailconfig.aspx.

Another option to set the Alerts is of course the use of PowerShell. Script with which you can set up alerts is as follows:


# == == == == == = Set variables == == == == ==

$SMTPServer = "devmail"

$emailAddress = "SP2013@dev.local"

$replyToEmail = "robi@dev.local"


# == == == == == = Outgoing Email = == == == == ==

$loadasm =[System.Reflection.Assembly]::LoadWithPartialName ("Microsoft SharePoint.")

$spGlobalAdmin = New-Object To Microsoft SharePoint. SPGlobalAdmin Administration. ..

$spGlobalAdmin.UpdateMailSettings ($SMTPServer, $emailAddress, $replyToEmail, 65001)


The result of our setting is then visible to the Central Administration and on the Ribbon in lists or libraries where the Alert Me button appears:

2. You are not able to create alerts for domain Email enabled security groups or distribution groups

In SharePoint 2013, we came across a very interesting example, where through the user interface you are not able to subscribe a domain security group to alerts. In the document library or list, click the alert me button, the dialog box opens, where you can enter the user you would like to subscribe to alerts. In previous versions of SharePoint names of distribution groups or email enabled security groups get resolved, while in the new version domain groups do not work. You get a "No matching results".

If you want to create alerts for your domain groups, you must use PowerShell:



<# ================================================


Options, setting EventType


SPEventType [Add, Modify, Delete, Discussion, All]


Setting options in the alert frequency

the case of the daily, weekly, it is necessary to also set

alert hour


SPAlertFrequency [Immediate, Daily, And Weekly]


================================================ #>


$web=Get-SPWeb "http://2013portal"


$user = $web.EnsureUser('DEV\skupinaemail')

$newAlert = $user.Alerts.Add()

$newAlert.Title = $list.Title


$newAlert.List = $list

$newAlert.DeliveryChannels = [Microsoft.SharePoint.SPAlertDeliveryChannels]::Email

$newAlert.EventType = [Microsoft.SharePoint.SPEventType]::All

$newAlert.AlertFrequency = [Microsoft.SharePoint.SPAlertFrequency]::Daily





In order to verify whether they have successfully set up alerts, open the Site Settings / User alerts

For the purposes of testing and the display of alerts, I created another alert for the same group, the only difference is that in this case, I use the "Frequency" Immediate Alert.

3. The user has set Alert on item adding in document library, but an immediate alert is not sent

For this example, it is necessary to understanding how alerts work. SharePoint Content databases has a table, which is called the EventCache. This is the table where SharePoint writes events that are important to search service application, alerts, ..., in short, the table serves as a temporary repository of current events, which is then used for subsequent operations. In the case of alerts, all subsequent operations for Immediate alerts are taken care of by the Timer job, which can be found in the Central Administration and is called Immediate Alerts or you can list all Timer jobs by simple PowerShell query:

Get-SPTimerJob | ? {$_.name -like "alert"}


As we can see from the results of the command, the Timer Job triggers every 5 minutes. The timer job queries the content databases and table EventCache. In the event that the table contains records for sending alerts, alert email is sent and record gets deleted from the event cache table.

Warning! Running a query directly to the SharePoint database is not supported.

To see what EventCache table contains, you can run this SQL Query against your Content DB:

select * from [SP13_Content_2013Portal].[dbo].[EventCache] (nolock)order by [TimeLastModified] desc


So, if I make a change on the document in the document library and perform a query, I get as a result of the following record:

Query result can be also the first verification that alerts work properly. For the purpose of subsequent verifications of Alerts, we have to use ULS Log Viewer and set up Verbose logging in Central Administration for alerts. Alerts logging can be found in the SharePoint Foundation category.

In the ULS Log Viewer, you have the option of filtering ULS logs. The easiest way to get all records related to the alerts, is to set the filter, as shown in the image:

If you want to see the records in the ULS logs and do not want to wait for 5 minutes, you must manually trigger Immediate Alerts Timer Job.

In my case, I'm exploring what's happening with alerts to which user Uroš is subscribed to. In the document library I uploaded a new document and started the Immediate alerts timer job with the use of PowerShell:

Get-SPTimerJob | ? {$_.name -like "*alert*"} |Start-SPTimerJob


In ULS logs you can now see records of Alerts operations:

In this case, I see that alerts work as expected. But what happens if we have a library with feature "Require documents to be checked out before they can be edited«?

Repeat the process, upload the document to a document library, in the ULS LOGS watch what's going on with alerts. Following entries appear:

Voila, I can see that the alerts are being processed correctly, however, because loading the document into the document library is a 2 step process, upload and entering metadata, it can be concluded that when uploading a document, document is treated as checked out. Record is written to EventCache table where state of document is registered as Checked out.

With that being said, we can now conclude that when the feature "Require documents to be checked out before they can be edited« on document libraries is enabled and user is subscribed to alerts on document added, alerts are not going to be sent out, as SharePoint treats this documents as checked out, or as the first minor version.


In this article I tried to describe some typical errors which may occur in the SharePoint 2013, in conjunction with the Alert Me feature and what's more important, I tried to demonstrate how we can debug and solve errors.

After the troubleshooting, don't forget to set the logging level for the Alerts back to the default settings.


Robi Vončina
SharePoint Server MVP


Upgrade Video

clock August 27, 2013 19:34 by author Robi

For the Ukrainian SharePoint Conference I recorded this video to present it at my session. I decided to make it public now so please, sit back, relax and enjoy it.


Upgrading to SharePoint 2013 Video

Upgrading to SharePoint 2013 – part 2

clock August 27, 2013 00:19 by author Robi

In one of the previous blog posts I was talking about basic upgrade scenarios. We've looked at how to make the database attach method and then how to make upgrade of the site collection.

In this article, I'll introduce how you can a variety of tools, get you a better insight into the farm that you are upgrading. You'll also see how you can monitor the upgrade, which site collections are in the process of upgrading and how upgrade "queue" can be edited. In the end, I will explain how to install custom solutions and what's new with installing custom solutions on SharePoint farm.

First of all, I would like to introduce two essential tools for SharePoint administrators, which can be found on CodePlex:

SharePoint Feature Administration and Clean Up

SharePoint Feature Administration and Clean Up is a tool that you can use before you make a backup of your database that you want to upgrade. The tool shows all features at web app level, site collection, or individual sites. Application offers that without any installation on the server. All you have to do is start the application as a user that has sufficient rights on the entire farm.

The application is particularly useful because you will find features that are either corrupt or for which you do not have installed the solution, and this way you can save a lot of debugging and browsing after upgrading to SharePoint 2013 alone.

As shown in the following illustration, the application shows which function is malfunctioning, and in the event that you select a broken function also offers a variety of options, among which are:

  • Uninstall
  • Activation
  • Find where this function is activated in the farm.


Example: Broken feature that we would like to remove from the farm. Select the feature, click on Uninstall and confirm some dialogs.




Application logs shows us what has been done and on which object.

The application, however, offers us a very useful function, which is called "Find the faulty feature and the farm. This function enumerates the entire farm locates broken features and removes them. Then, with one click of a button you can clean your farm and database upgrade will show lot less errors than in case you don't clean your farm.

SharePoint Manager

SharePoint Manager is also an application that is not necessary to install it on the server, but it is enough to run the application on a server with elevated privileges. The application allows you to look at the whole structure of objects in your SharePoint 2010/2013 environment.

For each object in the farm, you can view the properties of the selected object in details view in the right window and if you select view of field object you can also see the XML schema of the object. You can also change the values of the some properties and save them back to the SharePoint configuration database, respectively.

Because the application offers a very detailed look at the structure of the SharePoint environment, it is very useful for developers and highly recommend it as well as for the maintenance and administration of the SharePoint 2013/2013.

Manage site collection upgrade

In the previous issue of the article, I introduced the process of upgrades including site collection upgrades. Site collection upgrade was run with following command in PowerShell:

Upgrade-SPSite -Identity http://upgrade -VersionUpgrade


Command in PowerShell runs site collection upgrade. You would be able to add another parameter – QueueOnly, whereby site collection would be added to the upgrade queue. Queue maintainance is done by Timer Job, which runs every 2 minutes and starts the upgrade of the next site collection in a queue. Regardless of whether we add a parameter – QueueOnly or not, each site collection we upgrade appears in the list, obtained with the command:

Get-SPSiteUpgradeSessionInfo -ShowInProgress -ShowCompleted -ShowFailed -null ContentDatabase [database] | select siteid, the status of the


In the event that we are upgrading multiple site collections, which are located in more than one database, you can get list of upgrading site collections as follows:


foreach($baza and $imenaBaz) {

Get-SPSiteUpgradeSessionInfo -ShowInProgress -ShowCompleted -ShowFailed -A Null ContentDatabase $baza | select siteid, the status of the



In the event that you would like to remove site collection from the list, you can use the command:

Remove-SPSiteUpgradeSessionInfo -Identity http://upgrade


Server load throttling during upgrade

For the process of upgrading site collections, we are also able to control how many site collections are being upgraded simultaneously. For server load management, two settings are available. First one is at the web application level, and the other is set on content database.

$wa = Get-SPWebApplication http://upgrade


The command shows how many site collection can web app application pool process upgrade simultaneously. If the number is exceeded, a site collection is automatically placed in the queue, regardless of whether we add the parameter-Queue or not. It is also automatically placed in queue, if it exceeds SubWebCountLimit value, or if it exceeds UsageStorageLimit. By default, these two values are set to 10. You can easily increase that number, depending on your server hardware, as limits are set really low.

The second option is for controlling concurrent upgrades, at the level of the individual database in which site collection are located in. The current settings can be listed with the following command:

$db = Get-SPContentDatabase SP2013_Content_Upgrade



By default, the number of concurrent upgrades of site collections is set to 10. This setting can be, depending on the specifications of your database servers for, easily increased.

Custom solutions upgrade management

Because SharePoint is a platform that can be used for development of business applications, the environments with a lot of installed custom solutions are very common. For success of the upgrade to SharePoint 2013, it is very important that all the solutions that have been installed on the SharePoint 2010 are successfully transferred to the new version. Best scenario for custom solutions is if you can get the source code for installed solutions, which you can then upgrade for the new version or if we can get already upgraded WSP file. However, since SharePoint 2013 also contains binaries of SharePoint 2010, all 2010 solution might already work on 2013.

It is necessary to be aware of some changes to the security settings of custom solutions. In the past, solutions installed into the BIN folder were treated as "Partial Trusted Code", while in the last version of the SharePoint this solutions are considered as "Full Trust«.

To install a custom solution with PowerShell there are additional parameters available. If we want to add a SharePoint solution in the Solution Store, we need to use the command:

Add-SPSolution -LiteralPath "C:\Users\sp_FarmAdmin\Desktop\WSPs\test.wsp"


To install the solution, use the command:

The Install-SPSolution -Identity test wsp -GACDeployment parameter -Force -Verbose -CompatibilityLevel all


You can use the new parameter to set the CompatibilityLevel of the solution:




' AllVersions '


Installs the solution to both 14 and 15 directories (in the 2013 product, subject to change and future versions)



The "Old"

Installs the solution to 14 directories only (in the 2013 product, subject to change and future versions)




Installs the solution to 15 directories only (in the 2013 product, subject to change and future versions)



1. 14 Hive path-_/layouts/Custompage .aspx

2. the path – the Hive _ 15/layouts/15/Custompage.aspx

The Global Assembly Cache

GAC aspect. After the release of NET Framework 4.0 GAC is divided into two parts, one for each CLR. c:\windows\assembly is the location for the NET version 1.0 to 3.5 and c:\windows\microsoft.netassembly is the location for all DLL projects created in NET Framework 4.0.

I hope I've written some useful guidelines for a successful upgrade. In the event any additional questions or needing help with your upgrade, you can contact me at: robi@kompas-xnet.si.

SharePoint Designer 2013 Crashing when using select people/groups from SharePoint site

clock August 24, 2013 01:56 by author Robi

Yesterday I was working on an issue where client wanted to use action in SharePoint Designer 2013 like Send Email to users and when Select people/Groups from SharePoint Site was clicked, SharePoint Designer just crashed.

The same thing happened in InfoPath Designer 2013.


After investigating error logs on SharePoint Web Servers, I found this error:

System.ServiceModel EventID: 3

WebHost failed to process a request.

Sender Information: System.ServiceModel.ServiceHostingEnvironment+HostingManager/40939712

Exception: System.ServiceModel.ServiceActivationException: The service '/_vti_bin/spclaimproviderwebservice.svc' cannot be activated due to an exception during compilation. The exception message is: This collection already contains an address with scheme http. There can be at most one address per scheme in this collection. If your service is being hosted in IIS you can fix the problem by setting 'system.serviceModel/serviceHostingEnvironment/multipleSiteBindingsEnabled' to true or specifying 'system.serviceModel/serviceHostingEnvironment/baseAddressPrefixFilters'.

Parameter name: item. ---> System.ArgumentException: This collection already contains an address with scheme http. There can be at most one address per scheme in this collection. If your service is being hosted in IIS you can fix the problem by setting 'system.serviceModel/serviceHostingEnvironment/multipleSiteBindingsEnabled' to true or specifying 'system.serviceModel/serviceHostingEnvironment/baseAddressPrefixFilters'.

Parameter name: item

at System.ServiceModel.UriSchemeKeyedCollection.InsertItem(Int32 index, Uri item)

at System.Collections.Generic.SynchronizedCollection`1.Add(T item)

at System.ServiceModel.UriSchemeKeyedCollection..ctor(Uri[] addresses)

at System.ServiceModel.ServiceHost..ctor(Type serviceType, Uri[] baseAddresses)

at System.ServiceModel.Activation.ServiceHostFactory.CreateServiceHost(Type serviceType, Uri[] baseAddresses)

at System.ServiceModel.Activation.ServiceHostFactory.CreateServiceHost(String constructorString, Uri[] baseAddresses)

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.CreateService(String normalizedVirtualPath, EventTraceActivity eventTraceActivity)

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.ActivateService(ServiceActivationInfo serviceActivationInfo, EventTraceActivity eventTraceActivity)

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath, EventTraceActivity eventTraceActivity)

--- End of inner exception stack trace ---

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath, EventTraceActivity eventTraceActivity)

at System.ServiceModel.ServiceHostingEnvironment.EnsureServiceAvailableFast(String relativeVirtualPath, EventTraceActivity eventTraceActivity)

Process Name: w3wp

Process ID: 7720


This error appears because in IIS I set up two web site bindings for web application. One was http://intranet and the other one was http://intranet.kompas-xnet.si.

As it turns out, SharePoint 2013 is not particularly happy about that.


As error description says I tried to set up multipleSiteBindings in web.config of the Web Application. You can find more details here:

Supporting Multiple IIS Site Bindings

I modified the web.config as follows:


<serviceHostingEnvironment aspNetCompatibilityEnabled="true" multipleSiteBindingsEnabled="true"/>



With multipleSiteBindingsEnabled set to True, SharePoint Designer worked without any issues. But, what I found out was that event logs on web servers were starting to get filled with other error:

System.ServiceModel EventID: 3

WebHost failed to process a request.

Sender Information: System.ServiceModel.ServiceHostingEnvironment+HostingManager/62297830

Exception: System.ServiceModel.ServiceActivationException: The service '/_vti_bin/client.svc' cannot be activated due to an exception during compilation. The exception message is: When 'system.serviceModel/serviceHostingEnvironment/multipleSiteBindingsEnabled' is set to true in configuration, the endpoints are required to specify a relative address. If you are specifying a relative listen URI on the endpoint, then the address can be absolute. To fix this problem, specify a relative uri for endpoint 'http://intranet.kompas-xnet.si/_vti_bin/client.svc'.. ---> System.InvalidOperationException: When 'system.serviceModel/serviceHostingEnvironment/multipleSiteBindingsEnabled' is set to true in configuration, the endpoints are required to specify a relative address. If you are specifying a relative listen URI on the endpoint, then the address can be absolute. To fix this problem, specify a relative uri for endpoint 'http://intranet.kompas-xnet.si/_vti_bin/client.svc'.

at System.ServiceModel.Activation.ApplyHostConfigurationBehavior.ThrowIfAbsolute(Uri uri)

at System.ServiceModel.Activation.ApplyHostConfigurationBehavior.FailActivationIfEndpointsHaveAbsoluteAddress(ServiceHostBase service)

at System.ServiceModel.Description.DispatcherBuilder.ValidateDescription(ServiceDescription description, ServiceHostBase serviceHost)

at System.ServiceModel.Description.DispatcherBuilder.InitializeServiceHost(ServiceDescription description, ServiceHostBase serviceHost)

at System.ServiceModel.ServiceHostBase.InitializeRuntime()

at System.ServiceModel.ServiceHostBase.OnOpen(TimeSpan timeout)

at Microsoft.SharePoint.Client.Services.MultipleBaseAddressWebServiceHost.OnOpen(TimeSpan timeout)

at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.ActivateService(ServiceActivationInfo serviceActivationInfo, EventTraceActivity eventTraceActivity)

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath, EventTraceActivity eventTraceActivity)

--- End of inner exception stack trace ---

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath, EventTraceActivity eventTraceActivity)

at System.ServiceModel.ServiceHostingEnvironment.EnsureServiceAvailableFast(String relativeVirtualPath, EventTraceActivity eventTraceActivity)

Process Name: w3wp

Process ID: 8160


So, in the end I just removed multipleSiteBindingsEnabled setting from web.config and removed second web site binding from IIS. Now, everything works as expected.

Hope it helps someone when investigating SharePoint Designer issues.



Robi Vončina

Upgrading to SharePoint 2013

clock August 8, 2013 18:44 by author Robi

Upgrading from SharePoint 2007 to SharePoint 2010 has been significantly improved and there were no major issues like with upgrading from SharePoint 2003 to SharePoint 2007. Upgrade to SharePoint 2013 has also been vastly improved but it also brings some new stuff that I would like to point out.

Figure 1: 5 major steps in upgrade process

Every SharePoint upgrade project should consist of the 5 major steps, but I'm not going to discuss what every step of every upgrade project should be. I'm going to concentrate on what is new and how can you actually perform an upgrade.

Upgrade methods that are available in SharePoint 2013 upgrade process are or should I say is database attach method. There is no other supported upgrade method in the version to version upgrades. What this means is that you need to establish a side to side farm with your existing SharePoint 2010 environment, as earlier versions of SharePoint are not supported for direct upgrade to SharePoint 2013. If you have an earlier version of SharePoint and you do not want to make additional upgrade steps to SharePoint 2010, you can use 3rd part migration tools as Metalogix, Idera, AvePoint,…

Database attach method means backing up you SharePoint 2010 databases, restoring them to you SQL Server for SharePoint 2013 and mounting those to your SharePoint environment. But not every database supports db atttach upgrade from previous version of SharePoint.

Support database attach upgrade

Not supporting database attach upgrade

Content databases

Configuration database

Project databases

Search Index database

Search admin database

Sync database

Social database


Profile database


Managed Metadata database


Secure store database


Firstly, to perform a database attach method for content database we need to create new web application in SharePoint 2013. As a best practice I would recommend removing newly created database, as we are going to attach one. If you have already restored content database to your SQL Server, you can then open SharePoint 2013 Management Shell and test the content database against the newly created web application:

Test-SPContentDatabase -Name SP02_WSS_Content_Upgrade -WebApplication http://2013upgrade/

In your PowerShell window you will get a result from your test back.

Figure 2: Test-SPContentDatabase

You should be aware of upgrade blocking category as if this is set to true, your upgrade process won't be able to complete. If you test your content database to a web application that has a different authentication provider set than your content db, you would get an error like the following:

Figure 3: Test-SPContentDatabase against Claims Web Application

This may happen if your SharePoint 2010 environment was configured with Classic mode authentication and you created Claims web application in SharePoint 2013, which is the default authentication method.

To create classic mode web application in SharePoint 2013 you need to do it in PowerShell using a command similar to this:

New-SPWebApplication -Name SP2013_Upgrade -ApplicationPool SP2013Upgrade_AppPool -ApplicationPoolAccount dev\sp2013_app_pool -HostHeader 2013Upgrade -Port 80 -Url http://2013upgrade -DatabaseName SP2013_Content_2013Upgrade -Verbose

Once all issues from your test are resolved you can start an upgrade process by using the following command

Mount-SPContentDatabase -Name SP02_WSS_Content_Upgrade -WebApplication http://2013upgrade/ -Verbose

When command completes, you are notified about the success of you upgrade process. In my case there were some errors thrown because of missing features for Report Server:

Figure 4: Mount-SPContenDatabase

You can always check progress of your upgrade process in Central administration. So if you go to central admin, Upgrade and Migration, Check upgrade status you will see status of all your previous upgrade sessions with some basic data.

Figure 5: Central Admin Check Upgrade Status

One of the most important information you can get is where your upgrade log is located. If you go to your logs folder, you will find not only one but 2 log files. One is whole upgrade log file and the other one is just error log file.

One of the new things that I must mention here is that log files are now in ULS format, this means that they are easily readable and easier for review. Sample of ULS log is shown here:

Figure 6: Upgrade Error Log

With database upgrade completed, SharePoint upgrade is not completed yet. In SharePoint 2010 after an upgrade we had to do a so called Visual Upgrade which upgrades visual experience. In SharePoint 2013, we do not have visual upgrade, but we need to do the Deferred Site Collection upgrade. What this means is that site collection upgrades are separated from database upgrades and that every site collection admin can now control when he or she would like their site collection to be upgraded. Before site collection upgrade you actually use SharePoint 2010 binaries and in effect cannot use any new features SharePoint 2013 has to offer.

Figure 7: Site collection after database upgrade


After upgrade and when connecting to your upgraded sites you can see that you are still in SharePoint 2010 mode. The only difference you can notice is that there is a red notification bar on top of portal.

To be able to use all SharePoint 2013 features you need to do a deferred site collection upgrade, which as mentioned before is a new concept. With that said, there are also two more new concepts



introduced for site collection upgrades which are called Upgrade evaluation site collection and Site Collection Health Checker.

Figure 9: Links for Health checks and site collection upgrade

Site collection health checks should be run before you try to upgrade site collection. This tool will provide you information about any potential issues and that you can then address before you start upgrading your site collection.

Figure 10: Health check results


Figure 11: Upgrade site collection or create evaluation site collection

Before you actually run site collection upgrade, you can use one more option to test if everything works as expected. This is called evaluation site collection, which essentially is a read only copy of your existing site collection, which you can use to review the new interface and new functionalities. Evaluation sites are automatically expired and deleted after certain amount of time, which by default is set to 30 days.

Finally we are about to start the site collection upgrade. As mentioned before you can do this by clicking Upgrade this site collection button as seen on figure 9. The second option is of course using PowerShell.

Upgrade-SPSite -Identity http://2013upgrade -VersionUpgrade


Figure 12: Maintanance logs

After upgrade process is completed you can also check logs to see if there are any issues that need to be resolved. This log files are located at http://2013upgrade/_catalogs/MaintenanceLogs.

If everything went smoothly you are now able to enjoy all new features that SharePoint 2013 has to offer. There are of course some other details that need to be discussed, especially in large environments where there are plenty of site collections but this is out of scope of this article.


Crawling MySites on https

clock August 7, 2013 00:10 by author Robi

Small trick…

If you are going to crawl My Sites which are configured on https, you need to set your content source to crawl sps3s://mysites instead of sps3://mysites.

If you try to crawl sps3 when your mysites web application is configured with SSL, you get the:

»Object not found«

Error in your crawl logs.


Hope it helps!!!

UserProfileApplicationProxy.ApplicationProperties ProfilePropertyCache does not have a1097ef1-f31e-4145-88c1-ca2fd5c8c836

clock March 8, 2013 00:59 by author Robi


I'm playing around with SharePoint Server 2013 and configuring Service applications, trying out different installs and, well everything I can think of.


Few days ago I tried configuring synchronization of profile pictures from AD, which of course was working without any issues. But when I tried to run a command:

Update-SPProfilePhotoStore -CreateThumbnailsForImportedPhotos 1 -MySiteHostLocation http://myportal/

I got the following error:

Update-SPProfilePhotoStore : UserProfileApplicationNotAvailableException_Logging :: UserProfileApplicationProxy.ApplicationProperties ProfilePropertyCache does not have a1097ef1-f31e-4145-88c1-ca2fd5c8c836 At line:1 char:1

I was trying to find a solution on the web but could not find one that would have the exact same issue. But then one article pointed me in the right direction.


When I was configuring SharePoint 2013 in my DEV environment, I was preparing install scripts and I wanted to make a least priileged install of my farm. So I also changed Distributed cache account to a dedicated AD account with this script:

$farm = Get-SPFarm

#get windows service account

$windowsServiceAccount=read-host "Enter Windows Service account"

$accnt = Get-SPManagedAccount -Identity $windowsServiceAccount


#change distributed cache service account

$cacheService = $farm.Services | where {$_.Name -eq "AppFabricCachingService"}

$cacheService.ProcessIdentity.CurrentIdentityType = "SpecificUser"

$cacheService.ProcessIdentity.ManagedAccount = $accnt




#start service

$distrCacheService=Get-SPServiceInstance |where {$_.TypeName -eq "Distributed Cache "}

if($distrCacheService.status -eq "Disabled")


$distrCacheService | Start-SPServiceInstance


while(-not($distrCacheService.status -eq "Online"))


write-host -ForegroundColor Yellow $distrCacheService.status;sleep 5;

$distrCacheService=Get-SPServiceInstance |where {$_.TypeName -eq "Distributed Cache "}





So because distributed cache is now caching user profile data, it was then obvious, that distributed cache account does not have enough permissions on the User Profile Service Application to perform the PowerShell Command.

After I granted Full Control permission on UPS Service App to my distributed cache account, everything started working without any further issues.

One thing to remember though is, that PowerShell command to create thumbnail photo, should be run after every user profile import. You can use Windows Task manager to achieve this task.


Hope it helps J


Robi Vončina

SharePoint Server MVP

Aplikacijski model platforme Windows Azure – tretji del

clock January 20, 2013 19:03 by author Rok Bermež

6. Datoteka z definicijo storitve

Datoteka z definicijo storitve (končnica CSDEF) je datoteka v formatu XML z opisi različnih vlog, ki sestavljajo vašo aplikacijo. Celotno shemo za datoteko XML lahko poiščete na naslovu http://msdn.microsoft.com/en-us/library/windowsazure/ee758711.aspx. Datoteka CSDEF za vsako vlogo, ki jo želite v svoji aplikaciji, vključuje elementa WebRole in WorkerRole. Če določeno vlogo nastavite kot spletno vlogo (z uporabo elementa WebRole), to pomeni, da se bo koda izvajala na virtualnem strežniku, ki vključuje operacijski sistem Windows Server 2008 in strežnik Internet Information Server (IIS). Če določeno vlogo vpeljete kot delovno vlogo (z uporabo elementa WorkerRole), to pomeni, da se bo koda izvajala na virtualnem strežniku, ki vključuje le operacijski sistem Windows Server 2008 (strežnik IIS ne bo nameščen).

Seveda lahko še vedno ustvarite in namestite delovno vlogo, ki za pričakovanje in obdelavo spletnih zahtev uporablja drug mehanizem (na primer, v aplikaciji lahko uporabite razred .NET HttpListener). Ker vsi virtualni strežniki vključujejo operacijski sistem Windows Server 2008, lahko vaša programska koda izvaja vse operacije, ki so običajno na voljo aplikaciji, nameščeni v tem operacijskem sistemu.

Za vsako vlogo določite želeno velikost virtualnega strežnika. Spodnja tabela prikazuje različne velikosti in lastnosti virtualnih strežnikov, ki so vam na voljo:

Velikost virtualnega strežnika




Najvišja omrežna prepustnost

Zelo majhen

1 x 1,0 GHz

768 MB

20 GB

~5 Mb/s


1 x 1,6 GHz

1,75 GB

225 GB

~100 Mb/s


2 x 1,6 GHz

3,5 GB

490 GB

~200 Mb/s


4 x 1,6 GHz

7 GB

1 TB

~400 Mb/s

Zelo velik

8 x 1,6 GHz

14 GB

2 TB

~800 Mbps

Tabela 2: Različne velikosti in lastnosti virtualnih strežnikov

Storitev se obračunava na uro za vsak uporabljeni virtualni strežnik v določeni vlogi. Prav tako vam obračunamo vse podatke, ki so poslani izven podatkovnega centra. Podatkov, ki so poslani v podatkovni center, ne obračunavamo. Več informacij poiščite na spletni strani Cene storitev Windows Azure. Na splošno strankam priporočamo, da namesto manj večjih strežnikov uporabijo večje število manjših virtualnih strežnikov, saj so tako vaše aplikacije bolj odporne na morebitne okvare. Manj virtualnih strežnikov uporabljate, hujše so posledice izpada enega izmed njih. Poleg tega morate za vsako vlogo uporabiti vsaj dva virtualna strežnika, če želite uveljavljati dogovor o ravni storitev, ki zagotavlja 99,95-odstotno zanesljivost.

V datoteki z definicijo storitve (CSDEF) določite tudi številne lastnosti vsake posamezne vloge v vaši aplikaciji. Sledi nekaj bolj koristnih možnosti, ki so vam na voljo:

  • Certifikati. S certifikati šifrirate podatke, če vaša spletna storitev podpira povezave SSL. Vse certifikate morate naložiti na platformo Windows Azure. Več informacij poiščite na spletni strani Upravljanje certifikatov na platformi Windows Azure. Ta nastavitev XML naložene certifikate namesti v knjižnico certifikatov določenega virtualnega strežnika, kar vaši aplikaciji omogoča, da jih uporablja.
  • Imena konfiguracijskih nastavitev. Tu nastavite vrednosti, za katere želite, da jih vaše aplikacije preberejo, ko se izvajajo na določenem virtualnem strežniku. Dejanska vrednost konfiguracijske nastavitve se določi v datoteki za konfiguracijo storitve (CSCFG), ki jo lahko kadarkoli spremenite, ne da bi morali ponovno namestiti kodo vaše aplikacije. Svoje aplikacije lahko razvijete tako, da nove konfiguracijske vrednosti zaznajo brez prekinitve v delovanju sistema.
  • Vhodne končne točke. Tu določite končne točke HTTP, HTTPS ali TCP (z oznakami vrat), ki jih želite prikazati zunanjemu svetu prek naslova predpona.cloudapp.net. Ko platforma Windows Azure zažene vašo vlogo, bo samodejno nastavila požarno pregrado na virtualnem strežniku.
  • Interne končne točke. Tu določite končne točke HTTP ali TCP, ki jih želite izpostaviti drugim virtualnim strežnikov, ki so bili vpeljani kot del vaše aplikacije. Interne končne točke vsem virtualnim strežnikom z različnimi vlogami v vaši aplikacij dovoljujejo, da komunicirajo med seboj, ob tem pa niso dostopni drugim virtualnim strežnikom, ki se ne nahajajo v vaši aplikaciji.
  • Uvoz modulov. Te možnosti lahko vašim virtualnim strežnikom dodajo koristne gradnike. Na voljo so gradniki za spremljanje delovanja, oddaljeno namizje in Windows Azure Connect, ki vašemu virtualnemu strežniku omogoča, da prek varne komunikacijske povezave dostopa do virov, nameščenih na lokaciji.
  • Lokalno shranjevanje podatkov. Tu virtualnemu strežniku določite podmapo, ki jo bo uporabljala vaša aplikacija. Možnost je podrobno opisana v članku Možnosti shranjevanja podatkov v okolju Windows Azure.
  • Opravila ob zagonu. S temi opravili lahko ob zagonu virtualnega strežnika namestite zahtevane gradnike. Če je potrebno, se lahko opravila izvajajo v vlogi administratorja

7. Datoteka za konfiguracijo storitve

Datoteka za konfiguracijo storitev (CSCFG) je datoteka XML z opisi nastavitev, ki jih lahko spremenite, ne da bi morali ponovno namestiti svojo aplikacijo. Celotno shemo za datoteko XML lahko poiščete na naslovu http://msdn.microsoft.com/en-us/library/windowsazure/ee758710.aspx. Datoteka CSCFG vsebuje element Role za vsako vlogo v vaši aplikaciji. Sledi opis nekaterih elementov, ki jih lahko določite v datoteki CSCFG:

  • Različica operacijskega sistema. S tem atributom lahko izberete različico operacijskega sistema, ki jo želite uporabiti za vse virtualne strežnike, na katerih se izvaja vaša aplikacija. Ta operacijski sistem se imenuje gostujoči operacijski sistem, vsaka nova različica pa vključuje najnovejše varnostne posodobitve, ki so bile na voljo ob predstavitvi gostujočega sistema. Če atribut osVersion nastavite na »*«, bo Windows Azure samodejno posodobil gostujoči operacijski sistem na vseh virtualnih strežnikih, ko bo na voljo nova različica gostujočega sistema. Lahko pa izberete določeno različico gostujočega operacijskega sistema in se tako odpoveste nadgradnjam. Na primer, če atribut osVersion nastavite na vrednost »WA-GUEST-OS-2.8_201109-01«, bo na vseh virtualnih strežnikih nameščena različica, opisana na tej spletni strani: http://msdn.microsoft.com/en-us/library/hh560567.aspx. Več informacij o različicah gostujočega operacijskega sistema poiščite na spletni strani Upravljanje nadgradenj gostujočih operacijskih sistemov na platformi Windows Azure.
  • Virtualni strežniki. Z vrednostjo tega elementa nastavite število virtualnih strežnikov, za katere želite da izvajajo določeno vlogo. Ker lahko v okolje Windows Azure naložite novo datoteko CSCFG (ne da bi morali ponovno namestiti aplikacijo), je zelo preprosto spremeniti vrednost tega elementa in tako dinamično povečati ali zmanjšati število virtualnih strežnikov z določeno vlogo, na katerih deluje vaša aplikacija. Tako lahko preprosto povečate ali zmanjšate zmogljivost aplikacije glede na dejanske delovne obremenitve, obenem pa lahko nadzorujete tudi stroške.
  • Vrednosti konfiguracijskih nastavitev. Ta element označuje vrednosti nastavitev (določenih v datoteki CSDEF). Vaša vloga lahko te vrednosti bere med delovanjem. Te vrednosti konfiguracijskih nastavitev se običajno uporabljajo za povezave s sistemi SQL Azure ali Windows Azure Storage, vendar jih lahko uporabite v katerikoli namen..

8. Ustvarjanje in vpeljava gostovane storitve

Ko ustvarjate gostovano storitev, morate najprej obiskati upravljavski portal Windows Azure Management Portal in omogočiti gostovano storitev tako, da nastavite predpono DNS in podatkovni center, v katerega želite vpeljati svojo aplikacijo. Nato v razvojnem okolju ustvarite datoteko z definicijo storitve (CSDEF), napišete kodo svoje aplikacije in vse datoteke stisnete (z orodjem zip) v storitveni paket (CSPKG). Prav tako morate pripraviti datoteko za konfiguracijo storitve. Za namestitev vloge z uporabo aplikacijskega vmesnika Windows Azure Service Management naložite datoteke CSPKG in CSCFG. Po namestitvi bo Windows Azure omogočil virtualne strežnike v podatkovnem centru (na podlagi konfiguracijskih podatkov), razširil kodo aplikacije iz paketa in jo kopiral na virtualne strežnike ter jih zagnal. Sedaj je vaša koda nameščena in deluje.

Spodnja slika prikazuje datoteki CSPKG in CSCFG, ki ju ustvarite na lokalnem računalniku za razvoj. Datoteka CSPKG vključuje datoteko CSDEF in programsko kodo za dve vlogi. Po tem, ko z aplikacijskim vmesnikom Windows Azure Service Management naložite datoteki CSPKG in CSCFG, bo Windows Azure v podatkovnem centru ustvaril virtualne strežnike z različnimi vlogami. V tem primeru datoteka CSCFG zahteva, da Windows Azure ustvari tri virtualne strežnike v vlogi št. 1 in dva virtualna strežnika v vlogi št. 2.


Slika 5: Datoteki CSPKG in CSCFG, ki ju ustvarite na lokalnem računalniku za razvoj


[1] Msdn.microsoft.com, MSDN Portal, obiskano 20.5.2012.

[2] www.windowsazure.com, Azure Portal, obiskano 20.5.2012.

About the author

Rok Bermež is Slovenian Windows Azure MVP, he works as a Software Engineer and Microsoft Certified Trainer at Kompas Xnet. His primary interests include cloud computing and web development. With extensive experience in development and architecture he participated in many projects and since the CTP release of Windows Azure much of those projects are based on Windows Azure platform. Rok has been delivering courses, writing code and speaking at conferences and community events on Microsoft technologies for the last couple of years. You can also find him on Twitter (@Rok_B).

Month List

Sign In