X Blogs

who you gonna call?

Sorting months chronologically and not alphabetically in a Pivot Table report based on Power Pivot data

clock May 13, 2014 18:41 by author gašper

Here is our problem. When you create a Pivot Table in Excel, you can Group that field by month and the sort will be logical (January, February, …). But when you create a Pivot Table based on Power Pivot table, the grouping does not work! So you have to get to the month names by a different road. We do this by a Format function in PowerPivot, but the problem is that when you put this field in a Pivot Table, it gets sorted alphabetically. This is logical since the values are text and have nothing to do with dates as far as that Pivot Table is concerned, but this is a problem since the months are not sorted chronologically. This article will tell you how to achieve that.

Let's start with a simple table in Excel that has only two columns. One has Date values and the other has a number of visitors on that date. Now we would like to create a Pivot Table report to see how the number of visitors is spread through the months.

Case 1: Pivot Table report based on an Excel Table

First we create a Pivot Table based on an Excel Table

The Pivot Table will show the number of visitors by months. But to do this, since we only have Dates, we have to do Grouping by months on the Dates

And right away we get the desired result.

Case 2: Pivot Table report based on Power Pivot data.

First we add our Table data to Power Pivot the easiest way – by using the Add to Data Model command on the PowerPivot tab.

Now that we have the data in the Power Pivot we can create a Pivot Table report from Power Pivot window. But when we create a Pivot Table and want to see the analysis by months we see we just can't select the Group command. It is grayed out…

So to get to months we use a different trick, we go back to the Power Pivot window and create a calculated column using a Format function. This is a PowerPivot rendition of the Text function from Excel. The syntax is the same, just the names differ.

Excel Version

Power Pivot version.

Using the Format function we now get the month names and a new field to create a Pivot Report by. But when we create it, it looks quite disappointing.

So the numbers are OK, but the sorting is alphabetical and not the kind we want. To get the sorting right, we have to go back to the PowerPivot window and create a new calculated column using the Month function. This way we now get a month number along each date and month name.

Now just adding that to the Pivot Table report would get rid of our problem, but let's not forget that we want the month names as they were, only the sorting is wrong. But now we have all we need.

In the Power Pivot window, we select a value in the month name column and then select a Sort by Column command on the home tab and hey, look at that. You can now say that the Month name column will be sorted by Month No. column.

Doing that has changed our Pivot Report instantly

And we are one step closer to eternal happiness.



Windows Azure Mobile Services novosti

clock April 1, 2014 15:56 by author Rok Bermež

 

Pred kratkim so Windows Azure Mobile Services dobile podporo za .Net in ASP.NET Web API. Ta kombinacija je naredila gradnjo oblačnih mobilnih 'backendov' še toliko enostavnejšo.

Začnemo lahko enostavno tako, da gremo na azure management portal in naredimo nov Mobile service, pri tem pa zberemo .NET kot jezik uporabe.

clip_image002[4]

Ko bo servis narejen, pridemo do priročne začetne strani

clip_image004[4]

Kjer z klikom na gumb 'download' dobimo v naprej prirpavljen projekt temelječ na Web API predlogi z nekaj dodatnimi NuGet paketi.

clip_image006[4]

Če odpremo privzet ToDoItemConroller lahko vidimo kako se uporablja vgrajeni TableController<T> .NET razred, ki nam omogoča enostavno serviranje podatkov v mobilne aplikacije.

clip_image008[4]

Prav tako je omogočeno lokalno razhroščevanje in razvoj, ko pa smo z svojo storitvijo zadovoljni, jo pa enostavno objavimo na njeno oblačno mesto s pomočjo njenega 'publish' profila.

clip_image009[4]



The People picker and domain trusts

clock January 31, 2014 06:58 by author Robi

In the last couple of projects on SharePoint I was working with environments, where company was associated with couple of domains. Since SharePoint has to be configure to work in such environments, I decided to introduce the issue on two typical scenarios and explain in what way it is then necessary to set up a SharePoint site, to make it possible to resolve users in people picker from other domains.

Scenario 1 – "Two way trust"

The characteristic of "Two Way Trust" is that users from domain 1 have rights to access resources in domain 2 and vice versa. This means that users from a domain 1 can read data in the Active Directory of domain 2 and users from a domain 2 can read data from AD in domain 1. This is very important information, since configuration of SharePoint Server depends on these characteristic.

Figure 1 shows the scenario where we have set up 2 way trust between the domain 1 and domain 2. In domain 1 SharePoint Server is placed, which can be used by users from domain 1 and domain 2. The Problem, which in this case refers to the People picker is that people picker by default search only in Active Directory domain in which the SharePoint Server is installed. So, if you want to add users from other domains, we need to fix some of the settings on the SharePoint Server.

Settings that must be set in this case refers to the entire web application. Of course, these settings cannot be set in the user interface, so it is necessary in this case to open SharePoint 2013 Management Shell and enter a couple of lines of code.

$wa = Get-SPWebApplication http://portal.domena1.loc

 

# List the current values for a web application

$wa.PeoplePickerSettings.SearchActiveDirectoryDomains

 

 

<# ====================================

 

Portal web app

 

===================================== #>

 

 

 

 

<####################################  

 

Domain 1

 

####################################>

$newdomain1 = new-object Microsoft.SharePoint.Administration.SPPeoplePickerSearchActiveDirectoryDomain

 

### #### ### Register FQDN ### ### ###

$newdomain1.DomainName ='domena1.loc ';

 

### #### ### Register netbios name ### ### ###

$newdomain1.ShortDomainName ='domena1';

$wa.PeoplePickerSettings.SearchActiveDirectoryDomains.Add($newdomain1)

 

<####################################  

 

Domain 2

 

####################################>

$newdomain2= new-object Microsoft.SharePoint.Administration.SPPeoplePickerSearchActiveDirectoryDomain

 

 

### ### ### Register FQDN ### ### ###

$newdomain2.DomainName ='domena2.loc';

 

### #### ### Register netbios name ### ### ###

$newdomain2.ShortDomainName ='domena2';

 

$wa.PeoplePickerSettings.SearchActiveDirectoryDomains.Add($newdomain2)

 

 

$wa.Update ()

For setting the "People Picker" control, first we need to save the web application object in a new variable. The current value of the variable can be listed with the following command $wa.PeoplePickerSettings. SearchActiveDirectoryDomains.

In order to be able to add new values to search other domains, we have to make a new object of type Microsoft. SharePoint. Administration. SPPeoplePickerSearchActiveDirectoryDomain, and then set the value. The first value is the FQDN of the domain and the second value is the NetBIOS name, for a short domain name. The same procedure must be repeated for all the domains that you want to allow in the people picker.

At the end we have to, of course, save the changes that are stored in the web application object. We have to repeat the procedure on all web applications on which we want to change the behavior of the people picker, including the Central Administration.

Scenario 2 – "a One way trust"

»One way trust" is a way of linking domains where one domain trusts another domain. For example, I used a one way trust between domain 1 and domain 2, where domain1 trusts domain 2, while domain 2 does not trust domain 1. To make it simple, we could say that the users from domain 2 can access data in the domain 1 while domain 1 users cannot access the data in the domain 2. Such a scenario is widely used among intranet and extranet environments, where users from the intranet domain can access extranet domain, while the reverse is not possible.

   

In a scenario where you use one way trust ", we can use the same script that we used in bidirectional trusts, with the difference that you need to provide some sort of access. These parameters are the username and password of the user who will be able to access resources in the domain 2.

 

<####################################

 

Creating a password app

 

####################################>

 

Stsadm –o setapppassword –password 'Pa$$w0rd'

 

# ========================================================

 

$wa = Get-SPWebApplication http://portal.domena1.loc

 

# List the current values for a web application

$wa.PeoplePickerSettings.SearchActiveDirectoryDomains

 

 

<# ====================================

 

Portal web app

 

===================================== #>

 

 

 

 

<####################################  

   

Domain 1

                     

####################################>

$newdomain1= new-object Microsoft.SharePoint.Administration.SPPeoplePickerSearchActiveDirectoryDomain

 

### #### ### Register FQDN ### ### ###

$newdomain1.DomainName ='domena1.loc';

 

### #### ### Register netbios name ### ### ###

$newdomain1.ShortDomainName ='domena1';

$wa.PeoplePickerSettings.SearchActiveDirectoryDomains.Add($newdomain1)

 

<####################################  

   

Domain 2

                     

####################################>

$newdomain2 = new-object Microsoft.SharePoint.Administration.SPPeoplePickerSearchActiveDirectoryDomain

 

### ### ### Accounta rights Settings ### ### ###

$user="domena2\accountPravice"

$pass=convertto-securestring 'Pa$$w0rd' –AsPlainText -Force

$credentials=new-object –typename System.Management.Automation.PSCredential –argumentlist $user,$pass

$newdomain1.LoginName=$credentials.UserName

$newdomain1.SetPassword($credentials.Password)

 

 

### ### ### Register FQDN ### ### ###

$newdomain2.DomainName ='domena2.loc';

 

### #### ### Register netbios name ### ### ###

$newdomain2.ShortDomainName='domena2';

 

$wa.PeoplePickerSettings.SearchActiveDirectoryDomains.Add($newdomain2)

 

 

$wa.Update()

 

Because in this scenario, the application pool account does not have rights to access resources that are in a domain 2, we need to set the username and password for access to Active Directory objects in domain 2.

The first step is to create the app password for storing credentials on all server in our farm. Command Stsadm –o setapppassword –password 'Pa$$w0rd' creates a registry key where app pool account can read credential key from.

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\15.0\Secure

The next step is to create a connection for each domain specifically and also set up a user account for connecting to the specified domain.

For these purpose, you can use the account that synchronize data between SharePoint Server and Active Directory, provided that we have a SharePoint Server and the User Profile Synchronization Service. Otherwise, you can create normal service account, which will be exclusively used for reading data from the specified domain.

The last step is to set up application pool account permissions to read AppCredentialKey. AppCredentialKey registry key is limited to certain local server groups and if the permissions are not correctly set up, Event Logs are full of "Registry access is not allowed" errors and users are not getting resolved in the people picker. Users and groups that have access to registry key by default are:

  • System
  • Network Service
  • WSS_Restricted_WPG_V4
  • Administrators

In order to grant access for application pool to the AppCredentialKey, you can add service account to WSS_Restricted_WPG_V4 group. Accounts, which must be in this group are:

  • Application pool account for the web application
  • The Sandbox service account
  • The service account of the Central Administration – provided that it is not in the Administrators group.

SharePoint configuration can be quite challenging in some scenarios so I hope that this post will help SharePoint admins ease their work and enhance their understanding of SharePoint. This configuration also applies for SharePoint 2010.

 

Rob Vončina

SharePoint Server MVP



Windows Azure Active Directory

clock January 23, 2014 00:45 by author Rok Bermež

Windows Azure Active Directory (WAAD) je celovita rešitev za upravljanje identitet in dostopa, gostovana v oblaku. Združuje bistvene storitve imenika, napredno upravljanje z identiteto, varnostjo in dostopa aplikacij ter razvijalcem tako ponuja dostopno platformo za zagotavljanje nadzora dostopa do njihovih aplikacij, ki temelji na centralizirani politiki in pravilih.

Uporabimo ga lahko za:

·         Urejanje uporabniških računov preko portala za upravljanje Windows Azure-a, na istem mestu kot upravljanje uporabniških dostopov do Windows Azure-a in drugih Microsoftovih spletnih storitev, kot so na primer Microsoft Office 365 in ogromno drugih, z Microsoftom nepovezanih SaaS aplikacij, ki jih vaša organizacija lahko že uporablja.

·         Razširitev svojega lokalnega aktivnega imenika v oblak, tako da se lahko uporabniki avtenticirajo v aplikacije, ki tečejo v oblaku, na enak način, kot ga uporabljajo v podjetju. Uporabniški podatki se lahko samodejno sinhronizirajo v WAAD z uporabo brezplačnega orodja DirSync. Avtentikacija se izvede bodisi prek Federacije ali pa sinhronizacije gesel.

·         Celovito izkušnjo 'enotne prijave' (Single-SignOn) v vse Microsoftove online storitve ter stotine priljubljenih ne Microsoftovih aplikacij. Končni uporabniki lahko hitro in učinkovito zaganjajo svoje aplikacije iz personalizirane spletne dostopne plošče – Access Panel (http://technet.microsoft.com/en-us/library/dn308586.aspx).

·         Omogočanje več-faktorske avtentikacije (Multi-Factor Authentication) za Windows Azure AD uporabnike bistveno prispeva k izboljšanju zaščite do več sto oblačnih storitev in aplikacij. Priročne možnosti za preverjanje pristnosti so mobilne aplikacije, telefonski klici, in SMS sporočila.

·         Razvijalcem ponuja učinkovit način za Integracijo upravljanja z integritetami v aplikacijah s centraliziranim mehanizmom za avtentikacijo in avtorizacijo, s pomočjo identitet, gostovanih v WAADu ali pa družabnih loginih, kot na primer Microsoft, Facebook, Yahoo! ali Google račun. Prav tako so s pomočjo Graph APIja omogočene poizvedbe po podatkih, shranjenih v aktivnem imeniku.

 

Windows Azure Active Directory PremiumPREVIEW

Za podjetja z zahtevnejšimi potrebami pa je na voljo Windows Active Directory Premium, ki je trenutno v predogledu in še poveča nabor funkcionalnosti.

·         V svoji prvi fazi Windows Azure Active Directory Premium ponuja:

·         Uporabnik lahko sam zamenja svoje geslo

·         Skupinsko provizioniranje in upravljanje za SaaS aplikacije

·         Boljši ‘branding’

·         Varnostna poročila

Storitev Windows Azure Active Directory Premium bo še naprej rasla in skrbela za novo identiteto in nove zahteve za upravljanje  z dostopi  v dobi oblaka.



Export – import alerts

clock November 10, 2013 03:23 by author Robi

In my previous blog post I have explained how you can troubleshoot alerts. In this one, I'm just going to post scripts I used for exporting and importing all alerts in a site collection.

Here is the script for exporting all alerts in a site collection to a csv file:

$site = Get-SPSite "http://2013portal"

$alertResultsCollection = @()

foreach ($web in $site.AllWebs) {

foreach ($alert in $web.Alerts){

$alertURL = $web.URL + "/" + $alert.ListUrl

$alertResult = New-Object PSObject

$alertResult |Add-Member -type NoteProperty -name "WebUrl" -Value $web.Url

$alertResult | Add-Member -type NoteProperty -name "ListURL" -value $alertURL

$alertResult | Add-Member -type NoteProperty -name "AlertTitle" -value $alert.Title

$alertResult | Add-Member -type NoteProperty -name "ListUrl" -value $alert.ListUrl

$alertResult | Add-Member -type NoteProperty -name "List" -value $alert.List

$alertResult | Add-Member -type NoteProperty -name "DeliveryChannel" -value $alert.DeliveryChannels

$alertResult | Add-Member -type NoteProperty -name "AlertType" -value $alert.AlertType

$alertResult | Add-Member -type NoteProperty -name "EventType" -value $alert.EventType

$alertResult | Add-Member -type NoteProperty -name "Frequency" -value $alert.AlertFrequency

$alertResult | Add-Member -type NoteProperty -name "AlertTime" -value $alert.AlertTime

$alertResult | Add-Member -type NoteProperty -name "SubscribedUser" -value $alert.User

$alertResultsCollection += $alertResult

}

}

$site.Dispose()

$alertResultsCollection

 

#Export to CSV

$alertResultsCollection | Export-CSV C:\Users\sp2013_farm_admin\Desktop\Alerts.csv

 

And here is the script you can use to import all alerts in one site collection from csv file:

 

Import-Csv C:\Users\sp2013_farm_admin\Desktop\Alerts.csv |ForEach-Object{

$webUrl=$_.WebUrl

$listTitle=$_.List

$alertTitle=$_.AlertTitle

$subscribedUser=$_.SubscribedUser

$alertType=$_.AlertType

$deliveryChannel=$_.DeliveryChannel

$eventType=$_.EventType

$frequency=$_.Frequency

 

$web=Get-SPWeb $webUrl

$list=$web.Lists.TryGetList($listTitle)

$user = $web.EnsureUser($subscribedUser)

$newAlert = $user.Alerts.Add()

$newAlert.Title = $alertTitle

$newAlert.AlertType=[Microsoft.SharePoint.SPAlertType]::$alertType

$newAlert.List = $list

$newAlert.DeliveryChannels = [Microsoft.SharePoint.SPAlertDeliveryChannels]::$deliveryChannel

$newAlert.EventType = [Microsoft.SharePoint.SPEventType]::$eventType

$newAlert.AlertFrequency = [Microsoft.SharePoint.SPAlertFrequency]::$frequency

if($frequency -ne "Immediate"){

$AlertTime=$_.AlertTime

$newAlert.AlertTime=$AlertTime

}

$newAlert.Update()

}

 

 

Hope it helps.

Robi Vončina



"Alert Me" feature and SharePoint 2013

clock October 31, 2013 22:58 by author Robi

 

In the previous few projects we had some issues involving alerts after upgrading or migrating to SharePoint 2013, so I wanted to explain how you can help yourself out when troubleshooting alerts in SP2013.

I described here a few most common issues we were faced with.

1. The "Alert Me" Feature is not available

If you happen to be in the documents library, and you do not find the command "Alert Me", the most likely problem is that you have not set up "Outgoing Email Settings" in Central Administration. Outgoing Email can be found in the category System Settings Configure Outgoing E-Mail Settings or at the URL: http://[CA_URL],/_admin/globalemailconfig.aspx.

Another option to set the Alerts is of course the use of PowerShell. Script with which you can set up alerts is as follows:

 

# == == == == == = Set variables == == == == ==

$SMTPServer = "devmail"

$emailAddress = "SP2013@dev.local"

$replyToEmail = "robi@dev.local"

   

# == == == == == = Outgoing Email = == == == == ==

$loadasm =[System.Reflection.Assembly]::LoadWithPartialName ("Microsoft SharePoint.")

$spGlobalAdmin = New-Object To Microsoft SharePoint. SPGlobalAdmin Administration. ..

$spGlobalAdmin.UpdateMailSettings ($SMTPServer, $emailAddress, $replyToEmail, 65001)

   

The result of our setting is then visible to the Central Administration and on the Ribbon in lists or libraries where the Alert Me button appears:

2. You are not able to create alerts for domain Email enabled security groups or distribution groups

In SharePoint 2013, we came across a very interesting example, where through the user interface you are not able to subscribe a domain security group to alerts. In the document library or list, click the alert me button, the dialog box opens, where you can enter the user you would like to subscribe to alerts. In previous versions of SharePoint names of distribution groups or email enabled security groups get resolved, while in the new version domain groups do not work. You get a "No matching results".

If you want to create alerts for your domain groups, you must use PowerShell:

 

 

<# ================================================

   

Options, setting EventType

   

SPEventType [Add, Modify, Delete, Discussion, All]

   

Setting options in the alert frequency

the case of the daily, weekly, it is necessary to also set

alert hour

   

SPAlertFrequency [Immediate, Daily, And Weekly]

   

================================================ #>

   

$web=Get-SPWeb "http://2013portal"

$list=$web.Lists.TryGetList("Documents")

$user = $web.EnsureUser('DEV\skupinaemail')

$newAlert = $user.Alerts.Add()

$newAlert.Title = $list.Title

$newAlert.AlertType=[Microsoft.SharePoint.SPAlertType]::List

$newAlert.List = $list

$newAlert.DeliveryChannels = [Microsoft.SharePoint.SPAlertDeliveryChannels]::Email

$newAlert.EventType = [Microsoft.SharePoint.SPEventType]::All

$newAlert.AlertFrequency = [Microsoft.SharePoint.SPAlertFrequency]::Daily

$newAlert.AlertTime="12:00"

$newAlert.Update()

 

 

In order to verify whether they have successfully set up alerts, open the Site Settings / User alerts

For the purposes of testing and the display of alerts, I created another alert for the same group, the only difference is that in this case, I use the "Frequency" Immediate Alert.

3. The user has set Alert on item adding in document library, but an immediate alert is not sent

For this example, it is necessary to understanding how alerts work. SharePoint Content databases has a table, which is called the EventCache. This is the table where SharePoint writes events that are important to search service application, alerts, ..., in short, the table serves as a temporary repository of current events, which is then used for subsequent operations. In the case of alerts, all subsequent operations for Immediate alerts are taken care of by the Timer job, which can be found in the Central Administration and is called Immediate Alerts or you can list all Timer jobs by simple PowerShell query:

Get-SPTimerJob | ? {$_.name -like "alert"}

   

As we can see from the results of the command, the Timer Job triggers every 5 minutes. The timer job queries the content databases and table EventCache. In the event that the table contains records for sending alerts, alert email is sent and record gets deleted from the event cache table.

Warning! Running a query directly to the SharePoint database is not supported.

To see what EventCache table contains, you can run this SQL Query against your Content DB:

select * from [SP13_Content_2013Portal].[dbo].[EventCache] (nolock)order by [TimeLastModified] desc

   

So, if I make a change on the document in the document library and perform a query, I get as a result of the following record:

Query result can be also the first verification that alerts work properly. For the purpose of subsequent verifications of Alerts, we have to use ULS Log Viewer and set up Verbose logging in Central Administration for alerts. Alerts logging can be found in the SharePoint Foundation category.

In the ULS Log Viewer, you have the option of filtering ULS logs. The easiest way to get all records related to the alerts, is to set the filter, as shown in the image:

If you want to see the records in the ULS logs and do not want to wait for 5 minutes, you must manually trigger Immediate Alerts Timer Job.

In my case, I'm exploring what's happening with alerts to which user Uroš is subscribed to. In the document library I uploaded a new document and started the Immediate alerts timer job with the use of PowerShell:

Get-SPTimerJob | ? {$_.name -like "*alert*"} |Start-SPTimerJob

   

In ULS logs you can now see records of Alerts operations:

In this case, I see that alerts work as expected. But what happens if we have a library with feature "Require documents to be checked out before they can be edited«?

Repeat the process, upload the document to a document library, in the ULS LOGS watch what's going on with alerts. Following entries appear:

Voila, I can see that the alerts are being processed correctly, however, because loading the document into the document library is a 2 step process, upload and entering metadata, it can be concluded that when uploading a document, document is treated as checked out. Record is written to EventCache table where state of document is registered as Checked out.

With that being said, we can now conclude that when the feature "Require documents to be checked out before they can be edited« on document libraries is enabled and user is subscribed to alerts on document added, alerts are not going to be sent out, as SharePoint treats this documents as checked out, or as the first minor version.

 

In this article I tried to describe some typical errors which may occur in the SharePoint 2013, in conjunction with the Alert Me feature and what's more important, I tried to demonstrate how we can debug and solve errors.

After the troubleshooting, don't forget to set the logging level for the Alerts back to the default settings.

   

Robi Vončina
SharePoint Server MVP

   



Windows Azure Storage Tables - drugi del

clock September 13, 2013 15:54 by author Rok Bermež

V prejšnjem članku smo naredili uvod v NoSql podatke v oblaku ter 'Windows Azure Storage', tokrat pa bomo nadaljevali z raziskovanjem 'Tables' aspekta te storitve.

Dandanes obstaja mnogo podatkovnih platform od katerih ima vsaka svoje prednosti slabosti. Mnogo od teh deluje po konceptu NoSQLja, kar pomeni da ni uporabljen RDBMS (a relational database management system) model. Skratka nimamo tabel in SQL stavkov, temveč druge podatkovne strukture, ki so po navadi masovne zbirke ključ/vrednost parov ali pa asociativnih nizov.  Priljubljeni izbori danes so are MongoDB, Cassandra, HBase, CouchDB, Neo4j in Windows Azure Tables, od katerega zadnjemu se bomo posvetili tokrat.

Kljub velikim razlikam imajo tako SQL in NoSQL baze eno skupno stvar: te tehnologije so (lahko) na voljo kot storitev v oblaku in tako  sprostijo razvijalce pred ročnim provizioniranjem in de-provizioniranjem podatkovnih strežnikov. Tako so tudi Windows Azure Tables ponujene kot storitev in nam razvijalcem ni treba razmišljati o tem, kot o nekem ločenem fizičnem strežniku.

Ena od pomembnejših značilnosti Windows Azure tabel je, da je njihovo hranjenje na voljo na treh geografsko porazdeljene regijah, ki so ZDA, Evropa in Azija. Vsak Microsoftov datacenter je v skladu z  Mednarodno organizacijo za standardizacijo (ISO) 27001, SSAE 16 ISAE 3402, EU Model Clauses and Health Insurance Portability and Accountability Act (HIPAA) business associate agreement (BAA) standards. Druga pomembna značilnost je georedundanca, ki vam omogoča hranjenje podatkov v drugem podatkovnem centru v isti regiji, kar doda še eno stopnjo varnosti pri naravnih nesrečah.

WAS (Windows Azure Storage) zmogljivosti in kapacitete so povezane z računi za upravljanje z njimi, tako posameznikov račun vključuje do 200TB prostora za shranjevanje. Windows Azure Tables so optimizirane za zagotavljanje neverjetno hitrega delovanje poizvedb v stanju velike obremenitve s pisanjem. Več o tem si lahko preberete na bit.ly / cMAWsZ.

<!--[if !vml]-->clip_image002<!--[endif]-->

Prav tako so na voljo WAS analitike, ki nam omogočajo sledenje zahtevkov za shranjevanje, analizo trendov uporabe in optimizacijo vzorcev podatkov ter beleženje dostopov iz računa za shranjevanje (več na bit.ly/XGLtGt).

Najbolj jedrnat način za izražanje vrednosti Windows Azure Tabel je, da so podprte NoSQL ključ / vrednost (key/value) poizvedbe in to tudi med pisanjem pri veliki obremenitvi. Iz razvijalskega stališča  so Windows Azure tables namenjene za shranjevanje velikih zbirk neenakomernih objektov ali za serviranje spletnih strani z veliko prometa. Do podatkov shranjenih v  Windows Azure tabelah je mogoče dostopati od skoraj kjerkoli.

Celoten sistem shranjevanja podatkov je osnovan na REST-u (is Representational State Transfer), kar pomeni da lahko katerikoli klient, ki je sposoben http komunikacije, deluje z WAS sistemom. Prav tako, REST API podpira vse potrebne operacije za delo s podatki.

V naslednji številki, bomo pa videli kako to poteka v praksi.

 

 



Upgrade Video

clock August 27, 2013 19:34 by author Robi

For the Ukrainian SharePoint Conference I recorded this video to present it at my session. I decided to make it public now so please, sit back, relax and enjoy it.

   

Upgrading to SharePoint 2013 Video



Upgrading to SharePoint 2013 – part 2

clock August 27, 2013 00:19 by author Robi

In one of the previous blog posts I was talking about basic upgrade scenarios. We've looked at how to make the database attach method and then how to make upgrade of the site collection.

In this article, I'll introduce how you can a variety of tools, get you a better insight into the farm that you are upgrading. You'll also see how you can monitor the upgrade, which site collections are in the process of upgrading and how upgrade "queue" can be edited. In the end, I will explain how to install custom solutions and what's new with installing custom solutions on SharePoint farm.

First of all, I would like to introduce two essential tools for SharePoint administrators, which can be found on CodePlex:

SharePoint Feature Administration and Clean Up

SharePoint Feature Administration and Clean Up is a tool that you can use before you make a backup of your database that you want to upgrade. The tool shows all features at web app level, site collection, or individual sites. Application offers that without any installation on the server. All you have to do is start the application as a user that has sufficient rights on the entire farm.

The application is particularly useful because you will find features that are either corrupt or for which you do not have installed the solution, and this way you can save a lot of debugging and browsing after upgrading to SharePoint 2013 alone.

As shown in the following illustration, the application shows which function is malfunctioning, and in the event that you select a broken function also offers a variety of options, among which are:

  • Uninstall
  • Activation
  • Find where this function is activated in the farm.

 

Example: Broken feature that we would like to remove from the farm. Select the feature, click on Uninstall and confirm some dialogs.

    

 

 

Application logs shows us what has been done and on which object.

The application, however, offers us a very useful function, which is called "Find the faulty feature and the farm. This function enumerates the entire farm locates broken features and removes them. Then, with one click of a button you can clean your farm and database upgrade will show lot less errors than in case you don't clean your farm.

SharePoint Manager

SharePoint Manager is also an application that is not necessary to install it on the server, but it is enough to run the application on a server with elevated privileges. The application allows you to look at the whole structure of objects in your SharePoint 2010/2013 environment.

For each object in the farm, you can view the properties of the selected object in details view in the right window and if you select view of field object you can also see the XML schema of the object. You can also change the values of the some properties and save them back to the SharePoint configuration database, respectively.

Because the application offers a very detailed look at the structure of the SharePoint environment, it is very useful for developers and highly recommend it as well as for the maintenance and administration of the SharePoint 2013/2013.

Manage site collection upgrade

In the previous issue of the article, I introduced the process of upgrades including site collection upgrades. Site collection upgrade was run with following command in PowerShell:

Upgrade-SPSite -Identity http://upgrade -VersionUpgrade

 

Command in PowerShell runs site collection upgrade. You would be able to add another parameter – QueueOnly, whereby site collection would be added to the upgrade queue. Queue maintainance is done by Timer Job, which runs every 2 minutes and starts the upgrade of the next site collection in a queue. Regardless of whether we add a parameter – QueueOnly or not, each site collection we upgrade appears in the list, obtained with the command:

Get-SPSiteUpgradeSessionInfo -ShowInProgress -ShowCompleted -ShowFailed -null ContentDatabase [database] | select siteid, the status of the

 

In the event that we are upgrading multiple site collections, which are located in more than one database, you can get list of upgrading site collections as follows:

$imenaBaz=("baza1","baza2")

foreach($baza and $imenaBaz) {

Get-SPSiteUpgradeSessionInfo -ShowInProgress -ShowCompleted -ShowFailed -A Null ContentDatabase $baza | select siteid, the status of the

}

 

In the event that you would like to remove site collection from the list, you can use the command:

Remove-SPSiteUpgradeSessionInfo -Identity http://upgrade

 

Server load throttling during upgrade

For the process of upgrading site collections, we are also able to control how many site collections are being upgraded simultaneously. For server load management, two settings are available. First one is at the web application level, and the other is set on content database.

$wa = Get-SPWebApplication http://upgrade

$wa.SiteUpgradeThrottleSettings

The command shows how many site collection can web app application pool process upgrade simultaneously. If the number is exceeded, a site collection is automatically placed in the queue, regardless of whether we add the parameter-Queue or not. It is also automatically placed in queue, if it exceeds SubWebCountLimit value, or if it exceeds UsageStorageLimit. By default, these two values are set to 10. You can easily increase that number, depending on your server hardware, as limits are set really low.

The second option is for controlling concurrent upgrades, at the level of the individual database in which site collection are located in. The current settings can be listed with the following command:

$db = Get-SPContentDatabase SP2013_Content_Upgrade

$db.ConcurrentSiteUpgradeSessionLimit

 

By default, the number of concurrent upgrades of site collections is set to 10. This setting can be, depending on the specifications of your database servers for, easily increased.

Custom solutions upgrade management

Because SharePoint is a platform that can be used for development of business applications, the environments with a lot of installed custom solutions are very common. For success of the upgrade to SharePoint 2013, it is very important that all the solutions that have been installed on the SharePoint 2010 are successfully transferred to the new version. Best scenario for custom solutions is if you can get the source code for installed solutions, which you can then upgrade for the new version or if we can get already upgraded WSP file. However, since SharePoint 2013 also contains binaries of SharePoint 2010, all 2010 solution might already work on 2013.

It is necessary to be aware of some changes to the security settings of custom solutions. In the past, solutions installed into the BIN folder were treated as "Partial Trusted Code", while in the last version of the SharePoint this solutions are considered as "Full Trust«.

To install a custom solution with PowerShell there are additional parameters available. If we want to add a SharePoint solution in the Solution Store, we need to use the command:

Add-SPSolution -LiteralPath "C:\Users\sp_FarmAdmin\Desktop\WSPs\test.wsp"

 

To install the solution, use the command:

The Install-SPSolution -Identity test wsp -GACDeployment parameter -Force -Verbose -CompatibilityLevel all

 

You can use the new parameter to set the CompatibilityLevel of the solution:

Value

Effect

"14,15"

' AllVersions '

"All"

Installs the solution to both 14 and 15 directories (in the 2013 product, subject to change and future versions)

14

"OldVersions"

The "Old"

Installs the solution to 14 directories only (in the 2013 product, subject to change and future versions)

15

"NewVersion"

"New"

Installs the solution to 15 directories only (in the 2013 product, subject to change and future versions)

 

Example:

1. 14 Hive path-_/layouts/Custompage .aspx

2. the path – the Hive _ 15/layouts/15/Custompage.aspx

The Global Assembly Cache

GAC aspect. After the release of NET Framework 4.0 GAC is divided into two parts, one for each CLR. c:\windows\assembly is the location for the NET version 1.0 to 3.5 and c:\windows\microsoft.netassembly is the location for all DLL projects created in NET Framework 4.0.

I hope I've written some useful guidelines for a successful upgrade. In the event any additional questions or needing help with your upgrade, you can contact me at: robi@kompas-xnet.si.



SharePoint Designer 2013 Crashing when using select people/groups from SharePoint site

clock August 24, 2013 01:56 by author Robi

Yesterday I was working on an issue where client wanted to use action in SharePoint Designer 2013 like Send Email to users and when Select people/Groups from SharePoint Site was clicked, SharePoint Designer just crashed.

The same thing happened in InfoPath Designer 2013.

 

After investigating error logs on SharePoint Web Servers, I found this error:

System.ServiceModel 4.0.0.0 EventID: 3

WebHost failed to process a request.

Sender Information: System.ServiceModel.ServiceHostingEnvironment+HostingManager/40939712

Exception: System.ServiceModel.ServiceActivationException: The service '/_vti_bin/spclaimproviderwebservice.svc' cannot be activated due to an exception during compilation. The exception message is: This collection already contains an address with scheme http. There can be at most one address per scheme in this collection. If your service is being hosted in IIS you can fix the problem by setting 'system.serviceModel/serviceHostingEnvironment/multipleSiteBindingsEnabled' to true or specifying 'system.serviceModel/serviceHostingEnvironment/baseAddressPrefixFilters'.

Parameter name: item. ---> System.ArgumentException: This collection already contains an address with scheme http. There can be at most one address per scheme in this collection. If your service is being hosted in IIS you can fix the problem by setting 'system.serviceModel/serviceHostingEnvironment/multipleSiteBindingsEnabled' to true or specifying 'system.serviceModel/serviceHostingEnvironment/baseAddressPrefixFilters'.

Parameter name: item

at System.ServiceModel.UriSchemeKeyedCollection.InsertItem(Int32 index, Uri item)

at System.Collections.Generic.SynchronizedCollection`1.Add(T item)

at System.ServiceModel.UriSchemeKeyedCollection..ctor(Uri[] addresses)

at System.ServiceModel.ServiceHost..ctor(Type serviceType, Uri[] baseAddresses)

at System.ServiceModel.Activation.ServiceHostFactory.CreateServiceHost(Type serviceType, Uri[] baseAddresses)

at System.ServiceModel.Activation.ServiceHostFactory.CreateServiceHost(String constructorString, Uri[] baseAddresses)

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.CreateService(String normalizedVirtualPath, EventTraceActivity eventTraceActivity)

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.ActivateService(ServiceActivationInfo serviceActivationInfo, EventTraceActivity eventTraceActivity)

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath, EventTraceActivity eventTraceActivity)

--- End of inner exception stack trace ---

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath, EventTraceActivity eventTraceActivity)

at System.ServiceModel.ServiceHostingEnvironment.EnsureServiceAvailableFast(String relativeVirtualPath, EventTraceActivity eventTraceActivity)

Process Name: w3wp

Process ID: 7720

 

This error appears because in IIS I set up two web site bindings for web application. One was http://intranet and the other one was http://intranet.kompas-xnet.si.

As it turns out, SharePoint 2013 is not particularly happy about that.

 

As error description says I tried to set up multipleSiteBindings in web.config of the Web Application. You can find more details here:

Supporting Multiple IIS Site Bindings
http://msdn.microsoft.com/en-us/library/ee358763.aspx

I modified the web.config as follows:

<system.serviceModel>

<serviceHostingEnvironment aspNetCompatibilityEnabled="true" multipleSiteBindingsEnabled="true"/>

</system.serviceModel>

 

With multipleSiteBindingsEnabled set to True, SharePoint Designer worked without any issues. But, what I found out was that event logs on web servers were starting to get filled with other error:

System.ServiceModel 4.0.0.0 EventID: 3

WebHost failed to process a request.

Sender Information: System.ServiceModel.ServiceHostingEnvironment+HostingManager/62297830

Exception: System.ServiceModel.ServiceActivationException: The service '/_vti_bin/client.svc' cannot be activated due to an exception during compilation. The exception message is: When 'system.serviceModel/serviceHostingEnvironment/multipleSiteBindingsEnabled' is set to true in configuration, the endpoints are required to specify a relative address. If you are specifying a relative listen URI on the endpoint, then the address can be absolute. To fix this problem, specify a relative uri for endpoint 'http://intranet.kompas-xnet.si/_vti_bin/client.svc'.. ---> System.InvalidOperationException: When 'system.serviceModel/serviceHostingEnvironment/multipleSiteBindingsEnabled' is set to true in configuration, the endpoints are required to specify a relative address. If you are specifying a relative listen URI on the endpoint, then the address can be absolute. To fix this problem, specify a relative uri for endpoint 'http://intranet.kompas-xnet.si/_vti_bin/client.svc'.

at System.ServiceModel.Activation.ApplyHostConfigurationBehavior.ThrowIfAbsolute(Uri uri)

at System.ServiceModel.Activation.ApplyHostConfigurationBehavior.FailActivationIfEndpointsHaveAbsoluteAddress(ServiceHostBase service)

at System.ServiceModel.Description.DispatcherBuilder.ValidateDescription(ServiceDescription description, ServiceHostBase serviceHost)

at System.ServiceModel.Description.DispatcherBuilder.InitializeServiceHost(ServiceDescription description, ServiceHostBase serviceHost)

at System.ServiceModel.ServiceHostBase.InitializeRuntime()

at System.ServiceModel.ServiceHostBase.OnOpen(TimeSpan timeout)

at Microsoft.SharePoint.Client.Services.MultipleBaseAddressWebServiceHost.OnOpen(TimeSpan timeout)

at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.ActivateService(ServiceActivationInfo serviceActivationInfo, EventTraceActivity eventTraceActivity)

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath, EventTraceActivity eventTraceActivity)

--- End of inner exception stack trace ---

at System.ServiceModel.ServiceHostingEnvironment.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath, EventTraceActivity eventTraceActivity)

at System.ServiceModel.ServiceHostingEnvironment.EnsureServiceAvailableFast(String relativeVirtualPath, EventTraceActivity eventTraceActivity)

Process Name: w3wp

Process ID: 8160

 

So, in the end I just removed multipleSiteBindingsEnabled setting from web.config and removed second web site binding from IIS. Now, everything works as expected.

Hope it helps someone when investigating SharePoint Designer issues.

 

Enjoy

Robi Vončina



About the author

Rok Bermež is Slovenian Windows Azure MVP, he works as a Software Engineer and Microsoft Certified Trainer at Kompas Xnet. His primary interests include cloud computing and web development. With extensive experience in development and architecture he participated in many projects and since the CTP release of Windows Azure much of those projects are based on Windows Azure platform. Rok has been delivering courses, writing code and speaking at conferences and community events on Microsoft technologies for the last couple of years. You can also find him on Twitter (@Rok_B).

Month List

Sign In