X Blogs

who you gonna call?

Get data from Folder in Power Query

clock February 10, 2015 22:26 by author gašper

It seems that lately Excel users can be separated into two groups. One group that gets the new Excel (2010 or 2013) but still uses the application as if it's Excel 2003 and those are users that may have heard of the whole Power BI thing but think that the whole Power BI story can only benefit certain people. That is not true. Power BI is all in all a set of brilliant commands. This post will show you only one of them and that is a Get Data From Folder command in Power Query. But it will also show you how Get Data From Folder command in Power Query can make your life easier. Take the time and read this post and think about how to implement this in your work. Trust me, you will not regret it! Here's what this post will talk about.

If you get new data periodically in any form (txt, csv, xlsx…) then you know that what that means is that every time you have to go through the same process to get that data into Excel. Now this will be reduced to copying that file to a certain folder and clicking refresh in Excel. It doesn't get much better than that. But first, let's explain what is Power Query.

Power Query is an Excel Add-In that was created to help you get data into Excel or Power Pivot. It's part of Microsoft Power BI and is even incorporated into the new Microsoft Power BI Designer. Everything you read about in this post refers to Excel 2010 and Excel 2013 for which Power Query is available. And now for the Get data from folder command.

When I first found this command, my enthusiasm about Excel doubled over night (and that is saying a lot). Here's where it can really help you. Working with Excel will sooner or later get you in a position where you will be repeating one and same actions periodically. In the beginning that is great, but in time you would much rather be creating great dashboards and charts and not repeating the same imports or copying over and over again. Before Power Query was "born", the only tool at your disposal for easing of the process was VBA. You could write a macro that did the import and accounted for all specifics in your data. But now, things got even better. What follows is a nice example of that.

Let's say that every month, we get a new txt file that is just an export of data that we want in Excel so we can analyze it. This is how those txt files look like.

So every month we have to remove the top six rows, correctly import the columns and add them to previously imported data and analysis. Here's how you do it with Get data from folder command in Power Query.

On the Power Query tab select From File and then From Folder. This will open a Choose a folder dialog box, where you can choose any folder you like (even folders on SharePoint).

Once you choose the folder, you get what will soon become your favorite window in Excel. It's the Query Editor Window. And here the fun begins.

The above picture also shows two buttons we will use magnified, since one hides the dropdown menu of most used commands in Power Query. And we will use many of them during this process. But the other is even more important in this first stage and that is the double down arrow in the Content column.

The reason we need this button is that at this point, we are seeing the list of files in the folder but not the content of those files. But pressing this button will show exactly that.

Here's what we get after pressing it.

And now the fun begins. We will do some data transforming by mostly using the dropdown menu button we talked about before.

First off, we will delete the top six rows. So from the dropdown menu we choose Remove Top Rows…

As the three dots at the end of the command suggest, you get a dialog box, and all that it needs from you is the number of top rows, you wish to remove (in our case that is 6).

After that we will choose a simple Use First Row As Headers command.

Now if you can imagine, this took care of the first six rows from the first txt file. But the same six rows from other txt files in that folder are still there, appended bellow. So the next step must be removing those. We will be using the filter in the name column to do that. First a little trick. At the beginning, the filter only shows you a limited amount of records and you must use the Load more command shown below.

Now we remove the blanks (since all rows contain the name, than these must be the six rows at the beginning of all txt files)…

…and then remove the rows containing Name (these are the header rows of other txt files)

Now we have the data we need, we just have to tell Excel what that data is. So we set the data types of the columns. So we select the column and go to Home/Data Type and select the appropriate data type.

With that done, we attend to the specifics in our data. In the City column, a question mark has replaced the apostrophe so we should revert those changes. We will use the Transform/Replace Values command.

The dialog box will remind you of a classic Find and Replace Window.

And this is what you get

Now we will use the Close & Load command.

Ok, so now we did all that we do every time we get a txt file. More or less a same amount of work, so the question is, what we have gained or what is so special about this.

First and foremost, at this point we have imported four txt files at once and now have 4000 rows. But now we just got the new txt file with new data.

All we need to do is to copy that file to the same folder as the others.

And just right click the data in Excel and choose Refresh

And right away we get 1000 new rows from the new txt.

So Power Query will take all the files in that folder and repeat all the steps that we have defined before. Brilliant. But there's more. What if we get a txt file that needs a new rule or a change in the old set of rules? All we need to do is to right click the Query on the right and choose Edit and we are back in the Query editor and on the right you can delete or modify any of the steps that we defined in the original query.

 

Pure Brilliance. But if the new txt file would be totally different than the previous ones, you could define a new query for that one and then use the Append command to add the new data to our old Query.

Now do you believe me, that this is truly a game changer. Just imagine. You only do the hard work once and then it's just a matter of refreshing your data. And that is why its called POWER BI and POWER QUERY :)

This is a reposting of the original post from Excel Unplugged which can be found here.

 



Create site collection in target content database

clock August 23, 2014 20:02 by author Robi

I get a lot of question from my customers how to create site collection in specific content database. In Central Administration you do not have an option OOTB to create site collection and specify in which content database you would like to create it in.

This is the reason I created script that opens a form where you can select Web application, Managed Path, Content database, language and Template for a site collection.

You must enter data as seen on this screen shot.

Mandatory field are:

  1. managed path
  2. content database
  3. primary site collection admin
  4. site collection title
  5. site collection URL - note that this is just url after managed path
  6. language
  7. web template - it load templates based on language, compatibility level and not hidden.

***Note

This script supports creating site collection only with wildcard managed paths.

You can download a script here:

 

And I published the script to Office downloads and scripts as well.

 

Hope you like it.

CreateSiteCollectionInContentDb.ps1 (16.02 kb)



Sorting months chronologically and not alphabetically in a Pivot Table report based on Power Pivot data

clock May 13, 2014 18:41 by author Gašper

Here is our problem. When you create a Pivot Table in Excel, you can Group that field by month and the sort will be logical (January, February, …). But when you create a Pivot Table based on Power Pivot table, the grouping does not work! So you have to get to the month names by a different road. We do this by a Format function in PowerPivot, but the problem is that when you put this field in a Pivot Table, it gets sorted alphabetically. This is logical since the values are text and have nothing to do with dates as far as that Pivot Table is concerned, but this is a problem since the months are not sorted chronologically. This article will tell you how to achieve that.

Let's start with a simple table in Excel that has only two columns. One has Date values and the other has a number of visitors on that date. Now we would like to create a Pivot Table report to see how the number of visitors is spread through the months.

Case 1: Pivot Table report based on an Excel Table

First we create a Pivot Table based on an Excel Table

The Pivot Table will show the number of visitors by months. But to do this, since we only have Dates, we have to do Grouping by months on the Dates

And right away we get the desired result.

Case 2: Pivot Table report based on Power Pivot data.

First we add our Table data to Power Pivot the easiest way – by using the Add to Data Model command on the PowerPivot tab.

Now that we have the data in the Power Pivot we can create a Pivot Table report from Power Pivot window. But when we create a Pivot Table and want to see the analysis by months we see we just can't select the Group command. It is grayed out…

So to get to months we use a different trick, we go back to the Power Pivot window and create a calculated column using a Format function. This is a PowerPivot rendition of the Text function from Excel. The syntax is the same, just the names differ.

Excel Version

Power Pivot version.

Using the Format function we now get the month names and a new field to create a Pivot Report by. But when we create it, it looks quite disappointing.

So the numbers are OK, but the sorting is alphabetical and not the kind we want. To get the sorting right, we have to go back to the PowerPivot window and create a new calculated column using the Month function. This way we now get a month number along each date and month name.

Now just adding that to the Pivot Table report would get rid of our problem, but let's not forget that we want the month names as they were, only the sorting is wrong. But now we have all we need.

In the Power Pivot window, we select a value in the month name column and then select a Sort by Column command on the home tab and hey, look at that. You can now say that the Month name column will be sorted by Month No. column.

Doing that has changed our Pivot Report instantly

And we are one step closer to eternal happiness.

 

The most recent version of this post can be found here.



Windows Azure Mobile Services novosti

clock April 1, 2014 15:56 by author Rok Bermež

 

Pred kratkim so Windows Azure Mobile Services dobile podporo za .Net in ASP.NET Web API. Ta kombinacija je naredila gradnjo oblačnih mobilnih 'backendov' še toliko enostavnejšo.

Začnemo lahko enostavno tako, da gremo na azure management portal in naredimo nov Mobile service, pri tem pa zberemo .NET kot jezik uporabe.

clip_image002[4]

Ko bo servis narejen, pridemo do priročne začetne strani

clip_image004[4]

Kjer z klikom na gumb 'download' dobimo v naprej prirpavljen projekt temelječ na Web API predlogi z nekaj dodatnimi NuGet paketi.

clip_image006[4]

Če odpremo privzet ToDoItemConroller lahko vidimo kako se uporablja vgrajeni TableController<T> .NET razred, ki nam omogoča enostavno serviranje podatkov v mobilne aplikacije.

clip_image008[4]

Prav tako je omogočeno lokalno razhroščevanje in razvoj, ko pa smo z svojo storitvijo zadovoljni, jo pa enostavno objavimo na njeno oblačno mesto s pomočjo njenega 'publish' profila.

clip_image009[4]



The People picker and domain trusts

clock January 31, 2014 06:58 by author Robi

In the last couple of projects on SharePoint I was working with environments, where company was associated with couple of domains. Since SharePoint has to be configure to work in such environments, I decided to introduce the issue on two typical scenarios and explain in what way it is then necessary to set up a SharePoint site, to make it possible to resolve users in people picker from other domains.

Scenario 1 – "Two way trust"

The characteristic of "Two Way Trust" is that users from domain 1 have rights to access resources in domain 2 and vice versa. This means that users from a domain 1 can read data in the Active Directory of domain 2 and users from a domain 2 can read data from AD in domain 1. This is very important information, since configuration of SharePoint Server depends on these characteristic.

Figure 1 shows the scenario where we have set up 2 way trust between the domain 1 and domain 2. In domain 1 SharePoint Server is placed, which can be used by users from domain 1 and domain 2. The Problem, which in this case refers to the People picker is that people picker by default search only in Active Directory domain in which the SharePoint Server is installed. So, if you want to add users from other domains, we need to fix some of the settings on the SharePoint Server.

Settings that must be set in this case refers to the entire web application. Of course, these settings cannot be set in the user interface, so it is necessary in this case to open SharePoint 2013 Management Shell and enter a couple of lines of code.

$wa = Get-SPWebApplication http://portal.domena1.loc

 

# List the current values for a web application

$wa.PeoplePickerSettings.SearchActiveDirectoryDomains

 

 

<# ====================================

 

Portal web app

 

===================================== #>

 

 

 

 

<####################################  

 

Domain 1

 

####################################>

$newdomain1 = new-object Microsoft.SharePoint.Administration.SPPeoplePickerSearchActiveDirectoryDomain

 

### #### ### Register FQDN ### ### ###

$newdomain1.DomainName ='domena1.loc ';

 

### #### ### Register netbios name ### ### ###

$newdomain1.ShortDomainName ='domena1';

$wa.PeoplePickerSettings.SearchActiveDirectoryDomains.Add($newdomain1)

 

<####################################  

 

Domain 2

 

####################################>

$newdomain2= new-object Microsoft.SharePoint.Administration.SPPeoplePickerSearchActiveDirectoryDomain

 

 

### ### ### Register FQDN ### ### ###

$newdomain2.DomainName ='domena2.loc';

 

### #### ### Register netbios name ### ### ###

$newdomain2.ShortDomainName ='domena2';

 

$wa.PeoplePickerSettings.SearchActiveDirectoryDomains.Add($newdomain2)

 

 

$wa.Update ()

For setting the "People Picker" control, first we need to save the web application object in a new variable. The current value of the variable can be listed with the following command $wa.PeoplePickerSettings. SearchActiveDirectoryDomains.

In order to be able to add new values to search other domains, we have to make a new object of type Microsoft. SharePoint. Administration. SPPeoplePickerSearchActiveDirectoryDomain, and then set the value. The first value is the FQDN of the domain and the second value is the NetBIOS name, for a short domain name. The same procedure must be repeated for all the domains that you want to allow in the people picker.

At the end we have to, of course, save the changes that are stored in the web application object. We have to repeat the procedure on all web applications on which we want to change the behavior of the people picker, including the Central Administration.

Scenario 2 – "a One way trust"

»One way trust" is a way of linking domains where one domain trusts another domain. For example, I used a one way trust between domain 1 and domain 2, where domain1 trusts domain 2, while domain 2 does not trust domain 1. To make it simple, we could say that the users from domain 2 can access data in the domain 1 while domain 1 users cannot access the data in the domain 2. Such a scenario is widely used among intranet and extranet environments, where users from the intranet domain can access extranet domain, while the reverse is not possible.

   

In a scenario where you use one way trust ", we can use the same script that we used in bidirectional trusts, with the difference that you need to provide some sort of access. These parameters are the username and password of the user who will be able to access resources in the domain 2.

 

<####################################

 

Creating a password app

 

####################################>

 

Stsadm –o setapppassword –password 'Pa$$w0rd'

 

# ========================================================

 

$wa = Get-SPWebApplication http://portal.domena1.loc

 

# List the current values for a web application

$wa.PeoplePickerSettings.SearchActiveDirectoryDomains

 

 

<# ====================================

 

Portal web app

 

===================================== #>

 

 

 

 

<####################################  

   

Domain 1

                     

####################################>

$newdomain1= new-object Microsoft.SharePoint.Administration.SPPeoplePickerSearchActiveDirectoryDomain

 

### #### ### Register FQDN ### ### ###

$newdomain1.DomainName ='domena1.loc';

 

### #### ### Register netbios name ### ### ###

$newdomain1.ShortDomainName ='domena1';

$wa.PeoplePickerSettings.SearchActiveDirectoryDomains.Add($newdomain1)

 

<####################################  

   

Domain 2

                     

####################################>

$newdomain2 = new-object Microsoft.SharePoint.Administration.SPPeoplePickerSearchActiveDirectoryDomain

 

### ### ### Accounta rights Settings ### ### ###

$user="domena2\accountPravice"

$pass=convertto-securestring 'Pa$$w0rd' –AsPlainText -Force

$credentials=new-object –typename System.Management.Automation.PSCredential –argumentlist $user,$pass

$newdomain1.LoginName=$credentials.UserName

$newdomain1.SetPassword($credentials.Password)

 

 

### ### ### Register FQDN ### ### ###

$newdomain2.DomainName ='domena2.loc';

 

### #### ### Register netbios name ### ### ###

$newdomain2.ShortDomainName='domena2';

 

$wa.PeoplePickerSettings.SearchActiveDirectoryDomains.Add($newdomain2)

 

 

$wa.Update()

 

Because in this scenario, the application pool account does not have rights to access resources that are in a domain 2, we need to set the username and password for access to Active Directory objects in domain 2.

The first step is to create the app password for storing credentials on all server in our farm. Command Stsadm –o setapppassword –password 'Pa$$w0rd' creates a registry key where app pool account can read credential key from.

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\15.0\Secure

The next step is to create a connection for each domain specifically and also set up a user account for connecting to the specified domain.

For these purpose, you can use the account that synchronize data between SharePoint Server and Active Directory, provided that we have a SharePoint Server and the User Profile Synchronization Service. Otherwise, you can create normal service account, which will be exclusively used for reading data from the specified domain.

The last step is to set up application pool account permissions to read AppCredentialKey. AppCredentialKey registry key is limited to certain local server groups and if the permissions are not correctly set up, Event Logs are full of "Registry access is not allowed" errors and users are not getting resolved in the people picker. Users and groups that have access to registry key by default are:

  • System
  • Network Service
  • WSS_Restricted_WPG_V4
  • Administrators

In order to grant access for application pool to the AppCredentialKey, you can add service account to WSS_Restricted_WPG_V4 group. Accounts, which must be in this group are:

  • Application pool account for the web application
  • The Sandbox service account
  • The service account of the Central Administration – provided that it is not in the Administrators group.

SharePoint configuration can be quite challenging in some scenarios so I hope that this post will help SharePoint admins ease their work and enhance their understanding of SharePoint. This configuration also applies for SharePoint 2010.

 

Rob Vončina

SharePoint Server MVP



Windows Azure Active Directory

clock January 23, 2014 00:45 by author Rok Bermež

Windows Azure Active Directory (WAAD) je celovita rešitev za upravljanje identitet in dostopa, gostovana v oblaku. Združuje bistvene storitve imenika, napredno upravljanje z identiteto, varnostjo in dostopa aplikacij ter razvijalcem tako ponuja dostopno platformo za zagotavljanje nadzora dostopa do njihovih aplikacij, ki temelji na centralizirani politiki in pravilih.

Uporabimo ga lahko za:

·         Urejanje uporabniških računov preko portala za upravljanje Windows Azure-a, na istem mestu kot upravljanje uporabniških dostopov do Windows Azure-a in drugih Microsoftovih spletnih storitev, kot so na primer Microsoft Office 365 in ogromno drugih, z Microsoftom nepovezanih SaaS aplikacij, ki jih vaša organizacija lahko že uporablja.

·         Razširitev svojega lokalnega aktivnega imenika v oblak, tako da se lahko uporabniki avtenticirajo v aplikacije, ki tečejo v oblaku, na enak način, kot ga uporabljajo v podjetju. Uporabniški podatki se lahko samodejno sinhronizirajo v WAAD z uporabo brezplačnega orodja DirSync. Avtentikacija se izvede bodisi prek Federacije ali pa sinhronizacije gesel.

·         Celovito izkušnjo 'enotne prijave' (Single-SignOn) v vse Microsoftove online storitve ter stotine priljubljenih ne Microsoftovih aplikacij. Končni uporabniki lahko hitro in učinkovito zaganjajo svoje aplikacije iz personalizirane spletne dostopne plošče – Access Panel (http://technet.microsoft.com/en-us/library/dn308586.aspx).

·         Omogočanje več-faktorske avtentikacije (Multi-Factor Authentication) za Windows Azure AD uporabnike bistveno prispeva k izboljšanju zaščite do več sto oblačnih storitev in aplikacij. Priročne možnosti za preverjanje pristnosti so mobilne aplikacije, telefonski klici, in SMS sporočila.

·         Razvijalcem ponuja učinkovit način za Integracijo upravljanja z integritetami v aplikacijah s centraliziranim mehanizmom za avtentikacijo in avtorizacijo, s pomočjo identitet, gostovanih v WAADu ali pa družabnih loginih, kot na primer Microsoft, Facebook, Yahoo! ali Google račun. Prav tako so s pomočjo Graph APIja omogočene poizvedbe po podatkih, shranjenih v aktivnem imeniku.

 

Windows Azure Active Directory PremiumPREVIEW

Za podjetja z zahtevnejšimi potrebami pa je na voljo Windows Active Directory Premium, ki je trenutno v predogledu in še poveča nabor funkcionalnosti.

·         V svoji prvi fazi Windows Azure Active Directory Premium ponuja:

·         Uporabnik lahko sam zamenja svoje geslo

·         Skupinsko provizioniranje in upravljanje za SaaS aplikacije

·         Boljši ‘branding’

·         Varnostna poročila

Storitev Windows Azure Active Directory Premium bo še naprej rasla in skrbela za novo identiteto in nove zahteve za upravljanje  z dostopi  v dobi oblaka.



Export – import alerts

clock November 10, 2013 03:23 by author Robi

In my previous blog post I have explained how you can troubleshoot alerts. In this one, I'm just going to post scripts I used for exporting and importing all alerts in a site collection.

Here is the script for exporting all alerts in a site collection to a csv file:

$site = Get-SPSite "http://2013portal"

$alertResultsCollection = @()

foreach ($web in $site.AllWebs) {

foreach ($alert in $web.Alerts){

$alertURL = $web.URL + "/" + $alert.ListUrl

$alertResult = New-Object PSObject

$alertResult |Add-Member -type NoteProperty -name "WebUrl" -Value $web.Url

$alertResult | Add-Member -type NoteProperty -name "ListURL" -value $alertURL

$alertResult | Add-Member -type NoteProperty -name "AlertTitle" -value $alert.Title

$alertResult | Add-Member -type NoteProperty -name "ListUrl" -value $alert.ListUrl

$alertResult | Add-Member -type NoteProperty -name "List" -value $alert.List

$alertResult | Add-Member -type NoteProperty -name "DeliveryChannel" -value $alert.DeliveryChannels

$alertResult | Add-Member -type NoteProperty -name "AlertType" -value $alert.AlertType

$alertResult | Add-Member -type NoteProperty -name "EventType" -value $alert.EventType

$alertResult | Add-Member -type NoteProperty -name "Frequency" -value $alert.AlertFrequency

$alertResult | Add-Member -type NoteProperty -name "AlertTime" -value $alert.AlertTime

$alertResult | Add-Member -type NoteProperty -name "SubscribedUser" -value $alert.User

$alertResultsCollection += $alertResult

}

}

$site.Dispose()

$alertResultsCollection

 

#Export to CSV

$alertResultsCollection | Export-CSV C:\Users\sp2013_farm_admin\Desktop\Alerts.csv

 

And here is the script you can use to import all alerts in one site collection from csv file:

 

Import-Csv C:\Users\sp2013_farm_admin\Desktop\Alerts.csv |ForEach-Object{

$webUrl=$_.WebUrl

$listTitle=$_.List

$alertTitle=$_.AlertTitle

$subscribedUser=$_.SubscribedUser

$alertType=$_.AlertType

$deliveryChannel=$_.DeliveryChannel

$eventType=$_.EventType

$frequency=$_.Frequency

 

$web=Get-SPWeb $webUrl

$list=$web.Lists.TryGetList($listTitle)

$user = $web.EnsureUser($subscribedUser)

$newAlert = $user.Alerts.Add()

$newAlert.Title = $alertTitle

$newAlert.AlertType=[Microsoft.SharePoint.SPAlertType]::$alertType

$newAlert.List = $list

$newAlert.DeliveryChannels = [Microsoft.SharePoint.SPAlertDeliveryChannels]::$deliveryChannel

$newAlert.EventType = [Microsoft.SharePoint.SPEventType]::$eventType

$newAlert.AlertFrequency = [Microsoft.SharePoint.SPAlertFrequency]::$frequency

if($frequency -ne "Immediate"){

$AlertTime=$_.AlertTime

$newAlert.AlertTime=$AlertTime

}

$newAlert.Update()

}

 

 

Hope it helps.

Robi Vončina



"Alert Me" feature and SharePoint 2013

clock October 31, 2013 22:58 by author Robi

 

In the previous few projects we had some issues involving alerts after upgrading or migrating to SharePoint 2013, so I wanted to explain how you can help yourself out when troubleshooting alerts in SP2013.

I described here a few most common issues we were faced with.

1. The "Alert Me" Feature is not available

If you happen to be in the documents library, and you do not find the command "Alert Me", the most likely problem is that you have not set up "Outgoing Email Settings" in Central Administration. Outgoing Email can be found in the category System Settings Configure Outgoing E-Mail Settings or at the URL: http://[CA_URL],/_admin/globalemailconfig.aspx.

Another option to set the Alerts is of course the use of PowerShell. Script with which you can set up alerts is as follows:

 

# == == == == == = Set variables == == == == ==

$SMTPServer = "devmail"

$emailAddress = "SP2013@dev.local"

$replyToEmail = "robi@dev.local"

   

# == == == == == = Outgoing Email = == == == == ==

$loadasm =[System.Reflection.Assembly]::LoadWithPartialName ("Microsoft SharePoint.")

$spGlobalAdmin = New-Object To Microsoft SharePoint. SPGlobalAdmin Administration. ..

$spGlobalAdmin.UpdateMailSettings ($SMTPServer, $emailAddress, $replyToEmail, 65001)

   

The result of our setting is then visible to the Central Administration and on the Ribbon in lists or libraries where the Alert Me button appears:

2. You are not able to create alerts for domain Email enabled security groups or distribution groups

In SharePoint 2013, we came across a very interesting example, where through the user interface you are not able to subscribe a domain security group to alerts. In the document library or list, click the alert me button, the dialog box opens, where you can enter the user you would like to subscribe to alerts. In previous versions of SharePoint names of distribution groups or email enabled security groups get resolved, while in the new version domain groups do not work. You get a "No matching results".

If you want to create alerts for your domain groups, you must use PowerShell:

 

 

<# ================================================

   

Options, setting EventType

   

SPEventType [Add, Modify, Delete, Discussion, All]

   

Setting options in the alert frequency

the case of the daily, weekly, it is necessary to also set

alert hour

   

SPAlertFrequency [Immediate, Daily, And Weekly]

   

================================================ #>

   

$web=Get-SPWeb "http://2013portal"

$list=$web.Lists.TryGetList("Documents")

$user = $web.EnsureUser('DEV\skupinaemail')

$newAlert = $user.Alerts.Add()

$newAlert.Title = $list.Title

$newAlert.AlertType=[Microsoft.SharePoint.SPAlertType]::List

$newAlert.List = $list

$newAlert.DeliveryChannels = [Microsoft.SharePoint.SPAlertDeliveryChannels]::Email

$newAlert.EventType = [Microsoft.SharePoint.SPEventType]::All

$newAlert.AlertFrequency = [Microsoft.SharePoint.SPAlertFrequency]::Daily

$newAlert.AlertTime="12:00"

$newAlert.Update()

 

 

In order to verify whether they have successfully set up alerts, open the Site Settings / User alerts

For the purposes of testing and the display of alerts, I created another alert for the same group, the only difference is that in this case, I use the "Frequency" Immediate Alert.

3. The user has set Alert on item adding in document library, but an immediate alert is not sent

For this example, it is necessary to understanding how alerts work. SharePoint Content databases has a table, which is called the EventCache. This is the table where SharePoint writes events that are important to search service application, alerts, ..., in short, the table serves as a temporary repository of current events, which is then used for subsequent operations. In the case of alerts, all subsequent operations for Immediate alerts are taken care of by the Timer job, which can be found in the Central Administration and is called Immediate Alerts or you can list all Timer jobs by simple PowerShell query:

Get-SPTimerJob | ? {$_.name -like "alert"}

   

As we can see from the results of the command, the Timer Job triggers every 5 minutes. The timer job queries the content databases and table EventCache. In the event that the table contains records for sending alerts, alert email is sent and record gets deleted from the event cache table.

Warning! Running a query directly to the SharePoint database is not supported.

To see what EventCache table contains, you can run this SQL Query against your Content DB:

select * from [SP13_Content_2013Portal].[dbo].[EventCache] (nolock)order by [TimeLastModified] desc

   

So, if I make a change on the document in the document library and perform a query, I get as a result of the following record:

Query result can be also the first verification that alerts work properly. For the purpose of subsequent verifications of Alerts, we have to use ULS Log Viewer and set up Verbose logging in Central Administration for alerts. Alerts logging can be found in the SharePoint Foundation category.

In the ULS Log Viewer, you have the option of filtering ULS logs. The easiest way to get all records related to the alerts, is to set the filter, as shown in the image:

If you want to see the records in the ULS logs and do not want to wait for 5 minutes, you must manually trigger Immediate Alerts Timer Job.

In my case, I'm exploring what's happening with alerts to which user Uroš is subscribed to. In the document library I uploaded a new document and started the Immediate alerts timer job with the use of PowerShell:

Get-SPTimerJob | ? {$_.name -like "*alert*"} |Start-SPTimerJob

   

In ULS logs you can now see records of Alerts operations:

In this case, I see that alerts work as expected. But what happens if we have a library with feature "Require documents to be checked out before they can be edited«?

Repeat the process, upload the document to a document library, in the ULS LOGS watch what's going on with alerts. Following entries appear:

Voila, I can see that the alerts are being processed correctly, however, because loading the document into the document library is a 2 step process, upload and entering metadata, it can be concluded that when uploading a document, document is treated as checked out. Record is written to EventCache table where state of document is registered as Checked out.

With that being said, we can now conclude that when the feature "Require documents to be checked out before they can be edited« on document libraries is enabled and user is subscribed to alerts on document added, alerts are not going to be sent out, as SharePoint treats this documents as checked out, or as the first minor version.

 

In this article I tried to describe some typical errors which may occur in the SharePoint 2013, in conjunction with the Alert Me feature and what's more important, I tried to demonstrate how we can debug and solve errors.

After the troubleshooting, don't forget to set the logging level for the Alerts back to the default settings.

   

Robi Vončina
SharePoint Server MVP

   



Windows Azure Storage Tables - drugi del

clock September 13, 2013 15:54 by author Rok Bermež

V prejšnjem članku smo naredili uvod v NoSql podatke v oblaku ter 'Windows Azure Storage', tokrat pa bomo nadaljevali z raziskovanjem 'Tables' aspekta te storitve.

Dandanes obstaja mnogo podatkovnih platform od katerih ima vsaka svoje prednosti slabosti. Mnogo od teh deluje po konceptu NoSQLja, kar pomeni da ni uporabljen RDBMS (a relational database management system) model. Skratka nimamo tabel in SQL stavkov, temveč druge podatkovne strukture, ki so po navadi masovne zbirke ključ/vrednost parov ali pa asociativnih nizov.  Priljubljeni izbori danes so are MongoDB, Cassandra, HBase, CouchDB, Neo4j in Windows Azure Tables, od katerega zadnjemu se bomo posvetili tokrat.

Kljub velikim razlikam imajo tako SQL in NoSQL baze eno skupno stvar: te tehnologije so (lahko) na voljo kot storitev v oblaku in tako  sprostijo razvijalce pred ročnim provizioniranjem in de-provizioniranjem podatkovnih strežnikov. Tako so tudi Windows Azure Tables ponujene kot storitev in nam razvijalcem ni treba razmišljati o tem, kot o nekem ločenem fizičnem strežniku.

Ena od pomembnejših značilnosti Windows Azure tabel je, da je njihovo hranjenje na voljo na treh geografsko porazdeljene regijah, ki so ZDA, Evropa in Azija. Vsak Microsoftov datacenter je v skladu z  Mednarodno organizacijo za standardizacijo (ISO) 27001, SSAE 16 ISAE 3402, EU Model Clauses and Health Insurance Portability and Accountability Act (HIPAA) business associate agreement (BAA) standards. Druga pomembna značilnost je georedundanca, ki vam omogoča hranjenje podatkov v drugem podatkovnem centru v isti regiji, kar doda še eno stopnjo varnosti pri naravnih nesrečah.

WAS (Windows Azure Storage) zmogljivosti in kapacitete so povezane z računi za upravljanje z njimi, tako posameznikov račun vključuje do 200TB prostora za shranjevanje. Windows Azure Tables so optimizirane za zagotavljanje neverjetno hitrega delovanje poizvedb v stanju velike obremenitve s pisanjem. Več o tem si lahko preberete na bit.ly / cMAWsZ.

<!--[if !vml]-->clip_image002<!--[endif]-->

Prav tako so na voljo WAS analitike, ki nam omogočajo sledenje zahtevkov za shranjevanje, analizo trendov uporabe in optimizacijo vzorcev podatkov ter beleženje dostopov iz računa za shranjevanje (več na bit.ly/XGLtGt).

Najbolj jedrnat način za izražanje vrednosti Windows Azure Tabel je, da so podprte NoSQL ključ / vrednost (key/value) poizvedbe in to tudi med pisanjem pri veliki obremenitvi. Iz razvijalskega stališča  so Windows Azure tables namenjene za shranjevanje velikih zbirk neenakomernih objektov ali za serviranje spletnih strani z veliko prometa. Do podatkov shranjenih v  Windows Azure tabelah je mogoče dostopati od skoraj kjerkoli.

Celoten sistem shranjevanja podatkov je osnovan na REST-u (is Representational State Transfer), kar pomeni da lahko katerikoli klient, ki je sposoben http komunikacije, deluje z WAS sistemom. Prav tako, REST API podpira vse potrebne operacije za delo s podatki.

V naslednji številki, bomo pa videli kako to poteka v praksi.

 

 



Upgrade Video

clock August 27, 2013 19:34 by author Robi

For the Ukrainian SharePoint Conference I recorded this video to present it at my session. I decided to make it public now so please, sit back, relax and enjoy it.

   

Upgrading to SharePoint 2013 Video



About the author

Rok Bermež is Slovenian Windows Azure MVP, he works as a Software Engineer and Microsoft Certified Trainer at Kompas Xnet. His primary interests include cloud computing and web development. With extensive experience in development and architecture he participated in many projects and since the CTP release of Windows Azure much of those projects are based on Windows Azure platform. Rok has been delivering courses, writing code and speaking at conferences and community events on Microsoft technologies for the last couple of years. You can also find him on Twitter (@Rok_B).

Month List

Sign In