X Blogs

who you gonna call?

Microsoft Cloud Training

clock February 17, 2012 23:01 by author joze

Dear Microsoft student,

Expand your existing skills and acquire new skills on Microsoft’s cloud technologies including: Microsoft Office 365, Microsoft Exchange Online, Windows Azure, Windows Intune, Microsoft Hyper-V Server, Microsoft SharePoint Online, Microsoft Dynamics CRM Online, Microsoft System Center 2012 and SQL Azure.

Microsoft has made available over 30 learning resources to enable you to explore these technologies, including: eBooks, E-learning clinics, short videos (a.k.a. learning snacks), and classroom training courses.

Many of these valuable resources are free. To name a few:

· Understanding Microsoft Virtualization Solutions (eBook)

· Introduction to SQL Server 2012 (eBook)

· Microsoft® Office 365: Connect and Collaborate Anywhere, Anytime (eBook)

· Introducing Hyper-V in Windows Server 2008 R2 (learning snack)

· SQL Server 2012: Cloud on Your Terms (learning snack)

· Introduction to Microsoft Windows Azure Platform (learning snack)

Microsoft’s cloud-based technologies are relevant to specific job roles. Start here: Microsoft Cloud Training

To understand more, check out the Microsoft Learning Cloud Brochure.

Thank you, and good luck!

Using Windows Azure Connect

clock February 16, 2012 03:25 by author Rok Bermež

Windows Azure Connect enables Windows Azure users to set up secure, IP-level network connectivity between their Windows Azure hosted services and local (on-premises) resources.

To set it up, you must first connect to the Windows Azure Management portal and enable it for your subscription.

Next you need to get the activation token.

Then open the ServiceDefinition.csdef file and import the Connect module for your roles.

   <Import moduleName="Connect" />

And set the token in the csconfig file.

<Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="your_ activation_token_guid" />

To gain access to local resources you need to install Windows Azure Connect Endpoint software that you get on https://waconnecttokenpage.cloudapp.net/Default.aspx?token=yourtoken

After Connect Endpoint is installed, it will automatically “activate” itself with the Connect service which should take around 10 to 30 seconds. Once a local machine is activated, it will appear in the Virtual Network of the Management Portal when you select the “Activated Endpoints” node or the “Groups and Roles” node.

Now you have to you can define your network connectivity policy in the Virtual Networks section of the Management Portal.

If the “Interconnected” check box is checked, then machines that belong to the group will be able to communicate with each other via Connect. If it is set to false, then machines in the group will not be able to communicate with each other.

You cant ping your roles in the cloud because their local firewall prevents it, but there is a fix for that. Just add a startup task that ads a firewall rule to the firewall.

Echo Enable ICMP
netsh advfirewall firewall add rule name="ICMPv6 echo" dir=in action=allow enable=yes protocol=icmpv6:128,any 
exit /b 0

Connect will automatically track changes made to your Windows Azure role and maintain connectivity. If you increase the number of Windows Azure role instances, Connect will automatically connect those new role instances based on the current network policy. The REALLY bad side of it is when you redeploy the app, you will have to add your new deployment to your network policy manualy, since we currently dont have this available in management api.


Reduced Pricing on SQL Azure and New 100MB Database Option

clock February 15, 2012 03:09 by author Rok Bermež

Great news for SQL Azure subscribers! Microsoft announced last Tuesday significant changes to the pricing structure, resulting in 48% to 75% savings for databases larger than 1GB. Also, the 100MB database option enables customers to get started using this cloud-base relational DB engine at half of the previous price. Find out more here: http://blogs.msdn.com/b/windowsazure/archive/2012/02/14/announcing-reduced-pricing-on-sql-azure-and-new-100mb-database-option.aspx

Windows Azure Compute monitoring & diagnostics

clock January 11, 2012 03:00 by author Rok Bermež

Windows Azure Diagnostics enables you to collect diagnostic data from an application running in Windows Azure. You can use diagnostic data for debugging and troubleshooting, measuring performance, monitoring resource usage, traffic analysis and capacity planning, and auditing. After the diagnostic data is collected it can be transferred to a Windows Azure storage account for persistence. Transfers can either be scheduled or on-demand.

You can yous Windows Azure Diagnostics pluggin for that. Just enable it for your roles in your service definition file:

      <ImportmoduleName="Diagnostics" />

And configure azure storage accout for external storage for data in the configuration file:

      <Settingname="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"value="DefaultEndpointsProtocol=https;xxxyyyzz=gfgdg" />

To see the logs you can use a very nice tool from Cerebrata called Azure diagnostics manager that can be obtained here: http://www.cerebrata.com/Products/AzureDiagnosticsManager/Default.aspx.

There are a lot more options you can use and you can see most of them here: http://www.windowsazure.com/en-us/develop/net/common-tasks/diagnostics/

But in short, should you choose to do it in the config file you can do something like this:

1.    <tracing>
2.      <traceFailedRequests>
3.        <add path="*">
4.          <traceAreas>
5.             <add provider="ASP" verbosity="Verbose" />
6.             <add provider="ASPNET" areas="Infrastructure,Module,Page,AppServices" verbosity="Verbose" />
7.             <add provider="ISAPI Extension" verbosity="Verbose" />
8.             <add provider="WWW Server"
9.             areas="Authentication,
10.                                          Security,
11.                                          Filter,
12.                                          StaticFile,
13.                                          CGI,
14.                    Compression,
15.                    Cache,
16.                    RequestNotifications,
17.                    Module"
18.             verbosity="Verbose" />
19.        </traceAreas>
20.        <failureDefinitions statusCodes="400-599" />
21.       </add>
22.        </traceFailedRequests>
23.       </tracing>

Or you can do it in code by doing something like this:

   1:  public class WebRole : RoleEntryPoint
   2:  {
   3:      public override bool OnStart()
   4:      {
   5:          var diagConfig = DiagnosticMonitor.GetDefaultInitialConfiguration();
   7:          var procTimeConfig = new PerformanceCounterConfiguration();
   8:          procTimeConfig.CounterSpecifier = @"\Processor(*)\% Processor Time";
   9:          procTimeConfig.SampleRate = System.TimeSpan.FromSeconds(5.0);
  10:          diagConfig.PerformanceCounters.DataSources.Add(procTimeConfig);
  12:          var diskBytesConfig = new PerformanceCounterConfiguration();
  13:          diskBytesConfig.CounterSpecifier = @"\LogicalDisk(*)\Disk Bytes/sec";
  14:          diskBytesConfig.SampleRate = System.TimeSpan.FromSeconds(5.0);
  15:          diagConfig.PerformanceCounters.DataSources.Add(diskBytesConfig);
  17:          var workingSetConfig = new PerformanceCounterConfiguration();
  18:          workingSetConfig.CounterSpecifier =
  19:              @"\Process(" +
  20:              System.Diagnostics.Process.GetCurrentProcess().ProcessName +
  21:              @")\Working Set";
  22:          workingSetConfig.SampleRate = System.TimeSpan.FromSeconds(5.0);
  23:          diagConfig.PerformanceCounters.DataSources.Add(workingSetConfig);
  24:          diagConfig.PerformanceCounters.ScheduledTransferPeriod = 
  25:            TimeSpan.FromSeconds(30);
  27:          diagConfig.WindowsEventLog.DataSources.Add("System!*");
  28:          diagConfig.WindowsEventLog.DataSources.Add("Application!*");
  29:          diagConfig.WindowsEventLog.ScheduledTransferPeriod = 
  30:            TimeSpan.FromSeconds(30);
  32:          DiagnosticMonitor.Start(
  33:            "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", 
  34:            diagConfig);
  36:          return base.OnStart();
  37:      }
  38:  }

Make sure you restart the plugin with:

   1:  DiagnosticMonitor.Start(
   2:            "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", 
   3:            diagConfig);

That would be it for now, but feel free to contact me if you have any more questions.

Display members of AD group in SharePoint 2010 - Updated

clock January 3, 2012 22:19 by author Robi


I updated the Display members of AD groups in ShaePoint 2010. Now the solution has a new look and as what one of my friends, Christian Stahl suggested I added links to members of groups and groups to display detailed info about user.

If you use SharePoint Server 2010 and you have my sites configured, you will be redirected to my site host for details about a user.

If user have permission to see members of SharePoint group, the group displays link to group membership page or group settings if you will. You can see the link in the bottom part of the picture.

AD groups now have a new icon, which you can see by the »CORP\sg_finance« AD group.

When you click on the group link you are redirected to a People and Groups page:


Users that are members of the site, not AD groups also have links to details view. If you use SharePoint Foundation 2010 you are redirected to a »Users« list if you use My Sites you are redirected to user's my site.

Installation instructions:

Install solution

If you want to install the new version of solution, downlod the wsp file from Codeplex:


On your SharePoint Server run SharePoint 2010 Management Shell as Administrator and then run the command:

Add-SPSolution -LiteralPath C:\Users\sp_farm\Desktop\Skupine_Web_Part.wsp

To deploy the solution go to Central Administration, System settings, Manage Farm Solutions, click on »Skupine_Web_part.wsp« and click Deploy Solution.

You can also deploy the solution with PowerShell by running the following command:

Install-SPSolution -Identity skupine_web_part.wsp -GACDeployment -AllWebApplications

Activate solution

What you need to do next is go to your site collection where you want to use this solution and Activate the »Skupine Web part« solution on the »Site collection features« page:

Add web part to a page

To insert a web part to your page, click Edit page button and then Add a Web Part.

In the »KompasXnet« category, you will find »Xnet Diplay Group Members« web part.

Click Add.

When you web part is on the page, it is going to be empty. From a web part menu, select »Edit web part«

Enter a user name. This can be any domain user, without any special privileges. In the format domain\username.

Enter password for this user.

Enter domain information for querying your domain. If you have a special users container, you can specifiy just the Organizational Unit.

For example, if you would like to query just the OU »Zaposleni« you can enter:



Enter the Fully qualified domain name of domain controller where query will be sent to:


Click OK.

Click Stop Editing on the Ribbon on the Page tab.

When you click Stop Editing, the Web part will still be empty. What you need to do is click Browse on the Ribbon and navigate to the page where the Web part is.

Hope you find the improved version of my solution useful. If you have any comments or suggestions please send me email to:


Display members of AD group in SharePoint 2010

clock December 10, 2011 02:16 by author Robi


Because I got a question so many times if you can show members of AD groups on SharePoin site, I decided to build a web part that would replace the Site Users web part.


The usage is pretty simple, what you need to do is deploy a wsp package that you can find inside the bin\release folder.


Next step is to add a web part to your site and what you need to do is enter some information to access the Active Directory.


You need to enter:

  • user name to connect to AD in the format:
    • domain\username
  • domain information for LDAP query in format of:
    • DC=corp,DC=xnet,DC=si
  • you need to enter FQDN of a domain controller to which the ldap query will be sent:
    • xnet-dc01.corp.xnet.si

Hopefully some one will find this web part usefull.

You can find solution attached in a zip file.


For comments and suggestions please contact me at:




There were some bug fixes so I published a new version of a Web Part.


Skupine Web Part_1.2.zip (219.37 kb)

Automatic SQL Azure backup part 1

clock November 18, 2011 23:24 by author Rok Bermež
Backup for SQL Azure was one of the most voted-on features since the beginning. Sure, we had SQL Migration wizard, BCP, SSIS, PowerShell cmdlets from Cerebrata , and later a very nice tool from RedGate (that I still use a lot) - SQL Azure backup. All of them have one flaw, there are either impossible or very hard to use for automatic backups that require no on premises infrastructure.
For a while now, Import and Export CTP functionality has been available through Windows Azure management portal, that exports or imports Sql DacPac package. This is the exact functionality that should be integrated with my SQL Azure using cloud applications.
MSDN documentation for that REST API seems to be completely lacking, but fortunately there are some SQL DAC Examples on CodePlex project that can we can use to see how it’s done.
First, we add a service reference to http://dacdc.cloudapp.net/DACWebService.svc/mex and generate required proxy classes.

Now we can make WebRequests for specific actions (import,export, status) to URLs, that are specific to each Windows Azure datacenter. Here is the mapping:

North Central US https://ch1prod-dacsvc.azure.com/DACWebService.svc
South Central US https://sn1prod-dacsvc.azure.com/DACWebService.svc
North Europe https://db3prod-dacsvc.azure.com/DACWebService.svc
West Europe https://am1prod-dacsvc.azure.com/DACWebService.svc
East Asia https://hkgprod-dacsvc.azure.com/DACWebService.svc
Southeast Asia https://sg1prod-dacsvc.azure.com/DACWebService.svc



To create a backup to storage we need to create an object of type ExportInput and POST it to datacenters url + “/export”.


public Guid ExportToStorage(string storageContainer, string fileName, BlobContainerPublicAccessType blobContainerPublicAccessType = BlobContainerPublicAccessType.Off)
            var blobCredentials = GetBlobCredentials(storageContainer, fileName, true, blobContainerPublicAccessType);
            var exportRequest = new ExportInput
                BlobCredentials = blobCredentials,
                ConnectionInfo = _connectionInfo
            var req = GetRequest(new Uri(_dataCenterUrl + "/Export"), RequestMethod.POST);
            var serializer = new DataContractSerializer(typeof(ExportInput));
            var requestStream = req.GetRequestStream();
            serializer.WriteObject(requestStream, exportRequest);
            var resp = req.GetResponse();
            return GetGuidFromResponse(resp);
        private BlobStorageAccessKeyCredentials GetBlobCredentials(string storageContainer, string fileName, bool createIfNotExist = false, BlobContainerPublicAccessType blobContainerPublicAccessType = BlobContainerPublicAccessType.Off)
            var storageCredentials = new StorageCredentialsAccountAndKey(_storageConnectionInfo.AccountName, _storageConnectionInfo.AccountKey);
            var storageAccount = new CloudStorageAccount(storageCredentials, _storageConnectionInfo.UseHttps);
            var cloudBlobClient = storageAccount.CreateCloudBlobClient();
            var cloudBlobContainer = cloudBlobClient.GetContainerReference(storageContainer);
            if (createIfNotExist)
                if (cloudBlobContainer.CreateIfNotExist())
                    var permissions = cloudBlobContainer.GetPermissions();
                    permissions.PublicAccess = blobContainerPublicAccessType;
            var cloudBlob = cloudBlobContainer.GetBlobReference(fileName);
            var backupBlobUri = cloudBlob.Uri.ToString();
            var blobCredentials = new BlobStorageAccessKeyCredentials
                StorageAccessKey = _storageConnectionInfo.AccountKey,
                Uri = backupBlobUri,
            return blobCredentials;
        private HttpWebRequest GetRequest(Uri uri, RequestMethod requestMethod)
            var req = (HttpWebRequest)WebRequest.Create(uri);
            req.Method = requestMethod.ToString().ToUpper();
            req.ContentType = "application/xml";
            return req;


For import the process is very similar, we have object of type ImportInput and POST it to datacenters url + “/import”.


 public Guid ImportFromStorage(string storageContainer, string fileName, SqlAzureEdition sqlAzureEdition = SqlAzureEdition.Web, SqlAzureSize sqlAzureSize=SqlAzureSize.GB_1, string newDbName=null)
            var blobCredentials = GetBlobCredentials(storageContainer,fileName);
            ImportInput importRequest = new ImportInput();
            BlobCredentials creds = blobCredentials;
            importRequest.BlobCredentials = creds;
            importRequest.AzureEdition = sqlAzureEdition.ToString().ToLower();
            importRequest.DatabaseSizeInGB = (int)sqlAzureSize;
            importRequest.ConnectionInfo = (String.IsNullOrEmpty(newDbName)) ? _connectionInfo : new ConnectionInfo() { DatabaseName = newDbName, ServerName = _connectionInfo.ServerName, UserName = _connectionInfo.UserName, Password = _connectionInfo.Password};
            var req = GetRequest(new Uri(_dataCenterUrl + "/Import"), RequestMethod.POST);
            var serializer = new DataContractSerializer(typeof(ImportInput));
            var requestStream = req.GetRequestStream();
            serializer.WriteObject(requestStream, importRequest);
            var resp = req.GetResponse();
            return GetGuidFromResponse(resp);


Both actions return GUID representing current action that we can use to check if operation was successful. We do this by making GET request to datacenters url + “/status? servername={0}&username={1}&password={2} &reqId={3}”. If we want to get history and their statuses for this datacenter we can make the same request without reqId.


        public StatusInfo GetStatusInfo(Guid requestId)
            string uriBuilder = _dataCenterUrl + string.Format("/Status?servername={0}&username={1}&password={2}&reqId={3}", _connectionInfo.ServerName, _connectionInfo.UserName, _connectionInfo.Password, requestId.ToString());
            var req = GetRequest(new Uri(uriBuilder), RequestMethod.GET);
            var response = req.GetResponse();
            var statusInfos = GetStatusInfoFromResponse(response);
            return statusInfos[0];


Here ( SqlAzure.rar (22,71 kb) ) you can download a complete library that you can use in your Azure Worker tasks to automatically back up you SQL Azure databases. Next.... how to create cheduler that uses it.


Script for Exporting and importin all webs in site Collection

clock November 15, 2011 20:27 by author Robi

I needed to export and import all web in site collection to a new Site collection, which was based on a different language template.

In order to automate the process I, of course, used PowerShell. I was trying to figure out, what would be the best way to name the exported webs and found this article on TechNet Forums:



This lead me to write script for exporting webs to be:


#Get all webs in site collection

$webs=Get-SPSite http://razvoj/sites/ang |Get-SPWeb -limit all


foreach($web in $webs){$filename=$web.url.replace("http://","").replace("-","--").replace("/","-")+".cmp"; Export-SPWeb $web -Path c:\export\$filename -IncludeVersions all -IncludeUserSecurity -Verbose}


When you create new Site collection on a team site for example, there are some list and libraries which I found is best to delete them to avoid errors when importing webs.

#Delete lists and libraries on a new site collection to avoid errors when importing
for($i=0;$i -lt $seznami.Count;$i++){$seznami.delete($seznami[$i].id)}

The next step is to get all files in a c:\Export folder which has the extension of ».cmp«. For this I wrote a script:

#Get all files in folder with .cmp extension
$list=Get-ChildItem c:\export | where {$_.extension -eq ".cmp"}

And the final step is to import all webs to a new site collection:

#Import all files and use file name as reference to where web should be created
foreach($listitem in $list){$importweb=$listitem.name.replace("razvoj-","http://razvoj/").replace("-","/").replace("ang","slo").replace(".cmp",""); import-spweb $importweb -UpdateVersions overwrite -Path c:\export\$listitem -Force -IncludeUserSecurity -Verbose}

Hope this helps.

Robi Vončina

Windows Azure toolkit for Windows 8

clock September 16, 2011 03:20 by author Rok Bermež

Maximize the potential of your Windows Metro style app with Windows Azure. Building a cloud service to support rich Windows Metro style apps is even easier with the Windows Azure Toolkit for Windows 8.

This tool is designed to accelerate development so that developers can start enabling Windows 8 features, such as notifications, for their app with minimal time and experience. Use this toolkit to start building and customizing your own service to deliver rich Metro style apps.    

Learn more: http://blogs.msdn.com/b/hkamel/archive/2012/03/11/windows-azure-toolkit-for-windows-8.aspx

Multiple Websites on one WebRole

clock August 29, 2011 02:54 by author Rok Bermež

With Windows Azure SDK 1.3, you can run multiple Web Sites in a single Web role. Prior to Windows Azure SDK 1.3, each Web role ran a single Web application. This constraint was largely because Web roles were hosted in IIS Hosted Web Core where a single application was bound to a single HTTP/HTTPS endpoint. Now, Windows Azure supports full IIS capabilities allowing Web roles to support multiple Web sites and Web applications.

By using the Sites element within the service definition file, ServiceDefinition.csdef, you can configure your web role to support multiple web sites and web applications. This is accomplished using the sites, applications, and virtual directories features in Internet Information Services (IIS) 7.0. These features enable each web role in Windows Azure to support configuration for multiple web sites and applications.

Take a look at the sample ServiceDefinition.csdef:

As you can see all you have to do is set path to other web applications that you want to co-host in your role and set their host headers and endpoints. CSpack will do the rest :)

About the author

Rok Bermež is Slovenian Windows Azure MVP, he works as a Software Engineer and Microsoft Certified Trainer at Kompas Xnet. His primary interests include cloud computing and web development. With extensive experience in development and architecture he participated in many projects and since the CTP release of Windows Azure much of those projects are based on Windows Azure platform. Rok has been delivering courses, writing code and speaking at conferences and community events on Microsoft technologies for the last couple of years. You can also find him on Twitter (@Rok_B).

Month List

Sign In