X Blogs

who you gonna call?

Reduced Pricing on SQL Azure and New 100MB Database Option

clock February 15, 2012 03:09 by author Rok Bermež

Great news for SQL Azure subscribers! Microsoft announced last Tuesday significant changes to the pricing structure, resulting in 48% to 75% savings for databases larger than 1GB. Also, the 100MB database option enables customers to get started using this cloud-base relational DB engine at half of the previous price. Find out more here: http://blogs.msdn.com/b/windowsazure/archive/2012/02/14/announcing-reduced-pricing-on-sql-azure-and-new-100mb-database-option.aspx



Windows Azure Compute monitoring & diagnostics

clock January 11, 2012 03:00 by author Rok Bermež

Windows Azure Diagnostics enables you to collect diagnostic data from an application running in Windows Azure. You can use diagnostic data for debugging and troubleshooting, measuring performance, monitoring resource usage, traffic analysis and capacity planning, and auditing. After the diagnostic data is collected it can be transferred to a Windows Azure storage account for persistence. Transfers can either be scheduled or on-demand.

You can yous Windows Azure Diagnostics pluggin for that. Just enable it for your roles in your service definition file:

    <Imports>
      <ImportmoduleName="Diagnostics" />
    </Imports>

And configure azure storage accout for external storage for data in the configuration file:

    <ConfigurationSettings>
      <Settingname="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"value="DefaultEndpointsProtocol=https;xxxyyyzz=gfgdg" />
    </ConfigurationSettings>

To see the logs you can use a very nice tool from Cerebrata called Azure diagnostics manager that can be obtained here: http://www.cerebrata.com/Products/AzureDiagnosticsManager/Default.aspx.

There are a lot more options you can use and you can see most of them here: http://www.windowsazure.com/en-us/develop/net/common-tasks/diagnostics/

But in short, should you choose to do it in the config file you can do something like this:

1.    <tracing>
2.      <traceFailedRequests>
3.        <add path="*">
4.          <traceAreas>
5.             <add provider="ASP" verbosity="Verbose" />
6.             <add provider="ASPNET" areas="Infrastructure,Module,Page,AppServices" verbosity="Verbose" />
7.             <add provider="ISAPI Extension" verbosity="Verbose" />
8.             <add provider="WWW Server"
9.             areas="Authentication,
10.                                          Security,
11.                                          Filter,
12.                                          StaticFile,
13.                                          CGI,
14.                    Compression,
15.                    Cache,
16.                    RequestNotifications,
17.                    Module"
18.             verbosity="Verbose" />
19.        </traceAreas>
20.        <failureDefinitions statusCodes="400-599" />
21.       </add>
22.        </traceFailedRequests>
23.       </tracing>

Or you can do it in code by doing something like this:

   1:  public class WebRole : RoleEntryPoint
   2:  {
   3:      public override bool OnStart()
   4:      {
   5:          var diagConfig = DiagnosticMonitor.GetDefaultInitialConfiguration();
   6:   
   7:          var procTimeConfig = new PerformanceCounterConfiguration();
   8:          procTimeConfig.CounterSpecifier = @"\Processor(*)\% Processor Time";
   9:          procTimeConfig.SampleRate = System.TimeSpan.FromSeconds(5.0);
  10:          diagConfig.PerformanceCounters.DataSources.Add(procTimeConfig);
  11:   
  12:          var diskBytesConfig = new PerformanceCounterConfiguration();
  13:          diskBytesConfig.CounterSpecifier = @"\LogicalDisk(*)\Disk Bytes/sec";
  14:          diskBytesConfig.SampleRate = System.TimeSpan.FromSeconds(5.0);
  15:          diagConfig.PerformanceCounters.DataSources.Add(diskBytesConfig);
  16:   
  17:          var workingSetConfig = new PerformanceCounterConfiguration();
  18:          workingSetConfig.CounterSpecifier =
  19:              @"\Process(" +
  20:              System.Diagnostics.Process.GetCurrentProcess().ProcessName +
  21:              @")\Working Set";
  22:          workingSetConfig.SampleRate = System.TimeSpan.FromSeconds(5.0);
  23:          diagConfig.PerformanceCounters.DataSources.Add(workingSetConfig);
  24:          diagConfig.PerformanceCounters.ScheduledTransferPeriod = 
  25:            TimeSpan.FromSeconds(30);
  26:   
  27:          diagConfig.WindowsEventLog.DataSources.Add("System!*");
  28:          diagConfig.WindowsEventLog.DataSources.Add("Application!*");
  29:          diagConfig.WindowsEventLog.ScheduledTransferPeriod = 
  30:            TimeSpan.FromSeconds(30);
  31:   
  32:          DiagnosticMonitor.Start(
  33:            "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", 
  34:            diagConfig);
  35:   
  36:          return base.OnStart();
  37:      }
  38:  }

Make sure you restart the plugin with:

   1:  DiagnosticMonitor.Start(
   2:            "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", 
   3:            diagConfig);

That would be it for now, but feel free to contact me if you have any more questions.



Display members of AD group in SharePoint 2010 - Updated

clock January 3, 2012 22:19 by author Robi

Hi,

I updated the Display members of AD groups in ShaePoint 2010. Now the solution has a new look and as what one of my friends, Christian Stahl suggested I added links to members of groups and groups to display detailed info about user.

If you use SharePoint Server 2010 and you have my sites configured, you will be redirected to my site host for details about a user.

If user have permission to see members of SharePoint group, the group displays link to group membership page or group settings if you will. You can see the link in the bottom part of the picture.

AD groups now have a new icon, which you can see by the »CORP\sg_finance« AD group.

When you click on the group link you are redirected to a People and Groups page:

 

Users that are members of the site, not AD groups also have links to details view. If you use SharePoint Foundation 2010 you are redirected to a »Users« list if you use My Sites you are redirected to user's my site.

Installation instructions:

Install solution

If you want to install the new version of solution, downlod the wsp file from Codeplex:

http://sp2010adgroupmembers.codeplex.com/releases/view/79845

On your SharePoint Server run SharePoint 2010 Management Shell as Administrator and then run the command:

Add-SPSolution -LiteralPath C:\Users\sp_farm\Desktop\Skupine_Web_Part.wsp

To deploy the solution go to Central Administration, System settings, Manage Farm Solutions, click on »Skupine_Web_part.wsp« and click Deploy Solution.

You can also deploy the solution with PowerShell by running the following command:

Install-SPSolution -Identity skupine_web_part.wsp -GACDeployment -AllWebApplications

Activate solution

What you need to do next is go to your site collection where you want to use this solution and Activate the »Skupine Web part« solution on the »Site collection features« page:

Add web part to a page

To insert a web part to your page, click Edit page button and then Add a Web Part.

In the »KompasXnet« category, you will find »Xnet Diplay Group Members« web part.

Click Add.

When you web part is on the page, it is going to be empty. From a web part menu, select »Edit web part«

Enter a user name. This can be any domain user, without any special privileges. In the format domain\username.

Enter password for this user.

Enter domain information for querying your domain. If you have a special users container, you can specifiy just the Organizational Unit.

For example, if you would like to query just the OU »Zaposleni« you can enter:

 

OU=Zaposleni,DC=CORP,DC=XNET,DC=SI

Enter the Fully qualified domain name of domain controller where query will be sent to:

»Xnet-DC01.corp.xnet.si«

Click OK.

Click Stop Editing on the Ribbon on the Page tab.

When you click Stop Editing, the Web part will still be empty. What you need to do is click Browse on the Ribbon and navigate to the page where the Web part is.

Hope you find the improved version of my solution useful. If you have any comments or suggestions please send me email to:

robi@kompas-xnet.si



Display members of AD group in SharePoint 2010

clock December 10, 2011 02:16 by author Robi

Hi,

Because I got a question so many times if you can show members of AD groups on SharePoin site, I decided to build a web part that would replace the Site Users web part.

 

The usage is pretty simple, what you need to do is deploy a wsp package that you can find inside the bin\release folder.

 

Next step is to add a web part to your site and what you need to do is enter some information to access the Active Directory.

 

You need to enter:

  • user name to connect to AD in the format:
    • domain\username
  • domain information for LDAP query in format of:
    • DC=corp,DC=xnet,DC=si
  • you need to enter FQDN of a domain controller to which the ldap query will be sent:
    • xnet-dc01.corp.xnet.si

Hopefully some one will find this web part usefull.

You can find solution attached in a zip file.

 

For comments and suggestions please contact me at:

robi@kompas-xnet.si

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

 

There were some bug fixes so I published a new version of a Web Part.

 

Skupine Web Part_1.2.zip (219.37 kb)



Automatic SQL Azure backup part 1

clock November 18, 2011 23:24 by author Rok Bermež
Backup for SQL Azure was one of the most voted-on features since the beginning. Sure, we had SQL Migration wizard, BCP, SSIS, PowerShell cmdlets from Cerebrata , and later a very nice tool from RedGate (that I still use a lot) - SQL Azure backup. All of them have one flaw, there are either impossible or very hard to use for automatic backups that require no on premises infrastructure.
For a while now, Import and Export CTP functionality has been available through Windows Azure management portal, that exports or imports Sql DacPac package. This is the exact functionality that should be integrated with my SQL Azure using cloud applications.
MSDN documentation for that REST API seems to be completely lacking, but fortunately there are some SQL DAC Examples on CodePlex project that can we can use to see how it’s done.
First, we add a service reference to http://dacdc.cloudapp.net/DACWebService.svc/mex and generate required proxy classes.

Now we can make WebRequests for specific actions (import,export, status) to URLs, that are specific to each Windows Azure datacenter. Here is the mapping:

RegionUrl
North Central US https://ch1prod-dacsvc.azure.com/DACWebService.svc
South Central US https://sn1prod-dacsvc.azure.com/DACWebService.svc
North Europe https://db3prod-dacsvc.azure.com/DACWebService.svc
West Europe https://am1prod-dacsvc.azure.com/DACWebService.svc
East Asia https://hkgprod-dacsvc.azure.com/DACWebService.svc
Southeast Asia https://sg1prod-dacsvc.azure.com/DACWebService.svc

 

Export

To create a backup to storage we need to create an object of type ExportInput and POST it to datacenters url + “/export”.

 

public Guid ExportToStorage(string storageContainer, string fileName, BlobContainerPublicAccessType blobContainerPublicAccessType = BlobContainerPublicAccessType.Off)
        {
            var blobCredentials = GetBlobCredentials(storageContainer, fileName, true, blobContainerPublicAccessType);
            var exportRequest = new ExportInput
            {
                BlobCredentials = blobCredentials,
                ConnectionInfo = _connectionInfo
            };
            var req = GetRequest(new Uri(_dataCenterUrl + "/Export"), RequestMethod.POST);
            var serializer = new DataContractSerializer(typeof(ExportInput));
            var requestStream = req.GetRequestStream();
            serializer.WriteObject(requestStream, exportRequest);
            requestStream.Close();
            var resp = req.GetResponse();
            return GetGuidFromResponse(resp);
        }
        private BlobStorageAccessKeyCredentials GetBlobCredentials(string storageContainer, string fileName, bool createIfNotExist = false, BlobContainerPublicAccessType blobContainerPublicAccessType = BlobContainerPublicAccessType.Off)
        {
            var storageCredentials = new StorageCredentialsAccountAndKey(_storageConnectionInfo.AccountName, _storageConnectionInfo.AccountKey);
            var storageAccount = new CloudStorageAccount(storageCredentials, _storageConnectionInfo.UseHttps);
            var cloudBlobClient = storageAccount.CreateCloudBlobClient();
            var cloudBlobContainer = cloudBlobClient.GetContainerReference(storageContainer);
            if (createIfNotExist)
            {
                if (cloudBlobContainer.CreateIfNotExist())
                {
                    var permissions = cloudBlobContainer.GetPermissions();
                    permissions.PublicAccess = blobContainerPublicAccessType;
                    cloudBlobContainer.SetPermissions(permissions);
                }
            }
            var cloudBlob = cloudBlobContainer.GetBlobReference(fileName);
            var backupBlobUri = cloudBlob.Uri.ToString();
            var blobCredentials = new BlobStorageAccessKeyCredentials
            {
                StorageAccessKey = _storageConnectionInfo.AccountKey,
                Uri = backupBlobUri,
            };
            return blobCredentials;
        }
        private HttpWebRequest GetRequest(Uri uri, RequestMethod requestMethod)
        {
            var req = (HttpWebRequest)WebRequest.Create(uri);
            req.Method = requestMethod.ToString().ToUpper();
            req.ContentType = "application/xml";
            return req;
        }

Import

For import the process is very similar, we have object of type ImportInput and POST it to datacenters url + “/import”.

 

 public Guid ImportFromStorage(string storageContainer, string fileName, SqlAzureEdition sqlAzureEdition = SqlAzureEdition.Web, SqlAzureSize sqlAzureSize=SqlAzureSize.GB_1, string newDbName=null)
        {
            var blobCredentials = GetBlobCredentials(storageContainer,fileName);
            ImportInput importRequest = new ImportInput();
            BlobCredentials creds = blobCredentials;
            importRequest.BlobCredentials = creds;
            importRequest.AzureEdition = sqlAzureEdition.ToString().ToLower();
            importRequest.DatabaseSizeInGB = (int)sqlAzureSize;
            importRequest.ConnectionInfo = (String.IsNullOrEmpty(newDbName)) ? _connectionInfo : new ConnectionInfo() { DatabaseName = newDbName, ServerName = _connectionInfo.ServerName, UserName = _connectionInfo.UserName, Password = _connectionInfo.Password};
            var req = GetRequest(new Uri(_dataCenterUrl + "/Import"), RequestMethod.POST);
            var serializer = new DataContractSerializer(typeof(ImportInput));
            var requestStream = req.GetRequestStream();
            serializer.WriteObject(requestStream, importRequest);
            requestStream.Close();
            var resp = req.GetResponse();
            return GetGuidFromResponse(resp);
        }

Status

Both actions return GUID representing current action that we can use to check if operation was successful. We do this by making GET request to datacenters url + “/status? servername={0}&username={1}&password={2} &reqId={3}”. If we want to get history and their statuses for this datacenter we can make the same request without reqId.

 

        public StatusInfo GetStatusInfo(Guid requestId)
        {
            string uriBuilder = _dataCenterUrl + string.Format("/Status?servername={0}&username={1}&password={2}&reqId={3}", _connectionInfo.ServerName, _connectionInfo.UserName, _connectionInfo.Password, requestId.ToString());
            var req = GetRequest(new Uri(uriBuilder), RequestMethod.GET);
            var response = req.GetResponse();
            var statusInfos = GetStatusInfoFromResponse(response);
            return statusInfos[0];
        }

 

Here ( SqlAzure.rar (22,71 kb) ) you can download a complete library that you can use in your Azure Worker tasks to automatically back up you SQL Azure databases. Next.... how to create cheduler that uses it.

 



Script for Exporting and importin all webs in site Collection

clock November 15, 2011 20:27 by author Robi

I needed to export and import all web in site collection to a new Site collection, which was based on a different language template.

In order to automate the process I, of course, used PowerShell. I was trying to figure out, what would be the best way to name the exported webs and found this article on TechNet Forums:

http://social.technet.microsoft.com/Forums/en-US/sharepoint2010setup/thread/65e5e221-4d1f-428c-8803-f46f98171651/

 

This lead me to write script for exporting webs to be:

 

#Get all webs in site collection

$webs=Get-SPSite http://razvoj/sites/ang |Get-SPWeb -limit all

#Export-Spweb

foreach($web in $webs){$filename=$web.url.replace("http://","").replace("-","--").replace("/","-")+".cmp"; Export-SPWeb $web -Path c:\export\$filename -IncludeVersions all -IncludeUserSecurity -Verbose}

 

When you create new Site collection on a team site for example, there are some list and libraries which I found is best to delete them to avoid errors when importing webs.

#Delete lists and libraries on a new site collection to avoid errors when importing
for($i=0;$i -lt $seznami.Count;$i++){$seznami.delete($seznami[$i].id)}

The next step is to get all files in a c:\Export folder which has the extension of ».cmp«. For this I wrote a script:

#Get all files in folder with .cmp extension
$list=Get-ChildItem c:\export | where {$_.extension -eq ".cmp"}

And the final step is to import all webs to a new site collection:

#Import all files and use file name as reference to where web should be created
foreach($listitem in $list){$importweb=$listitem.name.replace("razvoj-","http://razvoj/").replace("-","/").replace("ang","slo").replace(".cmp",""); import-spweb $importweb -UpdateVersions overwrite -Path c:\export\$listitem -Force -IncludeUserSecurity -Verbose}

Hope this helps.

Robi Vončina



Windows Azure toolkit for Windows 8

clock September 16, 2011 03:20 by author Rok Bermež

Maximize the potential of your Windows Metro style app with Windows Azure. Building a cloud service to support rich Windows Metro style apps is even easier with the Windows Azure Toolkit for Windows 8.

This tool is designed to accelerate development so that developers can start enabling Windows 8 features, such as notifications, for their app with minimal time and experience. Use this toolkit to start building and customizing your own service to deliver rich Metro style apps.    

Learn more: http://blogs.msdn.com/b/hkamel/archive/2012/03/11/windows-azure-toolkit-for-windows-8.aspx



Multiple Websites on one WebRole

clock August 29, 2011 02:54 by author Rok Bermež

With Windows Azure SDK 1.3, you can run multiple Web Sites in a single Web role. Prior to Windows Azure SDK 1.3, each Web role ran a single Web application. This constraint was largely because Web roles were hosted in IIS Hosted Web Core where a single application was bound to a single HTTP/HTTPS endpoint. Now, Windows Azure supports full IIS capabilities allowing Web roles to support multiple Web sites and Web applications.

By using the Sites element within the service definition file, ServiceDefinition.csdef, you can configure your web role to support multiple web sites and web applications. This is accomplished using the sites, applications, and virtual directories features in Internet Information Services (IIS) 7.0. These features enable each web role in Windows Azure to support configuration for multiple web sites and applications.

Take a look at the sample ServiceDefinition.csdef:

As you can see all you have to do is set path to other web applications that you want to co-host in your role and set their host headers and endpoints. CSpack will do the rest :)



SharePoint Server 2010 Service Pack 1 PowerShell Changes

clock June 30, 2011 20:24 by author Robi

Here are some PowerShell Changes that came with SP1 for SharePoint Server 2010.

New cmdlets are:

  • Add-SPProfileLeader
  • Get-SPProfileLeader
  • Remove-SPProfileLeader
  • Remove-SPProfileSyncConnection
  • Add-SPProfileSyncConnection
  • Disable-SPHealthAnalysisRule
  • Enable-SPHealthAnalysisRule
  • Get-SPHealthAnalysisRule
  • Get-SPDeletedSite
  • Remove-SPDeletedSite
  • Restore-SPDeletedSite
  • Move-SPSocialComments

There are of course some changes to the existing cmdlets and here are a few that will be useful when configuring SharePoint Server 2010:

Used to move an RBS-enabled site collection from one RBS-enabled content database to another RBS-enabled content database without moving the underlying BLOB content. If the content database has more than one RBS provider associated with it, you must specify all providers. The same providers must be enabled on the target content database and the source content database."

    • New Parameter:AnalyticResultCacheMinimumHitCount <Int32>
    • New Parameters:DatabaseServer <string>,DatabaseName <string>,DatabaseFailoverServer <string>,DatabaseSQLAuthenticationCredential <PSCredential>

      This was the only Service Application that didn't allow us to set the database information when we created it so we were left with this nasty GUID in the name.

    • New Parameter:AnalyticResultCacheMinimumHitCount <Int32>
    • New Parameters:DatabaseServer <string>,DatabaseName <string>,DatabaseFailoverServer <string>,DatabaseSQLAuthenticationCredential <PSCredential>,DatabaseUseWindowsAuthentication
    • New Switch Parameter:Recycle


SharePoint 2010 Updates

clock June 30, 2011 19:22 by author Robi

To update your SharePoint environment it is recommended to install Cumulative Updates.

You can find all CUs as well as Service Packs and info on how updates should be applied on this page:

http://technet.microsoft.com/en-us/sharepoint/ff800847

To get better knowledge on applying patches it is recommended to read this TechNet article:

Software updates overview (SharePoint Server 2010)

One important information when updating your server farm or administering server farm is to know on which patch level it is.

To get build number, you simply open Central Administration and click »Manage Servers in this Farm«:

You can compare Farm build number to the table bellow which I borrowed from Todd Klindt

Build

Release

Component

14.0.4763.1000

RTM

All components

14.0.4762.1000

RTM

Farm Build Version

14.0.5114.5003

June 2010 CU

SharePoint Foundation 2010

14.0.5114.5003

June 2010 CU

Microsoft Shared Components

14.0.5114.5003

June 2010 CU

Microsoft SharePoint Portal

14.0.5114.5003

June 2010 CU

Microsoft User Profiles

14.0.5114.5003

June 2010 CU

Microsoft Search Server 2010 Core

14.0.5114.5003

June 2010 CU

Microsoft Web Analytics Web Front End Components

14.0.5123.5000

August 2010 CU

SharePoint Foundation 2010

14.0.5123.5000

August 2010 CU

SharePoint Server 2010

14.0.5128.5000

October 2010 CU

SharePoint Server 2010

14.0.5130.5002

December 2010 CU

SharePoint Foundation 2010

14.0.5130.5002

December 2010 CU

SharePoint Server 2010

14.0.5136.5002

February 2011 CU

SharePoint Foundation 2010

14.0.5136.5002

February 2011 CU

SharePoint Server 2010

14.0.5138.5000

April 2011 CU

SharePoint Foundation 2010

14.0.5138.5000

April 2011 CU

SharePoint Server 2010

14.0.5138.5000

April 2011 CU

Project Server 2010

14.0.6029.1000

Service Pack 1

SharePoint Server 2010

14.0.6029.1000

Service Pack 1

Office Web Apps

14.0.6029.1000

Service Pack 1

Project Server 2010

14.0.6029.1000

Service Pack 1

SharePoint Foundation 2010

14.0.6105.5000

June 2011 CU

SharePoint Server 2010

14.0.6105.5000

June 2011 CU

SharePoint Foundation 2010

14.0.6105.5000

June 2011 CU

Project Server 2010

Hope it helps.



About the author

Rok Bermež is Slovenian Windows Azure MVP, he works as a Software Engineer and Microsoft Certified Trainer at Kompas Xnet. His primary interests include cloud computing and web development. With extensive experience in development and architecture he participated in many projects and since the CTP release of Windows Azure much of those projects are based on Windows Azure platform. Rok has been delivering courses, writing code and speaking at conferences and community events on Microsoft technologies for the last couple of years. You can also find him on Twitter (@Rok_B).

Month List

Sign In