X Blogs

who you gonna call?

Automatic SQL Azure backup part 1

clock November 18, 2011 23:24 by author Rok Bermež
Backup for SQL Azure was one of the most voted-on features since the beginning. Sure, we had SQL Migration wizard, BCP, SSIS, PowerShell cmdlets from Cerebrata , and later a very nice tool from RedGate (that I still use a lot) - SQL Azure backup. All of them have one flaw, there are either impossible or very hard to use for automatic backups that require no on premises infrastructure.
For a while now, Import and Export CTP functionality has been available through Windows Azure management portal, that exports or imports Sql DacPac package. This is the exact functionality that should be integrated with my SQL Azure using cloud applications.
MSDN documentation for that REST API seems to be completely lacking, but fortunately there are some SQL DAC Examples on CodePlex project that can we can use to see how it’s done.
First, we add a service reference to http://dacdc.cloudapp.net/DACWebService.svc/mex and generate required proxy classes.

Now we can make WebRequests for specific actions (import,export, status) to URLs, that are specific to each Windows Azure datacenter. Here is the mapping:

RegionUrl
North Central US https://ch1prod-dacsvc.azure.com/DACWebService.svc
South Central US https://sn1prod-dacsvc.azure.com/DACWebService.svc
North Europe https://db3prod-dacsvc.azure.com/DACWebService.svc
West Europe https://am1prod-dacsvc.azure.com/DACWebService.svc
East Asia https://hkgprod-dacsvc.azure.com/DACWebService.svc
Southeast Asia https://sg1prod-dacsvc.azure.com/DACWebService.svc

 

Export

To create a backup to storage we need to create an object of type ExportInput and POST it to datacenters url + “/export”.

 

public Guid ExportToStorage(string storageContainer, string fileName, BlobContainerPublicAccessType blobContainerPublicAccessType = BlobContainerPublicAccessType.Off)
        {
            var blobCredentials = GetBlobCredentials(storageContainer, fileName, true, blobContainerPublicAccessType);
            var exportRequest = new ExportInput
            {
                BlobCredentials = blobCredentials,
                ConnectionInfo = _connectionInfo
            };
            var req = GetRequest(new Uri(_dataCenterUrl + "/Export"), RequestMethod.POST);
            var serializer = new DataContractSerializer(typeof(ExportInput));
            var requestStream = req.GetRequestStream();
            serializer.WriteObject(requestStream, exportRequest);
            requestStream.Close();
            var resp = req.GetResponse();
            return GetGuidFromResponse(resp);
        }
        private BlobStorageAccessKeyCredentials GetBlobCredentials(string storageContainer, string fileName, bool createIfNotExist = false, BlobContainerPublicAccessType blobContainerPublicAccessType = BlobContainerPublicAccessType.Off)
        {
            var storageCredentials = new StorageCredentialsAccountAndKey(_storageConnectionInfo.AccountName, _storageConnectionInfo.AccountKey);
            var storageAccount = new CloudStorageAccount(storageCredentials, _storageConnectionInfo.UseHttps);
            var cloudBlobClient = storageAccount.CreateCloudBlobClient();
            var cloudBlobContainer = cloudBlobClient.GetContainerReference(storageContainer);
            if (createIfNotExist)
            {
                if (cloudBlobContainer.CreateIfNotExist())
                {
                    var permissions = cloudBlobContainer.GetPermissions();
                    permissions.PublicAccess = blobContainerPublicAccessType;
                    cloudBlobContainer.SetPermissions(permissions);
                }
            }
            var cloudBlob = cloudBlobContainer.GetBlobReference(fileName);
            var backupBlobUri = cloudBlob.Uri.ToString();
            var blobCredentials = new BlobStorageAccessKeyCredentials
            {
                StorageAccessKey = _storageConnectionInfo.AccountKey,
                Uri = backupBlobUri,
            };
            return blobCredentials;
        }
        private HttpWebRequest GetRequest(Uri uri, RequestMethod requestMethod)
        {
            var req = (HttpWebRequest)WebRequest.Create(uri);
            req.Method = requestMethod.ToString().ToUpper();
            req.ContentType = "application/xml";
            return req;
        }

Import

For import the process is very similar, we have object of type ImportInput and POST it to datacenters url + “/import”.

 

 public Guid ImportFromStorage(string storageContainer, string fileName, SqlAzureEdition sqlAzureEdition = SqlAzureEdition.Web, SqlAzureSize sqlAzureSize=SqlAzureSize.GB_1, string newDbName=null)
        {
            var blobCredentials = GetBlobCredentials(storageContainer,fileName);
            ImportInput importRequest = new ImportInput();
            BlobCredentials creds = blobCredentials;
            importRequest.BlobCredentials = creds;
            importRequest.AzureEdition = sqlAzureEdition.ToString().ToLower();
            importRequest.DatabaseSizeInGB = (int)sqlAzureSize;
            importRequest.ConnectionInfo = (String.IsNullOrEmpty(newDbName)) ? _connectionInfo : new ConnectionInfo() { DatabaseName = newDbName, ServerName = _connectionInfo.ServerName, UserName = _connectionInfo.UserName, Password = _connectionInfo.Password};
            var req = GetRequest(new Uri(_dataCenterUrl + "/Import"), RequestMethod.POST);
            var serializer = new DataContractSerializer(typeof(ImportInput));
            var requestStream = req.GetRequestStream();
            serializer.WriteObject(requestStream, importRequest);
            requestStream.Close();
            var resp = req.GetResponse();
            return GetGuidFromResponse(resp);
        }

Status

Both actions return GUID representing current action that we can use to check if operation was successful. We do this by making GET request to datacenters url + “/status? servername={0}&username={1}&password={2} &reqId={3}”. If we want to get history and their statuses for this datacenter we can make the same request without reqId.

 

        public StatusInfo GetStatusInfo(Guid requestId)
        {
            string uriBuilder = _dataCenterUrl + string.Format("/Status?servername={0}&username={1}&password={2}&reqId={3}", _connectionInfo.ServerName, _connectionInfo.UserName, _connectionInfo.Password, requestId.ToString());
            var req = GetRequest(new Uri(uriBuilder), RequestMethod.GET);
            var response = req.GetResponse();
            var statusInfos = GetStatusInfoFromResponse(response);
            return statusInfos[0];
        }

 

Here ( SqlAzure.rar (22,71 kb) ) you can download a complete library that you can use in your Azure Worker tasks to automatically back up you SQL Azure databases. Next.... how to create cheduler that uses it.

 



Script for Exporting and importin all webs in site Collection

clock November 15, 2011 20:27 by author Robi

I needed to export and import all web in site collection to a new Site collection, which was based on a different language template.

In order to automate the process I, of course, used PowerShell. I was trying to figure out, what would be the best way to name the exported webs and found this article on TechNet Forums:

http://social.technet.microsoft.com/Forums/en-US/sharepoint2010setup/thread/65e5e221-4d1f-428c-8803-f46f98171651/

 

This lead me to write script for exporting webs to be:

 

#Get all webs in site collection

$webs=Get-SPSite http://razvoj/sites/ang |Get-SPWeb -limit all

#Export-Spweb

foreach($web in $webs){$filename=$web.url.replace("http://","").replace("-","--").replace("/","-")+".cmp"; Export-SPWeb $web -Path c:\export\$filename -IncludeVersions all -IncludeUserSecurity -Verbose}

 

When you create new Site collection on a team site for example, there are some list and libraries which I found is best to delete them to avoid errors when importing webs.

#Delete lists and libraries on a new site collection to avoid errors when importing
for($i=0;$i -lt $seznami.Count;$i++){$seznami.delete($seznami[$i].id)}

The next step is to get all files in a c:\Export folder which has the extension of ».cmp«. For this I wrote a script:

#Get all files in folder with .cmp extension
$list=Get-ChildItem c:\export | where {$_.extension -eq ".cmp"}

And the final step is to import all webs to a new site collection:

#Import all files and use file name as reference to where web should be created
foreach($listitem in $list){$importweb=$listitem.name.replace("razvoj-","http://razvoj/").replace("-","/").replace("ang","slo").replace(".cmp",""); import-spweb $importweb -UpdateVersions overwrite -Path c:\export\$listitem -Force -IncludeUserSecurity -Verbose}

Hope this helps.

Robi Vončina



Windows Azure toolkit for Windows 8

clock September 16, 2011 03:20 by author Rok Bermež

Maximize the potential of your Windows Metro style app with Windows Azure. Building a cloud service to support rich Windows Metro style apps is even easier with the Windows Azure Toolkit for Windows 8.

This tool is designed to accelerate development so that developers can start enabling Windows 8 features, such as notifications, for their app with minimal time and experience. Use this toolkit to start building and customizing your own service to deliver rich Metro style apps.    

Learn more: http://blogs.msdn.com/b/hkamel/archive/2012/03/11/windows-azure-toolkit-for-windows-8.aspx



Multiple Websites on one WebRole

clock August 29, 2011 02:54 by author Rok Bermež

With Windows Azure SDK 1.3, you can run multiple Web Sites in a single Web role. Prior to Windows Azure SDK 1.3, each Web role ran a single Web application. This constraint was largely because Web roles were hosted in IIS Hosted Web Core where a single application was bound to a single HTTP/HTTPS endpoint. Now, Windows Azure supports full IIS capabilities allowing Web roles to support multiple Web sites and Web applications.

By using the Sites element within the service definition file, ServiceDefinition.csdef, you can configure your web role to support multiple web sites and web applications. This is accomplished using the sites, applications, and virtual directories features in Internet Information Services (IIS) 7.0. These features enable each web role in Windows Azure to support configuration for multiple web sites and applications.

Take a look at the sample ServiceDefinition.csdef:

As you can see all you have to do is set path to other web applications that you want to co-host in your role and set their host headers and endpoints. CSpack will do the rest :)



SharePoint Server 2010 Service Pack 1 PowerShell Changes

clock June 30, 2011 20:24 by author Robi

Here are some PowerShell Changes that came with SP1 for SharePoint Server 2010.

New cmdlets are:

  • Add-SPProfileLeader
  • Get-SPProfileLeader
  • Remove-SPProfileLeader
  • Remove-SPProfileSyncConnection
  • Add-SPProfileSyncConnection
  • Disable-SPHealthAnalysisRule
  • Enable-SPHealthAnalysisRule
  • Get-SPHealthAnalysisRule
  • Get-SPDeletedSite
  • Remove-SPDeletedSite
  • Restore-SPDeletedSite
  • Move-SPSocialComments

There are of course some changes to the existing cmdlets and here are a few that will be useful when configuring SharePoint Server 2010:

Used to move an RBS-enabled site collection from one RBS-enabled content database to another RBS-enabled content database without moving the underlying BLOB content. If the content database has more than one RBS provider associated with it, you must specify all providers. The same providers must be enabled on the target content database and the source content database."

    • New Parameter:AnalyticResultCacheMinimumHitCount <Int32>
    • New Parameters:DatabaseServer <string>,DatabaseName <string>,DatabaseFailoverServer <string>,DatabaseSQLAuthenticationCredential <PSCredential>

      This was the only Service Application that didn't allow us to set the database information when we created it so we were left with this nasty GUID in the name.

    • New Parameter:AnalyticResultCacheMinimumHitCount <Int32>
    • New Parameters:DatabaseServer <string>,DatabaseName <string>,DatabaseFailoverServer <string>,DatabaseSQLAuthenticationCredential <PSCredential>,DatabaseUseWindowsAuthentication
    • New Switch Parameter:Recycle


SharePoint 2010 Updates

clock June 30, 2011 19:22 by author Robi

To update your SharePoint environment it is recommended to install Cumulative Updates.

You can find all CUs as well as Service Packs and info on how updates should be applied on this page:

http://technet.microsoft.com/en-us/sharepoint/ff800847

To get better knowledge on applying patches it is recommended to read this TechNet article:

Software updates overview (SharePoint Server 2010)

One important information when updating your server farm or administering server farm is to know on which patch level it is.

To get build number, you simply open Central Administration and click »Manage Servers in this Farm«:

You can compare Farm build number to the table bellow which I borrowed from Todd Klindt

Build

Release

Component

14.0.4763.1000

RTM

All components

14.0.4762.1000

RTM

Farm Build Version

14.0.5114.5003

June 2010 CU

SharePoint Foundation 2010

14.0.5114.5003

June 2010 CU

Microsoft Shared Components

14.0.5114.5003

June 2010 CU

Microsoft SharePoint Portal

14.0.5114.5003

June 2010 CU

Microsoft User Profiles

14.0.5114.5003

June 2010 CU

Microsoft Search Server 2010 Core

14.0.5114.5003

June 2010 CU

Microsoft Web Analytics Web Front End Components

14.0.5123.5000

August 2010 CU

SharePoint Foundation 2010

14.0.5123.5000

August 2010 CU

SharePoint Server 2010

14.0.5128.5000

October 2010 CU

SharePoint Server 2010

14.0.5130.5002

December 2010 CU

SharePoint Foundation 2010

14.0.5130.5002

December 2010 CU

SharePoint Server 2010

14.0.5136.5002

February 2011 CU

SharePoint Foundation 2010

14.0.5136.5002

February 2011 CU

SharePoint Server 2010

14.0.5138.5000

April 2011 CU

SharePoint Foundation 2010

14.0.5138.5000

April 2011 CU

SharePoint Server 2010

14.0.5138.5000

April 2011 CU

Project Server 2010

14.0.6029.1000

Service Pack 1

SharePoint Server 2010

14.0.6029.1000

Service Pack 1

Office Web Apps

14.0.6029.1000

Service Pack 1

Project Server 2010

14.0.6029.1000

Service Pack 1

SharePoint Foundation 2010

14.0.6105.5000

June 2011 CU

SharePoint Server 2010

14.0.6105.5000

June 2011 CU

SharePoint Foundation 2010

14.0.6105.5000

June 2011 CU

Project Server 2010

Hope it helps.



Crawl BCS Data

clock June 29, 2011 21:26 by author Robi

Hi,

One of our clients wants to crawl data that they store in SQL Database so I created a new External Content Type to test the solution and gave it a name of TestSql.

To crawl BCS content source, you need to give a default content access account or account you want to crawl you content source Execute permissions on External Content Type. Click on the Drop down and select Set Permissions. Add your search account and select Execute Permissions.

 

Next, you need to define new content source in Search Service Application administration. In order to do that, click on Content Sources category and then New Content Source.

Give your Content source a meaningful name, select Line of Business Data and select BCS Service Application where you External Content type is saved. Create Crawl Schedules and perform a full crawl of your Content Source.

 

When full crawl is finished you can search for your line of Business data. Go to your portal site and search for your data. Notice that link to that data is somewhat strange.

 

To correct this "issue" you need to create Profile Pages for BCS Data. Go back to your BCS Service application and on the Ribbon, in the Profile Pages section click Configure. You need to specify SharePoint site to host Profile pages.

 

When Profile page host is created, you can now Create/Upgrade Profile page. Select External Content Type and click Create/Upgrade button.

When operation is finished, perform a full crawl of this Content Source and you should now get results similar to this:

 

When you click on the link you get BCS Data directly from your SQL Server:

 

Hope it helps!

Enjoy!



SharePoint jokes

clock June 16, 2011 23:08 by author Robi

Enjoy laughing at SharePoint people:

http://www.paulswider.com/the-fantastic-40-sharepoint-jokes.html/



NTK 2011 - New in Microsoft Virtualization

clock May 27, 2011 01:03 by author joze

Novosti v Microsoftovi Virtualizaciji

Virtualizacija postaja del večine poslovnih okolij. Nekje samo na klientih, pri večini pa tudi že na strežnikih.
Tekom predavanja si bomo pogledali, kaj je Microsoft v zadnjem letu postavil na trg, kaj je spremenil in kaj posodobil. Pogledali si bomo tako Windows Virtual PC, kot tudi HyperV strežnik. Zraven pa še kak trik za upravljanje enega in drugega.

---

Last week I was presenting at Slovenian Microsoft’s conference – NTK or. NT konferenca.

Attached to this post you can find my presentation. You can also find it on the original NTK page.

Novosti v Microsoftovi Virtualizaciji



Azure AppFabric Cache HowTo

clock May 3, 2011 04:24 by author Rok Bermež
Well since we now have Azure AppFabric Cache available, let’s get a head start on how to use it in your Cloud ASP.NET (MVC) application. First, you need to have AppFabric 1.0 April Refresh SDK installed on your machine so grab it at here.
Next, go to Windows Azure Management portal. Log in, go to AppFabric/Cache and create new service namespace: wait for the service to be activated.

Then click 'View Client Configuration Button'

you will get a nice pre prepared configuration settings (with all those pesky security information included) for your app:
now we have all the pieces of the puzzle ready, all we have to do is to add references to caching dlls (located at C:\Program Files\Windows Azure AppFabric SDK\V1.0\Assemblies\NET4.0\Cache) to our application and change web.config with the settings given by previous step.

You would always need to put cache section in <configSections> node:
<section name="dataCacheClients" 
             type="Microsoft.ApplicationServer.Caching.DataCacheClientsSection,
Microsoft.ApplicationServer.Caching.Core"
allowLocation="true" allowDefinition="Everywhere"/>
and configure your Cache EndPoints:
<dataCacheClients>
    <dataCacheClient name="default">
      <hosts>
        <host name="ntkonferenca.cache.windows.net" cachePort="22233" />
      </hosts>
      <securityProperties mode="Message">
        <messageSecurity
authorizationInfo="1AmN0tT371NgUtH@T">
        </messageSecurity>
      </securityProperties>
    </dataCacheClient>
    <dataCacheClient name="SslEndpoint">
      <hosts>
        <host name="ntkonferenca.cache.windows.net" cachePort="22243" />
      </hosts>
      <securityProperties mode="Message" sslEnabled="true">
        <messageSecurity
authorizationInfo="1AmN0tT371NgUtH@T">
        </messageSecurity>
      </securityProperties>
    </dataCacheClient>
  </dataCacheClients>
If your application uses Session State and you want it to use Azure AppFabric Cache (which you do) you change your <sessionState> node to:
    <sessionState mode="Custom" customProvider="AppFabricCacheSessionStoreProvider">
      <providers>
        <add name="AppFabricCacheSessionStoreProvider"
             type="Microsoft.Web.DistributedCache.DistributedCacheSessionStateStoreProvider, 
Microsoft.Web.DistributedCache"
cacheName="default" useBlobMode="true" dataCacheClientName="default" /> </providers> </sessionState>
and the same for OutputCache by changing <caching> note:
<caching>
    <outputCacheSettings enableFragmentCache="true" defaultProvider="DistributedCache">
      <outputCacheProfiles>
        <add name="default" duration="43000" />
        <add name="DistributedCache"
                       type="Microsoft.Web.DistributedCache.DistributedCacheOutputCacheProvider, 
Microsoft.Web.DistributedCache"
cacheName="default" dataCacheClientName="default" /> </outputCacheProfiles> </outputCacheSettings> </caching>
That’s it, nothing needs to be changed in your code, your app is configured and ready to be published to the cloud. AppFabric Cache is not a free service even if you won’t be charged anything for its use prior to August 1, 2011. After that the prices are:
  • 128 MB cache for $45.00/month
  • 256 MB cache for $55.00/month
  • 512 MB cache for $75.00/month
  • 1 GB cache for $110.00/month
  • 2 GB cache for $180.00/month
  • 4 GB cache for $325.00/month
In my next post, Ill tackle with using AppFabric Cache without using prepared providers by simple GET and PUT statements (lets get rid of HttpRuntime.Cache as well).

 



About the author

Rok Bermež is Slovenian Windows Azure MVP, he works as a Software Engineer and Microsoft Certified Trainer at Kompas Xnet. His primary interests include cloud computing and web development. With extensive experience in development and architecture he participated in many projects and since the CTP release of Windows Azure much of those projects are based on Windows Azure platform. Rok has been delivering courses, writing code and speaking at conferences and community events on Microsoft technologies for the last couple of years. You can also find him on Twitter (@Rok_B).

Month List

Sign In