X Blogs

who you gonna call?

Microsoft Cloud Training

clock February 17, 2012 23:01 by author joze

Dear Microsoft student,

Expand your existing skills and acquire new skills on Microsoft’s cloud technologies including: Microsoft Office 365, Microsoft Exchange Online, Windows Azure, Windows Intune, Microsoft Hyper-V Server, Microsoft SharePoint Online, Microsoft Dynamics CRM Online, Microsoft System Center 2012 and SQL Azure.

Microsoft has made available over 30 learning resources to enable you to explore these technologies, including: eBooks, E-learning clinics, short videos (a.k.a. learning snacks), and classroom training courses.

Many of these valuable resources are free. To name a few:

· Understanding Microsoft Virtualization Solutions (eBook)

· Introduction to SQL Server 2012 (eBook)

· Microsoft® Office 365: Connect and Collaborate Anywhere, Anytime (eBook)

· Introducing Hyper-V in Windows Server 2008 R2 (learning snack)

· SQL Server 2012: Cloud on Your Terms (learning snack)

· Introduction to Microsoft Windows Azure Platform (learning snack)

Microsoft’s cloud-based technologies are relevant to specific job roles. Start here: Microsoft Cloud Training

To understand more, check out the Microsoft Learning Cloud Brochure.

Thank you, and good luck!



Automatic SQL Azure backup part 1

clock November 18, 2011 23:24 by author Rok Bermež
Backup for SQL Azure was one of the most voted-on features since the beginning. Sure, we had SQL Migration wizard, BCP, SSIS, PowerShell cmdlets from Cerebrata , and later a very nice tool from RedGate (that I still use a lot) - SQL Azure backup. All of them have one flaw, there are either impossible or very hard to use for automatic backups that require no on premises infrastructure.
For a while now, Import and Export CTP functionality has been available through Windows Azure management portal, that exports or imports Sql DacPac package. This is the exact functionality that should be integrated with my SQL Azure using cloud applications.
MSDN documentation for that REST API seems to be completely lacking, but fortunately there are some SQL DAC Examples on CodePlex project that can we can use to see how it’s done.
First, we add a service reference to http://dacdc.cloudapp.net/DACWebService.svc/mex and generate required proxy classes.

Now we can make WebRequests for specific actions (import,export, status) to URLs, that are specific to each Windows Azure datacenter. Here is the mapping:

RegionUrl
North Central US https://ch1prod-dacsvc.azure.com/DACWebService.svc
South Central US https://sn1prod-dacsvc.azure.com/DACWebService.svc
North Europe https://db3prod-dacsvc.azure.com/DACWebService.svc
West Europe https://am1prod-dacsvc.azure.com/DACWebService.svc
East Asia https://hkgprod-dacsvc.azure.com/DACWebService.svc
Southeast Asia https://sg1prod-dacsvc.azure.com/DACWebService.svc

 

Export

To create a backup to storage we need to create an object of type ExportInput and POST it to datacenters url + “/export”.

 

public Guid ExportToStorage(string storageContainer, string fileName, BlobContainerPublicAccessType blobContainerPublicAccessType = BlobContainerPublicAccessType.Off)
        {
            var blobCredentials = GetBlobCredentials(storageContainer, fileName, true, blobContainerPublicAccessType);
            var exportRequest = new ExportInput
            {
                BlobCredentials = blobCredentials,
                ConnectionInfo = _connectionInfo
            };
            var req = GetRequest(new Uri(_dataCenterUrl + "/Export"), RequestMethod.POST);
            var serializer = new DataContractSerializer(typeof(ExportInput));
            var requestStream = req.GetRequestStream();
            serializer.WriteObject(requestStream, exportRequest);
            requestStream.Close();
            var resp = req.GetResponse();
            return GetGuidFromResponse(resp);
        }
        private BlobStorageAccessKeyCredentials GetBlobCredentials(string storageContainer, string fileName, bool createIfNotExist = false, BlobContainerPublicAccessType blobContainerPublicAccessType = BlobContainerPublicAccessType.Off)
        {
            var storageCredentials = new StorageCredentialsAccountAndKey(_storageConnectionInfo.AccountName, _storageConnectionInfo.AccountKey);
            var storageAccount = new CloudStorageAccount(storageCredentials, _storageConnectionInfo.UseHttps);
            var cloudBlobClient = storageAccount.CreateCloudBlobClient();
            var cloudBlobContainer = cloudBlobClient.GetContainerReference(storageContainer);
            if (createIfNotExist)
            {
                if (cloudBlobContainer.CreateIfNotExist())
                {
                    var permissions = cloudBlobContainer.GetPermissions();
                    permissions.PublicAccess = blobContainerPublicAccessType;
                    cloudBlobContainer.SetPermissions(permissions);
                }
            }
            var cloudBlob = cloudBlobContainer.GetBlobReference(fileName);
            var backupBlobUri = cloudBlob.Uri.ToString();
            var blobCredentials = new BlobStorageAccessKeyCredentials
            {
                StorageAccessKey = _storageConnectionInfo.AccountKey,
                Uri = backupBlobUri,
            };
            return blobCredentials;
        }
        private HttpWebRequest GetRequest(Uri uri, RequestMethod requestMethod)
        {
            var req = (HttpWebRequest)WebRequest.Create(uri);
            req.Method = requestMethod.ToString().ToUpper();
            req.ContentType = "application/xml";
            return req;
        }

Import

For import the process is very similar, we have object of type ImportInput and POST it to datacenters url + “/import”.

 

 public Guid ImportFromStorage(string storageContainer, string fileName, SqlAzureEdition sqlAzureEdition = SqlAzureEdition.Web, SqlAzureSize sqlAzureSize=SqlAzureSize.GB_1, string newDbName=null)
        {
            var blobCredentials = GetBlobCredentials(storageContainer,fileName);
            ImportInput importRequest = new ImportInput();
            BlobCredentials creds = blobCredentials;
            importRequest.BlobCredentials = creds;
            importRequest.AzureEdition = sqlAzureEdition.ToString().ToLower();
            importRequest.DatabaseSizeInGB = (int)sqlAzureSize;
            importRequest.ConnectionInfo = (String.IsNullOrEmpty(newDbName)) ? _connectionInfo : new ConnectionInfo() { DatabaseName = newDbName, ServerName = _connectionInfo.ServerName, UserName = _connectionInfo.UserName, Password = _connectionInfo.Password};
            var req = GetRequest(new Uri(_dataCenterUrl + "/Import"), RequestMethod.POST);
            var serializer = new DataContractSerializer(typeof(ImportInput));
            var requestStream = req.GetRequestStream();
            serializer.WriteObject(requestStream, importRequest);
            requestStream.Close();
            var resp = req.GetResponse();
            return GetGuidFromResponse(resp);
        }

Status

Both actions return GUID representing current action that we can use to check if operation was successful. We do this by making GET request to datacenters url + “/status? servername={0}&username={1}&password={2} &reqId={3}”. If we want to get history and their statuses for this datacenter we can make the same request without reqId.

 

        public StatusInfo GetStatusInfo(Guid requestId)
        {
            string uriBuilder = _dataCenterUrl + string.Format("/Status?servername={0}&username={1}&password={2}&reqId={3}", _connectionInfo.ServerName, _connectionInfo.UserName, _connectionInfo.Password, requestId.ToString());
            var req = GetRequest(new Uri(uriBuilder), RequestMethod.GET);
            var response = req.GetResponse();
            var statusInfos = GetStatusInfoFromResponse(response);
            return statusInfos[0];
        }

 

Here ( SqlAzure.rar (22,71 kb) ) you can download a complete library that you can use in your Azure Worker tasks to automatically back up you SQL Azure databases. Next.... how to create cheduler that uses it.

 



Truncating parent tables in MS SQL

clock July 8, 2009 20:49 by author Klemen Slavič

If you've ever tried testing SQL stored procedures and functions, you've undoubtedly run into situations where you needed to clear tables of data to load fixtures into related tables. In MySQL, this would be accomplished by using TRUNCATE on the child tables, going up the hierarchy; in MS SQL, you cannot use TRUNCATE on tables that are referenced by foreign keys.

To remedy the situation, consider the scenario where you have two tables – Child and Parent. The Child table contains a foreign key referencing the primary key in Parent. In this case, you can use the following code snippet to truncate both tables:

TRUNCATE TABLE Child;
DELETE FROM Parent;
DBCC CHECKIDENT(Parent, RESEED, 0);

This will clear the table data and reset the primary key increment variable to 0. Any new data in the tables will now start enumeration on autoincrement numeric primary keys with 1.



Grant execute rights on schema to users

clock July 8, 2009 18:43 by author Rok Bermež

The easyest was is to create a new role and then just add users to that role:

CREATE ROLE db_executor
GRANT EXECUTE TO db_executor TO db_executor



About the author

Rok Bermež is Slovenian Windows Azure MVP, he works as a Software Engineer and Microsoft Certified Trainer at Kompas Xnet. His primary interests include cloud computing and web development. With extensive experience in development and architecture he participated in many projects and since the CTP release of Windows Azure much of those projects are based on Windows Azure platform. Rok has been delivering courses, writing code and speaking at conferences and community events on Microsoft technologies for the last couple of years. You can also find him on Twitter (@Rok_B).

Month List

Sign In