Microsoft Cloud Training

by joze 17. February 2012 14:01
Dear Microsoft student, Expand your existing skills and acquire new skills on Microsoft’s cloud technologies including: Microsoft Office 365, Microsoft Exchange Online, Windows Azure, Windows Intune, Microsoft Hyper-V Server, Microsoft SharePoint Online, Microsoft Dynamics CRM Online, Microsoft System Center 2012 and SQL Azure. Microsoft has made available over 30 learning resources to enable you to explore these technologies, including: eBooks, E-learning clinics, short videos (a.k.a. learning snacks), and classroom training courses. Many of these valuable resources are free. To name a few: · Understanding Microsoft Virtualization Solutions (eBook) · Introduction to SQL Server 2012 (eBook) · Microsoft® Office 365: Connect and Collaborate Anywhere, Anytime (eBook) · Introducing Hyper-V in Windows Server 2008 R2 (learning snack) · SQL Server 2012: Cloud on Your Terms (learning snack) · Introduction to Microsoft Windows Azure Platform (learning snack) Microsoft’s cloud-based technologies are relevant to specific job roles. Start here: Microsoft Cloud Training To understand more, check out the Microsoft Learning Cloud Brochure. Thank you, and good luck!

Tags:

SQL | Azure | Office 365 | Hyper-V

Automatic SQL Azure backup part 1

by Rok Bermež 18. November 2011 14:24
Backup for SQL Azure was one of the most voted-on features since the beginning. Sure, we had SQL Migration wizard, BCP, SSIS, PowerShell cmdlets from Cerebrata , and later a very nice tool from RedGate (that I still use a lot) - SQL Azure backup. All of them have one flaw, there are either impossible or very hard to use for automatic backups that require no on premises infrastructure. For a while now, Import and Export CTP functionality has been available through Windows Azure management portal, that exports or imports Sql DacPac package. This is the exact functionality that should be integrated with my SQL Azure using cloud applications. MSDN documentation for that REST API seems to be completely lacking, but fortunately there are some SQL DAC Examples on CodePlex project that can we can use to see how it’s done. First, we add a service reference to http://dacdc.cloudapp.net/DACWebService.svc/mex and generate required proxy classes. Now we can make WebRequests for specific actions (import,export, status) to URLs, that are specific to each Windows Azure datacenter. Here is the mapping: RegionUrl North Central US https://ch1prod-dacsvc.azure.com/DACWebService.svc South Central US https://sn1prod-dacsvc.azure.com/DACWebService.svc North Europe https://db3prod-dacsvc.azure.com/DACWebService.svc West Europe https://am1prod-dacsvc.azure.com/DACWebService.svc East Asia https://hkgprod-dacsvc.azure.com/DACWebService.svc Southeast Asia https://sg1prod-dacsvc.azure.com/DACWebService.svc   Export To create a backup to storage we need to create an object of type ExportInput and POST it to datacenters url + “/export”.   public Guid ExportToStorage(string storageContainer, string fileName, BlobContainerPublicAccessType blobContainerPublicAccessType = BlobContainerPublicAccessType.Off) { var blobCredentials = GetBlobCredentials(storageContainer, fileName, true, blobContainerPublicAccessType); var exportRequest = new ExportInput { BlobCredentials = blobCredentials, ConnectionInfo = _connectionInfo }; var req = GetRequest(new Uri(_dataCenterUrl + "/Export"), RequestMethod.POST); var serializer = new DataContractSerializer(typeof(ExportInput)); var requestStream = req.GetRequestStream(); serializer.WriteObject(requestStream, exportRequest); requestStream.Close(); var resp = req.GetResponse(); return GetGuidFromResponse(resp); } private BlobStorageAccessKeyCredentials GetBlobCredentials(string storageContainer, string fileName, bool createIfNotExist = false, BlobContainerPublicAccessType blobContainerPublicAccessType = BlobContainerPublicAccessType.Off) { var storageCredentials = new StorageCredentialsAccountAndKey(_storageConnectionInfo.AccountName, _storageConnectionInfo.AccountKey); var storageAccount = new CloudStorageAccount(storageCredentials, _storageConnectionInfo.UseHttps); var cloudBlobClient = storageAccount.CreateCloudBlobClient(); var cloudBlobContainer = cloudBlobClient.GetContainerReference(storageContainer); if (createIfNotExist) { if (cloudBlobContainer.CreateIfNotExist()) { var permissions = cloudBlobContainer.GetPermissions(); permissions.PublicAccess = blobContainerPublicAccessType; cloudBlobContainer.SetPermissions(permissions); } } var cloudBlob = cloudBlobContainer.GetBlobReference(fileName); var backupBlobUri = cloudBlob.Uri.ToString(); var blobCredentials = new BlobStorageAccessKeyCredentials { StorageAccessKey = _storageConnectionInfo.AccountKey, Uri = backupBlobUri, }; return blobCredentials; } private HttpWebRequest GetRequest(Uri uri, RequestMethod requestMethod) { var req = (HttpWebRequest)WebRequest.Create(uri); req.Method = requestMethod.ToString().ToUpper(); req.ContentType = "application/xml"; return req; } Import For import the process is very similar, we have object of type ImportInput and POST it to datacenters url + “/import”.   public Guid ImportFromStorage(string storageContainer, string fileName, SqlAzureEdition sqlAzureEdition = SqlAzureEdition.Web, SqlAzureSize sqlAzureSize=SqlAzureSize.GB_1, string newDbName=null) { var blobCredentials = GetBlobCredentials(storageContainer,fileName); ImportInput importRequest = new ImportInput(); BlobCredentials creds = blobCredentials; importRequest.BlobCredentials = creds; importRequest.AzureEdition = sqlAzureEdition.ToString().ToLower(); importRequest.DatabaseSizeInGB = (int)sqlAzureSize; importRequest.ConnectionInfo = (String.IsNullOrEmpty(newDbName)) ? _connectionInfo : new ConnectionInfo() { DatabaseName = newDbName, ServerName = _connectionInfo.ServerName, UserName = _connectionInfo.UserName, Password = _connectionInfo.Password}; var req = GetRequest(new Uri(_dataCenterUrl + "/Import"), RequestMethod.POST); var serializer = new DataContractSerializer(typeof(ImportInput)); var requestStream = req.GetRequestStream(); serializer.WriteObject(requestStream, importRequest); requestStream.Close(); var resp = req.GetResponse(); return GetGuidFromResponse(resp); } Status Both actions return GUID representing current action that we can use to check if operation was successful. We do this by making GET request to datacenters url + “/status? servername={0}&username={1}&password={2} &reqId={3}”. If we want to get history and their statuses for this datacenter we can make the same request without reqId.   public StatusInfo GetStatusInfo(Guid requestId) { string uriBuilder = _dataCenterUrl + string.Format("/Status?servername={0}&username={1}&password={2}&reqId={3}", _connectionInfo.ServerName, _connectionInfo.UserName, _connectionInfo.Password, requestId.ToString()); var req = GetRequest(new Uri(uriBuilder), RequestMethod.GET); var response = req.GetResponse(); var statusInfos = GetStatusInfoFromResponse(response); return statusInfos[0]; }   Here ( SqlAzure.rar (22,71 kb) ) you can download a complete library that you can use in your Azure Worker tasks to automatically back up you SQL Azure databases. Next.... how to create cheduler that uses it.  

Tags:

c# | Azure | SQL

Truncating parent tables in MS SQL

by Rok Bermež 8. July 2009 11:49
If you've ever tried testing SQL stored procedures and functions, you've undoubtedly run into situat [More]

Tags:

SQL

Grant execute rights on schema to users

by Rok Bermež 8. July 2009 09:43
The easyest was is to create a new role and then just add users to that role: CREATE ROLE db_executorGRANT EXECUTE TO db_executor TO db_executor

Tags:

SQL

Calendar

<<  March 2017  >>
MonTueWedThuFriSatSun
272812345
6789101112
13141516171819
20212223242526
272829303112
3456789

View posts in large calendar

Page List

Month List