Using Windows Azure Connect

by Rok Bermež 15. February 2012 18:25
Windows Azure Connect enables Windows Azure users to set up secure, IP-level network connectivity between their Windows Azure hosted services and local (on-premises) resources. To set it up, you must first connect to the Windows Azure Management portal and enable it for your subscription. Next you need to get the activation token. Then open the ServiceDefinition.csdef file and import the Connect module for your roles. <Imports> <Import moduleName="Connect" /> </Imports> And set the token in the csconfig file. <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="your_ activation_token_guid" /> To gain access to local resources you need to install Windows Azure Connect Endpoint software that you get on https://waconnecttokenpage.cloudapp.net/Default.aspx?token=yourtoken After Connect Endpoint is installed, it will automatically “activate” itself with the Connect service which should take around 10 to 30 seconds. Once a local machine is activated, it will appear in the Virtual Network of the Management Portal when you select the “Activated Endpoints” node or the “Groups and Roles” node. Now you have to you can define your network connectivity policy in the Virtual Networks section of the Management Portal. If the “Interconnected” check box is checked, then machines that belong to the group will be able to communicate with each other via Connect. If it is set to false, then machines in the group will not be able to communicate with each other. You cant ping your roles in the cloud because their local firewall prevents it, but there is a fix for that. Just add a startup task that ads a firewall rule to the firewall. Echo Enable ICMP netsh advfirewall firewall add rule name="ICMPv6 echo" dir=in action=allow enable=yes protocol=icmpv6:128,any exit /b 0 Connect will automatically track changes made to your Windows Azure role and maintain connectivity. If you increase the number of Windows Azure role instances, Connect will automatically connect those new role instances based on the current network policy. The REALLY bad side of it is when you redeploy the app, you will have to add your new deployment to your network policy manualy, since we currently dont have this available in management api.  

Tags:

Azure

Automatic SQL Azure backup part 1

by Rok Bermež 18. November 2011 14:24
Backup for SQL Azure was one of the most voted-on features since the beginning. Sure, we had SQL Migration wizard, BCP, SSIS, PowerShell cmdlets from Cerebrata , and later a very nice tool from RedGate (that I still use a lot) - SQL Azure backup. All of them have one flaw, there are either impossible or very hard to use for automatic backups that require no on premises infrastructure. For a while now, Import and Export CTP functionality has been available through Windows Azure management portal, that exports or imports Sql DacPac package. This is the exact functionality that should be integrated with my SQL Azure using cloud applications. MSDN documentation for that REST API seems to be completely lacking, but fortunately there are some SQL DAC Examples on CodePlex project that can we can use to see how it’s done. First, we add a service reference to http://dacdc.cloudapp.net/DACWebService.svc/mex and generate required proxy classes. Now we can make WebRequests for specific actions (import,export, status) to URLs, that are specific to each Windows Azure datacenter. Here is the mapping: RegionUrl North Central US https://ch1prod-dacsvc.azure.com/DACWebService.svc South Central US https://sn1prod-dacsvc.azure.com/DACWebService.svc North Europe https://db3prod-dacsvc.azure.com/DACWebService.svc West Europe https://am1prod-dacsvc.azure.com/DACWebService.svc East Asia https://hkgprod-dacsvc.azure.com/DACWebService.svc Southeast Asia https://sg1prod-dacsvc.azure.com/DACWebService.svc   Export To create a backup to storage we need to create an object of type ExportInput and POST it to datacenters url + “/export”.   public Guid ExportToStorage(string storageContainer, string fileName, BlobContainerPublicAccessType blobContainerPublicAccessType = BlobContainerPublicAccessType.Off) { var blobCredentials = GetBlobCredentials(storageContainer, fileName, true, blobContainerPublicAccessType); var exportRequest = new ExportInput { BlobCredentials = blobCredentials, ConnectionInfo = _connectionInfo }; var req = GetRequest(new Uri(_dataCenterUrl + "/Export"), RequestMethod.POST); var serializer = new DataContractSerializer(typeof(ExportInput)); var requestStream = req.GetRequestStream(); serializer.WriteObject(requestStream, exportRequest); requestStream.Close(); var resp = req.GetResponse(); return GetGuidFromResponse(resp); } private BlobStorageAccessKeyCredentials GetBlobCredentials(string storageContainer, string fileName, bool createIfNotExist = false, BlobContainerPublicAccessType blobContainerPublicAccessType = BlobContainerPublicAccessType.Off) { var storageCredentials = new StorageCredentialsAccountAndKey(_storageConnectionInfo.AccountName, _storageConnectionInfo.AccountKey); var storageAccount = new CloudStorageAccount(storageCredentials, _storageConnectionInfo.UseHttps); var cloudBlobClient = storageAccount.CreateCloudBlobClient(); var cloudBlobContainer = cloudBlobClient.GetContainerReference(storageContainer); if (createIfNotExist) { if (cloudBlobContainer.CreateIfNotExist()) { var permissions = cloudBlobContainer.GetPermissions(); permissions.PublicAccess = blobContainerPublicAccessType; cloudBlobContainer.SetPermissions(permissions); } } var cloudBlob = cloudBlobContainer.GetBlobReference(fileName); var backupBlobUri = cloudBlob.Uri.ToString(); var blobCredentials = new BlobStorageAccessKeyCredentials { StorageAccessKey = _storageConnectionInfo.AccountKey, Uri = backupBlobUri, }; return blobCredentials; } private HttpWebRequest GetRequest(Uri uri, RequestMethod requestMethod) { var req = (HttpWebRequest)WebRequest.Create(uri); req.Method = requestMethod.ToString().ToUpper(); req.ContentType = "application/xml"; return req; } Import For import the process is very similar, we have object of type ImportInput and POST it to datacenters url + “/import”.   public Guid ImportFromStorage(string storageContainer, string fileName, SqlAzureEdition sqlAzureEdition = SqlAzureEdition.Web, SqlAzureSize sqlAzureSize=SqlAzureSize.GB_1, string newDbName=null) { var blobCredentials = GetBlobCredentials(storageContainer,fileName); ImportInput importRequest = new ImportInput(); BlobCredentials creds = blobCredentials; importRequest.BlobCredentials = creds; importRequest.AzureEdition = sqlAzureEdition.ToString().ToLower(); importRequest.DatabaseSizeInGB = (int)sqlAzureSize; importRequest.ConnectionInfo = (String.IsNullOrEmpty(newDbName)) ? _connectionInfo : new ConnectionInfo() { DatabaseName = newDbName, ServerName = _connectionInfo.ServerName, UserName = _connectionInfo.UserName, Password = _connectionInfo.Password}; var req = GetRequest(new Uri(_dataCenterUrl + "/Import"), RequestMethod.POST); var serializer = new DataContractSerializer(typeof(ImportInput)); var requestStream = req.GetRequestStream(); serializer.WriteObject(requestStream, importRequest); requestStream.Close(); var resp = req.GetResponse(); return GetGuidFromResponse(resp); } Status Both actions return GUID representing current action that we can use to check if operation was successful. We do this by making GET request to datacenters url + “/status? servername={0}&username={1}&password={2} &reqId={3}”. If we want to get history and their statuses for this datacenter we can make the same request without reqId.   public StatusInfo GetStatusInfo(Guid requestId) { string uriBuilder = _dataCenterUrl + string.Format("/Status?servername={0}&username={1}&password={2}&reqId={3}", _connectionInfo.ServerName, _connectionInfo.UserName, _connectionInfo.Password, requestId.ToString()); var req = GetRequest(new Uri(uriBuilder), RequestMethod.GET); var response = req.GetResponse(); var statusInfos = GetStatusInfoFromResponse(response); return statusInfos[0]; }   Here ( SqlAzure.rar (22,71 kb) ) you can download a complete library that you can use in your Azure Worker tasks to automatically back up you SQL Azure databases. Next.... how to create cheduler that uses it.  

Tags:

c# | Azure | SQL

Azure AppFabric Cache HowTo

by Rok Bermež 2. May 2011 19:24
Well since we now have Azure AppFabric Cache available, let’s get a head start on how to use it in your Cloud ASP.NET (MVC) application. First, you need to have AppFabric 1.0 April Refresh SDK installed on your machine so grab it at here. Next, go to Windows Azure Management portal. Log in, go to AppFabric/Cache and create new service namespace: wait for the service to be activated. Then click 'View Client Configuration Button' you will get a nice pre prepared configuration settings (with all those pesky security information included) for your app: now we have all the pieces of the puzzle ready, all we have to do is to add references to caching dlls (located at C:\Program Files\Windows Azure AppFabric SDK\V1.0\Assemblies\NET4.0\Cache) to our application and change web.config with the settings given by previous step. You would always need to put cache section in <configSections> node: <section name="dataCacheClients" type="Microsoft.ApplicationServer.Caching.DataCacheClientsSection, Microsoft.ApplicationServer.Caching.Core" allowLocation="true" allowDefinition="Everywhere"/> and configure your Cache EndPoints: <dataCacheClients> <dataCacheClient name="default"> <hosts> <host name="ntkonferenca.cache.windows.net" cachePort="22233" /> </hosts> <securityProperties mode="Message"> <messageSecurity authorizationInfo="1AmN0tT371NgUtH@T"> </messageSecurity> </securityProperties> </dataCacheClient> <dataCacheClient name="SslEndpoint"> <hosts> <host name="ntkonferenca.cache.windows.net" cachePort="22243" /> </hosts> <securityProperties mode="Message" sslEnabled="true"> <messageSecurity authorizationInfo="1AmN0tT371NgUtH@T"> </messageSecurity> </securityProperties> </dataCacheClient> </dataCacheClients> If your application uses Session State and you want it to use Azure AppFabric Cache (which you do) you change your <sessionState> node to: <sessionState mode="Custom" customProvider="AppFabricCacheSessionStoreProvider"> <providers> <add name="AppFabricCacheSessionStoreProvider" type="Microsoft.Web.DistributedCache.DistributedCacheSessionStateStoreProvider, Microsoft.Web.DistributedCache" cacheName="default" useBlobMode="true" dataCacheClientName="default" /> </providers> </sessionState> and the same for OutputCache by changing <caching> note: <caching> <outputCacheSettings enableFragmentCache="true" defaultProvider="DistributedCache"> <outputCacheProfiles> <add name="default" duration="43000" /> <add name="DistributedCache" type="Microsoft.Web.DistributedCache.DistributedCacheOutputCacheProvider, Microsoft.Web.DistributedCache" cacheName="default" dataCacheClientName="default" /> </outputCacheProfiles> </outputCacheSettings> </caching> That’s it, nothing needs to be changed in your code, your app is configured and ready to be published to the cloud. AppFabric Cache is not a free service even if you won’t be charged anything for its use prior to August 1, 2011. After that the prices are: 128 MB cache for $45.00/month 256 MB cache for $55.00/month 512 MB cache for $75.00/month 1 GB cache for $110.00/month 2 GB cache for $180.00/month 4 GB cache for $325.00/month In my next post, Ill tackle with using AppFabric Cache without using prepared providers by simple GET and PUT statements (lets get rid of HttpRuntime.Cache as well).  

Tags:

.Net | c# | Azure

Windows Azure AppFabric Caching Service Released

by Rok Bermež 30. April 2011 10:52
Finally the Azure thingy I’ve been waiting for the most. From the official source: Today we are excited to announce that the Caching service has been released as a production service.   The Caching service is a distributed, in-memory, application cache service that accelerates the performance of Windows Azure and SQL Azure applications by allowing you to keep data in-memory and saving you the need to retrieve that data from storage or database. We provide 6 different cache size options for you to choose from, varying from 128MB to 4GB.  In order for customers to be able to start using the service and evaluate your needs we are running a promotion period in which we will not be charging for the service for billing periods prior to August 1, 2011. To learn more about the Caching service please use the following resources: ·        Windows Azure AppFabric Caching availability announced!! blog post, including more information on pricing. ·        Video: Introduction to the Windows Azure AppFabric Cache ·        Video: Windows Azure AppFabric Caching – How to Set-up and Deploy a Simple Cache ·        Windows Azure AppFabric FAQ on MSDN ·        MSDN Documentation The service is already available in our production environment at: http://appfabric.azure.com. For questions on the Caching service please visit the Windows Azure Storage Forum. Customers can take advantage of our free trial offer to get started with the Caching service and Windows Azure AppFabric. Just click on the image below and get started today!  

Tags:

.Net | Azure

Windows Azure Cache Dependency

by Rok Bermež 10. April 2011 23:07
We are supposed to get Windows AppFabric Cache real soon, but till than form time to time we need to synchronize content cached inside our Web roles. SqlDependency is one way, but it cannot solve all problems (especialy those that are not based on SQL data). So to help with the matter I wrote AzureStorageCacheDependency that uses Azure storage to know when data is outdated and cache shloul be cleared. If anyone is in need of something similar, here it goes: public class AzureStorageCacheDependency : System.Web.Caching.CacheDependency { private System.Threading.Timer _timer; private int _poolTime; private CloudBlob _blob; private string _key; public AzureStorageCacheDependency(string connectionString, string conteinerAddress, string blobAddress, int poolTime = 5000) { _poolTime = poolTime; using (AzureContext azureContext = new AzureContext(true)) { var storageAccount = CloudStorageAccount.Parse(connectionString); CloudBlobClient blobStorage = storageAccount.CreateCloudBlobClient(); CloudBlobContainer container = blobStorage.GetContainerReference(conteinerAddress.ToLower()); container.CreateIfNotExist(); _blob = container.GetBlockBlobReference(blobAddress.ToLower()); if (!Exists(_blob)) { Reset(); } else { _key = _blob.DownloadText(); } } _timer = new Timer(new System.Threading.TimerCallback(CheckDependencyCallback), this, 0, _poolTime); } public void Reset() { _key = Guid.NewGuid().ToString(); _blob.UploadText(_key); } private void CheckDependencyCallback(object sender) { if (!Exists(_blob) || _key != _blob.DownloadText()) { NotifyDependencyChanged(this, EventArgs.Empty); _timer.Dispose(); } } public static bool Exists(CloudBlob blob) { try { blob.FetchAttributes(); return true; } catch (StorageClientException e) { if (e.ErrorCode == StorageErrorCode.ResourceNotFound) { return false; } else { throw; } } } } public class AzureContext : IDisposable { HttpContext _oldHttpContext; bool _restoreOldHttpContext = false; public AzureContext(bool forceSettingContextToNull = false) { if (forceSettingContextToNull) { _oldHttpContext = HttpContext.Current; HttpContext.Current = null; _restoreOldHttpContext = true; } else { try { HttpResponse response = HttpContext.Current.Response; } catch (HttpException) { _oldHttpContext = HttpContext.Current; HttpContext.Current = null; _restoreOldHttpContext = true; } } } public void Dispose(bool disposing) { if (disposing) { if (_restoreOldHttpContext) { HttpContext.Current = _oldHttpContext; } } } public void Dispose() { Dispose(true); } ~AzureContext() { Dispose(false); } }

Tags:

.Net | c# | ASP.NET | Azure | Web

Calendar

<<  February 2018  >>
MonTueWedThuFriSatSun
2930311234
567891011
12131415161718
19202122232425
2627281234
567891011

View posts in large calendar

Page List

Month List