Tag Archive: Cloud


For the couple months, I have been working on testing some cloud related applications to fully take advantage of cloud storage, shared sessions across different nodes, and cloud to location via a virtual network utilizing Azure Connect. Each presents their own challenges. Finding a provider that has a high level of uptime, and the ability to scale is very important. So far I have been pretty happy with the ability of Windows Azure to scale to different app sizes, as well as, have all the technologies that seem to bridge the gap for each section.

Cloud related technologies truly can be difficult, sometimes, to truly understand how to handle some things that are often taken for granted when the servers are inside a network, and completely managable locally. Simple things like, remoting into a machine, all the way to, managing session state between nodes that have been extended.

Two other obstacles are to maintain dependencies on servers that require certain software to be installed on the server at runtime, as well as, ensure greater security of important business logic and services required for a software solution. The two major ways of dealing with this type of issue is to either:

A) Deploy custom images to the cloud that are prebuilt with the software you need to run in the cloud.

B) Make a library available outside of the cloud, that can be called from the cloud, with software dependencies and requirements that might not be feasible in the cloud.

Each has it’s own positives and negatives.

First, a custom cloud image can take some time to deploy. This cloud image, not only needs to be deployed, but updated after deployment. Cloud images contain server software that is critical to keep up-to-date, and it can take a while, even if already imaged, to deploy said image just due to maintenance. Uploading these deployments can also take some time, since these images can be several gigabytes in size.

Second, a local library extends the amount of moving parts that are involved in the system, making it harder to debug some issues, due to the different areas of breakage. One nice thing, though, exceptions thrown locally will be available for viewing on the event viewer, so you essentially have another tier of error reporting, off of the cloud (for dependencies within the said module). A big positive here, you can leverage the ability to maintain certain libraries and services that have software and dependency requirements that just don’t make sense for the cloud. For example, you have an order module that sends information to a backend system, but requires 2-3 other dependencies that lie on your network. A cloud solution really isn’t feasible, since you would have to make multiple connections, as well as, would be required to not only install software on a custom image and deploy it, but test connection between these nodes, on every instance, as well as, ensure the remote nodes can handle so many different connections between instances as they grow.

Keeping this in mind, sometimes it makes sense to go the second route, to connect a remote library or service to your instances to provide those much needed dependencies, reliably and responsibly. To do so, we really need to have a connector to ensure services can be accessed from within a network, this is where Azure Connect comes in.

Azure connect is a program that works with Windows Azure, to help bridge the gap between endpoints (your servers, on your own network), and your published software. These are the steps I took to setup Azure Connect between my local desktop, and Azure Connect:

  1. Created the WCF project and published it via localhost to a directory on my local computer.
  2. Created a cloud project that accesses the local service based on its qualified name [computer].[mydomain].com.
  3. Ensured it would properly work on localhost, as well as, the service accessible to other computers on my network.
  4. Published the cloud project and ensured that it is working fine without the connection to the endpoint being referenced.
  5. Downloaded the Endpoint client from the Azure portal and installed & confirmed I am connected.
  6. Moved my Endpoint into a group.
  7. Connected the group to the Role I want to connect to

The tutorial I followed, though, doesn’t go into how to troubleshoot nodes, nor how to ping external entities. To enable ping, you must have startup command to open the firewall to enable ping. To do so, create a file ‘enableping.cmd’, and insert the following into it:

 

Code Snippet
  1. Echo Enable ICMP
  2.  
  3. netsh advfirewall firewall add rule name="ICMPv6in" dir=in action=allow enable=yes protocol=icmpv6
  4.  
  5. exit /b 0

 

Now that you have the file, we must tell the application to run this command on startup. On your service definition inside your endpoint setup project, inside your WebRole configuration:

 

Code Snippet
  1. <Startup>
  2.       <Task commandLine="enableping.cmd" executionContext="elevated" taskType="simple"/>
  3. </Startup>

This will allow you to ping to the cloud. To enable pinging from the cloud to your own server, you MUST run the same command on your server. To do so, you can open a command prompt window and insert the same command:

cmd

Once this is done, you should be able to ping back and forth between your server and the cloud, if not, there is something else wrong with your network setup. Ensuring the Azure Connect icon is showing that you are fully connected, and there are no issues, on both the cloud and server, are important steps to ensure this is working.

Another issue I ran into when working with this, was my port 80 only computer was not open for traffic that was seeking to consume the local service I had created. Once port 80 was opened, I had no more issues connecting to the service running on a local IIS deployment within my network from the cloud. Note, this service was not made available outside of my network, so Azure Connect provides this connectivity.

Everyday we hit challenges in new technologies, and the Cloud environment is definitely a challenging environment. There are a lot of things that can (and probably will) fail along the way. The only way to make it through is continuing even when you hit the wall. Eventually you will get through!

I have been working on several projects lately that utilize the cloud for many different things, including service, session, database and client hosting. The ultimate goal is to have the flexibility to host solutions in the cloud in a fully available manner that is flexible to allocate more resources when needed. I have run into a case where I would like to pass more information into WCF functions that allows me to confirm caller identity, and manage some other things behind the scenes, without passing this information as parameters.

My original thinking was to somehow extend the functionality of WCF services itself to allow for adding parameters that would be set at initial session start.

An example being:

 

Code Snippet
  1. Service1Client myService = new Service1Client();
  2. myService.itemToPass = "Hello World!";

 

I didn’t really find anything that could meet my needs! It amazes me that there wasn’t more information out there regarding this type of data passing, since it seems a lot more reasonable to include such a offering, instead of passing the same value to all methods within a service. So, I asked on a couple forums and one post suggested possibly using a static variable for this, so I finally found a solution for it!

So on the service level I declared a private static variable for storing the value globally to be available for other methods within the service. I also created a setter method to enable, on setting up a connection, the availability to set this variable.

Code Snippet
  1. public class Service1 : IService1 {
  2. private static int customVar { get; set; }
  3. public void setCustomVar(int CustomVar) {
  4.     customVar = CustomVar;
  5. }
  6. // …

 

You ultimately could pass multiple things through a service, but remember to limit this to variables that MUST be provided globally. So from here, it is just a matter of (on the client) connecting to the service and setting this variable.

 

Code Snippet
  1. Service1Client myService = new Service1Client();
  2. myService.setCustomVar(12345);

 

Now that we have that variable set, it can be utilized for the lifetime of the connection to the service. This is very useful for those variables you want to have available globally, but don’t want to insert this parameter into every call!