Posts Tagged 'worker role'

Using C++ in an Azure Worker Role for a Compute Intensive Task

Recently a situation came up with an application I was working on. My client had implemented a proprietary algorithm in a legacy desktop application. This algorithm is very compute intensive. It happened that it was an actuarial calculation, but the specifics don’t really matter. The computation was written in highly optimized C code.

In thinking about how to port this application to Azure it was clear to me that I did not want to re-write all that code! In fact that was not even an option. The client wanted to be sure that that specific code (which they knew and trusted) was what was executing. The solution I chose was to host the application natively in a worker role. That way I felt I could pretty much drop in the existing code, run it through the C++ compiler and linker and party on from within Visual Studio.

This post highlights some of the considerations that exist when implementing a worker role that hosts an application in this way. For my example here I have replaced the actuarial calculation with another intensive task: calculate pi to an arbitrarily large number of decimal places. For that algorithm I adapted some nifty C code posted on Planet Source Code. I also borrowed liberally from a project I found in the MSDN Code Gallery. Both of these are excellent resources.

The first thing I did was to create a new C++ project in Visual Studio. Then I dropped the algorithm code into a new source file. After verifying that the code compiled properly and functioned as expected I was ready to add a new cloud service project to the solution.

Since the C++ application (in this case it is called “MyPI”) is to be hosted in the worker role I added a link to the executable in the worker role project items. Setting the “Copy to Output Directory” property to “Copy if Newer” ensures that the latest bits for that program always get added to the deployment package.

Figure 1 – MyPI Solution

The architecture of the solution is pretty straightforward. The web role asks for the precision of the calculation. It then places that into a queue which is read by the worker role. The worker role launches the native process with the desired precision passed as a command line argument. Standard output is redirected into a StreamReader object which is read into a string variable in the worker role. This string (which may be very long!) is then uploaded to an Azure blob. Back in the web role, the blob is read back into a string and used to populate a text box.

Click here for the screencast.

As usual the actual production version of the code (and ultimate deployment to Azure!) was a little more complicated than presented here. Still, this simple demo gives an overview of how it is possible to leverage existing code written in C/C++ to port computationally intensive legacy applications to the Azure cloud.

Happy Coding!

Kevin

Worker Role Communication in Windows Azure, Part 1

In an earlier post we talked about the “Asynchronous Web/Worker Role Pattern” in Windows Azure. In this pattern the web roles are “client facing” and expose an http endpoint. In Azure a web role is essentially a virtual machine running IIS while a worker role is a background process that is usually not visible to the outside world. The web roles communicate with the worker roles via asynchronous messages passed through a queue. Even though it is relatively simple this basic architecture is at the heart of many highly scalable Azure applications.

There are, however, other ways in which roles can communicate internally or externally. This allows for considerable flexibility when implementing Azure based solutions. A worker role, for example, could be directly exposed to the outside world (through a load balancer, of course!) and communicate over tcp (or other) protocol. This might be useful in a solution which required, for some reason, a highly customized application server exposed over the Internet. It is also possible for worker roles to communicate directly with other worker roles internally. Often the internal communication is done over tcp but other protocols can be used internally as well.

In the first of this two part series we will explore the basics of exposing an external (i.e. available over the Internet) endpoint to a worker role over tcp. We will use this to implement a simple client/server version of a “math service”. Note that this is not something you would ever necessarily want to do in practice! It is used here simply as an example of how to enable the communication. In the real world the “math service” could be replaced by whatever custom application service was required.

As usual to get started, we use Visual Studio 2010 to create a Cloud Service project. We add a single worker role into that project. We can then define an “InputEndpoint” (which is Microsoft’s term for an endpoint that is visible to the external world). An InputEndpoint has a name, a protocol and a port number.

This definition is done in the ServiceDefinition.csdef file (some details omitted for clarity):

<?xml version=“1.0” encoding=“utf-8”?>
<ServiceDefinition … >
  <WorkerRole name=“Square.Service”>
    <ConfigurationSettings>
      <Setting … />
    </ConfigurationSettings>
    <Endpoints>
      <InputEndpoint name=“SquareEndpoint” protocol=“tcp” port=“2345” />
    </Endpoints>
  </WorkerRole>
</ServiceDefinition>

Note that the port number (in this case I have arbitrarily chosen 2345) is the port number that the Load Balancer is listening on. You need to make an API call to get the actual internal to Azure endpoint as follows:

RoleEnvironment.CurrentRoleInstance.InstanceEndpoints[“SquareEndpoint”].IPEndpoint;

Hopefully a demo will make this clearer:

Note that this demo is very minimal and does not necessarily demonstrate good design or programming practice. The sole purpose is to show how an Azure worker role can communicate with the external world over tcp.

By opening up role communication in this way Microsoft has given us lots of options to implement a wide variety of design architectures in Azure. In a future post we will examine how roles can also directly communicate with other roles internally.

Kevin Kell

As cloud computing continues to make information technology headlines, vendors are aggressively promoting the many benefits it can provide organizations.  Our White Paper, Cloud Computing Promises: Fact of Fiction, addresses the claims and questions that are often raised in relation to cloud computing and provides a clear view of what the cloud can—and can’t—deliver in reality.


Learning Tree Logo

Cloud Computing Training

Learning Tree offers over 210 IT training and Management courses, including Cloud Computing training.

Enter your e-mail address to follow this blog and receive notifications of new posts by e-mail.

Join 53 other followers

Follow Learning Tree on Twitter

Archives

Do you need a customized Cloud training solution delivered at your facility?

Last year Learning Tree held nearly 2,500 on-site training events worldwide. To find out more about hosting one at your location, click here for a free consultation.
Live, online training
.NET Blog

%d bloggers like this: