CHAPTER 12

image

Workflows in Windows Azure

What is the cloud? How can it help my business? These are the questions I hear from customers who are interested in either moving to the cloud or contemplating an investment in using the cloud to start a business. Generally what I tell these customers is that the cloud levels the playing field so that different size companies can have access to the same amount of IT infrastructure and computer power available to them on-demand. Therefore, limitation on computing power and scalability is no longer a question.

Once cloud technology started becoming more popular, workflow enthusiasts began experimenting with the capabilities the cloud could offer for building and running workflows within Windows Azure. If you are not familiar with the cloud, think about it as being able to utilize servers that are hosted off premise or not physically located within an organization for running software. The servers that are used are virtual and behave just like a physical server without a user even knowing it, so you can remote desktop into them just like a physical server. The difference is that virtual servers run in memory and a physical server can have many virtual servers running within it.

What does the cloud mean to you from a developer’s perspective? Cloud computing offers developers the freedom to do what we do best, which is developing software. A major concern I have always had when delivering software solutions to clients is the physical infrastructure. I have never been interested in setting it up, and happily the cloud removes the stress of having to worry about how this is done.

One of the main goals of Windows Azure is to provide a development experience that does not change the way developers write software. I can write software just as I always do and decide later if and when I want to deploy my software to the cloud. Windows Azure also provides a way for my software to grow as the usage of my applications grows. This is done by configuration rather than more processors or memory. Once a change is made to the configuration, the results are immediate, so there is no waiting around until someone gets the time to make the change. The same thing applies to scaling down an application. Maybe peak times for clients accessing my applications are seasonal, such as a sporting event like the Super Bowl. In order to save money, I might want to scale down my application so it does not use up as much memory since I am being billed based on application utilization.

Now that you have an idea how cloud computing works, the rest of this chapter will explain the advantages of exposing workflows through the cloud using Windows Azure by demonstrating different scenarios of how to build and run workflows within the cloud. In this chapter, I will cover configuring Windows Azure and the different components within Azure that can be used to host workflows within the cloud. Once a foundation for Azure is established, I will go over the different business scenarios and patterns and practices for providing solutions using workflows hosted within Azure.

Windows Azure

Windows Azure is Microsoft’s cloud technology and it provides different delivery models for providing software solutions that run within the cloud. The different delivery models are as follows:

  • Software as a Service (SaaS): Services provided through software that clients can subscribe to. These services are usually domain specific, such as finance, sales, and human resources. As new clients subscribe to the service, there is little to no overhead for the software service provider.
  • Platform as a Service (PaaS): Provides a development services that can be either tied together or used separately for building SaaS applications. Some of the essential services that Windows Azure provides are services for running web applications and worker services for running external business processes.
  • Infrastructure as a Service (IaaS): Provides the components for hosting virtual hardware and networks. Azure allows virtual machines to be hosted and scaled out depending on the growth of the business it supports.

9781430243830_Fig12-01.jpg

Figure 12-1.  Services provided through Windows Azure

Each of these service offerings is illustrated in Figure 12-1. You may be asking yourself, what is the difference between the cloud and simply finding a nearby data center that provides server leasing? Companies that lease servers have two models for getting by with as few servers as possible. The first model allows the client leasing the server to manage the software and upkeep for the server. The second model is where the hosting company manages the server in return for a fee, so the client does not need to worry about the maintenance. Cloud technology is different because the maintenance is handled automatically. In fact, the client does not have to worry about leasing a physical or virtual server because the fee incurred is strictly for the usage of the software running within the cloud. In most cases, this can be significantly cheaper than leasing servers.

image Note   There is currently no support for WF in Windows Azure web sites.

Azure Portal

The latest Windows Azure was announced by Microsoft at the TechEd North America 2012 conference. One of the most noticeable changes is the look and feel of the Azure Portal. The Azure Portal is the main web site used for working with the many features that Windows Azure provides. The last portal provided was built using Microsoft’s Silverlight; the new portal has been designed using JavaScript and HTML5, the latest release of HTML. The portal’s web address is http://manage.windowsazure.com/, but in order to work within the Azure Portal, you must have a Windows Live account. Figure 12-2 illustrates the login screen that a user is redirected to in order to either create a new account or sign in with an existing one.

9781430243830_Fig12-02.jpg

Figure 12-2.  Logging into the Azure Portal

While I was writing this chapter, Microsoft promoted a Windows Azure 90-day free trial. To check if it is still available, visit www.windowsazure.com/en-us/pricing/free-trial/. With the free trial you get access to the following:

  • 750 small compute hours a month
  • 10 shared web sites
  • 1GB SQL relational database
  • 20GB with 1,000,000 storage transactions
  • Unlimited bandwidth inbound with 20GB outbound

Once a new or existing account is used to log in, you can access the portal. If the account used to log in is eligible for the promotion, the first step is to create a new Azure account (see Figure 12-3).

9781430243830_Fig12-03.jpg

Figure 12-3.  Creating an Azure account

A mobile phone number is required to verify the new account. Entering a mobile number causes a verification code to be sent via a phone call or text message. Once the text message is received, the code can be added to verify the account (see Figure 12-4).

9781430243830_Fig12-04.jpg

Figure 12-4.  Account verification

The final step is billing information. Since this is a trial, you will not be billed; however, if you decide to upgrade, you can use the credit card information as the source of payment. After the account is set up and the trial is approved, you can view the new Azure Portal, as illustrated in Figure 12-5.

9781430243830_Fig12-05.jpg

Figure 12-5.  New Azure Portal

The new portal has been designed to perform better than the previous one and navigation between all the services is more intuitive. If you have worked with Azure before, some of the features are still consistent with what was provided earlier; however, there are some new features that were just released within the Azure preview. First, the trial comes with 10 free web sites. Building web sites was not a feature that was provided until now. The second major feature is the ability to create VMs or virtual machines within Azure, without having to worry about any of the complexities of building them. Base operating systems can be selected and built within minutes, without having to install the operating system manually. Even though these new features are off topic for what will be covered in this chapter, you should be aware of them as you start learning more about what Azure has to offer.

Cloud Services

Figure 12-3 indicates that the trial offering for using Windows Azure offers 750 hours of cloud services. Cloud services provide the platform services, or Platform as a Service (PaaS), for building software. Cloud services alleviate the effort of having to focus on the infrastructure that the software requires to run, so focus can remain on developing software. Once software solutions are built, cloud services provide

  • High availability (HA) for making sure that the hosted software is always available and is resilient to system failure.
  • Administration for hosting the software, which is minimal.
  • Scalability for the software as its usage grows from demand.

Cloud services use virtualization to run instances of Windows Server for hosting applications, which is different than the IaaS Azure offers with VMs because administration for the server instance(s) is automatically provided through Azure rather than having to manually provide the administration and initialize provisioning of an actual virtual machine. The tradeoff for not having to manage virtual instances provided with Azure cloud services is the customization and control over the virtual machines created as IaaS. Next, let’s discuss each virtual instance that can be created (also known as a cloud service role).

Web Roles

Web roles run within IIS are designed to host web applications that provide services for clients over HTTP or HTTPS; therefore, a web role is where code is typically written for building a web application that resides when using cloud services. More than one web role can be created and run at a given time.

Worker Roles

A worker role runs on a separate thread from a web role and is typically used to run background business processes for a web role in order to provide a level of work for the application. The worker role is continuously being executed, for example, as a client executes specific functionality from a web application that is hosted within a web role; tasks can be generated that the worker role can handle asynchronously on a separate thread. When a worker role is added to a project within Visual Studio, the default WorkerRole class illustrated in Listing 12-1 is provided, which basically loops through every 10 seconds and writes “Working” within the compute emulator UI (which will be discussed later).

Listing 12-1.  Default WorkerRole Code

using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Net;
using System.Threading;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.Diagnostics;
using Microsoft.WindowsAzure.ServiceRuntime;
using Microsoft.WindowsAzure.StorageClient;

namespace WorkerRole1
{
    public class WorkerRole : RoleEntryPoint
    {
        public override void Run()
        {
            // This is a sample worker implementation. Replace with your logic.
            Trace.WriteLine("WorkerRole1 entry point called", "Information");

            while (true)
            {
                Thread.Sleep(10000);
                Trace.WriteLine("Working", "Information");
            }
        }

        public override bool OnStart()
        {
            // Set the maximum number of concurrent connections
            ServicePointManager.DefaultConnectionLimit = 12;

            // For information on handling configuration changes
            // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.

            return base.OnStart();
        }
    }
}

image Tip   The workflow instance state should not be maintained within Azure roles because of the efforts made through Azure to manage the virtual instance. Therefore, instance states should be maintain within the services provided for data management like queues, blobs, tables, and SQL databases.

Role Scalability

Each role is designed to be run either together or separately depending on the requirements of the software that is hosted; however, because they are essentially virtual machines at the core, the charge applies to each hour a role is running. There can also be more than one of each type of role running simultaneously, and each instance of the same type of role running would use the same code. The reason for running more than one role, either a web or worker role, is to provide better performance and high availability in the case of instance failure. Both web and worker roles provide the code and configuration for handling the different parts of software functionality.

Roles can also be configured to handle scalability defined from client utilization. The size of the VM can be changed based on the number of CPU cores, memory, and file system size dedicated to running a VM instance. Table 12-1 illustrates the different sizes that can be selected for scaling out cloud service roles.

Table 12-1. VM Size for Cloud Services

image

Data Management

There isn’t always a “one data storage fits all” when it comes to designing software. One important reason why is because software can scale better when different methods for managing data are strategically implemented. Windows Azure provides three different and unique choices for managing different data within applications. It may seem that these data storage options are only for applications hosted within Windows Azure but that is not the case. Each of the storage methods that are discussed can also be used for applications not running within Windows Azure, like an application running within on-premise data centers or within client-hosted applications like mobile devices, laptops, and tablets.

Table Storage

Table storage is used for storing large amounts of data that can be organized in a tabular format, but the data stored does not need to be structured as relational data within a database like SQL Server. For instance, the other day my wife, who works in the education field, was asking me about storing student data within a spreadsheet. I was impressed at how well she had organized each column within the spreadsheet to represent a characteristic for the student data and thought how table storage would make a good candidate. Another example for using table storage is storing logging information that records an application’s performance. Although a case can be made to store logging data within a database, it might make more sense to hold large amounts of logging data within table storage. For one thing, it will help the application scale because there is less demand on the database; moreover, because space is plentiful, it can be a cheaper option as well.  

Blob Storage

During your development career, you may have written software that required interaction with media files. One viable approach might be to use a file system to store the files so that the software could access them on demand. However, over time, as the need for storage starts growing, using the file system can become unmanageable. Blob storage is Windows Azure’s solution for storing files within the cloud, and it also provides a cheaper model for implementing file storage. One good example of using blob storage is to provide image illustrations for products sold from an e-commerce site. Each image associated with a single product can be stored within blog storage rather than taking up space within the file system or serializing the image files as binary within the database.

SQL Databases

So far I have discussed data storage methods that are a part of Windows Azure and therefore use APIs to interact with the respective storage types. Windows Azure also provides SQL databases, which are just like the SQL Server databases that developers are already familiar with for storing data that is relational. Relational data consists of different tables that share relationships. For example, an employee table can logically be related or have a foreign key relationship to a job role table, which associates an employee with a job position the employee has within a company.

image Note   Both blob and table storage have APIs that can be extended to manage storage. As I will demonstrate later, blob storage can be used to work with workflows in Azure.

Azure Development Tools

Today, non-Microsoft technologies can now be developed and hosted within Azure. The latest release of Azure includes development tools that are available for technologies outside of the .NET stack. Microsoft provides all of the tools, including client libraries, source code, and SDKs for working in Azure; go to www.windowsazure.com/en-us/develop/downloads/. The SDK for working with Azure within VS2012 is also available from the page or can be directly downloaded by going to http://go.microsoft.com/fwlink/?LinkId=254364&clcid=0x409. The file name for the June 2012 release of the SDK is VWDOrVs11AzurePack_RC.3f.3f.3fnew.exe and it needs to be installed (see Figure 12-6).

9781430243830_Fig12-06.jpg

Figure 12-6.  Installing VS2012 Windows Azure SDK

The Windows Azure SDK installs emulators for testing locally software written to be hosted within Azure. Figure 12-7 illustrates the emulators running within the taskbar for Windows 7.

9781430243830_Fig12-07.jpg

Figure 12-7.  Emulators running within the taskbar

Azure Workflows

Running workflows within the cloud sheds new light on exposing software and services to more clients. There is no boundary in the cloud like there might be for software that runs within an organization’s network domain. It is very easy to write software and expose it to the Internet via the cloud. Since workflows can be exposed as services, a broader range of clients can now subscribe to workflow services that are published out to the cloud. Even though Azure has been around for a couple of years, hosting workflows within Azure is nothing new. In fact, hosting workflows within the cloud is something that die-hard workflow developers have been doing for some time. WF and Azure make the perfect marriage, and as each of the technologies matures, Microsoft will continue to make sure that the relationship between the two grows stronger by providing more functionality to support the two technologies and make the integration process easier.

image Note   It is important to mention that during the same time of writing this book, the guest OS images within Azure do not support .NET 4.5. The release of WF4.5 does not change the development story or hosting patterns from what I will share and demonstrate.

Workflow Hosting Patterns

Combining these two technologies brings the best qualities of each. WF supports a viable solution for the following:

  • Declaratively building business logic.
  • Enabling long-running business processes.
  • Empowering non-technical software end users to drive business logic during runtime.

On the other hand, Azure provides the capabilities for WF services to be managed off-premise or outside of a business’s data center and also provides all of the resources and infrastructure needed for scalability. The WF runtime requires an executing process so it can be hosted, using the following WF hosts:

  • WorkflowApplication
  • WorkflowServiceHost
  • WorkflowInvoker

Table 12-2 shows some common patterns around hosting workflows within Azure. Quite commonly it may seem more natural to host workflows using WorkflowApplication within an ASP/MVC .NET application. This is a common pattern for applications built outside the cloud. In this case, an ASP/MVC .NET application is required to run within a web role so it can be hosted within the cloud. WorkflowApplication can also be used for hosting workflows within a worker role so the features gained (like tracking, bookmarks, etc.) can be utilized from the WF runtime.

Table 12-2. Common WF Hosting Patterns for Azure

image

WorkflowServiceHost is generally hosted directly within a web role, so the WF runtime can use the web role’s IIS infrastructure like Windows Activation Services (WAS). WorkflowServiceHost can also be self-hosted within a worker role, and this has been a pattern for working around the issue of when a web role is recycled and a workflow instances can no longer be reloaded from a persistence store. There is a persistence table called LockOwnersTable so a workflow can only be hosted by one host at a time. During a web role deployment, the instance name becomes part of the host name and gets written to the lock owner name. If the VM gets recycled, there could be a change based on the lock owner name where workflow instances cannot be reloaded from the persistence store. Once .NET 4.5 is added within the guest OS for Azure, it will address this issue.

Hosting Non-Durable Workflows

The scenario used in this section will demonstrate hosting non-durable workflows, or workflows that are stateless in nature. This means that the workflows will not maintain state through using persistence and are not anticipated to run for long periods of time. The scenario is built around a pawn shop that processes customers who want to pawn items. WF is a great candidate technology to integrate pawn shops with local law enforcement because workflows could be federated, so each pawn shop would have to follow the same rules or business logic.

A common pattern used for hosting workflows within Azure is hosting the WF runtime using a worker role, which was briefly introduced earlier in this chapter. In order to add a worker role to a VS2012 solution, a new project must be created using a Cloud project template, which resides within the Visual C# project template. If the .NET Framework 4.5 is selected, there won’t be any project templates to choose from; however, changing the framework to .NET Framework 4 will allow the Windows Azure Cloud Service project to be selected. This also indicates that project templates that use .NET Framework 4.5 cannot be built for Windows Azure (see Figure 12-8).

9781430243830_Fig12-08.jpg

Figure 12-8.  Adding a new Windows Azure Cloud Service

After selecting the OK button, the roles that can be added to the project are presented. Figure 12-9 indicates that a worker role is being selected to run as a background process within the cloud service. After selecting the role and clicking the arrow button pointing to the Windows Azure Cloud Service solution listbox on the right side, the worker role called WorkerRole1 is added. Right-clicking on the selected worker role allows it to be renamed if desired.

9781430243830_Fig12-09.jpg

Figure 12-9.  Adding a worker role to a solution

image Caution  When adding a new Worker Role project, you might notice that the Microsoft.WindowsAzure.StorageClient reference is not added to the project, but the namespace is included as a using statement within the provided default code. To fix this bug, you can reference the missing namespace by referencing it from the path, C:Program FilesMicrosoft SDKsWindows Azure.NET SDK2012-06in.

Starting the solution after adding the workflow will execute the WorkerRole code in Listing 12-1. As VS2012 starts the Azure emulators, a message from the taskbar indicates that the emulators have started. The compute emulator can be viewed by right-clicking on the blue Azure icon within the taskbar, as indicated in Figure 12-7, and selecting “Show Compute Emulator UI.” The compute emulator will show that WorkerRole1, built within VS2012, is running. Clicking on WorkerRole1 will show a small console window with trace information being logged. Clicking the top of console window will provide a bigger console so the trace activity can be viewed better.

9781430243830_Fig12-10.jpg

Figure 12-10.  Viewing activity within the compute emulator

Notice the trace information, “Information: Working”, represented in Figure 12-10. This indicates that the default WorkerRole code is executing (see Listing 12-2).

Listing 12-2.  Tracing Execution within the Worker Role

while (true)
{
    Thread.Sleep(10000);
    Trace.WriteLine("Working", "Information");
}

Since the worker role is always executing, technically it is a great place to host a workflow, and to do so, the code in Listing 12-2 can be changed to the code in Listing 12-3, which executes a workflow for processing items that are brought into a pawn shop by a customer to be pawned. A Customer object, which is passed in as a WF argument, is passed with the workflow and sets the customer’s date of birth. Depending on how the workflow receives data for execution, this is probably not an efficient way to process business logic using data that is supplied externally from the workflow. Listing 12-3 indicates that a workflow will be executed every 10 seconds, which leaves very little control over the initiation and ending of the workflow’s execution because the execution of a workflow instance could take longer than the interval of every 10 seconds. Listing 12-3 also reveals a shortcoming regarding how data will be provided externally to the workflow. Another scenario might be calling out to external services or reading data from other external data sources like files or databases from within the workflow, but even this implementation limits the control of the workflow’s execution. The workflow needs a consistent way of receiving data that can be passed in as a WF argument. This is where queued messages play a significant role within Azure.

Listing 12-3.  Invoking a Workflow in Azure

while (true)
{
    try
    {
       var activity = new Apress.Chapter12.WF.ProcessPawnedItems(); //Pawn shop workflow
       var inargs = new Dictionary<string, object> {
         {
                "argNewCustomer",
                new Apress.Chapter12.DataModel.Customer() {
                        DOB = Convert.ToDateTime("4-9-1995")
                }
         } };
      WorkflowInvoker wfInvoker = new WorkflowInvoker(activity);
      wfInvoker.Invoke(inargs);
   }
   catch (Exception ex)
   {
        Trace.WriteLine(ex.Message,"Exception");
   }
        Thread.Sleep(10000);
   }

image Tip   If you are not running Visual Studio as an administrator while running an Azure project, VS2012 will throw a message before it starts the solution that says, “The Windows Azure compute emulator must be run elevated. Please restart Visual Studio in elevated administrator mode in order to run the project.” To run VS2012 as an Administrator, close the instance of VS2012 and click the VS2012 icon to restart it, but this time right-click on VS2012 icon and select “Run as Administrator.”

Queuing Data for Workflows

Although the infinite loop within the worker role is required for continuous execution, it should not be relied upon to invoke workflow execution. One way to seize control over workflow execution is to write custom code that feeds external data to the workflow using WF arguments. External data can come from calls made from services or external data stores and is passed into the workflow using WF arguments; however, it is good practice to separate out this functionality rather than implement it within the worker role.

Azure provides an out-of-the-box queuing implementation called Windows Azure Queues for affectively processing data internally within applications that are hosted on Azure. Queues are essentially messages that are stacked one after the next; however, Azure Queues do not guarantee “first-in, first-out” (FIFO). Queues are highly affective for providing reliable messaging because in situations where executing processes go down, queues can continue to stack more messages until the process is restarted or a new process is brought online to process messages that have been created in the queue.

Initiating Queues in Web Roles

To illustrate how Azure queues can be used, I will demonstrate how to add a web role, which will serve as the web front-end for receiving pawned items from customers; this is somewhat similar to how a worker role was added earlier. A web role can be added to the project that was created for the worker role earlier. By right-clicking on the Roles folder in the Windows Azure Cloud Service project that was used to create the worker role earlier, a new web role can be added by selecting New Web Role Project (see Figure 12-11).

9781430243830_Fig12-11.jpg

Figure 12-11.  Adding a web role to the Roles container

Selecting an ASP.Net web role as illustrated in Figure 12-12 will add a project (which is probably familiar when compared to a standard ASP.NET project).

9781430243830_Fig12-12.jpg

Figure 12-12.  Add New Role Project screen

Figure 12-13 shows the WebRole1 project that was added and all of the default pages and folders that are included.

9781430243830_Fig12-13.jpg

Figure 12-13.  Added ASP.NET web role to a solution

In this scenario, the message queue will be stacked with messages that are created within the web application. To illustrate how messages are created, the code in Listing 12-4 is added to the Default.aspx web page as well as the appropriately named textboxes within the HTML to build customer information for pawned items.

Listing 12-4.  Initiating and Creating Messages for the Queue

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using Apress.Chapter12.DataModel;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.ServiceRuntime;
using Microsoft.WindowsAzure.StorageClient;

namespace WebRole1
{
    public partial class _Default : Page
    {
        private const string QueueName = "customerqueue";
        private CloudStorageAccount StorageAccount = null;
        private CloudQueue InitiateAzureQueue()
        {
            CloudQueue CustomerQueue = null;
            try
            {
                StorageAccount =
CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue("DataConnectionString"));
                var queueClient = StorageAccount.CreateCloudQueueClient();
                CustomerQueue =
                queueClient.GetQueueReference(QueueName);
                CustomerQueue.CreateIfNotExist();
            }
            catch (Exception ex)
            {
                throw ex;
            }
            return CustomerQueue;
        }
        protected void Page_Load(object sender, EventArgs e)
        {
        }
        protected void cmdSubmit_Click(object sender, EventArgs e)
        {
            try
            {
                var customer = new Customer()
                {
                    DOB = Convert.ToDateTime(txtDOB.Text),
                    DriversLicenseNumber = txtDriverLicense.Text,
                    FirstName = txtFirstName.Text,
                    LastName = txtLastName.Text,
                    OwnersSSN = txtSSN.Text,
                    CustomerPawns = new List<CustomerPawn>
                   {
                        new CustomerPawn()
                        {
                            PawnedItems = new List<PawnedItem>
                            {
                                new PawnedItem{
                                    ItemName = txtItemName.Text,
                                    PawnedAmount = Convert.ToDecimal(txtAmount.Text),
                                    ModelNumber = txtModelNumber.Text
                                }
                            }
                        }
                   }
                };

                var customerQueue = InitiateAzureQueue();
                var newMessage = new CloudQueueMessage(customer.ToJson());
                customerQueue.AddMessage(newMessage);
            }
            catch (Exception ex)
            {
                throw ex;
            }
        }
    }
}

While reviewing Listing 12-4, the first thing to mention is the following line of code:

StorageAccount =
   CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue("DataConnectionString"));

Note that DataConnectionString represents the setting used to indicate that development storage should be used. Development storage is provided with the Azure SDK that was installed earlier. It uses a local database simulating managed resources like queues, blobs, etc. The setting has to be created within ServiceConfiguration.Local.cscfg and ServiceConfiguration.Cloud.cscfg, as illustrated in Listing 12-5.

Listing 12-5.  Configuring Development Storage for Testing Azure Locally

<?xml version="1.0" encoding="utf-8"?>
<ServiceConfiguration serviceName="Apress.Chapter12" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*" schemaVersion="2012-05.1.7">
  <Role name="WorkerRole1">
    <Instances count="1" />
    <ConfigurationSettings>
      <Setting name="DataConnectionString" value="UseDevelopmentStorage=true" />
      <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="" />
    </ConfigurationSettings>
  </Role>
  <Role name="WebRole1">
    <Instances count="1" />
    <ConfigurationSettings>
      <Setting name="DataConnectionString" value="UseDevelopmentStorage=true" />
      <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" />
    </ConfigurationSettings>
  </Role>
</ServiceConfiguration

image Caution  It is important to make sure that the ConfigurationSettings are the same between ServiceConfiguration.Local.cscfg and ServiceConfiguration.Cloud.cscfg, so the setting for DataConnectionString has been added in both files. If not, Visual Studio will indicate that they are not in synch and the solution will not run.

The rest of the code for setting up a queue in Listing 12-4 is pretty straightforward. The InitiateAzureQueue() function initiates a new queue called customerqueue, which allows messages to be added to a queue using an Azure storage account. In this case, the development storage is used locally to simulate how the queues function within Azure, but once this code is deployed to the cloud, a storage account must be created to host the queue within Azure. After the storage account is established, a CloudQueueClient object is created and used to reference to the customerqueue queue. If the queue does not exist, then it is created. After the queue is created, the CloudQueue object is returned and messages can be added to it.

Within the Page_Load event for the Default.aspx page, a new Customer object is instantiated and loaded with data that the customer wants to pawn. After the Customer object is loaded, the queue is initiated by calling InitiateAzureQueue, so a new CloudQueueMessage is created and added to the queue. The message that is queued consists of JSON, which is created by serializing the Customer object. The code in Listing 12-6 illustrates the code used for serializing a Customer object to JSON and deserializing JSON as string data back to a Customer object. Listing 12-6 uses code extensions, which allows the ToJson function to be called from any Customer object and is demonstrated in Listing 12-4. The FromJson function can be called from any string object and will be demonstrated later as the message from the queue is later read from a worker role. The magic of serialization and deserialization is handled by using the DataContractJsonSerializer.

Listing 12-6.  Serializing a Customer Object and Deserializing Back to an Object

using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;

using System.Runtime.Serialization;
using System.Runtime.Serialization.Json;

namespace Apress.Chapter12.DataModel
{
    public static class extCustomer
    {
        public static string ToJson(this Customer customer)
        {
            using (var ms = new MemoryStream())
            {
                DataContractJsonSerializer serializer = new DataContractJsonSerializer(typeof(Customer));
                serializer.WriteObject(ms, customer);
                ms.Position = 0;
                return new StreamReader(ms).ReadToEnd();
            }
        }

        public static object FromJson(this string Json)
        {
            DataContractJsonSerializer serializer = new DataContractJsonSerializer(typeof(Customer));
            var retCustomer =
                serializer.ReadObject(new MemoryStream(Encoding.Default.GetBytes(Json)));
            return retCustomer;
        }
    }
}

image Note   JSON stands for JavaScript Object Notation and provides a lean way to serialize data into a formatted string data type.

Consuming Queues within Worker Roles

Now that the queues have been created and messages can now be added to the queues, a worker role can be used to consume the message. Listing 12-7 illustrates some changes within the WorkerRole code compared to Listing 12-3. The WorkerRole code in Listing 12-7 is written to now process external data from a queue workflow and receive WF arguments based on the messages read from the queue. The first two lines of code look familiar because this is the same code used to initialize communication to an Azure queue. The last function in Listing 12-7, InitiateAzureQueue(), uses the same code that the web page code used in Listing 12-4 to initialize the queue. The code within the Run() method for processing a queue is important. After a customerQueue object is initialized by calling IntializeAzureQueue(), customerQueue.GetMessage() is called to retrieve customer data represented as JSON. If a message has not been added to the queue, then the processing thread goes idle for 10 seconds.

Making the processing thread go to sleep for a bit after checking for a message is good practice and reduces the number of transactions, which in return saves money on your monthly Azure bill. There is a common algorithm for measuring transaction efficiency called truncated exponential backoff. It times the rate for transactions within a process so that they are processed affectively. The message is then deserialized back into a Customer object so it can be added as a WF argument that a workflow can use as data for processing. After the workflow processes successfully while being hosted through the WorkflowInvoker, DeleteMessage is called so that the message is removed from the queue and a new message can be retrieved from the queue.

Listing 12-7.  Consuming Queues Using a WorkerRole

public class WorkerRole : RoleEntryPoint
    {
        private const string QueueName = "customerqueue";
        private CloudStorageAccount StorageAccount = null;

        public override void Run()
        {
            // This is a sample worker implementation. Replace with your logic.
            Trace.WriteLine("WorkerRole1 entry point called", "Information");

            Trace.WriteLine("Initiating storage account", "Information");
            StorageAccount =
CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue("DataConnectionString"));
           var customerQueue = InitiateAzureQueue();
            
            while (true)
            {
                try
                {

                    var wfData = customerQueue.GetMessage();
                    if (wfData == null)
                    {
                        Thread.Sleep(10000);
                        Trace.WriteLine("No message in queue! Waiting 10 seconds...", "Information");
                    }
                    else
                    {
                        try
                        {
                            var newCustomer = wfData.AsString.FromJson() as Customer;
                            var activity = new Apress.Chapter12.WF.ProcessPawnedItems();
                            var inargs =
                                new Dictionary<string, object> { { "argNewCustomer", newCustomer } };
                            WorkflowInvoker wfInvoker = new WorkflowInvoker(activity);
                            wfInvoker.Invoke(inargs);
                            customerQueue.DeleteMessage(wfData);
                            Trace.WriteLine("Processed Message!", "Information");

                        }
                        catch (Exception ex)
                        {
                            throw ex;
                        }
                    }
                }
                catch (Exception ex)
                {
                    Trace.WriteLine(ex.Message,"Exception");
                }
            }
        }

        public override bool OnStart()
        {
            // Set the maximum number of concurrent connections
            ServicePointManager.DefaultConnectionLimit = 12;

            // For information on handling configuration changes
            // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.

            return base.OnStart();
        }

        private CloudQueue InitiateAzureQueue()
        {
            CloudQueue CustomerQueue = null;
            try
            {
                var queueClient = StorageAccount.CreateCloudQueueClient();

                CustomerQueue =
                queueClient.GetQueueReference(QueueName);
                CustomerQueue.CreateIfNotExist();
            }
            catch (Exception ex)
            {
                throw ex;
            }
            return CustomerQueue;
        }

Now that a workflow can accept WF arguments built from the queue of new customers, let’s take a look at the workflow that will process customer orders within the pawn shop. Figure 12-14 illustrates a flowchart workflow that uses a custom WF activity that inherits from CodeActivity<result>. Listing 12-8 shows the code used for SaveCustomer activity.

9781430243830_Fig12-14.jpg

Figure 12-14.  Workflow for saving new customers and their pawned items

Listing 12-8.  SaveCustomer Activity Code

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Activities;
using Apress.Chapter12.DataModel;

namespace Apress.Chapter12.WF
{
    public sealed class SaveCustomer : CodeActivity<bool>
    {
        public InArgument<Customer> NewCustomer { get; set; }

        protected override bool Execute(CodeActivityContext context)
        {
            bool retVal = false;
            try
            {
                var newCustomer = context.GetValue(this.NewCustomer);

                using (var pawning = new Pawning())
                {
                    pawning.Customers.Add(newCustomer);
                    pawning.SaveChanges();
                }
                retVal = true;
            }
            catch (Exception ex)
            {
                throw ex;
            }
            return retVal;
        }
    }
}

The code that actually adds a customer and pawned items is defined within a separate C# project, Apress.Chapter12.DataModel. It uses Entity Framework 4 and a Code First model for building the database and tables based on the entity classes defined to handling the data plumbing for SQL Server.

There are three entities used for defining a customer and items that the customer pawns. Customer.cs, illustrated in Listing 12-9, defines the properties that need to be captured for a customer.

Listing 12-9.  Customer.cs Class

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.ComponentModel.DataAnnotations;
using System.Collections.ObjectModel;

namespace Apress.Chapter12.DataModel
{
    public class Customer
    {
        public Customer()
        {
            CustomerPawns = new Collection<CustomerPawn>();
        }
        [Key]
        public int CustomerId { get; set; }
        [Required]
        public string FirstName { get; set; }
        [Required]
        public string LastName { get; set; }
        [Required]
        public string OwnersSSN { get; set; }
        [Required]
        public string DriversLicenseNumber { get; set; }
        [Required]
        public DateTime DOB { get; set; }

        public virtual  ICollection<CustomerPawn> CustomerPawns { get; set; }
    }
}

PawnedItem.cs, illustrated in Listing 12-10, defines the properties needed to save information about the items that a customer is pawning.

Listing 12-10.  PawnedItem.cs Class

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.ComponentModel.DataAnnotations;

namespace Apress.Chapter12.DataModel
{
    public class PawnedItem
    {
        public PawnedItem()
        {
            DatePawned = DateTime.Now;
        }
        [DatabaseGenerated(DatabaseGeneratedOption.Identity)]
        public int PawnedItemId { get; set; }
        [Key]
        public int CustomerPawnId { get; set; }
        [Required]
        public string ModelNumber { get; set; }
        [Required]
        public string ItemName { get; set; }
        public DateTime DatePawned { get; set; }
    }
}

CustomerPawn.cs, illustrated in Listing 12-11, defines the properties needed to bridge the relationship between the customer and the items that are being pawned.

Listing 12-11.  CustomerPawn.cs Class

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.ComponentModel.DataAnnotations;
using System.Collections.ObjectModel;

namespace Apress.Chapter12.DataModel
{
    public class CustomerPawn
    {
        public CustomerPawn()
        {
            PawnedItems = new Collection<PawnedItem>();
        }

        [Key]
        public int CustomerPawnId { get; set; }
        [Required]
        public int CustomerId { get; set; }
        [Required]
        public virtual ICollection<PawnedItem> PawnedItems { get; set; }

    }
}

Listing 12-12 shows the code that Entity Framework uses to define the database tables and relationships between each of the tables.

Listing 12-12.  Pawning.cs Class

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Data.Entity;

namespace Apress.Chapter12.DataModel
{
  public sealed class Pawning : DbContext
  {
     public DbSet<CustomerPawn> CustomerPawns { get; set; }
     public DbSet<PawnedItem> PawnedItems { get; set; }
     public DbSet<Customer> Customers { get; set; }

     protected override void OnModelCreating(DbModelBuilder modelBuilder)
     {
         modelBuilder.Entity<CustomerPawn>()
             .HasMany<PawnedItem>(ItemsPawned => ItemsPawned.PawnedItems);

         modelBuilder.Entity<PawnedItem>()
             .HasKey(p => new { p.CustomerPawnId, p.PawnedItemId });

         modelBuilder.Entity<Customer>()
             .HasMany<CustomerPawn>(pawnedItems => pawnedItems.CustomerPawns);
     }
  }
}

Now when the solution is run, customer data defined within a web role is queued and then processed out of the queue within the worker role. The worker role then sends the queued data to a workflow, which will process the information about the customer and the item being pawned.

Cloud Workflows

So far I have walked through running workflows within VS2012 with the aid of the Azure SDK to simulate how the application should execute running in Azure. Once the application is deployed to Azure, there should be no reason why a solution hosted in Azure should not execute the same way it executed while running within VS2012 using the Azure SDK. However, there are some configurations that require tweaks so that the application actually uses the features available within Azure, such as SQL databases and storage.

To add a SQL database, log on to the Windows Azure portal at http://manage.windowsazure.com/. When the portal’s main screen loads, select the New menu located at the bottom left of the portal. After the menu slides up, select SQL Database. Once a SQL Database has been added within Azure, VS2012 can run applications that hit the Azure database, and Visual Studio or SQL Server’s Management Studio can be used to manage and query a database. To establish security to the database, firewall rules for a SQL database must be added for client computers through Azure’s portal; they simply tell Azure what IP addresses can access the database.

9781430243830_Fig12-15.jpg

Figure 12-15.  Creating a SQL Database in Azure

Figure 12-15 shows that after selecting Quick Create, the server name for the new database has already been chosen, so now the database name needs to be added (and it will be PawnShop). The next couple of steps configure the database and security; however, once the database is created, the portal will show within the SQL Databases item that the database is available. At this point, the database can be accessed locally if the firewall rules have been added. This enables Entity Framework’s Code First model to now perform data plumbing within the cloud. It is important to add the database connection string within the connectionString section of an app.config file, added within the WorkerRole1 project (see Figure 12-15). After setting the connection string illustrated within Listing 12-13 and running the solution, data will be added within the Azure database. The data can then be checked via SQL Server Management Studio using the Azure database server.

Listing 12-13.  Connection String Used to Connect EF Code First to the Azure Database

<connectionStrings>
    <add name="Pawning" providerName="System.Data.SqlClient" connectionString="Server=tcp:l1yab5tqj4.database.windows.net,1433;Database=PawnShop;User ID=<UserID>;Password=<Password>;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;" />
  </connectionStrings>

Configuring Azure Storage

In order to utilize the different storage capabilities offered through Azure, a storage account needs to be created. To add on to an Azure subscription, select Storage and click Quick Create. Type in a mandatory all lower-case and unique name for the URL and then select a Region/Affinity Group. It is important to make sure that affinity groups are the same as other accounts within Azure (see Figure 12-16).

9781430243830_Fig12-16.jpg

Figure 12-16.  Creating a storage account in Azure

After a storage account has been created, the key that was automatically generated with the account must be used when accessing the storage account (see Figure 12-17).

9781430243830_Fig12-17.jpg

Figure 12-17.  Managing storage keys

Clicking Manage Keys will pop up the two keys that were generated. There are two, so one can be generated while the other stays active and can still be used in production. Frequent generation of the keys provides better security for the storage account.

When running Azure solutions within Visual Studio, the Azure SDK handles the storage features for development. These settings are configured using the ServiceConfiguration files within the cloud project. Opening  ServiceConfiguration.Cloud.cscfg and changing UseDevelopmentStorage=true to the Primary access key in Figure 12-17 provides the security necessary so Azure can use the storage account created in Figure 12-16. The DefaultEndpointsProtocol needs to be specified as either http or https, which will indicate the protocol the storage account will use to connect, and the AccountName indicates the account used for storage (see Listing 12-14).

Listing 12-14.  Example of a Storage Connection String

DefaultEndpointsProtocol=https;AccountName=[AccountName];AccountKey=[key]

Publishing to Azure

Deploying solutions to the cloud has become even easier with the Azure SDK v1.7 for Visual Studio. Right-clicking an Azure project within Visual Studio provides a Publish menu command; however, the project first needs to know the Azure subscription that is intended for the published project to run under. If credentials do not exist on the current computer used for building the solution, Visual Studio will prompt the user to log on to the Azure Portal so the credentials can be downloaded. Figure 12-18 indicates that there are no Azure subscription credentials installed on the local machine.

9781430243830_Fig12-18.jpg

Figure 12-18.  Selecting credentials for an Azure subscription

Clicking “Sign in to download credentials” prompts a new browser window to open so credentials can be downloaded by signing into the Azure Portal. After signing in, the credentials are prompted to be downloaded, as illustrated in Figure 12-19.

9781430243830_Fig12-19.jpg

Figure 12-19.  Downloading credentials for an Azure subscription

The page in Figure 12-19 lists the steps required to import the credentials and choose the right credentials but here is the cool part: the pawn shop solution requires services from Azure to run, so cloud services must be set up before the solution can be deployed. This can be done through the Azure Portal, but if cloud services do not exist, Visual Studio is smart enough to check during the deployment process and provide a way for creating cloud services as well.

It is best to select a location for cloud services that is within the same region as the other components that will be used, like the SQL Server and storage that were created earlier (see Figure 12-20).

9781430243830_Fig12-20.jpg

Figure 12-20.  Selecting a region for a new cloud service

After cloud services are created, Visual Studio automatically starts the deployment. The Windows Azure Activity Log displays the status or progress as the solution is being deployed. If the Azure Portal is opened at the same time cloud services are being created through Visual Studio, refreshing the portal page will update, showing the latest status about the cloud service. Once the solution is deployed to the cloud service, the site URL, which is found on the Dashboard portal page for the cloud service, can be used for browsing the site (see Figure 12-21).

9781430243830_Fig12-21.jpg

Figure 12-21.  Deployed solution within a cloud service

image Tip   Creating cloud services also creates a storage account with the same name that is defined for the cloud service, so there may not be a need to create a separate storage account first.

Figure 12-22 illustrates the web page running within the cloud. Now that the solution is published, the web application running within Azure can process customers and items pawned by recording data from the web page, queuing the data and passing it to the workflow so the data can then be stored within SQL Server. Figure 12-23 shows how to view the data within a SQL database within Azure and shows that the information entered in Figure 12-22 has been successfully processed using the workflow.

9781430243830_Fig12-22.jpg

Figure 12-22.  Entering data about an item being pawned

9781430243830_Fig12-23.jpg

Figure 12-23.  Azure SQL database with data loaded using a workflow

Workflows in Blob Storage

At this point, workflows have been used to define business logic declaratively, using a visual representation of the business flow that should be followed for off-premise software, but this is just the beginning of the story. To really take advantage of the power of combining WF with Azure, you need to push Azure combined with WF a step further. Let’s look at what WF and Azure can really provide developers when building solutions off-premise.

In the beginning of the book, I mentioned that WF provides the framework for adapting to business processes as they change. The current pawn shop application does not support a dynamic business process model because the workflow cannot be redefined when need arises to change the business process. It must be redeployed to Azure. To provide dynamic workflow model, workflows can be built within a rehosted WF designer and then stored as XAML using Azure’s blob storage, which defines the workflow. For instance, the current workflow used within the pawn shop example for storing data about the customer and the item pawned is the minimal functionality that needs to take place. What happens when additional conditions are introduced like items being pawned that were stolen? This is where the business process for checking items that are being pawned will need to change over time, and WF can provide the capabilities to adapt to the business process without having to recompile or redeploy the application.

In addition to applications that have evolving business processes, rehosting the WF designer can also cater to applications within the cloud that need to support a large number of workflows to handle many business processes. These workflows can be built by non-technical users to provide functionality, so it is important to make sure that the workflows have a reliable repository for storage and retrieval as they are needed to run within one or more applications.

Earlier in the chapter I gave an overview of blob storage and how it can be used for storing files. Since workflows can also be XAML files, blob storage also provides a great place to manage workflows so they can be executed within Azure. Azure provides a standard API for working with blob storage, and it can be customized to meet the needs of custom applications running within Azure and applications running outside of Azure as well. For instance, Listing 12-7 illustrates code that limits workflow execution because there is no way to dynamically change the workflow while a worker role is executing. By using blob storage, a worker role can instead check for a workflow stored as a blob and retrieve it so it can be executed. While the worker role is executing, an application outside of Azure that rehosts the WF designer can modify the workflow and save it in blob storage so the workflow can be retrieved later. New logic can be performed from an updated workflow after it is retrieved from blob storage within the worker role.

Blob Storage Loaded from a Rehosted Designer

Chapter 10 demonstrated how to rehost the WF designer, so some of the code from Chapter 10 needs to be modified so that workflows built by the rehosted WF designer can be stored as blob storage within Azure instead of stored locally using the file system. The first step to using blob storage within an application is to reference Microsoft.WindowsAzure.StorageClient and System.Configuration within the project and then add the following using statements:

using Microsoft.WindowsAzure.StorageClient;
using Microsoft.WindowsAzure;
using System.Configuration;

The next step is to make sure that the SaveCustomer activity is provided within the rehosted activity toolbox. Instead of copying the code to the solution, references to the appropriate assemblies can be made instead by referencing

And then by adding a Window.Resources entry within the XAML markup called PawnShopAssembly.

<Window.Resources>
        <sys:String x:Key="WFAssembly">System.Activities, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35</sys:String>
        <sys:String x:Key="PawnShopAssembly">Apress.Chapter12.WF, Version=1.0.0.0, Culture=neutral</sys:String>
    </Window.Resources>

Finally, you must add the SaveCustomer activity to the toolbox using the XAML markup.

<WFC:ToolboxItemWrapper  AssemblyName="{StaticResource PawnShopAssembly}">
                            <WFC:ToolboxItemWrapper.ToolName>
                                Apress.Chapter12.WF.SaveCustomer
                            </WFC:ToolboxItemWrapper.ToolName>
                        </WFC:ToolboxItemWrapper>

A default Customer WF argument can also be set as a default argument with the code in Listing 12-15. As a new workflow is initialized, it will already have an argument added as an InArgument so a new Customer object can be passed into the workflow.

Listing 12-15.  Setting a Default Customer Argument

private ActivityBuilder BuildBaseActivity()
        {
            try
            {
                ActivityBuilder builder = new ActivityBuilder
                {

                    Name = txtWorkflowName.Text!=string.Empty?txtWorkflowName.Text:txtWorkflowName.Text,
                    Implementation = new Flowchart(),
                    Properties =
                    {
                        new DynamicActivityProperty
                        {
                             Name = "argNewCustomer",
                             Type = typeof(InArgument<Apress.Chapter12.DataModel.Customer>),
                             Attributes =
                             {
                                 new RequiredArgumentAttribute(),
                             },
                             Value = new InArgument<Apress.Chapter12.DataModel.Customer>()
                        }
                    }
                };

                return builder;
            }
            catch (Exception)
            {

                throw;
            }
        }

image Tip   Make sure to add references within the project rehosting the WF designer to the appropriate assemblies. In this case, Apress.Chapter12.DataModel and Apress.Chapter12.WF are required to be referenced in order to have the SaveCustomer activity available and rehosted.

At this point, the workflow author can take advantage of the SaveCustomer activity using a default Customer WF argument for the workflow. Saving the workflow requires initiating blob storage when the application starts by creating a global variable for holding the CloudBlobContainer object and setting it within the class’s constructor, RehostedWFThroughMarkup.

CloudBlobContainer _blobContainer = null;
        
public RehostedWFThroughMarkup()
{
   InitializeComponent();
   _blobContainer = InitiateBlobStorage();
}

The code in Listing 12-16 is used to initialize the blob storage. ConfigurationManager.AppSettings allows blob storage to be configured using a configuration file. In this case, an app.config file is used. The CreateContainerReference() call applies a unique name for the blob container, workflows, so that the blob container can be uniquely referenced. If the blob container of workflows does not exist, then it is created, setting public access to only clients with the right credentials.

Listing 12-16.  Initializing Blob Storage

private CloudBlobContainer InitiateBlobStorage()
        {
            CloudBlobContainer blobContainer = null;
            try
            {
                var StorageAccount =
                CloudStorageAccount.Parse(ConfigurationManager.AppSettings["DataConnectionString"]);

                var blobClient = StorageAccount.CreateCloudBlobClient();
                blobContainer = blobClient.GetContainerReference("workflows");

                blobClient.RetryPolicy = RetryPolicies.Retry(3, TimeSpan.FromSeconds(5));

                if (blobContainer.CreateIfNotExist())
                {
                    var permissions = new BlobContainerPermissions() { PublicAccess = BlobContainerPublicAccessType.Off };
                    blobContainer.SetPermissions(permissions);
                }

                return blobContainer;
            }
            catch (Exception ex)
            {
                throw ex;
            }
        }

The app.config file configures access to the blob storage account by specifying a connection string within the appSettings section of the file. The account key and name can be found within the Azure Portal.

<appSettings>
    <add key="DataConnectionString" value="DefaultEndpointsProtocol=https;AccountName=<AccountName>;AccountKey=<AccountKey>"/>
  </appSettings>

Finally, the code used to save the XAML file representing a workflow using the rehosted WF designer is illustrated in Listing 12-17. It is the same code from Chapter 10, except the code for saving the XAML file to the file system has been commented out and replaced with new code that allows the generated file to be saved within blob storage. A reference to the blob container is obtained and then the XAML representing the workflow is uploaded to blob storage.

Listing 12-17.  Saving Workflow XAML Markup in Blob Storage

private void cmdSave_Click(object sender, RoutedEventArgs e)
        {
            try
            {
                if (!string.IsNullOrWhiteSpace(txtWorkflowName.Text))
                {
                    if (_wd!=null)
                    {
                        _wd.Flush();
                        //if (_wd.Text.Length>0)
                        //{
                        //    var wfFile = Directory.GetCurrentDirectory() + WorkflowFolder + txtWorkflowName.Text + ".xaml";

                        //    if (!Directory.Exists(Directory.GetCurrentDirectory() + WorkflowFolder))
                        //        Directory.CreateDirectory(Directory.GetCurrentDirectory() + WorkflowFolder);
                        //    //else //Directory exists
                        //    //    File.OpenWrite(wfFile);
                            
                        //    _wd.Save(wfFile);
                        //}
                        var blobRef = _blobContainer.GetBlobReference(txtWorkflowName.Text + ".xaml");
                        blobRef.UploadText(_wd.Text);
                    }
                    else
                        MessageBox.Show("Please press 'Create' to build a new workflow!");
                }
                else
                {
                    MessageBox.Show("Please enter a workflow name!");
                }
            }
            catch (Exception ex)
            {
                throw ex;
            }
        }

Consuming Blob Storage within a Worker Role

Now that workflows are loaded within the blob storage, the worker role can easily consume them so they can be executed. Listing 12-18 illustrates in bold the code required to consume a workflow from blob storage. A blob reference is made to PawnShopWorkflow.xaml, which is the name of the workflow that is saved within the rehosted workflow designer. Then the workflow is downloaded into the MemoryStream object so it can be dynamically loaded as an Activity object and executed using WorkflowInvoker.

Listing 12-18.  Consuming Workflows from Blob Storage within a WorkerRole

public override void Run()
        {
            // This is a sample worker implementation. Replace with your logic.
            Trace.WriteLine("WorkerRole1 entry point called", "Information");

            Trace.WriteLine("Initiating storage account", "Information");
            StorageAccount =
CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue("DataConnectionString"));
            var customerQueue = InitiateAzureQueue();
            
            var blobContainer = InitiateBlobStorage();

            while (true)
            {
                try
                {

                    var wfData = customerQueue.GetMessage();
                    if (wfData == null)
                    {
                        Thread.Sleep(10000);
                        Trace.WriteLine("No message in queue! Waiting 10 seconds...", "Information");
                    }
                    else
                    {
                        try
                        {
                  var newBlob = blobContainer.GetBlobReference("PawnShopWorkflow.xaml");
                            var ms = new System.IO.MemoryStream();
                            newBlob.DownloadToStream(ms);
                            ms.Position = 0;
                            
                            var activity = ActivityXamlServices.Load(ms);

                            var newCustomer = wfData.AsString.FromJson() as Customer;
                            //var activity = new Apress.Chapter12.WF.ProcessPawnedItems();
                            
                            var inargs =
                                new Dictionary<string, object> { { "argNewCustomer", newCustomer } };
                            WorkflowInvoker wfInvoker = new WorkflowInvoker(activity);
                            wfInvoker.Invoke(inargs);
                            customerQueue.DeleteMessage(wfData);
                            Trace.WriteLine("Processed Message!", "Information");

                        }
                        catch (Exception ex)

Now, as the worker role reads a workflow serialized within blob storage, a workflow authored from the rehosted designer and created with the name PawnShopWorkflow, as illustrated in Figure 12-24, can be consumed. The new workflow is then downloaded from blob storage, so as new customer information is gathered through the web page, queued customer information is retrieved from within the worker role. The customer data is then executed within the workflow.

9781430243830_Fig12-24.jpg

Figure 12-24.  Saving a workflow to blog storage using a rehosted designer

Now let’s pretend that local pawn shops have had a problem with items being pawned by customers under 21. The rules for processing new customers are handled through the workflow so the logic can be changed to make sure that only customers 21 and up can pawn items. Figure 12-25 shows that the workflow has been modified by adding an If activity to check if the date of birth passed in on the customer object indicates that the customer is 21 or over. I also added a new WF argument that will return a string to the workflow host. Assign activities were added to indicate to the workflow host when customers are old enough to pawn items and when they are too young. To handle the new WF argument called argResponse, the worker role code was modified for hosting the workflow (see Listing 12-19).

Listing 12-19.  Tracing the Returned WF Argument argResponse

try
       {
          var args = wfInvoker.Invoke(inargs);

          if (args["argResponse"]!=null)
             Trace.WriteLine(args["argResponse"]);
       }
       catch (Exception ex)
       {
          Trace.WriteLine(ex.Message, "Information");

       }

9781430243830_Fig12-25.jpg

Figure 12-25.  PawnShopWorkflow.xaml is modified to check if customer is at least 21

Figure 12-26 shows that when a customer under 21 tries to pawn an item, the feedback from the workflow saying “Customer is not 21” is traced within the worker role.

9781430243830_Fig12-26.jpg

Figure 12-26.  Logging WorkerRole information about a customer being too young

Service Bus and Workflows

Microsoft’s Service Bus provides an infrastructure around brokered or relayed messages for support for one-way messaging. Service Bus queues provide a first-in, first-out (FIFO) message delivery. Applications can be loosely coupled by integrating message exchange through the Service Bus rather than directly integrating message exchange within applications. Service Bus queues are somewhat similar to Azure Queues. Azure Queues are built around storing reliable messaging and are beneficial for persisting messaging between services within the same network environment. Service Bus queues provide a broader messaging infrastructure around brokering messaging and routing of messages to appropriate destinations. Service Bus provides support for applications that use multiple communication protocols and/or network environments.

Instead of using Azure Queues for queuing messages that will be processed through a workflow, Service Bus messages can be processed using the code in Listing 12-20. There is a new cloud project template that integrates a worker role with the Service Bus queue, as illustrated in Figure 12-27. The template will kick-start the boilerplate code needed to receive a message over the Service Bus. If you don’t see the project, make sure to download Windows Azure Tools for Microsoft Visual Studio version 1.7.

9781430243830_Fig12-27.jpg

Figure 12-27.  Worker role template that integrates Service Bus queue code

Listing 12-20 illustrates the updated code for processing Service Bus messages within the worker role. The OnStart function sets up the configuration for the Service Bus and makes sure the queue PawnQueue is created. The Run method then checks for new messages on the queue. When a message is received, it performs its own deserialization. In this case, receivedMessage.GetBody<Customer> deserializes the message into a Customer object so it can be passed in as a parameter to the workflow. After the workflow processes the message, it is completed by calling the Complete() method on the message.

Listing 12-20.  Using Service Bus Messages to Process Workflows

        // The name of your queue
        const string QueueName = "PawnQueue";

        // QueueClient is thread-safe. Recommended that you cache
        // rather than recreating it on every request
        QueueClient Client;
        bool IsStopped;

        public override void Run()
        {
            var blobContainer = InitiateBlobStorage();

            while (!IsStopped)
            {
                try
                {
                    // Receive the message
                    BrokeredMessage receivedMessage = null;
                    receivedMessage = Client.Receive();

                    if (receivedMessage != null)
                    {
                        var newBlob = blobContainer.GetBlobReference("PawnShopWorkflow.xaml");
                        var ms = new System.IO.MemoryStream();
                        newBlob.DownloadToStream(ms);
                        ms.Position = 0;

                        var activity = ActivityXamlServices.Load(ms);

                        var newCustomer = receivedMessage.GetBody<Customer>();
                        
                        var inargs =
                            new Dictionary<string, object> { { "argNewCustomer", newCustomer } };
                        WorkflowInvoker wfInvoker = new WorkflowInvoker(activity);
                        try
                        {
                            var args = wfInvoker.Invoke(inargs);

                            if (args["argResponse"]!=null)
                                Trace.WriteLine(args["argResponse"]);
                        }
                        catch (Exception ex)
                        {
                            Trace.WriteLine(ex.Message, "Information");
                        }
                        // Process the message
                        Trace.WriteLine("Processing", receivedMessage.SequenceNumber.ToString());
                        receivedMessage.Complete();
                    }
                }
                catch (MessagingException e)
                {
                    if (!e.IsTransient)
                    {
                        Trace.WriteLine(e.Message);
                        throw;
                    }

                    Thread.Sleep(10000);
                }
                catch (OperationCanceledException e)
                {
                    if (!IsStopped)
                    {
                        Trace.WriteLine(e.Message);
                        throw;
                    }
                }
            }
        }

        public override bool OnStart()
        {
            // Set the maximum number of concurrent connections
            ServicePointManager.DefaultConnectionLimit = 12;

            // Create the queue if it does not exist already
            string connectionString = CloudConfigurationManager.GetSetting("Microsoft.ServiceBus.ConnectionString");
            var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);
            if (!namespaceManager.QueueExists(QueueName))
            {
                namespaceManager.CreateQueue(QueueName);
            }

            // Initialize the connection to Service Bus Queue
            Client = QueueClient.CreateFromConnectionString(connectionString, QueueName);
            IsStopped = false;
            return base.OnStart();
        }

        public override void OnStop()
        {
            // Close the connection to Service Bus Queue
            IsStopped = true;
            Client.Close();
            base.OnStop();
        }

Since the web application creates the Azure Queues that are processed within the worker role, Listing 12-21 shows the updated code in bold so the web application creates a BrokeredMessage instead. In order for a client application to send messages to a Service Bus, it has to use the right credentials. Once the QueueClient is created using the appropriate connection string, it can send a message to the Service Bus, which is automatically serialized, so it can be processed within the worker role.

Listing 12-21.  Configuring and Sending Service Bus Messages

protected void cmdSubmit_Click(object sender, EventArgs e)
        {
            try
            {
                var customer = new Customer()
                {
                    DOB = Convert.ToDateTime(txtDOB.Text),
                    DriversLicenseNumber = txtDriverLicense.Text,
                    FirstName = txtFirstName.Text,
                    LastName = txtLastName.Text,
                    OwnersSSN = txtSSN.Text,
                    CustomerPawns = new List<CustomerPawn>
                   {
                        new CustomerPawn()
                        {
                            PawnedItems = new List<PawnedItem>
                            {
                                new PawnedItem{
                                    ItemName = txtItemName.Text,
                                    PawnedAmount = Convert.ToDecimal(txtAmount.Text),
                                    ModelNumber = txtModelNumber.Text
                                }
                            }
                        }
                   }
                };

                string connectionString = CloudConfigurationManager.GetSetting("Microsoft.ServiceBus.ConnectionString");
                var client = QueueClient.CreateFromConnectionString(connectionString, "PawnQueue");
                var brokerMessage = new BrokeredMessage(customer);
                client.Send(brokerMessage);
                
                //Setting up the Azure Queue
                //var customerQueue = InitiateAzureQueue();
                //var newMessage = new CloudQueueMessage(customer.ToJson());
                //customerQueue.AddMessage(newMessage);

            }
            catch (Exception ex)
            {
                throw ex;
            }

Up to this point in the chapter I have demonstrated how workflows can be authored and modified during runtime and hosted within an Azure worker role so that other applications can execute the workflows for driving business logic. Workflows were first authored and later enhanced through an application that rehosts the WF designer. Workflows were then stored in Azure’s blob storage. The workflows were later used to process business logic for data that was queued up using Azure Queues and also Azure Service Bus from a web application is hosted in Azure. The next part of the chapter will change focus to authoring longer-running workflows that can be persisted within Azure.

Hosting Durable Workflows

For the remainder of the chapter, the focus will be on hosting workflows that are durable or considered to be long-running. I will cover the following topics:

  • Persistence in Azure
  • Hosting patterns and practices
  • Using the Service Bus
  • Hosting workflows for the future

Cloud Persistence

Setting up WF persistence in Azure’s cloud is similar to how you do it on-premise or locally; however, there are some important differences. For instance, the update 4.0.1 for Microsoft .NET Framework 4 Runtime Update fixed the following issues:

  • Sqlworkflowinstancestoreschema.sql used allow_page_locks, which is not a supported keyword in SQL Azure and would fail the installation. allow_page_locks is removed from the script via the update.
  • When network issues happen or the connection was lost, there could be reliability issues because the retry logic was not customized for SQL Azure. To handle this, a new public property called MaxConnectionRetries is available within the SqlWorkflowInstanceStore class. It changes the number of reconnection attempts for connecting to SQL Server. By default, there are 3 attempts during 1 second intervals. The recommended attempts for SQL Azure are greater or equal to 15.

Since VS2012 is installed, 4.0.1 update is not required, and the scripts that are supplied with .NET 4.5 can be used. The first step is to create another database from the Azure Portal. This can also be done remotely by connecting to the Azure database server through SQL Management Studio. For a recap on creating a database in Azure, review Figure 12-15 from earlier in the chapter and review Chapter 8 for illustrations on running the supplied persistence scripts within SQL Management Studio. Remember that each script should be run in the following order:

  1. SqlWorkflowInstanceStoreSchema.sql
  2. SqlWorkflowInstanceStoreLogic.sql
  3. SqlWorkflowInstanceStoreSchemaUpgrade.sql

Each of the scripts should not have any issues and should indicate that they completed successfully. After the scripts have run, refresh SQL Management Studio to validate that the database has been updated appropriately.

image Tip   Remember that persistence can be configured. To make sure that the maxConnectionRetries is set to a ­minimum of 15, use <sqlWorkflowInstanceStore maxConnectionRetries="15".../>.

Hosting Workflow Services in Azure

Hosting services in Azure provides more flexibility around working with customers and greater visibility for services that are exposed within the cloud. Existing WCF services can easily be hosted within Azure. WF is ideal for building durable services using VS2012 and the latest release of Azure tools 1.7. This section will demonstrate how to set up a WCF service hosted in Azure that is long running and maintains state, because service logic is authored as a workflow. The scenario stems around making sure a pawn shop has legitimate customers by establishing an approval process. First, a customer will fill out an application to pawn an item. A pawnshop can then approve or deny the customer. If the customer is approved, then the workflow enters the customer information in SQL Server. The workflow in Figure 12-28 needs to be durable because the approval process can take a long period of time. Persistence will be used to make sure the workflow can handle running over the duration of time it takes to approve or reject a customer.

9781430243830_Fig12-28.jpg

Figure 12-28.  Pawnshop approval process

Although the workflow in Figure 12-28 will be hosted in Azure, Visual Studio makes the process smooth and easy. In fact, workflow services that were demonstrated in Chapter 11 could be moved easily to the cloud as well. Microsoft’s goal for using Azure is to build on the existing knowledge developers have of writing software. Therefore, to get started, we will build a new solution using the WCF Workflow Service Application template, as illustrated in Figure 12-29. Next, the Apress.Chapter12.DataModel project needs to be added, which was built earlier in the chapter.

9781430243830_Fig12-29.jpg

Figure 12-29.  Creating a new WCF Workflow Service Application

The first operation, CreateCustomerApplication, uses a ReceiveAndSendReply and accepts a parameter called NewCustomer of type Customer. A string is sent back indicating that the customer application has been received (see Figure 12-30).

9781430243830_Fig12-30.jpg

Figure 12-30.  CreateCustomerApplication ReceiveAndSendReply

Correlation is established using the OwnersSSN property on the Customer object.

Next, a Receive activity is used to simply approve a customer application. The operation it creates is ApproveCustomer, which takes two parameters. Approval is a Boolean type parameter; SSN is a string and passes the social security number as a correlation token to the workflow so it can resume (see Figure 12-31).

9781430243830_Fig12-31.jpg

Figure 12-31.  ApproveCustomer operation exposed using a Receive activity

The last part of the workflow uses an If activity that checks if the customer is approved to pawn items or rejected. If the customer application is approved, then the SaveCustomer activity used earlier in the chapter is used to save the customer and item the customer wants to pawn to SQL Server (see Figure 12-32).

9781430243830_Fig12-32.jpg

Figure 12-32.  SaveCustomer activity is used if the customer application is approved

Figure 12-33 illustrates the complete workflow that will be deployed to Azure but it still has to be configured to use the persistence store that was created in SQL Azure.

9781430243830_Fig12-33.jpg

Figure 12-33.  Complete pawnshop approval process

The workflow service project comes with a default web.config that can be modified for configuring persistence. Listing 12-22 shows the modified web.config for configuring WF persistence and setting up the connection string so customer data can be written to the Pawnshop database in SQL Server.

Listing 12-22.  Web.config Used to Configure Persistence

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <configSections>
    <!-- For more information on Entity Framework configuration, visit http://go.microsoft.com/fwlink/?LinkID=237468-->
    <section name="entityFramework" type="System.Data.Entity.Internal.ConfigFile.EntityFrameworkSection, EntityFramework, Version=4.4.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" requirePermission="false" />
  </configSections>
  <connectionStrings>
    <add name="Pawning" providerName="System.Data.SqlClient" connectionString="Server=<ServerName>;Database=PawnShop;User ID=<UserId>;Password=<password>;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;" />
  </connectionStrings>
  <system.web>
    <compilation debug="true" strict="false" explicit="true" targetFramework="4.0"></compilation>
    <customErrors mode="Off"></customErrors>
  </system.web>
  <system.serviceModel>
    <behaviors>
      <serviceBehaviors>
        <behavior>
          <sqlWorkflowInstanceStore connectionString="Server=tcp:l1yab5tqj4.database.windows.net,1433;Database=WFAzurePersist;User ID=<UserId>;Password=<password>;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;" hostLockRenewalPeriod="00:00:30" runnableInstancesDetectionPeriod="00:02:00" instanceCompletionAction="DeleteAll" instanceLockedExceptionAction="AggressiveRetry" instanceEncodingOption="GZip" />
          <workflowIdle timeToPersist="00:00:05" timeToUnload="00:00:30" />
          <!-- To avoid disclosing metadata information, set the values below to false before deployment -->
          <serviceMetadata httpGetEnabled="true" httpsGetEnabled="false" />
          <!-- To receive exception details in faults for debugging purposes, set the value below to true.  Set to false before deployment to avoid disclosing exception information -->
          <serviceDebug includeExceptionDetailInFaults="true" />
        </behavior>
      </serviceBehaviors>
    </behaviors>
    <serviceHostingEnvironment aspNetCompatibilityEnabled="true" multipleSiteBindingsEnabled="true" />
  </system.serviceModel>
  <system.webServer>
    <modules runAllManagedModulesForAllRequests="true" />
  </system.webServer>
  <entityFramework>
    <defaultConnectionFactory type="System.Data.Entity.Infrastructure.SqlConnectionFactory, EntityFramework" />
  </entityFramework>
</configuration>

Now that persistence has been configured, a web role will be used to host the workflow within Azure using IIS. The Azure tools for Visual Studio allows this very easily by right-clicking the workflow service project and selecting Add Windows Azure Cloud Service Project, as illustrated in Figure 12-34.

9781430243830_Fig12-34.jpg

Figure 12-34.  Adding a web role for hosting the workflow service project

After the new Azure service project is added, it will now contain the workflow service project as an Azure web role. Opening ServiceDefinition.csdef will show that the role is indeed a web role, as illustrated in Listing 12-23. The workflow is now ready to be pushed to Azure.

Listing 12-23.  ServiceDefinition.csdef for the New Azure Project

<?xml version="1.0" encoding="utf-8"?>
<ServiceDefinition name="Apress.Chapter12.WF.Service.Azure" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition" schemaVersion="2012-05.1.7">
  <WebRole name="Apress.Chapter12.WF.Service" vmsize="Small">
    <Sites>
      <Site name="Web">
        <Bindings>
          <Binding name="Endpoint1" endpointName="Endpoint1" />
        </Bindings>
      </Site>
    </Sites>
    <Endpoints>
      <InputEndpoint name="Endpoint1" protocol="http" port="80" />
    </Endpoints>
    <Imports>
      <Import moduleName="Diagnostics" />
    </Imports>
  </WebRole>
</ServiceDefinition>

Earlier in the chapter, Figure 12-18 illustrated how to set up Azure subscription so solutions can be deployed to Azure. Once the solution compiles, it is ready to be deployed by right-clicking on the new cloud project and then clicking on Publish, as illustrated in Figure 12-35.

9781430243830_Fig12-35.jpg

Figure 12-35.  Publishing a project to Azure through Visual Studio

After the publishing of the solution completes, the service can be checked to make sure it has been created correctly using Internet Explorer. Figure 12-36 shows how to go to the URL that Azure provides and add the workflow name and extension at the end of the URL. In this case, the workflow is named wfPawnService.xmlx, so the address is http://flowfocusservice.cloudapp.net/wfPawnService.xamlx (see Figure 12-36).

9781430243830_Fig12-36.jpg

Figure 12-36.  Checking that the service has been created on Azure

Reusing the web application that was created earlier in the chapter, it can simply be run locally in Visual Studio to interact with the new service. The first step is to add a service reference to the ASP.NET project created earlier. So far, ASP.NET has been used to handle messages from Azure and Service Bus queues, and now it will work directly with the workflow service hosted within Azure. After creating a service reference to the web project, the bold code in Listing 12-24 can be used to communicate with the service.

Listing 12-24.  Code for Submitting Customer Applications and Approving Them

protected void cmdSubmit_Click(object sender, EventArgs e)
        {
            var pawnService = new PawnSvc.PawnServiceClient();
            
            try
            {
                var customer = new Customer
                {
                    DOB = Convert.ToDateTime(txtDOB.Text),
                    DriversLicenseNumber = txtDriverLicense.Text,
                    FirstName = txtFirstName.Text,
                    LastName = txtLastName.Text,
                    OwnersSSN = txtSSN.Text,
                    CustomerPawns = new List<CustomerPawn>
                   {
                        new CustomerPawn()
                        {
                            PawnedItems = new List<PawnedItem>
                            {
                                new PawnedItem{
                                    ItemName = txtItemName.Text,
                                    PawnedAmount = Convert.ToDecimal(txtAmount.Text),
                                    ModelNumber = txtModelNumber.Text
                                }
                            }
                        }
                   }
                };

                
                lblAppResults.Text = pawnService.CreateCustomerApplication(customer);
                //Setting up the Service Bus Queue
                //string connectionString = CloudConfigurationManager.GetSetting("Microsoft.ServiceBus.ConnectionString");
                //var client = QueueClient.CreateFromConnectionString(connectionString, "PawnQueue");
                //var brokerMessage = new BrokeredMessage(customer);
                //client.Send(brokerMessage);
                
                //Setting up the Azure Queue
                //var customerQueue = InitiateAzureQueue();
                //var newMessage = new CloudQueueMessage(customer.ToJson());
                //customerQueue.AddMessage(newMessage);
            }
            catch (Exception ex)
            {
                throw ex;
            }
        }

        protected void cmdApprove_Click(object sender, EventArgs e)
        {
            try
            {
                var pawnService = new PawnSvc.PawnServiceClient();
                pawnService.ApproveCustomer(txtSSN.Text,chkApprove.Checked);
            }
            catch (Exception)
            {
                
                throw;
            }
        }

Running the ASP.NET application locally, data can be submitted to the service based on the data the customer enters. Feedback from the service shows that the application was submitted based on the “Your application has been received” string returned and loaded into the label on the page (see Figure 12-37).

9781430243830_Fig12-37.jpg

Figure 12-37.  Sending a customer application to the service

image Caution  There is no Distributed Transaction Controller (DTC) within Azure, which WF4 relies on heavily. As a result, a best practice when using workflow services make sure the PersistBeforeSend property on SendReply is checked in order to ensure consistency between a sent response to a client and the persisted workflow.

Opening SQL Management Studio and connecting to the SQL database hosted within Azure for persisting workflows shows that there is one persisted workflow instance. This customer application has been persisted, as illustrated in Figure 12-38. By clicking the Approve checkbox and then clicking the Approve button, the record in Figure 12-38 will disappear because the workflow has now completed.

9781430243830_Fig12-38.jpg

Figure 12-38.  Checking for persisted workflow instances

Checking the tables that are in the Pawnshop database will now show that the records have been added from the workflow and that the data matches the data submitted in Figure 12-37 (see Figure 12-39).

9781430243830_Fig12-39.jpg

Figure 12-39.  Records added from the workflow service hosted in Azure

Workflow Manager (Workflow 1.0 Beta)

There is a new workflow manager, currently in preview, for supporting a new workflow model. Creatively, it is called Workflow Manager and it provides hosting and management for WF4.5. Its core focus is to provide features for SharePoint 2013 and Office 365 requirements; however, the intent is for Workflow Manager to support additional scenarios in future releases.

Workflow Manager’s goal is to handle architectural challenges in building workflow solutions within the cloud so developers can focus on the overall business logic. For instance, multi-tenant solutions have specific challenges around segmenting, scale, and managing resources. These challenges are being driven from independent software vendors (ISV) who are building SaaS solutions in Azure, so multi-tenant support will include runtime isolation mechanisms, resource utilization throttles, and features for storing activities and workflows for multiple tenants.

Hosting workflow solutions and getting support from the WF runtime requires WorkflowApplication or WorkflowServiceHost. Workflow Manager builds on these hosting models and provides a new managed host for running workflows. The key capabilities Workflow Manager focuses on are

  • High density and multi-tenancy: Safe, high efficiency, and performance workflow instance execution.
  • Elastic scale: Scaling up/down based on required system capability.
  • Activity/workflow artifact management: Uploading these artifacts through a REST API or client library, plus support for versioning and updating definitions.
  • Tracking and monitoring: Provided through the REST API, client library, and Azure Portal for service health, configuration, and status of running workflows.
  • Instance management: Health monitoring and management, performance and scale management.
  • Fully declarative authoring: Expanded activity library, expression translation, and a new declarative data modeling feature.
  • REST and Service Bus messaging: Integrated messaging capabilities for Azure messaging models of REST web services and Azure’s Service Bus. This includes inbound/outbound message coordination with workflow persistence for reliability and integrity.
  • Managed service reliability: Around-the-clock operations support, fault-tolerant design, cross-data center disaster recovery, service and platform upgrades and management, and standards compliance.

To learn more about Workflow 1.0 Beta, please visit http://msdn.microsoft.com/en-us/library/windowsazure/jj193528(v=azure.10).aspx.

Summary

This chapter focused on building workflows hosted within the cloud using Windows Azure. First, I gave a brief background on the cloud and introduced Windows Azure as Microsoft’s technology platform for building cloud-based solutions. The new Azure portal was under preview during the time the chapter was written, so the chapter walked through the preview portal on how to get started with using Windows Azure using a free 90-day trial. I covered features of Azure that assist in hosting workflows within the cloud, such as SQL Azure, worker and web roles, and blob storage. I also covered cloud services, which provide the components for handling the hosting of the WF runtime through web and worker roles using the different WF runtime hosts and patterns for hosting durable and non-durable workflows. Finally, the chapter talked about the direction Microsoft is taking in providing a standard model for hosting workflows on-premise and within Azure.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.116.62.168