As a new member of the Vertica integration family I have spent my first month in the company learning from some of the great minds in the Integration Industry. In Vertica we have some of the best within the field of integration and it has become clear to me that many of my new colleagues are very enthusiastic about BizTalk. However, I have joined the Azure integration Services (AIS) Team which consists of some of the very same people, who have dedicated their careers to BizTalk. Given that AIS is predicted to replace their beloved Biztalk – Not now, and not soon, but at some point – this tells me that AIS is the tool for the future, and a tool that many firms will have to be aware of at some point. Better late than ever, yes, but also better on time than too late.
In my short integration career, I have already been a part of one of the first AIS-Integration-Projects in Vertica and I would like to report what I have learned about the design of integration solutions in Azure so far. My hope is that my short time in the industry will help me speak in a language that everybody understands as I have not yet been fully smitten with the decease of using too many big words and abbreviations. Although, with the steep learning curve in Vertica I will probably soon be doing the same thing. In this post I want to talk about how you can design a good integration solution with the use of AIS and XSLT.
From BizTalk to Azure
Before being able to use all the possibilities in Azure we want the service to be able to do the job that we need it to do – The job that BizTalk has done for so many companies over the last 15-20 years. In integration this job is enabling us to send and receive data from and to our customers, no matter the shape or the format. But how do we do that in Azure? First, it depends on whether the sender and receiver are on-premise or in the cloud, in which cases Azure offers different solutions. In this post I only want to focus on the basic integration actions, without worrying about where and how we get the data.
It’s all about the Logic Apps
Azure is so many things. The cloud is the tool, but the sky is the limit! And it will only get better and bigger in time, adding new features that you can take advantage of – if you are present in the environment of cause. AIS is a set of Azure-Applications assembled by Microsoft to create the applications needed for integration, and the center of this collection is Logic Apps. Logic Apps lets you build easy to understand flows, that uses “out of the box” connectors from Azure, letting you connect to the services you need to build your solutions. If you need something that isn’t available “out of the box” they also give you the chance to create your own functions through “Azure Functions”. These two are the main components that we have used in our AIS integration flows, supported by other tools, that I intend to write about in later posts as I learn to master them.
How to do integration in Azure?
We want to make the solution as loosely coupled as possible with the goal of giving us a flexible design that makes it easy to react to business changes. This is done by creating a lot of Logic Apps – one for each task in a flow. This means that when somebody sends a type of document, we create a Logic App, when the same sender sends a different type of document, we create another Logic App, and when somebody receives a document, we create another. In this way we get a specific Logic App for each task, making it easy to detect errors and easy to add new components. We then use blob storages to store the documents in between Logic Apps, enabling the next actor in a flow to access the document through connections in the Logic Apps. Blob Storage is Azures storage solution in the cloud and a connection is used to access services from Logic Apps. In an Integration flow there’s always a sender and at least one receiver of data. These sides will each need a Logic App to perform the needed actions.
The Logic App on the sender’s side of the flow
The Logic App on the sender’s side of the flow is basically responsible for translating the document to our own internal format, storing this new document in a “Blob” and then telling possible receiving Logic Apps that there is a document for them to collect. We use XSLT to translate the senders original document to our own internal format, that follows a set of rules that we can count on and use further in the flow. I was only recently introduced to XSLT, but I have already become a big fan of the transforming language, that plays a big part in most of our integrations. When this Logic App has done its job and transformed the document it will trigger a Logic App on the receiving side of the flow.
The Logic App on the receiver’s side of the flow
This Logic App has a set of filters that enables it to listen to messages of other Logic Apps. If its filters fit with the message that was given from the Logic App, it will trigger. When triggered this Logic App’s primary job is to transform our internal document to the native format of the receiver, again through XSLT. This leaves us with a native document that fits the internal criteria of the receiver. What we do with this document then depends on how the receiver wants to receive the message which, as before mentioned, will not be covered in this post, as AIS contains different possibilities for doing this. The model below shows a simple version of the main-flow, with the main responsibilities of the Logic Apps involved.
1. Logic app basis integration flow example
In this solution we use a couple of Azure functions to handle the data manipulation that we need to do. The most important function is the one that transforms to and from our internal XML format via XSLT. Another is used to handle different kinds of charsets. In the picture below you can see an example of how an integration Logic App on the sending side of a flow can look like on the surface. In this case it listens to a “ServiceBus” as its trigger and notifies a “ServiceBus” in the end, but that doesn’t have to be the case.
2. Logic App example
This sums up one way to do integration with XSLT and Azure, but it is important to say that the right solution depends on the business circumstances and the tasks to be done. It is clear to me that AIS brings an exciting future for integrations and I look forward to be a part of it and learning more about the great possibilities of AIS and the rest of the Azure environment. I intend to write other posts like this about how we do integrations in Azure soon, where subjects like “Logging”, “Routing” and “On-premise possibilities” will be on the agenda – adding more as we gain experience and learn as an AIS-team. See you then!