skip to Main Content

Azure Logic Apps – Creating a Simple Logic App

Time to read: 6 minutes By Arun Sirpal (Microsoft MVP), Editorial Contributor 

Azure Logic Apps is a cloud technology feature that forms a core part of the serverless computing platform built by Microsoft and provides much flexibility when developing a wide range of solutions.

Azure Logic Apps

It does this by being the “glue” for disparate systems to connect with each other for a common goal. This is done by building workflows where connectors (see the full list of connectors https://docs.microsoft.com/en-gb/azure/connectors/apis-list) are used rather than custom written code. From this you can automate, orchestrate and schedule processes across your enterprise.

There are many advantages to this technology. You do not have to worry about hosting, scaling, managing, maintaining, and monitoring your apps. These are handled by Microsoft. Generally, you have two options when it comes to pricing. You can select the consumption-based model i.e. pay for only what you use in terms of resource or you can select a fixed priced model using an ISE (Integration Services Environment). 

Taken from official Microsoft documentation here are a few scenarios you can automate with Azure Logic Apps:

  • Process and route orders across on-premises systems and cloud services.
  • Send email notifications with Office 365 when events happen in various systems, apps, and services.
  • Move uploaded files from an SFTP or FTP server to Azure Storage.
  • Monitor tweets for a specific subject, analyse the sentiment, and create alerts or tasks for items that need review.

To showcase the capability of Azure Logic Apps I am going to use a modified version of last example shown above and extract tweets from my twitter account when people tweet about #Azure / #Microsoft hashtags. This is known as the trigger. From there I use cognitive services – text sentiment analysis API returning a numeric score between 0 and 1. This is based on a machine learning classification algorithm where scores close to 1 indicate a positive sentiment about the tweet in question and scores close to 0 indicate a negative sentiment.

The APIs are ever evolving, Currently Microsoft have developed version 3 of the sentiment API which is in preview mode where it can now give us the choice of using sentiment labelling which helps define the sentiment at a document level. Whilst not applicable to the twitter example you can see how handy this feature could be for certain scenarios.

Going back to my solution, with the workflow defined I wanted to put that data into an Azure SQL Database table and link it to Power BI for reporting needs to see where in the world tweets were coming from. All this made possible by using only three connectors taking 15 minutes to build with the Azure Logic App designer. This designer can be used in both the Azure Portal and Visual Studio, more specifically Visual Studio 2015, 2017, 2019 Community Edition and higher.

This is what the workflow looks like when in designer mode. At a high-level, I connect to the twitter account, extract the text (from the tweet) to apply the sentiment analysis and then load the data into a table.

Azure Logic Apps Workflow

For the twitter data I am checking for new items every 30 seconds (see below) where then I apply the sentiment API against the tweet text.

New tweet

When you actually setup the sentiment analysis section you will need to get an API key from https://azure.microsoft.com/en-us/try/cognitive-services/ using the cognitive services language API as shown below.

API

Simply state what you want analysed, for this example being the text of the tweet.

Text of the tweet

This is an example of what is generated by the output (JSON) of the Azure Logic App.
{ “author”: “Test”, “location”: “Minneapolis, MN”, “sentiment”: 0.5, “tweetdesc”: “Azure.Source #azureblogfeed #Azure #AzureStack #Cloud #CloudOps #Microsoft @Azure” }
So, you can see the author, location, sentiment score and the actual tweet. This then gets moved into a table with the above structure, obviously I have made sure the column mappings are correct between the JSON and the SQL table. The above text got rated 0.5 being a neutral sentiment.
Within my Azure SQL Database, I create the below table to hold all the data that I ultimately want to report on.

 CREATE TABLE [dbo].[TweetMe]( [id] [int] IDENTITY(1,1) NOT NULL, [createdDate] [datetime2] NULL, [tweetdesc] [varchar](512) NULL, [sentiment] [float] NULL, [author] [varchar](512) NULL, [location] [varchar](128) NULL, PRIMARY KEY CLUSTERED ( [id] ASC )WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF)

  ON [PRIMARY] ) ON [PRIMARY]

Once the Logic App completes you will want to see the green ticks as shown below.

Logic Apps ticks

When connecting to the Azure SQL Database and issuing basic queries you will see data incoming (I removed the author column) and this will automatically update based on the 30 second check trigger.

SELECT * from [dbo].[TweetMe] ORDER BY createdDate desc

Tweet table

The analysis of the text is accurate. For example, a tweet saying: “The more I use Azure Powershell/CLi to automate infrastructure and various tasks – the more I think AzureRM is becoming outdated. #microsoft #azure #unsure” was returning a score of 0.25 whereas “We have a great partnership with #Microsoft. Thanks #Azure #cloud” was returning 0.98.
Connecting the table to Power BI is very much possible and the last stage. For this blog post I opt for Power BI Desktop.

Power BI Desktop

Quite simply, I select Azure SQL Database for the data source and write the query to pull in that data from the table.

Azure get data

Using the ArcGIS map visualisation, you can see the tweets coming in from across the world.

ArcGIS Map

After reading this blog post hopefully you can see how straightforward it is to connect different systems together to build a solution.

By Arun Sirpal (Microsoft MVP), Editorial Contributor

About the Author

Arun Sirpal, writing here as a freelance blogger, is a four-time former Data Platform MVP, specialising in Microsoft Azure and Database technology. A frequent writer, his articles have been published on SQL Server Central and Microsoft TechNet alongside his own personal website. During 2017/2018 he worked with the Microsoft SQL Server product team on testing the vNext adaptive query processing feature and other Azure product groups. Arun is a member of Microsoft’s Azure Advisors and SQL Advisors groups and frequently talks about Azure SQL Database.

Education, Membership & Awards

Arun graduated from Aston University in 2007 with a BSc (Hon) in Computer Science and Business. He went on to work as an SQL Analyst, SQL DBA and later as a Senior SQL DBA, DBA Team Lead and now Cloud Solution Architect. Alongside his professional progress, Arun became a member of the Professional Association for SQL Server. He became a Microsoft Most Valued Professional (MVP) in November 2017 and has since won it for the fourth time.

You can find Arun online at:

Back To Top
Contact us for a chat