Containerised Asp.Net Core WebApi With Docker On Mac.

Containerised Asp.Net Core WebApi With Docker On Mac.

New .NET Core is the biggest change since the invention of .NET platform. It is fully open-source, components and is supported by Windows, Linux and Mac OSX. In this post I am going to give it a test ride by creating a containerised C# application with the latest .NET CORE.

Docker containers allow teams to build, test, replicate and run software, regardless of where the software is deployed.Docker containers assure teams that software will always act the same no matter where it is – there’s no inconsistency in behavior, which allows for more accurate and reliable testing.

The main advantage to using Docker containers is that they mimic running an application in a virtual machine, minus the hassle and resource neediness of a virtual machine. Available for Windows and Linux systems, Docker containers are lightweight and simple, taking up only a small amount of operating system capacity. They start-up in seconds and require little disk space or memory.

docker installation is available for Mac and windows and can be downloaded from office channel. clickhere

prerequisites:

  1. Install Visual studio for mac
  2. Install Docker for mac

here we will try to build asp.net core webapi and host/run on container with the help of docker.

1. Create New Project: choose asp.net core webapi template from visual studio 2017

D1.png

Now provide project name,solution name and other details like where to save the project.

D2.png

Now our newly created project structure looks as per below

D3.png

Here we have very simple case where service only returns some information about employees as our main objective to host this tiny application on container.

2.Add Docker Support To application

Now add docker file in project  and write instructions that how the docker image build from base image of asp.net core image.

 

Below are the instructions issue to daemon to create docker image.

 

3. Open Terminal on mac :

search for “Terminal” on mac machine and open new window.

D6.png

Once we click on new window option then command window will appear and now all set to issue/write docker command to create docker image.

D7.png

4. Navigate to application folder by issuing change directory command “CD” and make sure we are inside the application folder .We can verify that all items listed by issuing “LS” command that means we are in right place.

D8.png

5. Create Docker Image :

The docker build command builds Docker images from a Dockerfile and a “context”. A build’s context is the set of files located in the specified PATH or URL . The build process can refer to any of the files in the context

Command: Docker build -t .

here our image name is “firstapiwithdocker”,so command should be

docker build -t firstapiwithdocker .

at the moment we can see daemon accept the command start creating the docker image from the docker file instructions.

D9.png

at last we can see  image has been successfully created and tagged with “Latest” keyword.if we don’t provided any tag than daemon tagged the image with “Latest” keyword.

D10.png

5. List all Docker Images:

Now we have to verify that require image has been created or not,so below command have to issue list down all the images.we can see all the important information about images like image name,tag,imageid,created date and size. in below image we can see that out newly created image listed with other base images.

Command :  Docker Images

D11.png

6.Run image with in container:

Till now we have successfully created docker image for our webapi solution and contain all the necessary files and now we have to create container to run this image.

below command will create container.

Syntax: docker run -d -p : –name

example: docker run -d -p 9000:80 –name FirstContainer firstapiwithdocker

once we execute above command,a new container has been created and get random number that means container has been created successfully.

D12.png

Now List down all containers and we can see all the important metadata about containers like containerId,ImageName, Command,Created date,container status,Ports and container name.

here our newly created container is running and exposing 9000 from the host to 80 on the container.

D13.png

Let’s hit the url “http://localhost:9000/api/values” on browser or postman to verify that our application is running on container or not.

Below are the result of the webapi which is running on container instead of local machine.

D14

7. Push docker image on docker hub:

Docker Cloud uses Docker Hub as its native registry for storing both public and private repositories. Once you push your images to Docker Hub, they are available in Docker Cloud.

we need to create docker hub account to push the image on public/private repositories.

D15

 

Docker image should be tagged with well qualified name before issuing the push command. so below command will tagged the image with name “RakeshMahur/FirstApiWithContainer”.

Syntax: docker tag <ImageName> <TagName>

Example:  docker tag firstapiwithdocker rakeshmahur/webapicore-sample

Now login on docker hub from the terminal window by issuing the “docker login” command and provide the docker hub account details (username/password).

D16.png

once docker hub credentials has been validated successfully then a message comes on window and now we will able to push image to docker hub.

D17

Issue docker push command to push docker image to docker hub.

docker push rakeshmahur/webapicore-sample

once we execute above command then we can see our local docker image push to docker cloud repository and listed done over there and every one can pull this image start working on it.

D19.png

Docker Commands

Below are some important and comman used commands , refer to the docker documentation for more details and a more exhaustive list of flags.

  • docker build -t .
    • Builds an image from a given dockerfile. While still useful when handling individual images, ultimately docker-compose will build your project’s images.
  • docker exec -it
    • Runs a command in a running container. More than anything else, I’ve used exec to run a bash session (docker exec -it /bin/bash).
  • docker image ls
    • Lists images on your machine.
  • docker image prune
    • Removes unused images from your machine. Especially when building new images, I’ve found myself constantly wanting a clean slate. Combining prune with other commands helps clear up the clutter.
  • docker inspect
    • Outputs JSON formatted details about a given container. More than anything else I look for IP address via (docker inspect | grep IPAddress)
  • docker pull
    • Downloads a given image from a remote repository. For development purposes, docker compose will abstract this away, but if you want to run an external tool or run project on a new machine you’ll use pull.
  • docker ps
    • Without any flags, this lists all running containers on your machine. I’m constantly tossing on the ‘-a’ flag to see what containers I have across the board. While you are building a new image you inevitably have containers spawned from it exit prematurely due to some runtime error. You’ll need to do ‘docker ps -a’ to look up the container.
  • docker push
    • Once you have an image ready to be distributed/deployed you’ll use push to release it to either docker hub or a private repository.
  • docker rm
    • Removes an unstarted container from your system. Need to run docker stop first if it is running.
  • docker rmi
    • Removes an image. May need to add on the ‘–force’ flag to force removal if it is in use (provided you know what you are doing).
  • docker run
    • Runs a command in a new container. Learning the various flags for the run command will be extremely useful. The flags I’ve been using heavily are as follows:
      • –rm – Removes the container after you end the process
      • -it – Run the container interactively
      • –entrypoint – Override the default command the image specifies
      • -v – Maps a host volume into the container. For development, this allows us to use the image’s full environment and tools, but provide it our source code instead of production build files.
      • -p – Maps a custom port (ie. 8080:80)
      • –name – Gives the container a human readable name which eases troubleshooting
      • –no-cache – Forces docker to reevaluate each step when it runs the container, as opposed to using caching.
  • docker version
    • Outputs both the client vs. server versions of docker being run. This isn’t the same as ‘-v’.
  • docker volume ls
    • While there are variants on volumes, so far I mostly use the ‘ls’ command to list current volumes for troubleshooting. I’m sure there will more to come with using volumes.

 

 

Advertisements
Microsoft Azure |Automate Common Tasks With Azure Logic Apps.

Microsoft Azure |Automate Common Tasks With Azure Logic Apps.

Introduction

Logic Apps are a relatively new feature of Microsoft Azure that makes it simple to build complex workflows using one or more of the over 175 different connectors. Since the Logic Apps are server less, we do not need to worry about server sizing. The platform will scale to meet our demand, and we are only charged for what we use.

Azure Logic Apps are hosted in Microsoft Azure, and so, there’s no infrastructure component, there’s no on-premises or virtual machines to manage. On that basis, and in terms of how we pay for it, it is essentially server-less.

Logic App can be trigger launched, and there is no server configuration involved for us as developers or managers of these Logic Apps. It is also server-less in the pricing sense, in that it is paid for in terms of how many actions our Logic Apps take.

Logic apps have a no code designer for rapid application creation.  It will give us an easy way, without writing code, to do integration in the cloud and automate it.  No need to go through hours or days to get setup.   With a simple user interface, it should help us to get started in a matter of minutes.  This simplification should help broaden the user base and make more people willing to use Azure.

How Logic App Works:

logic apps starts with a trigger whenever specific criteria meets or any event happened.Each time that the trigger fires, the Logic Apps engine creates a logic app instance that runs the workflow’s actions. These actions can also include data conversions and flow controls, such as conditional statements, switch statements, loops, and branching.

Benefit’s of Logic Apps:

  • Visually build workflows with easy-to-use tools.
  • Get started faster with logic app templates.
  • Connect disparate systems across different environments.
  • First-class support for enterprise integration and B2B scenarios.
  • Write once, reuse often.
  • Pay only for what we use.

Let’s take simple example with Microsoft one drive. Assume there is specific folder created on one drive and allow to upload different files like xml, json,word, ppt etc.Folder owner is more interested in specific files format not all files and want some notifications when ever specific file uploaded in one drive folder.

In above use case one drive and email client integration required and this problem can be solved by any programming techniques but required more configuration and development expertise But logic apps doesn’t required much expertise all of that.there are hundred’s of connector’s available to integrate with external tools.

Get Started:

Login on azure portal and select logic apps service under enterprise integration category.

L2.jpg

L8.jpg

There are so many templates available to start designing logic apps and select as per business requirement.select blank template for custom designing , here we will go with blank template.

L9.jpg

Once we create logic app than go to logic app designer and start designing the business workflows.

L3.jpg

Now start adding connectors as per the business requirement.Here we will select Microsoft one drive connector as staring point of logic app workflow.

To do so,click on “New Step” button and add new action,after that we will see a list of connectors.

L4.jpg

here is a list of connectors,we will search for Microsoft one drive to make connection with one drive.

L5.jpg

L6.jpg

Every connector has two type of options like trigger and actions.Trigger always be staring point of workflows and responding back on specific events on resources. In our case we are going to monitor our one drive folder.

L7.jpg

once we added one drive connector than we have to connect with valid credentials,here we have to use our Microsoft user id & password for one drive authentication.

L10.jpg

once we clicked on sign in button,we will redirect to Microsoft login page for login.

L11.jpg

Once we authenticate successfully then we are able to see all folder’s on one drive.Select any folder on which monitoring required,here i go with “Azure Testing Files” because all files uploaded in this folder.

L12.jpg

L13.jpg

Now add next step,Click on + sign and add conditional action.

L14.jpg

Conditions are similar to C# or any other language “if-Else” statements.

L15.jpg

First we have to define condition by taking input parameter’s from previous step which is one drive connector.we have to check uploaded file extension whether file is xml file or not.

L16.jpg

If above condition match than  write send mail functioanlity to notifying about uploaded xml file.Here we can see send mail logic defined inside the true condition with Gmail connector .

L17.jpg

Now we have to define some action for false condition,if conditional statement not matched with uploaded file.In this case we only monitor xml files.so if any other files uploaded than logic apps move such files from main folder to archive folder.

Let’s take one drive connector in false condition section and write  file archive logic,have a look complete file archival logic.

L19.jpg

Now we ready with folder monitoring workflow with logic app,let’s look complete designed logic app.

L20

Trigger the logic App:

To test the logic app,I have set the time frame as 10 second to trigger the logic apps that means in every 10 second one drive connector scan the one drive folder if any new file uploaded inside the folder,it automatically triggered and start executing complete workflow.

below is snapshot from one drive that contains two folders. One folder (“Azure Testing Files”) is used for uploading files and another folder (Archive Azure Testing) used for archive files.

L21.jpg

Verify True Condition,if Xml file uploaded then mail should send:

As of now both folder are empty.Now upload any xml file in “Azure Testing file” and we will see  logic app automatically trigger and detect this file and start executing all connectors. As per our condition file is xml than true section should execute and send mails to folder owner. Let’s catch-all events.

uploaded xml file to one drive folder.

L22.jpg

Now go to logic apps and see activity logs.

L23.jpg

if we open above highlighted details then visual designer pane will open with all connector execution status. here we can see each connector status in dept.

L24.jpg

As Gmail connector executed successfully then Gmail should send mail to folder owner.Below is received mail which is sent by logic app.

L25

Verify False Condition: If uploaded file is not xml file,then false block triggered.

Let’s upload other file format files (other than xml files) then true block will not execute and not send any email by Gmail connector.In this case false block will triggered and move file from uploaded folder to archive folder by one drive connector.

This time image file uploaded.

L26.jpg

See logic app execution details.

L27.jpg

After successfuly execution we can se uploaded file moved to archive folder from uploaded folder.

L28.jpg

Conclusion

Logic apps are not only about integration and orchestration, but also about connectivity to other services. These services can be Azure based, SAP applications or 3rd party solutions such as One-Drive and Dropbox: They can even be custom-built applications running on-premises, such as a web API.

Event Based Solution with Azure Event Grid & Logic Apps.

Event Based Solution with Azure Event Grid & Logic Apps.

Introduction

Event grid is new app-service in azure that connect applications together so that applications talk to each other in distributed environment. This way of working decouple application components, which enables more scalability, maintainability, and extensibility.

At the basic level it’s similar to a message queue service, like Azure Service Bus Topics, that enables a publish / subscribe model.

Azure Event Grid is a different kind of messaging service that’s built to enable event-based architectures like those use with Microservices architectures to be built more easily.

Azure Event Grid can be described as an event broker that has one of more event publishers and subscribers. Event publishers are currently Azure blob storage, resource groups, subscriptions, event hubs and custom events etc.  Subsequently, there are consumers of events (subscribers) like Azure Functions, Logic Apps, and WebHooks.

E1.jpg

 

Employee On boarding System Example:

Let’s take employee boarding system example which is very common and essential process to every organizations. So whenever new employees hired then there are some hiring formalities like save employee details in company database (azure database,on premises Database etc) and send and welcome email to employee to notified about joining dates and other formalities  and also send calendar invites for both parties (employee and HR) to remind the joining date.

There may be some other parties may include in this process like transport department,IT department etc to set up all the necessary things at the time of employee on boarding.

So employee information is common to all parties and on the basis of this information the have to execute some set of  actions  like email creation,Id card issue,IT assets arrangement and seat allocation,so there should be a mechanism that can broadcast the same information to all interested parties.

Here Azure event grid comes in picture and help us to solve this problem in very efficient manner and  logic apps removes all hurdles to integrate all external integrations like mail client,database base logic etc without much coding part.

Process Flow:

E3.jpg

There are below important sections to developed above solution:

  1. Mvc application to submit employee details.
  2. Setup azure event grid in azure
  3. setup logic apps in azure
  4. Add logic apps as subscriber to event grid
  5. Integrate all together from step 1 to step 4.

 

1 ) Create Employee On-boarding Portal (Mvc Application)

Primarily web application will receive all employee details from end-users and generate a json formatted message and send to azure event grid topic instead of directly inserted in azure database.Later this message broadcast to all its subscribers and processed.In our case logic app is topic subscriber hence message received by logic app and start processing and executing related actions like send mail,send calendar invite so on.

Here i am taking simple mvc application for demonstration with simple form.

e9.jpg

2 ) Setup azure event grid in azure

Login on azure portal and select new and search for event grid topic.Fill all the mandatory details like name,subscription,region and resource group.

e4.jpg

once we finish with event grid configuration then we will navigate to below user interface where we have to set up event subscriptions.In our case logic apps will be our event listener and need to map here but we will setup later in next steps.

e5.jpg

3 ) Setup Logic App Service:

Logic Apps helps us build, schedule, and automate processes as workflows so we can integrate apps, data, systems, and services across enterprises or organizations. Logic Apps simplifies how we design and create scalable solutions for app integration, data integration, system integration, enterprise application integration (EAI).

Choose “logic app ” service from available service in azure portal and fill all required information.

e6

e7

Now press create button and fill all required details like logic app name,resource group,region etc. logic app name should be unique throughout the globally and azure portal automatically suggest whether name is unique or not.

e8.jpg

now navigate to logic app and start designing process flow with logic app connector.Below is complete designed logic app flow and we will talk about each connector step by step.

e10.jpg

Let’s talk about each connector in details.

  • EventGrid Connector: logic apps have connector called event grid connector that able to connect with event grid and received all broadcast messages.Here it will received employee details that is json formatted messages  submitted by mvc application to event grid.

 e11.jpg

  • Parse Json Connector: This connector parse the message that received from event grid on the basis of provided message schema.Here output from event grid connector will become input for parse json connector.If message parse successfully than same message passed to next step that responsible to insert data in database

 e12.jpg

  • Sqlserver Connector: Sql connector enable us make communication with database.by using valid database credential we can make connection with database and select target tables in which message details need to store.

 E13.jpg

Once connection establish successfully with server then we can see all tables of that server.here we need to map our message properties with table columns to store message in sql tables.

when we will click on each property then message properties selection area enable to choose right property from all available.Below we can see there are multiple properties showing to select and we may select one or more properties to store in particular column,here i select employeeId to store in employeeid column.

e15.jpg

E14.jpg

  • Google Calender & Gmail Calender Connectors: we have successfully configured event grid ,message parsing and database connection.Now we have to set up two important task like send Calender invite and send emails.here we are adding both connectors as parallel branch so that execute simultaneously once message save successfully in database.

first we have to click on (+) symbol and choose “add parallel branch” and then select google Calender connector and again repeat above thing select Gmail connector.

         e16.jpg

Now we have to provide all required information like valid Gmail email id and other massage details.

E17.jpg

Now our logic app completely ready to use but one small integration still pending that is an integration of azure event grid logic apps.

4 ) Add logic apps as subscriber to event grid: Till above step we have created event grid, mvc application and logic apps.Now we have to add our logic apps as a subscriber to event grid so that every message can be broadcast to logic apps and trigger it.let’s see below steps:

  • go to newly created event grid topic from azure portal and click on “Event Subscription” button on top of the expanded view in event grid and some mandatory details as per below .

E18

To find subscriber endpoint,go to back our logic app.there is a option called “CalledBackUrl” and this is actual logic app calling end-point.we need to copied it and paste on above form.

e19.jpg

once we add logic app as a subscriber,it will showing in event grid subscriber’s list as per below:

e20.jpg

Great..our integration done till this step.Now little bit C# code need to add at our controller level to convert employee joining details in json message and send to event grid. let’s see how can we do it.

5 ) Mvc Application & Event Grid Integration:

Here mvc application works as message initiator and send employee details to event grid. by calling event grid topic  end-point as a service.once this message send to event grid then event grid broadcast same message to its subscriber,here logic app is our subscriber so logic app trigger and start executing it’s all components like save data in db and send mails to employee email id.

we can find event-grid endpoint from event grid detail page as per below

e21.jpg

look at the mvc controller from where application call event grid and sent message:

EmployeeController.cs


<span class="mceItemHidden">public class <span class="mceItemHidden"><span class="hiddenSpellError">EmployeeController</span></span> : Controller</span>
 {
 // GET: Employee
<span class="mceItemHidden"> public <span class="mceItemHidden"><span class="hiddenSpellError">ActionResult</span></span> Index()</span>
 {
 return View();
 }
<span class="mceItemHidden"> [<span class="mceItemHidden"><span class="hiddenSpellError">HttpPost</span></span>]</span>
<span class="mceItemHidden"> public <span class="mceItemHidden"><span class="hiddenSpellError">ActionResult</span></span> Index(<span class="hiddenSpellError">EmployeeModel</span> <span class="hiddenSpellError">employeeModel</span>)</span>
 {
<span class="mceItemHidden"> <span class="mceItemHidden"><span class="hiddenSpellError">TimeSpan</span></span> hours = new System.TimeSpan(0, 9, 0, 0);</span>
 employeeModel.DateOfJoining.Add(hours);

employeeModel.DateOfJoining = employeeModel.DateOfJoining.ToUniversalTime();
<span class="mceItemHidden"> <span class="mceItemHidden"><span class="hiddenSpellError">EventGridModel</span></span> <span class="hiddenSpellError">evmModel</span> = new EventGridModel();</span>
<span class="mceItemHidden"> <span class="mceItemHidden"><span class="hiddenSpellError">evmModel</span></span>.id = new Guid().<span class="hiddenSpellError">ToString</span>();</span>
 evmModel.eventType = "NewPost";
 evmModel.subject = "blog/posts";
 evmModel.eventTime = "2017-08-20T23:14:22+1000";
 evmModel.data = employeeModel;

<span class="mceItemHidden">List<span class="mceItemHidden"><span class="hiddenSpellError">egmList</span></span> = new List();</span>
 egmList.Add(evmModel);
<span class="mceItemHidden"> <span class="mceItemHidden"><span class="hiddenSpellError">SendMessageToEventGrid</span></span>(egmList);</span>
 return View();
 }

Here private method “SendMessageToEventGrid” called and in this method we have use event grid end-point to post json message.let’s look at implementation.

E22.jpg

Till now we have set up each component and now let’s see how things works when details submitted by mvc applcation.we will once details submitted successfully than message broadcast to all subscriber by event grid and logic app trigger and send Calender invite and email to employee.

Mvc application with employee details:

E23.jpg

Once we press create button,message send to event grid and event grid triggers logic app.below is logic app running status where we can see all connector execute successfully ( click on highlighted details and See green sign on each connector).

we can see the complete process took only 5 second to execute including all connector.

E24.jpg

Logic app running status in expend view:

E25.jpg

Below is email and calender invite from my inbox where we can see email successfully sent by logic apps.

E26

Calender Invite:

E27

 

I hope you like this..keep learning and sharing.

 

 

 

 

 

 

Dead Letter Queue (DLQ) Messages Handling In Azure Topic & Subscription

Dead Letter Queue (DLQ) Messages Handling In Azure Topic & Subscription

DownLoad Project:AzureTopic&Subscription

Azure service bus Queue and topic  subscription provide addition sub-queue called dead-letter queue (DLQ).All expired message move to dead letter queue from queue and subscriptions for further investigations and processing.

Dead letter queues are the safest bet when we are using Azure Service Bus. If for whatever reason message could not be processed by a receiver, we could move that such message to dead letter queue. This would mean, we may need manual intervention to determine why the message is not getting processed. Listener for the dead letter would just need to facilitate that by logging the message to some database or to a log file so that the user can take a look at that later.

No additional efforts  are required to create dead-letter sub queue for Queue and Subscription,dead letter subqueue created automatically when main entities either azure service bus queue or topics/subscription are created.

t23.jpg

Every  topic/subscription have dead letter queue and name ends with $deadletterqueue.

Naming Convention of DLQ:

<TopicName>/Subscriptions/<YourSubscriptionName>/$DeadLetterQueue

TopicName: it will be any qualified topic name created in azure server bus.

SubscrpitonName: It will be subscription name that is create under azure topic.

Let’s assume we have create a topic named as “OrderTopic” and one subscription named as “EastSubscrpption” is also created in “orderTopic” then below will be dead letter queue path.

OrderTopic/Subscriptions/EastSubscription/$DeadLetterQueue

In C# we can find this path at run time by using azure service bus sdk.there is a method “FormatDeadLetterPath” used to extract full path of DLQ.

C# Code


var Dlqpath = SubscriptionClient.FormatDeadLetterPath("OrderTopic", "EastSubscription");

Reading DLQ messages:

DLQ messages can be read by azure subscription client,that interact with dead letter sub queue for specific subscription.”$deadletterqueue” need to add at end with subscription name to create subscription client and start reading the message from sub queue.


 var DLQPath = "/$DeadLetterQueue";

SubscriptionClient sClient = SubscriptionClient.CreateFromConnectionString(TopicConfigurations.Namespace, topicName, description.Name + DLQPath);

Dead letter messages can be inspected by azure service bus explorer.Every subscription has two parts (X,X) with subscription name in service bus explorer. First part represents active messages count in subscriptions and second part represents dead letter message count. we will see this in details with example.

 

t25.jpg

Enterprise applications rely on processing every order and message that gets placed in a system. To make sure your application doesn’t miss a message when integrating with service bus, you must be sure to code for DLQs.

Let’s take a look with example,assume we have azure topic named as “orderTopic” with two subscriptions (EastSubscription & NorthSubscrpiton).There is a message publisher utility that publish some order details continuously to azure topic (OrderTopic) but at the same time receiver utility is not running or not receiving messages properly due to some reasons.To ensure reliable message delivery we have set up a separate utility that actively monitor these dead letter messages and process them or re-enque/redirect some other place.

To test above use case,we will run only publisher and dead letter message processing utility so that messages not processed by any receiver and move to Deadletter sub queue and later same messages processed by DLQ message processing utility.

Let’s capture information before executing publisher utility. below you can see (0,0) shows with subscription name that means there is no active message and no dead letter message in server bus subscriptions.

t26.jpg

Now run the publisher utility and see 4 order details publish to azure topic and verify same thing with azure service bus explorer.

t27.jpg

In below snapshot we can see (2,0) with both subscription names that means there are 2 messages in active state and no message move to dead letter sub-queue. As we have set message expiry time to 1 minute so after one minute these message got expired and move to DLQ. Wait for 1 min to expiry these messages.

t28.jpg

After one minute these message got expired as there we have not runs the receiver utility and then these messages move to DLQ and (0,2) will show with both subscription names that means there are no messages in active state and 2 messages in dead letter queue.See below snapshot that is taken after 1 min.

t29.jpg

Now execute Dead letter processing utility that reads DLQ messages and processed it,

t30.jpg

Once all messages read from DLQ by the utility then subscriptions dead letter sub-queue clears and dead letter message count reset to 0.

t31.jpg

All code together for DLQ message processor:


using System;
using Common;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;

namespace DeadLetterQueueProcessor
{
 class DlqMessageProcessor
 {
 static NamespaceManager nameSpaceManager;

static void Main(string[] args)
 {
 nameSpaceManager = NamespaceManager.CreateFromConnectionString(TopicConfigurations.Namespace);
 ReadDLQMessages("OrderTopic");
 Console.ReadLine();
 }

public static void ReadDLQMessages(String topicName)
 {
 //Path of Deadletter queue,evey subscripiton has deadletter queue
 var DLQPath = "/$DeadLetterQueue";

foreach (SubscriptionDescription description in nameSpaceManager.GetSubscriptions(topicName))
 {
 var Dlqpath = SubscriptionClient.FormatDeadLetterPath(topicName, description.Name);

Console.ForegroundColor = ConsoleColor.Yellow;
 Console.WriteLine("=======================================");
 Console.WriteLine("---Order Recieving From DeadLetter Queue [" + Dlqpath + "]---");
 Console.WriteLine("=======================================");

//here susbcription client is created with deadletter queue
 SubscriptionClient sClient = SubscriptionClient.CreateFromConnectionString(TopicConfigurations.Namespace, topicName, description.Name + DLQPath);
 while (true)
 {
 BrokeredMessage bmessgage = sClient.Receive(TimeSpan.FromSeconds(1));
 if (bmessgage != null)
 {
 Order order = bmessgage.GetBody();
 Console.ForegroundColor = ConsoleColor.Red;
 Console.Write(" Request Recieved, ProductName: {0},Zone : {1},CustomerName: {2},DeliveryAddress: {3} \n\n",
 order.ProductName, order.Zone, order.CustomerName, order.DeliveryAddress);

Console.ForegroundColor = ConsoleColor.Yellow;

bmessgage.Complete();
 }
 else
 {
 break;
 } 
 }

sClient.Close();
 }
 }
 }
}

Sample project structure

t32.jpg

When working with the Azure Service Bus, having knowledge of the dead-letter queue can save you a lot of time and it is best practice to place dead letter queue with enterprise integration projects.

Read more about Azure service bus explorer here

 

 

Filtered Subscriptions with Azure Service Bus Topics

Filtered Subscriptions with Azure Service Bus Topics

Download Complete Project: AzureTopicSubscriptionfilters

Azure Service Bus topics allow multiple subscribers to receive the same messages. So if we post an message to a topic, then one subscriber might send an order confirmation email, while another subscriber to the same event might handle payments.

The way this is done is that you create multiple “subscriptions” on the topic, and then you can listen for messages on those subscriptions, in the same way that you would listen for messages on a regular queue.

But what if your subscription is only interested in a subset of messages that are posted to that topic? Well, this is where filters come in. When we create a subscription, we can specify the properties of the messages we are interested in.

What is Rule Filter

As the filter’s type implies, it allows for defining a SQL92 standard expression in its constructor that will govern what messages are filtered out.

There are following types of filters :

  1.  SQLFilter – The filter that a number of other filters derive from such TrueFilter and FalseFilter
  2. TrueFilter – This is the default filter provided through the default rule that is generated for us when a rule or filter is not explicitly provided when creating a subscription.  Ultimately, this generates the SQL92 expression 1=1 and subscribes to receive all messages of the associated topic.
  3. FalseFilter – The antithesis of a TrueFilter that generates a SQL92 expression of 1=0; a subscription with this filter will never subscript to any messages of the associated topic.
  4. CorrelationFilter – This filter subscribes the subscription to all messages of a particular CorrelationId property of the message.
    Note:Be aware that the comparison values in your SQL expression are case-sensitive, while the property names are not (e.g. “Zone= ‘East’” is not the same as “zone= ‘east’”)

In earlier post we have seen how default rules works and how same message broadcast to all subscriptions. Now we will see how custom rules works and how specific message subscribe by the subscriber with custom filtering rules.

Example:

Let’s consider order processing system where some orders need to publish to topics and some region wise subscriptions are also setup in the topic.Instead of broadcast all orders to all subscriptions,only region wise orders will broadcast to specific subscriptions.

In our case orders belongs to two regions (East and North) and two subscriptions are setup to subscribe these messages,i.e East Subscription only receives east region orders and north subscription receives north region orders.

T18

Adding rules to the subscriptions:

Subscriptions are not limited to one rule however. We can add additional rules to an existing subscription.


public static void CreateTopicUnderServiceBus()
{
var TopicName = "OrderTopic";
string[] arrSubsription = new string[] { "NorthSubscription", "EastSubscription" };

//Create Topic
if (!nameSpaceManager.TopicExists(TopicName))
{
nameSpaceManager.CreateTopic(TopicName);
}

//CreateSubscription
foreach (string subsription in arrSubsription)
{
if (!nameSpaceManager.SubscriptionExists(TopicName, subsription))
{
SubscriptionDescription subscriptionDesc = new SubscriptionDescription(TopicName, subsription)
{
EnableDeadLetteringOnMessageExpiration = true,
EnableDeadLetteringOnFilterEvaluationExceptions = true,
DefaultMessageTimeToLive = TimeSpan.FromMinutes(5),
LockDuration = TimeSpan.FromSeconds(60),
};
var zone = subsription.Replace("Subscription", "");

// Rule for EastSubsrciption to recieve those message that belongs to specific zones.
RuleDescription ruleDescForSubscriptions = new RuleDescription("ZoneFilter", new SqlFilter("Zone='" + zone + "'"));

nameSpaceManager.CreateSubscription(subscriptionDesc,ruleDescForSubscriptions);
var ss = SubscriptionClient.FormatDeadLetterPath(TopicName, subsription);
}
}

}

Here RuleDescription class is used to create a filter rules. Above code will add two filter rules  in  subscriptions which are  “Zone=’East’ and “Zone=’North'”.

An additional properties also need to send with broker message along with creating filter rules.


public static void PublishOrder(TopicClient topicClient, Order order)
{
String Displaytext = string.Format("Order : orderId={0},customerName={1},ProductName={2},DeliveryAddress={3},Zone={4} " +
" Sent To Topic Successfully \n\n",
order.OrderId, order.CustomerName, order.ProductName, order.DeliveryAddress, order.Zone);

//Create broker message for order
BrokeredMessage brokerMessage = new BrokeredMessage(order);
brokerMessage.Properties.Add("Zone", order.Zone);
topicClient.Send(brokerMessage);
Console.Write(Displaytext);

}

See the “Zone” property set in broker message.

Complete Code For Message Publisher:


using Common;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using System;
using System.Threading;

namespace MessageSender
{
class Publisher
{
static NamespaceManager nameSpaceManager;

static void Main()
{
nameSpaceManager = NamespaceManager.CreateFromConnectionString(TopicConfigurations.Namespace);
CreateTopicUnderServiceBus();

TopicClient tClient = TopicClient.CreateFromConnectionString(TopicConfigurations.Namespace, "OrderTopic");

Console.ForegroundColor = ConsoleColor.Yellow;
Console.WriteLine("======================================================");
Console.WriteLine("-----------Publishing Start---------------------------");
Console.WriteLine("======================================================");

Console.ForegroundColor = ConsoleColor.Green;
PublishOrder(tClient, new Order()
{
OrderId = 5656,
CustomerName = "Vivek Jadon",
ProductName = "Iphone6",
DeliveryAddress = "MG Road Gurgaon",
Zone = "East"
});
PublishOrder(tClient, new Order()
{
OrderId = 5657,
CustomerName = "Rakesh ",
ProductName = "Samsung S8",
DeliveryAddress = "12/3 Ring Road Delhi",
Zone = "East",
});
PublishOrder(tClient, new Order()
{
OrderId = 5658,
CustomerName = "Prakash Nayal",
ProductName = "OnePlus 5T",
DeliveryAddress = "12/3 Ring Road Delhi",
Zone = "North",
});
PublishOrder(tClient, new Order()
{
OrderId = 5659,
CustomerName = "Gautom Anand",
ProductName = "Samsung S8",
DeliveryAddress = "12/3 Ring Road Delhi",
Zone = "North",
});

Console.ForegroundColor = ConsoleColor.Yellow;
Console.WriteLine("-----------Publishing End---------------------------");
Console.ReadLine();
}

public static void CreateTopicUnderServiceBus()
{
var TopicName = "OrderTopic";
string[] arrSubsription = new string[] { "NorthSubscription", "EastSubscription" };

//Create Topic
if (!nameSpaceManager.TopicExists(TopicName))
{
nameSpaceManager.CreateTopic(TopicName);
}

//CreateSubscription
foreach (string subsription in arrSubsription)
{
if (!nameSpaceManager.SubscriptionExists(TopicName, subsription))
{
SubscriptionDescription subscriptionDesc = new SubscriptionDescription(TopicName, subsription)
{
EnableDeadLetteringOnMessageExpiration = true,
EnableDeadLetteringOnFilterEvaluationExceptions = true,
DefaultMessageTimeToLive = TimeSpan.FromMinutes(5),
LockDuration = TimeSpan.FromSeconds(60),
};
var zone = subsription.Replace("Subscription", "");

// Rule for EastSubsrciption to recieve those message that belongs to specific zones.
RuleDescription ruleDescForSubscriptions = new RuleDescription("ZoneFilter", new SqlFilter("Zone='" + zone + "'"));

nameSpaceManager.CreateSubscription(subscriptionDesc,ruleDescForSubscriptions);
var ss = SubscriptionClient.FormatDeadLetterPath(TopicName, subsription);
}
}

}
public static void PublishOrder(TopicClient topicClient, Order order)
{
String Displaytext = string.Format("Order : orderId={0},customerName={1},ProductName={2},DeliveryAddress={3},Zone={4} " +
" Sent To Topic Successfully \n\n",
order.OrderId, order.CustomerName, order.ProductName, order.DeliveryAddress, order.Zone);

//Create broker message for order
BrokeredMessage brokerMessage = new BrokeredMessage(order);
brokerMessage.Properties.Add("Zone", order.Zone);
topicClient.Send(brokerMessage);
Console.Write(Displaytext);

}
}
}

Complete code for message receiver:


using Common;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using System;

namespace MessageReciever
{
class Subscriber
{
static NamespaceManager nameSpaceManager;

static void Main()
{
nameSpaceManager = NamespaceManager.CreateFromConnectionString(TopicConfigurations.Namespace);
ReadMessageFromSubscription("OrderTopic");
Console.ReadLine();
}

public static void ReadMessageFromSubscription(string TopicName)
{

foreach (SubscriptionDescription description in nameSpaceManager.GetSubscriptions(TopicName))
{
SubscriptionClient sClient = SubscriptionClient.CreateFromConnectionString(TopicConfigurations.Namespace,"OrderTopic", description.Name);

Console.ForegroundColor = ConsoleColor.Yellow;
Console.WriteLine("=======================================");
Console.WriteLine("---Order Recieving From [" + description.Name + "]---");
Console.WriteLine("=======================================");
while (true)
{
BrokeredMessage bmessgage = sClient.Receive(TimeSpan.FromSeconds(2));

if (bmessgage != null)
{
Order order = bmessgage.GetBody<Order>();
Console.ForegroundColor = ConsoleColor.Green;
Console.Write(" Request Recieved, ProductName: {0},Zone : {1},CustomerName: {2},DeliveryAddress: {3} \n\n",
order.ProductName, order.Zone, order.CustomerName, order.DeliveryAddress);

Console.ForegroundColor = ConsoleColor.Yellow;

bmessgage.Complete();
}
else
{
break;
}
}
sClient.Close();
}

}
}
}

Once the messages are sent to Topic, the subscriber should start showing the appropriate message count. In our case if we send a message with East and North as Zone, as per the rules set, EastSubscriber should receive only 2 message which is East region messages and Northsubscriber should receive also 2 messages as north region has 2  messages.

Let’s run complete application and then talk about the output.

Here we can see there are 4 orders publish to order topic,there are 2 orders that belongs to East region and 2 orders for North reagion.Important thing that this time all 4 messages has not broadcast to both subscriptions because we have created 2 filters rule for both regions.

t19

Take a look on azure portal , here we can see each subscriptions received 2 messages according to their filter rule.

t20.jpg

Now see topic & subscription details with azure service bus explorer.We can see custom filters with name “ZoneFilter” has created in both subscripiton with filter rule.

t21.jpg

Of course, you can get away without using filters at all, if you just set up plain subscriptions and only respond to messages of interest in your handlers. But using the filters will reduce network traffic and save unnecessary work processing messages that aren’t of interest.

Read more about service bus explorer :Link

Microsoft Azure: Service Bus Topic & Subscription With Default Filtering Rule.

Microsoft Azure: Service Bus Topic & Subscription With Default Filtering Rule.

Download complete project : AzureTopicSubscription

In contrast to queues, in which each message is processed by a single consumer,topics and subscriptions provide a one-to-many form of communication, in publish and subscribe pattern.

Useful for scaling to very large numbers of recipients, each published message is made available to each subscription registered with the topic. Messages are sent to a topic (and delivered to one or more associated subscriptions) and received from subscriptions. Filter rules can also be set on a per-subscription basis.

t1

Default Filters in topic:

When the subscriptions were created in the topic, no filter was defined for them, and they subscribe to all messages that are sent to the topic.

True Filter  is the default filter provided through the default rule that is generated for us when a rule or filter is not explicitly provided when creating a subscription.  Ultimately, this generates the SQL92 expression 1=1 and subscribes to receive all messages of the associated topic.

Default Rule Name:  “$Default”

Default Filter Name : “True Filter”

Default Sql Expression=”Filter=1=1″

There are multiple ways to create topic and subscription in Azure service bus.

  1. Azure Portal
  2. PowerShell
  3. language specific SDK. ex: C# azure sdk

1. Create Topic & Subscription Through Azure Portal

login to azure portal and choose “New” option from left menu.

t2

once you navigate to service bus navigation pane then fill all required information like service bus name (should be unique across the global),choose appropriate price tire ,Resource group (new or existing created group) and location.

t3

Once you press create button,your newly created service bus list down in your resource list and you can go inside it for further configuration.

for this demonstration i have created service bus with name “Rakesh-ServiceBus”.As we can see below there are two options  available ,one for Queue and another is Topic.

t4

Click on “Topic option” from top menu and enter few details like topic name,topic max size( default size is 1 GB) and define number of days to message live etc.

t5

2. Create Topic & Subscription Through C# Azure Sdk:

let’s say you have already created service bus in azure account and now you have to create topics and subscriptions through C#.

Here’s a simple example of how to achieve this with C# programming.

Creating Topics:

Topic creation is straight forward approach and below are the code to create .


nameSpaceManager = NamespaceManager.CreateFromConnectionString(TopicConfigurations.Namespace);

var TopicName = "OrderTopic";
//Create Topic
if (!nameSpaceManager.TopicExists(TopicName))
{
nameSpaceManager.CreateTopic(TopicName);
}

Creating Subscriptions:

Here two subscriptions (NorthSubscription and East Subscription) will created under the topic name “OrderTopic” .


//CreateSubscription
if (!nameSpaceManager.SubscriptionExists(TopicName, "NorthSubscription"))
{
nameSpaceManager.CreateSubscription(TopicName, "NorthSubscription");
}
if (!nameSpaceManager.SubscriptionExists(topicPath: TopicName, name: "EastSubscription"))
{
nameSpaceManager.CreateSubscription(TopicName, "EastSubscription");
}

Deleting a Topic:


var TopicName = "OrderTopic";
//Create Topic
if (nameSpaceManager.TopicExists(TopicName))
{
nameSpaceManager.DeleteTopic(TopicName);
}

Deleting a Subscription:


//Delete a subscription
if (nameSpaceManager.SubscriptionExists(TopicName, "NorthSubscription"))
{
nameSpaceManager.DeleteSubscription(TopicName, "NorthSubscription");
}

Putting all together with publisher scenario:

Taking simple order processing example where publisher send the order details to the specific topic and there are multiple subscriber (receiver) that reads order details from the topic.

As i already mentioned that default filter 1=1 applied to topic and subscription then same message will sent to both subscription because we have not defined any specific filter (will see in next article in details).

t6

Sending Order Details to Topic:

A console application  named as “MessageSender” is created to send order details to azure topic.Below are the code to send message. I will show code in parts but you can download complete sample project.


using Common;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using System;
using System.Threading;

namespace MessageSender
{
class Publisher
{
static NamespaceManager nameSpaceManager;

static void Main()
{
Thread.Sleep(1000);
nameSpaceManager = NamespaceManager.CreateFromConnectionString(TopicConfigurations.Namespace);
CreateTopicUnderServiceBus();

TopicClient tClient = TopicClient.CreateFromConnectionString(TopicConfigurations.Namespace, "OrderTopic");

Console.ForegroundColor = ConsoleColor.Yellow;
Console.WriteLine("======================================================");
Console.WriteLine("-----------Publishing Start---------------------------");
Console.WriteLine("======================================================");

Console.ForegroundColor = ConsoleColor.Green;
PublishOrder(tClient, new Order()
{
OrderId = 5656,
CustomerName = "Vivek Jadon",
ProductName = "Iphone6",
DeliveryAddress = "MG Road Gurgaon",
Zone = "East"
});
PublishOrder(tClient, new Order()
{
OrderId = 5657,
CustomerName = "Rakesh ",
ProductName = "Samsung S8",
DeliveryAddress = "12/3 Ring Road Delhi",
Zone = "East",
});
PublishOrder(tClient, new Order()
{
OrderId = 5658,
CustomerName = "Prakash Nayal",
ProductName = "OnePlus 5T",
DeliveryAddress = "12/3 Ring Road Delhi",
Zone = "North",
});
PublishOrder(tClient, new Order()
{
OrderId = 5659,
CustomerName = "Gautom Anand",
ProductName = "Samsung S8",
DeliveryAddress = "12/3 Ring Road Delhi",
Zone = "North",
});

Console.ForegroundColor = ConsoleColor.Yellow;
Console.WriteLine("-----------Publishing End---------------------------");
Console.ReadLine();
}

public static void CreateTopicUnderServiceBus()
{
var TopicName = "OrderTopic";
string[] arrSubsription = new string[] { "NorthSubscription", "EastSubscription" };

//Create Topic
if (!nameSpaceManager.TopicExists(TopicName))
{
nameSpaceManager.CreateTopic(TopicName);
}

//CreateSubscription
foreach (string subsription in arrSubsription)
{
if (!nameSpaceManager.SubscriptionExists(TopicName, subsription))
{
SubscriptionDescription subscriptionDesc = new SubscriptionDescription(TopicName, subsription)
{
EnableDeadLetteringOnMessageExpiration = true,
EnableDeadLetteringOnFilterEvaluationExceptions = true,
DefaultMessageTimeToLive = TimeSpan.FromMinutes(5),
LockDuration = TimeSpan.FromSeconds(30)
};

nameSpaceManager.CreateSubscription(subscriptionDesc);
}
}
}

public static void PublishOrder(TopicClient topicClient, Order order)
{

String Displaytext = string.Format("Order : orderId={0},customerName={1},ProductName={2},DeliveryAddress={3},Zone={4} " +
" Sent To Topic Successfully \n\n",
order.OrderId, order.CustomerName, order.ProductName, order.DeliveryAddress, order.Zone);

//Create broker message for order
BrokeredMessage brokerMessage = new BrokeredMessage(order);
topicClient.Send(brokerMessage);
Console.Write(Displaytext);

}

}
}

Let’s run the message sender console application only (not executing any messaging receiving application) and see what type of resources and activity happens on azure portal.

Few things captured before executing sender application,you will see there is no topic and subscription created in service bus “Rakesh-Service Bus”.

T10.jpg

Now Execute message sender application and  see what happens.

  • As you see in below snapshot ,order details successfully publish to azure topic.

t11.jpg

  • go to azure portal and you find an topic with name “OrderTopic” with two subscription “EastSubscription” and “NorthSusbcription” created successfully and one thing also noticed that both subscriptions received both 4 order details because we have created subscriptions without and filter rules and if you not create any rule then default rule (1=1) applied.

t12.jpg

let’s see more in-depth by using azure service bus explorer.Service explorer provide rich user interface to know more details about messages as compare to azure portal.we can see message state weather in active state or in dead letter queue.Create and modify filter rules.

T17.jpg

 

Subscribe message from topic:

subscribers never directly connected with azure topics.Any subscriber wants to receive the message from topic then first they have to subscribe any subscription under the same topic and read messages from these subscriptions.

Putting all together with Subscriber scenario:

Again taking console application and install necessary nuget packages to support azure service bus SDK.


using Common;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using System;

namespace MessageReciever
{
class Subscriber
{
static NamespaceManager nameSpaceManager;

static void Main()
{
nameSpaceManager = NamespaceManager.CreateFromConnectionString(TopicConfigurations.Namespace);
ReadMessageFromSubscription("OrderTopic");
}

public static void ReadMessageFromSubscription(string TopicName)
{
Console.ForegroundColor = ConsoleColor.Yellow;
Console.WriteLine("======================================================");
Console.WriteLine("-----------Order Recieving Start---------------------------");
Console.WriteLine("======================================================");
foreach (SubscriptionDescription description in nameSpaceManager.GetSubscriptions(TopicName))
{
SubscriptionClient sClient = SubscriptionClient.CreateFromConnectionString(TopicConfigurations.Namespace,"OrderTopic", description.Name);
while (true)
{
BrokeredMessage bmessgage = sClient.Receive();
if (bmessgage != null)
{
Order order = bmessgage.GetBody<order>();
Console.ForegroundColor = ConsoleColor.Green;
Console.Write(" Request Recieved, ProductName: {0},Zone : {1},CustomerName: {2},DeliveryAddress: {3} \n\n",
order.ProductName, order.Zone,order.CustomerName,order.DeliveryAddress);

Console.ForegroundColor = ConsoleColor.Yellow;
bmessgage.Complete();
}
}
}
Console.WriteLine("-----------Publishing End---------------------------");
}
}
}

Now execute sender and  receiver applications together and you find that sender start publishing messages to topic and receiver application start reading those message from both subscriptions.

T15.jpg

 

We’ve learned that two of the most powerful features in Topics and Subscriptions is the ability to distribute messages to multiple interested parties (subscriptions) and those parties are able to filter out what messages they are specifically interested in.  There is still a good bit to cover on the topic of Azure Service Bus.

A full code example attached with this and you can find link at top of the blog.

 

 

Securing Sensitive App Settings Using Azure Key Vault.

DownLoad Complete Project: WebApiWithAzureKeyVault

Why Azure Key Vault?

Almost every Azure app has some kind of cryptographic key, storage account key, sensitive setting, password, or connection string.

For example, consider a web app that requires a connection string to an Azure SQL Database.Storing this sensitive information in an App.config file could result in it being checked in to a source-code control system and unintentionally exposed to many developers.

Compare this to using Azure Key Vault, where the App.config file only contains a reference to this sensitive data, and is controlled by the access policy of Azure Key Vault.

Below is insecure way which is commonly used in azure based solutions:

A1

you can see here all secret information is clearly mentioned in webconfig.cs file in plain text from and think if some one got access on server and stolen all sensitive information easily. Usually these configuration files also checked in on repository systems like TFS,GitHub etc along with other project files.Any team who have access to these repositories can also see these secret information.

By using Key Vault you can securely store data and avoid having these sensitive pieces of information stored in source code which may then be compromised.

The Microsoft Azure cloud platform provides a secure secrets management service, Azure Key Vault, to store sensitive information. It is a multi-tenant service for developers to store and use sensitive data for their application in Azure.

The Azure Key Vault service can store three types of items: secrets, keys, and certificates.

  • Secrets are any sequence of bytes under 10KB like connection strings, account keys, or the passwords for PFX (private key files). An authorized application can retrieve a secret for use in its operation.
  • Keys involve cryptographic material imported into Key Vault, or generated when a service requests the Key Vault to do so. An authorized cloud service can request the Key Vault perform one or more cryptographic operations with a key on its behalf.
  • An Azure Key Vault certificate is simply a managed X.509 certificate. What’s different is Azure Key Vault offers life-cycle management capabilities. Like Azure Keys, a service can request Azure Key Vault to create a certificate. When Azure Key Vault creates the certificate, it creates a related private key and password. The password is stored as an Azure Secret while the private key is stored as an Azure Key. Expired certificates can roll over with notifications before these operations happen.

Application flow with key vault

A30

Steps Required:

  1. Create A Key Vault
  2. Create a Secret
  3. Register an App in Azure Active Directory
  4. Create an API Key for the App
  5. Give App-Specific Permissions to Access Key Vault
  6. Configure your Dot Net Application

1. Create a key vault

Login on azure portal  and add new service “key vault”. If ‘Key vaults’ is not already in your list, click on ‘More services’ and use the filter to find it. Select ‘Key vaults’.Fill all the mandatory information and press create button.

A2.JPG

A3

2. Create Secrets:

To do this, click on ‘Secrets’ under ‘Settings’ on the left, or under ‘Assets’ in the Overview panel. Once the ‘Secrets’ panel opens up, click the ‘Add’ button at the top so you can create a new one.

Activation and expiry dates can be used if you only want the secret to be accessed for a specific period of time. When you are finished, click ‘Create’.

A4

Key vault DNS name will be used as Key Vault url in application from where key request will initiate.

A15.jpg

3.Register An App In Azure Active Directory

Now You have data protected by Key Vault and You need to give our application (secure) access to this data, first.

Again go to azure portal and search “Azure Active Directory”. inside AD, select ‘App registrations’ from under the ‘Manage’ panel on the left. This is where You will configure the access and permissions.

our application will have when accessing Key Vault programmatically.

A5

In my case i have already created a webapi app named as  “DubaiProperties-Api” which is running under azure app service and i have register the same application in azure active directory to read secure keys/secrets from key vault by this application.

A7

4. Create An Api Key For Registered App

From the ‘App registrations’ menu, you should see your newly created app listed.

A8.JPG

Click on registered application and  Copy the ‘Application ID’ that you should be able to see under ‘Essentials’.

A9.JPG

select ‘Keys’ from the ‘API Access’ section on the right.Give the Key a meaningful description that will explain its purpose, then set an expiration setting. Click ‘Save’ and your API Key ‘Value’ will be presented to you. Copy this key value now as when you navigate away it will never be presented again.

You can always create a new one, if you forgot to copy it.

A10

A11

5.Give App-Specific Permissions to Access Key Vault

Return to key vault and select ‘Access Policies’ under the ‘Settings’ panel on the right. Click the ‘Add New’ button. Click the ‘Select Principal’ option to be presented with a new blade. Enter the Application ID of the app in Azure AD into this field, and select the app when it is presented to you. Click the ‘Select’ button at the bottom to confirm. You can now configure the permissions that you wish to grant the application.

A13.jpg

Only assign the necessary permissions. As it is only Secrets that your app needs access to (and read-only access at that), I would suggest picking ‘Get’ and ‘List’ under the ‘Secret permissions’ option. This is all you need to do, so click ‘OK’ to complete this step.

A14

Now key vault configurations are ready to store secret keys  and refer by  any application.

6.Configure your Dot Net Application

Now all key vault and active directory administration task has completed and now you need to set up dotnet application to use key vault for consuming secret keys instead of define those keys in app.config or some where in application.

Let’s start with Webapi project that needs to use some secrets that is stored in key vault.

Initially there are few things that are required to configure in you application like application Id,Key Vault Url and App Registration keys. All these information already described above at the time of application registration in AD and key Vault Creation.

Below are the settings for webconfig.cs:

A16.jpg

Now add some nuget packages for azure key vault  to the application

A17.jpg

Create a helper class to interact with azure key vault by using Azure SDK and fetch all required secret keys and use in application.

A18.jpg

Now Use this helper class in our webapi controller.

A19.jpg

Now Publish webapi project on azure app service. webapi application should work after deployment.

A21.jpg

Now Check complete swagger url for deployed api and see all api’s controller with all http verbs.

A22

expend Get method of keyvault controller and try to make request to read key vault secret keys value from azure key vault.

A23.jpg

Below is response of webapi with key vault values.

A24.jpg

if you analysis whole code you will not find any secret keys configured in application configuration files or application settings section of azure app service.Keys value directly comes from key vault that is different location.

If you can add new version of same key in keyvault again  then no need to make any changes in your application and application always pick latest version of  key.

let’s create new version of same key with differ value.Go back to secret keys section under key vault  settings pane,here you will found all defined keys.

A25.jpg

click on key for which you want to create new version with new value. choose “DemoSecretKey” to update the value.Once you click on that,you will found all versions of selected key.Currently single version is created.

you never see values of secret keys,its hide to every one.

A26.jpg

Click on “New Version” and select “manual” from the drop down.enter new secret value for key and save it.

A27.jpg

now you can see new version added with updated value,and previous version also maintain by the Azure key vault.

A28.jpg

Let’s test our webapi and it should read new updated value from the key vault.Important thing this is, i have not make any changes in deployed webapi.

A29.jpg

That’s it! This configuration should enable to you to protect your sensitive information in Key Vault and then provide a Dot Net with secure access to that data

Summary

The Azure Key Vault is an excellent service and a welcome addition to the overall Azure services family. It promotes the secure management of cryptographic keys without the associated overhead, which is an important step to adopting and implementing better security within our applications. In the next article, we’ll see how you can set up a Key Vault for our application and use the .NET SDK to create, manage and retrieve keys.