Power Platform Fundamentals PL-900 Certification exam

In general, any certification offers practical experience to individuals from all the aspects to be a proficient worker.Certified professionals have more beneficial and relevant networks that help them in setting career goals for themselves.Since last year I concentrated on the different Azure certifications. This year I have have a target to complete certifications on different areas too. As a start for this year, i did the certification PL-900 today and i would say it is one of the easiest exam if you have prior experience in building Mobile applications and have a general idea on how to solve business problems.

Microsoft’s Power Platform

Microsoft’s Power Platform is a low code platform (an environment with graphical user interfaces rather than traditional scripts and programming languages) powered by Microsoft Azure (the cloud computing platform) enabling organisations to analyse data from multiple sources, act on it through created applications and automate business processes.he Power Platform contains 3 key aspects (Power Apps, Power BI,Power Virtual Agents, and Power Automate) and integrates with 2 main ecosystems (Office 365 and Dynamics 365).

PL-900: Microsoft Power Platform Fundamentals

Last year November the new PL-900: Microsoft Power Platform Fundamentals was released in Beta mode and it’s recently went into generally available.

After reading the “Skills Measured” section, I realized that a lot of what I have implemented in the past as a developer and i had some experience with PowerBI as well.

Based on the content , i was confident enough to take the exam but I wanted to make sure I was completely prepared for the exam on the topics which i am not familiar with, so did some learning on Dynamics 365 on Microsoft Learn and the Common Data Service (CDS) on Microsoft Learn as well.

Exam Materials and Tips:

You need to only spent 3-4 hours reviewing the Learning Path of Power Platform Fundamentals if you are already familiar with the topics. It’s a pretty straight forward exam for which if you read the course on Microsoft Learn, you should be good to go. If you are aware of all the components of the Power Platform, you probably can give the exam as is.

Link to Exam: Here
Released: 4th November 2019 (GA 18/02/2020)
MS Learn: Modules available

I would request anyone preparing for PL-900 to have a good grasp of the following subjects:

  • Common Data Service
  • Difference between Power BI Desktop and Power BI Service
  • Power Apps Portals
  • AI Builder Models
  • Difference between Business Rules and Business Process Flows
  • How Power Platform works with Dynamics 365

It is not a hard exam if you are from the developer background, however get well prepared for this one! As always get hands on. Let’s be citizen developers together. Cheers!

Corona Escape – Game with Python and Cosmosdb

I’ve been gaming since 2003 till now.I remember those sleepless nights and how much fun i had playing PC games. I always wanted to be a game designer since my childhood days and have built lot of small games during my university days. After a very long time i invested some time and built a simple game using Python and Azure cosmosdb. I wanted to write how to build the game “Corona escape” with others in this blog post.


  • Python 3 Installed
  • VScode or Pycharm
  • Azure Subscription

Game Structure :

The coronavirus is fairly new that has taken the world by shock. It’s been two months since the outbreak started and it has shown that it isn’t as deadly as the SARS virus. This game “Corona Escape” is built using Pygame which is a library for beginners to cut their teeth on to get comfortable with learning programming and the process of game development and feel successful in making games. It’s also a great rapid prototyping tool. This game is very similar to any jump game. The idea is that to escape from the virus as much as you can, user will be provided with a capsule to make the move fast and a mask to escape from the virus. I will not go in detail on the logic side of it as the source code is published here.

Corona Escape Game

Architecture below is fairly easy, its just a diagram with Cosmosdb to store the data and application insights to gather the user details (type of device,location etc). If you have plan to expand the game, you could add other components in the architecture such as azure functions etc.

Highest score is pushed to a text file and Azure cosmosdb for sharing the score across the users in the world. The related code resides in the cosmos.py which as follows,

def getLeaderBoard(self):
        options = {}
        options['enableCrossPartitionQuery'] = False
        options['maxItemCount'] = 100
        query = {'query': 'SELECT * FROM server s'}

        results = self.client.QueryItems(self.container['_self'], query, options)

        for result in results:

def pushData(self,username,highscore):
        data = self.client.CreateItem(self.container['_self'], {
            "username": str(username),
            "highscore": str(highscore),
            "message" : str(username) + " got " + str(highscore)


Make sure to create a cosmosdb account with the SQL API and pass those credentials under config.

self.config = {
            'ENDPOINT': 'your endpoint',
            'PRIMARYKEY': 'your cosmosdb primary key',
            'DATABASE': 'your db',
            'CONTAINER': 'your container'

How to run the Game:

  • Clone the repository from here
  • Make sure to install the dependencies using pip such as pygame
  • Run the game with the command python main.py

Hope this helps someone who want to build games using python and cosmosdb. Play the game and add your comments below. cheers!

How to build Facial Recognition Solution for the fraud prevention using Azure AI and Cosmosdb

Biometric Face Recognition is the process and ability of a bio metric machine to identify and recognize the face of an individual either to grant access to a secured system or to find out the details of a person by matching the face with existing data in the machine’s system.

Facial recognition is full of potential and can be easily incorporated to increase the security measures of any device/object. Apart from all the excitement, this technology is still developing, the more faces are fed in the algorithm, the more accurate it becomes. Therefore, there’s no need to be afraid of facial technology as it is being used for good ethical uses and safe practices.

Industries around the globe have already started to use face detection for several purposes. As a series of my ideas, continuation with Architecture for traffic problem, This post focus on one of the solution with Azure by using various services such as Cognitive Service, Blob Storage, Event Grid, Azure Functions and Cosmosdb to build the right architecture that would solve the above mentioned use case.

Overall Architecture:

Components used:

  • Azure CosmosDB
  • Azure Functions
  • EventGrid
  • Cognitive Face API
  • Storage (Queue,Blob)


The above architecture is very self explanatory, it comprises of two main components/flow.

  • Training Faces of Individuals
  • Identifying faces of Individuals

Every process is achieved in the above case using Azure function. Each operation will have separate function to achieve result. Main operations such as RegisterUser,TrainFace,TriggerTrain are simple Azure functions in the above diagram. Images are uploaded to Blob Storage using SAS token and face detection is done using cognitive service and references are stored in CosmosDB. EventGrid is used to route the events according to the invokes, for ex, to train image whenever a user uploads while registration and tag the face of the individual in the database.

I have used the above architecture with one case study and hope it will help someone out there who wants to build similar solution. Cheers!

Deploy Angular9 App to Azure with Github Actions

It’s been almost a week since Angular 9 was released which included several new features such as new Ivy Renderer, SSR, Type checking etc. In this post, i will be going through on how to build and deploy an Angular9 application as a static website to Azure with GitHub Actions.


  • Github Account
  • Azure Account
  • VSCode

Step 1: Install latest Angular CLI

In order to create the Angular application , you need to install the latest angular cli, which could be done with the following command,

 npm install -g @angular/cli@latest

Step 2: Create new Angular9 App

Run the following command to create the Angular9 app with the default template. Let’s name the application as ga-azure

ng new ga-azure

Step 3: Install the Hexa.run CLI

We will use the package by Wassim Chegham to deploy the application to Azure. Next step in your Angular project make sure to install the Hexa.run CLI as a prod dependency as you can see from the package.json of this project.

npm i -g @manekinekko/hexa

Step 4: Login to Azure account

Next step is to create the necessary resources to deploy the angular application. In this way, we will deploy our application to static website for Azure Storage which is the optimum option to host a single page application (SPA) on Azure.Hosting a SPA in pure Storage is by far the cheapest and most efficient way of running in Azure.

You can login to azure account with the command,

npm run hexa:login

which will list down the available subscriptions and you need to pick the subscription where you want to deploy the application.

Step 5: Initiate the Hexa Settings

Next step is to initiate the configuration needed for the deployment of the application. Run the Hexa CLI command as follows,

npm run hexa:init

which will ask for few inputs from the user such as the project name, storage account name and the destination folder. Eventually, you will see a new file generated as hexa.json which will look like the below,

  "subscription": {
    "name": "Azure Demo"
  "project": {
    "location": "westeurope",
    "name": "ga-azure"
  "storage": {
    "location": "westeurope",
    "name": "gaazure10940"
  "hosting": {
    "folder": "./gist/ga-azure"

Now you are good with all the necessary things needed to deploy the application with github action.

Step 6: Generate Service Principal

You need to use the service principal identity mechanism to do the authorization of the deployment. In order to generate the service principal using hexa, run the below command,

npm run hexa:ci

Hexa.run will automatically:

  1. create an Azure resource group (or lets you choose an existing one)
  2. create the Azure storage account
  3. configure the storage account and make it static-website ready
  4. upload the angular bundle.
  5. prints the generated URL from the Azure service.

Also it will generate the necessary credentials as a JSON and make a note of them.

  appId: 'xx4362xx-aaxx-40xx-8bxx-xx6ea0c351xx',
  displayName: 'ga-azure',
  name: 'http://ga-azure',
  password: 'xxce72xx-1axx-44xx-81xx-35xxb15xxa1e',
  tenant: 'xxf988xx-86xx-41xx-91xx-2d7cd011dbxx'

Step 7 : Commit and push to the Github Repository

Once you are done with all the above steps , you can commit your changes and push to the remote repository on Github.

git remote add origin https://github.com/sajeetharan/ga-azure.git
git add -A && git commit -m "First commit"
git push origin master

Step 8 : Create Github Actions Workflow

Now it’s the time to create the Actions workflow, you can create a new workflow by navigating to actions and click New Worfklow. There are few sample template workflows available , in this case we will use own workflow. So you need to click on setup workflow yourself.

Setting up own workflow GitHub Actions

Immediately you could see new workflow.yml file created where you need to add the steps and actions needed to deploy the app. Here is the workflow file look like after adding all the steps.

name: Deploy to Azure with Hexa.ru
    - master
    - release/*


    runs-on: ubuntu-latest

        node-version: [12.x]

    - uses: actions/checkout@v1
    - name: Use Node.js ${{ matrix.node-version }}
      uses: actions/setup-node@v1
        node-version: ${{ matrix.node-version }}
    - name: npm install
      run: |
        npm install
    - name: npm build, and deploy
      run: |
        npm run hexa:login
        npm run build -- --prod
        npm run hexa:deploy

As you could see, the steps are very much simple starting with installing the dependencies and deploying the angular application using the hexa:deploy command.

Also you need to configure the secrets in the github repository which were generated in the step 6. You can create a new secret by navigating to settings and then secrets. you need to define the below secrets which are associated with the service principal.

Github Secrets

The rest in the workflow can be easily understood as its about the environment and the trigger(whenever someone push the changes to master/release there should be a build)

Step 9 : See Github Actions in Action

Immediately when you save the workflow.yml you can see there will be a new build triggered and the steps are executed, which you can notice in the Actions tab as follows,

Deploy using hexa:run

You will be able to access the application in the url generated once the application is deployed which will look like https://gaazure10940.z6.web.core.windows.net/

That’s all you need to do inorder to deploy the Angular application to Azure. If you need to include end to end testing and different tasks you could simply modify the flow and it. Github Actions definitely a future to believe in! Try this out and let me know if you have any queries! Cheers!

Automate Azure Resources with PowerApps

Microsoft’s Power Platform is a low code platform which enable the organization to analyse data from multiple sources, act on it through application and automate business process. Power Platform contains 3 main aspects (Power Apps,Power BI and Microsoft Flow) and it also integrates two main ecosystems(Office 365 and Dynamics 365). In this blog we will see in detail about Power Apps which helps to quickly build apps that connects data and run on web and mobile devices.

Overview :

We wills see how to build a simple Power App to automate scaling of resources in Azure in a simple yet powerful fashion to help save you money in the cloud using multiple Microsoft tools. In this post, we will use Azure Automation run books to code Azure Power Shell scripts, MS Flow as orchestration tool and MS Power Apps as the simple interface. If you have an Azure Environment with lot of resources it becomes hassle to manage the scaling part of it if you don’t have the auto scaling implemented. The following Power App will help to understand how easy it is to build the above scenario.

Prerequisites :

We will use the following Azure resources to showcase how scaling and automation can save lot of cost.

  • AppService Plan
  • Azure SQL database
  • Virtual Machines

Step 1: Create and Update Automation Account :

First we need to create the Azure Automation account. Navigate to Create Resources -> search for Automation Account on the search bar. Create a new resource as follows,

Once you have created the resource, you also need to install the required modules for this particular demo which includes

Installing necessary Modules

Step 2: Create Azure PowerShell RunBooks

Next step is to create the run books by going to the Runbooks blade, for this tutorial lets create 6 run books one for each resources and its purpose. We need to create PowerShell scripts for each of those types

Scale Up SQL
-Scale Up ASP
-Start VM
-Scale Down SQL
-Scale Down ASP
-Stop VM

Creating RunBook

PowerShell scripts for each of these are found in the Github repository. We need to create runbook for each of the scripts.

All the runbooks above can be scheduled and automated to run for desired length of time, particular days of the week and time frame or continuously with no expiry. For an enterprise for non-production you would want it to scale down end of business hours and at the weekend.

Step 3: Create and Link Microsoft Flow with Azure Automation Run books

Once we tested with the above Powershell scripts and the runbooks, tested and published now we can move on the next step to create the Flow. Navigate to Flow and Create New App from the template.

New Flow from the template

Select the template as PowerApps Button and the first step we need to add is the automation job. When you search for automation you will see the list of jobs available. Select Create Job and pick the one you created above. If you want to all the actions in one app, you can add one by one, If not you need to create separate flows for each one.

In this one, i have created one with the ScaleUpDB job which will execute the Scale up command of the database.

Once you are done with all the steps save the flow with necessary name.

Step 4 : Create Power Apps and and link to your Flow button

Once we create PowerApp flow buttons login to MS PowerApps with a work/school account. Navigate to Power Apps which will give a way to create a blank canvas for Mobile or tablet. Next you can then begin to customise the PowerApp with text labels, colour and buttons as below

PowerApps Name

In this case we will have a button to increase/decrease the count of instances of the sql db, my app looked like below with few labels and buttons.


Next is to link the flow to the button of the power App by navigating to the Actions -> Power Automate

Linking Button Action

Once both Scale Up/Scale down actions are linked, save the app and publish

Step 5 : Verify Automation

Inorder to verify if things are working correctly, click on scale up and scale down few times and navigate to Azure Portal and open the Automation account we created.

Navigate to the overview tab to see the requests for each job made via the power app as below.

Jobs executed.

In order to look at the history navigate to the Jobs blade

Jobs Execution

further you can build a customized app for different environments with different screens using Power App. With the help of Azure Alert, whenever you get an alert regarding the heavy usages of resources/spikes, with single click of button you will be able scale up and scale down the resources as you need.

Improvements and things to consider:

This is just a starting point to explore more on this functionality, but there are improvements you could add to make this really useful.

(i) Sometimes the azure automation action fails to start the runbook. When you are implementing flow needs to handle this condition.

(ii) Sometimes a runbook action will be successful, but the runbook execution errored. Consider using try/catch blocks in the PowerShell and output the final result as a JSON object that can be parsed and further handled by the flow.

(iii) We should update your code to use the Az modules rather than AzureRM.

Note : The user who executes the PowerApp also needs permission to execute runbooks in the Automation Account.

With this App, It becomes handy for the operations team to manage the portal without logging in. Hope it helps someone out there! Cheers.

Create Github Issue Reporter with Azure Function and CosmosDB

Many times you would have wanted to have one view/dashboard of all the Github issues created for your open source repositories. I have almost 150 repositories and it becomes really hard to find which are the priority ones to be fixed. In this post we will see how you can create a one dashboard/report to view all your github issues in a page using Azure Function(3.X with Typescript) and Azure CosmosDB.


You will need to have an Azure Subscription and a Github Account. If you do not have an Azure subscription you can simply create one with free trial. Free trial provides you with 12 months of free services. We will use Azure Function and CosmosDB to build this solution.

Step 1 : Create Resource Group

Inorder to manage deploy the function app and cosmosdb we first need to create Resource Group. You can create one named “gh-issue-report”

Step 2: Create the Azure Cosmosdb Account

To store the related data of the GitHub issue we need to create a CosmosDB account. To Create CosmosDB account, navigate to the Azure portal and click the Create Resource. Search for Azure Cosmosdb on the market place and create the account as follows.

CosmosDB Creation

Step 3:  Create the Function app

If you have noticed my previous blog, i have mentioned about how to create an Azure function. Here is an image of the Function App i created.

Creating Function App

Create Typescript Function:

As you see i have selected Runtime stack as Node.js which will be used to run the function written with Typescript.  Open Visual Studio Code(Make sure you have already installed the VSCode with the function core tools and extension). Select Ctrl + Shif + P to create a new Function Project and select the language as Typescript.

Create Typescript Function

 Select the template as Timer trigger as we need to run every 5 minutes and you need to configure the cron expression (0 */5 * * * *) as well. (You can have custom time)

Give the function name as gitIssueReport, You will see the function getting created with the necessary files.

Step 4 : Add Dependencies to the Function App

Let’s try to add the necessary dependencies to the project. We will use bluebird as a dependency to handle the requests. Also gh-issues-api library to interact with Github and get the necessary issues. You need to add the dependencies in the package.json folder under dependencies.

 "dependencies": {
    "@types/node": "^13.7.0",
    "bluebird": "^3.4.7",
    "gh-issues-api": "0.0.2"

You can view the whole package.json here.

Step 5: Set Output Binding

Let’s set the output binding to CosmosDB to write the issues to the collection. You can set it by modifying the function.json as

      "type": "cosmosDB",
      "name": "issueReport",
      "databaseName": "gh-issues",
      "collectionName": "open-issues",
      "createIfNotExists": true,
      "connectionStringSetting": "gh-issue_DOCUMENTDB",
      "direction": "out"

Where type cosmosDB denotes the database output binding and you can see that the database name and collection as configured.

Step 6 : Code to Retrieve the Github Repository Issues

The actual logic of the function is as follows,

import Promise = require('bluebird');

import {
} from 'gh-issues-api';

export function index(context: any, myTimer: any) {
  var timeStamp = new Date().toISOString();

  if(myTimer.isPastDue) {
      context.log('Function trigger timer is past due!');

  const repoName = process.env['repositoryName'];
  const repoOwner = process.env['repositoryOwner'];
  const labels = [
    'build issue',
    'investigation required',
    'help wanted',

  const repo = new GHRepository(repoOwner, repoName);
  var report = {
    name: repoName,
    at: new Date().toISOString()

  context.log('Issues for ' + repoOwner + '/' + repoName, timeStamp);   
  repo.loadAllIssues().then(() => {
    var promises = labels.map(label => {
      var filterCollection = new FilterCollection();
      filterCollection.label = new IssueLabelFilter(label);
      return repo.list(IssueType.All, IssueState.Open, filterCollection).then(issues => report[label] = issues.length);
    var last7days = new Date(Date.now() - 604800000)
    var staleIssuesFilter = new IssueActivityFilter(IssueActivity.Updated, last7days);
    staleIssuesFilter.negated = true;
    var staleFilters = new FilterCollection();
    staleFilters.activity = staleIssuesFilter;
      repo.list(IssueType.Issue, IssueState.Open).then(issues => report['total'] = issues.length),
      repo.list(IssueType.PulLRequest, IssueState.Open).then(issues => report['pull_request'] = issues.length),
      repo.list(IssueType.All, IssueState.Open, staleFilters).then(issues => report['stale_7days'] = issues.length)

    return Promise.all(promises);
  }).then(() => {
    var reportAsString = JSON.stringify(report);
    context.bindings.issueReport = reportAsString;

You can see that the document is set as a input to the CosmosDB with the binding named issueReport.

Step 7: Deploy the Function

Deploy the Function App. You can deploy the function app to the Azure with the keys Ctrl+Shift+P and select Deploy to the Function App

Deploy Function App

Step 8 : Verify/Install the Dependencies

Once the deployment is succesfful, Navigate to Azure portal and open the function app to make sure that everything looks good. If you dont see the dependencies make sure to install the dependencies manually by navigating to the Kudu Console of the function App.

Note : Make sure to stop the Function app before you head over to Kudu.

ick on the Platform Features tab. Under Development Tools, click Advanced tools (Kudu). Kudu will open on it’s own in a new window.

Navigate to KUDU console

In the top menu of the Kudu Console, click Debug Console and select CMD

In the command prompt, we’ll want to navigate to D:\home\site\wwwroot. You can do so by using the command cd site\wwwroot and press enter on your keyboard. Once you’re in wwwroot, run the command npm i bluebird to install the package. Also do the same for gh-issues-api

Step 8: Set Environment Variables (Repository)

As you could see in the above code, we are setting two environment variables to read the repository name and the repository owner which are needed to fetch the issues information. You can set those variable son the Azure portal as follows.

Navigate to the Overview tab for your function and click Configuration. As you can see below I’ve configured those values.

Function App Settings

Step 9: Verify the Output Binding

Just to make sure that our settings in the function.json has been reflected or not navigate to the Functions and select the Function and make sure all the binding values are correct. If not create a new binding to cosmosdb account you created as mentioned in the step Step 3 (Instead of Twilio select Cosmosdb)

Step 10 : Run and Test the Function

Now its time to see the function app running and issues being reported. Navigate to your function app and click Run. You can see the Function Running as shown below.

Run Function App

Step 11: Check Live App Metrics

If you see any errors you can always navigate to Monitor section of the Function app and select Live App Metrics

Live metrics of the function app

Step 12: Verify the data in cosmosdb

If everything goes well, you can navigate to Cosmosdb Account and open the collection with the data Explorer.

Data Explorer Cosmosdb

You will see that there are many documents inserted in the collection.

Cosmosdb collection with Github repository Issues

Now you can modify this function to retrieve the issues from all of your repositories and use the data stored in the cosmosdb collection to build a dashboard to show the issues with priority. Also you can make use of this post to send a notification to someone about the issue as well.

Hope this simple function will help someone to build a dashboard out of the data collected and make them more productive.Cheers!

Azure Tip : Enable Debug Mode on Azure Portal

I was checking out this cool feature on the Azure Portal Today. I usually spent 2 hours per day on evaluating the new features or building something new on Azure. Azure Portal is for a lot of developers is the go-to place to manage all their Azure resources and services. Most often i hear from the developers is that Portal takes much time to load when you login and sometimes we feel that portal is slow. Today i figured out a way to debug the portal loading time. If you are developer who considers Azure Portal as your website and want to know about the duration for each view of the page, while you are logged in into the Azure Portal and press the keyboard shortcut CTRL + ALT + D you can see the load time and other useful information for every title.

Azure Portal Load Time

You can simply enable/disable certain features on the portal by toggling the Experiments. You can also enable Debug Hub to see if there are any exceptions/issues while loading the portal related elements.

Enable/Disable certain features

One other tip i would like to highlight here is that keyboard shortcuts that you can use specifically for the Azure portal.

To see them all, you can open the Keyboard shortcut help item in the Help Menu on the top-right of the Portal.

Shortcut keys

Hope it helps you to figure out if there is a slowness while loading the Azure Portal. If you are new to Azure and want to get start on Azure, explore my tool Azure360. Share this tip with your colleagues.

Investigating into Azure Data Lake Storage and its Multi-protocol Access

We have a wide variety of options to store data in Microsoft Azure. Nevertheless, every storage option has a unique purpose for its existence. In this blog, we will discuss ADLS (Azure Data Lake Storage) and its multi-protocol access that Microsoft introduced in the year 2019.

Introduction to ADLS (Azure Data Lake Storage)

According to the Microsoft definition, it is an enterprise-wide hyper-scale repository for big data analytics workloads and enables you to capture data of any size and ingestion speed in one single space for operational and exploratory analytics.

The main purpose of its existence is to enable analytics on the stored data (it may be of any type structured, semi-structured and unstructured data) and provide enterprise-grade capabilities like scalability, manageability, reliability, etc.

Where does it build?

ADLS is built on the top of the Azure Blob Storage. Blob Storage is one of the storage services under the suite of Storage accounts. Blob storage lets you store any type of data and it doesn’t necessarily to be a specific data type.

Does the functionality of ADLS sound like the Blob storage?

From the above paragraphs, it looks like both ADLS and Blob storage has the same functionality. Because, both the services can be used to store any type of data. But, as I said before, every service has its purpose for its existence. Let us explore, what is the difference between ADLS and Blob storage in the following.

Difference between ADLS and Blob storage


It is optimized for analytical purposes on the data stored in the ADLS, but Blob storage is a usual way of storing file-based information in Azure where the data which will not be accessed very often also called as cold storage.


In both the storage options, we need to pay the amount for the data stored and I/O operations. In the case of ADLS, the cost is slightly higher than the Blob.

Support for Web HDFS interface

ADLS supports a standard web HDFS interface and can access the files and directories in Hadoop. Blob does not support this feature.

I/O performance

ADLS is built for running large scale systems that require massive read throughput when queried against the DB at any pace. Blob is used for store data which will be accessed infrequently.

Encryption at rest

Since ADLS GA, it supports encryption at rest. It encrypts data flowing in public networks and at rest. Blob Storage does not support encryption at rest. See more details on the comparison here.

Now, without any further delay let us dig on the Multi-protocol access for ADLS.

Multi-protocol access for ADLS

This is one of the significant announcements that Microsoft has done in the year 2019 as far as ADLS is concern. Multi-protocol access to the same data allows you to leverage existing object storage capabilities on Data Lake Storage accounts, which are hierarchical namespace-enabled storage accounts built on top of Blob storage. This allows you to put all your different types of data in the data lake so that the users can make the best use of your data as the use case evolves.

The multi-protocol concept can be achieved via Azure Blob storage API and Azure Data Lake Storage API. The convergence of both the existing services, ADLS Gen1 and blob storage, paved the path to a new term called Azure Data Lake Storage Gen 2.

Expanded feature set

With the announcement of multi-protocol access, existing blob features such as access tiers and lifecycle management policies are now unlocked for ADLS. Furthermore, it enables many of the features and ecosystem support of blob storage is now supported for your data lake storage.

This could be a great shift because your blob data can now be used for analytics. The best thing is you don’t need to update the existing applications to get access to your data stored in Data Lake Storage. Moreover, you can leverage the power of both your analytics and object storage applications to use your data most effectively.

While exploring the expanded feature sets, one of the best things I could found is that ADLS can now be integrated with Azure Event Grid.

Yes, we have one more publisher on the list for Azure Event Grid. Azure Event Grid can now be used to consume events generated from Azure Data Lake Storage Gen2 and routed to its subscribers with webhooks, Azure Event Hubs, Azure Functions, and Logic Apps as endpoints.

Modern Data Warehouse scenario

The above image depicts the use case scenario of ADLS integration with Event Grid. First off, there are a lot of data comes from different sources like Logs, Media, Files and Business apps. Those data are ending up in the ADLS via Azure Data Factory and the Event Grid which listens to the ADLS gets triggered once data reaches it. Further, the event gets routed via Event Grid and Functions to Azure Databricks. The file will be processed by the databricks job and writes the output back to Azure Data Lake Storage Gen2. Meanwhile, Azure Data Lake Storage Gen2 pushes a notification to Event Grid which triggers an Azure Function to copy data to Azure SQL Data Warehouse. Finally, the data will be served via Azure Analysis Services and PowerBI.


In this blog, we have seen an introduction about the Azure Data Lake Storage and the difference between ADLS and blob storage. Further, we investigated the multi-protocol access which is one of the new entrants in ADLS. Finally, we looked into one of the extended feature sets – integration of ADLS with Azure Event Grid and its use case scenario.

I hope you enjoyed reading this article. Happy Learning!

Image Credits: Microsoft

This article was contributed to my site by Nadeem Ahamed and you can read more of his articles from here.

Azure Tip : Start/Stop all VMs in a Resource Group with PowerShell/Bash

There are cases if you want to start/stop all VMs in particular resource group in parallel within Azure. You can set it up with Automation using Scheduled actions. Other way which you can do by using PowerShell or Azure CLI

If you are using PowerShell, simply you can do

Get-AzVm -ResourceGroupName 'MyResourceGroup' | Start-AzVM

If you are using Azure CLI/Bash

az vm start --ids $(az vm list -g MyResourceGroup --query "[].id" -o tsv)

Where MyResourceGroup is the name of your ResourceGroup. Happy Azurifying!

Deploy Web App to a Sub-folder on Azure

I have come across this question about “How to deploy a web app within a sub folder on Azure” in Stackoverflow many times. Even though there is an official documentation, this question has not been addressed in general. With Virtual Directories, You could keep your web sites in separate folders and use the ‘virtual directories and applications’ settings in Azure to publish the two different projects under the same site.

However, say if you have an ASP.NET Core/Angular app to a sub-folder inside Azure Web App (App Service), and wanted to deploy on Azure inside a sub-folder. You can simply navigate to Azure portal -> Select the Web App -> Overview

You should be getting an error as follows,

System.TypeLoadException: Method ‘get_Settings’ in type ‘Microsoft.Web.LibraryManager.Build.HostInteraction’ from assembly ‘Microsoft.Web.LibraryManager.Build

You can resolve the above issue by updating the nuget package named Microsoft.Web.LibraryManager.Build in your project.

One other thing that you should be aware is that, Go to portal > demo-site App Service > Configuration > Path Mappings > Virtual applications and directories. And add the following,

Virtual PathPhysical PathType
/folder site\wwwroot\folder Folder
/folder/sub-folder site\wwwroot\folder\sub-folder Application
Configuration Virtual Directory

Now publish from Visual Studio. If you only need to publish to a first level folder, i.e to your-site\folder, then all you have to do is, change the Type to Application in the Path mappings for /folder, and skip the sub-folder entry since you don’t need it. And correct the Site Name and Destination URL in the publish profile accordingly.

Hope there will be no more questions on the same. Happy Coding!

%d bloggers like this: