Create & Test Azure Bot Service (Bot Service – Part 1)

You can use the Azure Bot Service for creating bots. It provides the core components including the Bot Builder SDK for developing bots and the Bot Framework for connecting bots to channels. In this topic, you will learn how to use Bot Service to create a new bot that uses the Bot Builder SDK.

Create Bot Service

  • Create a new Resource Group (optional).
  • Search and add Web App bot.

  • In the Bot Service window, provide the requested information about your bot.

  • Click on the Web App Bot you just created.
  • Click Test in Web Chat to test the service. Enter a message and your bot should respond.

This is how you can create a Basic Web App Bot/Functions Bot by using Bot Service and verified the bot’s functionality by using the built-in Web Chat control within Azure.

Next is to learn how to manage your bot and start working with its source code. Stay tuned!

Useful Resources

Azure Bot Service Pricing

Bot Builder Samples

Azure Bot Service Documentation

Bot Framework Additional Resources

Credits to original post.

Start & Stop Azure VM Automatically

With all the benefits of using Virtual Machines on Azure, one of challenge is to manage the cost of running it. It becomes important especially for developers and consultants as they may not need the virtual machines to run all day.

Using Azure Automation, you can schedule the start and stop timings for your virtual machines easily. The article descries the steps you may need to follow to achieve this.

Add Automation Service

  • Create a Resource Group (optional).
  • In the resource group, click Add to add a new service.
  • Search for Automation.

  • Create an Automation Account. Specify the settings and click Create to add automation account.

NOTE:
Select Yes in the “Create Azure Run as Account” option, if this this the first automation you are creating.

Configure Runbooks

  • Open the Automation account, you just created.
  • Click Runbooks, under Process Automation.

NOTE: In case there are default published runbooks, you can delete them (optional).

  • Click Browse Gallery.

  • Search and Import the following Graphical Runbooks:
    • Start Azure V2 VMs
    • Stop Azure V2 VMs

  • Specify Name for your runbooks.
  • Edit the runbooks and Publish them.

Schedule the Runbook

  • Edit the Runbook.
  • Click Schedule.

  • Create a new schedule and specify the settings. The image below shows the settings for Starting the VM.

NOTE: Similarly, you can also provide scheduling settings for stopping the VM.

  • Click Start to start the runbook. Specify the following parameters and click Ok:
    • RESOURCEGROUPNAME: Specify the name of the resource group where the VM resides.
    • VMNAME: Specify name of the VM that you would like to start or stop.
    • AZURECONNECTIONASSETNAME: (optional)

The runbooks are ready and will start and stop the VMs on scheduled timings as per your settings.

NOTE: If you are using a public IP, this will change every time the VM starts. Its better to setup a DNS Name and use that in your RDP to connect to the VM.

That’s it, enjoy!

Azure Functions – Resize Images Uploaded to Blob Storage

Scenario

I have been capturing images with my power App, which are uploaded to a Blob Storage. My power App uses the build in phone camera to capture images, hence each image size is more than 2 MB.

I use these images in one of my reports generated periodically via logic app (also converted to pdf). Sometimes there can be more than 40 images in my report and as you can imagine the system takes a lot of time to first download the images, because of the size, and my report generation fails.

Now, to avoid this situation, I can do couple of things:

  • Setup my camera to use lowest settings for image size. However, with today’s devices even the lowest settings generate images in MBs. Also, it is not possible all the time. Try setting that as best practice for your customers!
  • I can use the Camera App (provided by the Power App), which usually captures images in KBs. However, there are various limitations like, I can’t use (at least for now) Flash, zoom in, zoom out and many other features that my device provides.

Solution

So, the most appropriate solution would be to create a function that monitors my Blob Storage, where my images are uploaded and whenever there is an image which exceeds certain size, I would like the function to resize the image and store it in the same place.

Credits to original post.

Prerequisite

To replicate this scenario, we will create a Blob Storage to store our images and use Microsoft Azure Storage Explorer.

Blob Storage

Create a Blob Storage:

  • Specify a unique Name
  • Select Account Kind
  • Select Location
  • Select Replication
  • Select Subscription
  • Create a new Resource Group or use existing
  • Click Create.

Within the storage, use the Blob Service and create a Container where the images will be stored.

Microsoft Azure Storage Explorer

Download and Install Microsoft Azure Storage Explorer and configure it to access the Blob Storage. We will be using it to simulate upload of sample image.

Create a Function App Service

  • Use Function App to create a new function:
    • Specify unique App Name
    • Select Subscription
    • Create a new Resource Group or use existing
    • Select Hosting Plan
    • Select Location
    • Create a new Storage or use existing
    • Click Create

Create a Function Trigger

  • Create a new function trigger based on Blob Trigger template using C#
  • Specify Name of the function trigger
  • Specify the Path, this is the path to the source of the images
  • Select the Storage Account Connection
  • Click Create

NOTE: If you do not see your storage account connection, click New.

The portal will create a binding for your script that will allow you to process files created at the path specified.

The connection string for your storage account will automatically be created and added to your Function App. To see how this is connected, inspect the function.json file in your function through the View Files tool pane.

You can also click on the Integrate menu and check the settings:

Add New Output Binding

We need to add another parameter binding, this time for output, so that we can save out the resized image back to Azure Blob Storage (same location in this example).

  • Click on the Integrate menu
  • Click New Output to create output binding.

  • Select Azure Blob Storage

  • Specify Blob Parameter Name
  • Specify Path (in this example it is the same path where we need to save the output)
  • Select Storage Account Connection

Note: The path, where we are reusing the {imageName} parameter. This was the name of the file coming into our function, and we will save it out as the same filename in the same location, essentially replacing the original file. If you want you can specify different path to store the resized file.

Add Files to Make the Function Work

Project.json

  • Under View Files button add a new file called project.json and add the following code. This is how the Function Apps use to restore packages from NuGet.
{

“frameworks”: {

“net46”:{

“dependencies”: {

“ImageResizer”: “4.0.4”

}

}

}

}

  • Save the file.

Run.csx

  • Navigate to run.csx, the script file for the endpoint.
  • Use the following code. We are using the ImageResizer library to execute a resize operation with one stream as the source and the other as the output.
#r “System.Drawing”

using ImageResizer;

using System.Drawing;

using System.Drawing.Imaging;

public static void Run(Stream inputImage, string imageName, Stream resizedImage, TraceWriter log)

{

log.Info($”C# Blob trigger function Processed blob\n Name:{imageName} \n Size: {inputImage.Length} Bytes”);

var settings = new ImageResizer.ResizeSettings{

MaxWidth = 400

};

if(inputImage.Length > 500000){

ImageResizer.ImageBuilder.Current.Build(inputImage, resizedImage, settings);

}

else

{

log.Info($”Name:{imageName} \n Size: {inputImage.Length} Bytes is too small ignoring file”);

}

}

  • Save the file.

Check the log to see if the compilation was a success.

Test the Solution

  • In this example we are uploading an image file of size around 3.71 MB.

  • Using the Azure Storage Explorer upload the sample image. Use the Upload button. Note the size.

  • The function sees that an image of larger size has been uploaded and runs the logic to resize the image.
  • Refresh the Azure Storage Explorer after few seconds and check the size of the image, it would have been reduced as per the logic.

Using Azure Functions to Call Dynamics 365.

Credits…Original Post by Nishant Rana (Big Thanks!).

Here is an example of a simple Azure Function that refers our CRM assemblies and creates contact record in CRM.

Log in to Azure Portal, and perform the following actions:

  • Optionally, Create a Resource Group, so that it is easy to maintain the resources.
  • Search for “Function App” and create one. NOTE: The App Name of the Function App should be a globally unique name.

  • Add (+) a new function (GenericWebHook-Csharp template)

  • This creates a new function (“CallCRMApps” in this example).
  • Select the function, and under View Files (Right end of your screen), add a new file and give it a name, for example “project.json”. NOTE: It is within this file we will refer our Nuget Packages that we need in our function.

  • Click on project.json and write the following code:

NOTE: At the time of writing this article the latest version of Nuget was “9.0.0.7”.

PROJECT.JSON

{

“frameworks”: {

“net46”:{

“dependencies”: {

“Microsoft.CrmSdk.CoreAssemblies”: “9.0.0.7”

}

}

}

}

  • Click Save and Run and notice the Log (bottom of the screen).

  • Now click on run.csx file. Write the following sample code for the Azure Function. Please replace the highlighted Organization Serve with your D365 Organization Service Endpoint Address, also the username and password for your D365 environment.

RUN.CSX

using System.Net;

using Microsoft.Xrm.Sdk;

using Microsoft.Xrm.Sdk.Client;

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)

{

log.Info(“C# HTTP trigger function processed a request.”);

// parse query parameter

string firstname = req.GetQueryNameValuePairs()

.FirstOrDefault(q => string.Compare(q.Key, “firstname”, true) == 0)

.Value;

string lastname = req.GetQueryNameValuePairs()

.FirstOrDefault(q => string.Compare(q.Key, “lastname”, true) == 0)

.Value;

IServiceManagement<IOrganizationService> orgServiceManagement = ServiceConfigurationFactory.CreateManagement<IOrganizationService>(new Uri(“<your organization service>“));

AuthenticationCredentials authCredentials = new AuthenticationCredentials();

authCredentials.ClientCredentials.UserName.UserName = “<username>“;

authCredentials.ClientCredentials.UserName.Password = “<password>“;

AuthenticationCredentials tokenCredentials = orgServiceManagement.Authenticate(authCredentials);

OrganizationServiceProxy organizationProxy = new OrganizationServiceProxy(orgServiceManagement, tokenCredentials.SecurityTokenResponse);

Entity contact = new Entity(“contact”);

contact.Attributes[“firstname”] = firstname;

contact.Attributes[“lastname”] = lastname;

var contactId = organizationProxy.Create(contact);

// Get request body

dynamic data = await req.Content.ReadAsAsync<object>();

string fullname = “”;

return fullname == null

? req.CreateResponse(HttpStatusCode.BadRequest, “Please pass a name on the query string or in the request body”)

: req.CreateResponse(HttpStatusCode.OK, “Contact created in CRM ” + contactId.ToString());

}

NOTE: To get your organization service , in your D365 environment click, Settings, Customizations, Developer Resources and copy the Endpoint Address under Organization Service.

Test Function

Add a query under Test tab to enter the First Name and Last Name (click + Add parameter) and click Run.

If your query is OK, you will get a Status: 200 OK and the output stating that a new Contact is created in D365.

Check the D365 Contact and search for the contact.

Check Traffic with a Scheduler-Based Logic App

The following article takes you through steps that show how you can build a logic app with a scheduler trigger, which runs every weekday and checks the travel time, including traffic, between two places. If the time exceeds a specific limit, the logic app sends email with the travel time and the extra time necessary for your destination. (Original post by Esther Fan. Big thanks!)

Prerequisite

So what are the steps involved?

  • Create a blank logic app.
  • Add a trigger that works as a scheduler for your logic app.
  • Add an action that gets the travel time for a route.
  • Add an action that creates a variable, converts the travel time from seconds to minutes, and saves that result in the variable.
  • Add a condition that compares the travel time against a specified limit.
  • Add an action that sends email if the travel time exceeds the limit.

Create Resource Group (optional)

  • Create a Resource Group, so that you can manage your resources easily (optional).

Create Logic App

  • In your resource group. Click Add.
  • Search for Logic Apps, select the relevant option and click Create.

Add Scheduler Trigger

Every logic app must start with a trigger, which fires when a specific event happens or when new data meets a specific condition.

  • On the Logic App Designer, search and add Schedule – Recurrence.
  • Optionally, Click the ellipses (…) button, and select Rename. Rename the trigger with this description “Check travel time every weekday morning”.
  • Inside the trigger, choose Show advanced options.
  • Provide the schedule and recurrence details for your trigger as shown:

This trigger fires every weekday, every 1 minute, starting at 7:00 AM and ending at 5:00 PM. You can choose shorter interval, or your mail box will be flooded . The Preview box shows the recurrence schedule.

For more information, see Schedule tasks and workflows and Workflow actions and triggers.

NOTE: To hide the trigger’s details for now, click inside the shape’s title bar. Also, save your logic app at regular intervals.

Get the Travel Time for a Route

Add an action that gets the travel time between two places. Logic Apps provides a connector for the Bing Maps API so that you can easily get this information. Before you start this task, make sure that you have a Bing Maps API key as described in prerequisites.

  • In the Logic App Designer, under your trigger, choose + New step > Add an action.
  • Search for “maps”, and select this action: Bing Maps – Get route.
  • If you don’t have a Bing Maps connection, you’re asked to create a connection. Provide these connection details and choose Create. Provide a Connection Name and API Key.
  • Optionally rename the action with this description: “Get route and travel time with traffic”.

For more information about these parameters, see Calculate a route.

Create Variable to Store Travel Time

By default, the previous Get route action returns the current travel time with traffic in seconds through the Travel Duration Traffic field. By converting and storing this value as minutes instead, you make the value easier to reuse later without converting again.

  • Under the Get route action, choose + New step > Add an action.
  • Search for “variables” and select this action: Variables – Initialize variable.
  • Optionally rename this action with this description: “Create variable to store travel time”.
  • To create the expression for the Value field, click inside the field so that the dynamic content list appears. If necessary, widen your browser until the list appears. In the dynamic content list, choose Expression.
    • In the expression editor, enter this expression: div(,60).
    • Put your cursor inside the expression between the left parenthesis (() and the comma (,). Choose Dynamic content.
    • In the dynamic content list, select Travel Duration Traffic.
    • After the field resolves inside the expression, choose OK.
    • The result would be something like this: div(body(‘Get_route_and_travel_time_with_traffic’)?[‘travelDurationTraffic’],60)

Compare Travel Time with Limit

Now, add a condition that checks whether the current travel time is greater than a specific limit.

  • Under the previous action, choose + New step > Add a condition.
  • Optionally rename the condition with this description: “If travel time exceeds limit”
  • Build a condition that checks whether travelTime exceeds your specified limit as described and shown here:
  • Inside the condition, click inside the Choose a value box, which is on the left (wide browser view) or on top (narrow browser view).
  • From either the dynamic content list or the parameter list, select the travelTime field under Variables.
  • In the comparison box, select this operator: is greater than
  • In the Choose a value box on the right (wide view) or bottom (narrow view), enter this limit: 10

Send Email When Limit Exceeded

Now, add an action that emails you when the travel time exceeds your limit. This email includes the current travel time and the extra time necessary to travel the specified route.

  • In the condition’s If true branch, choose Add an action.
  • Search for “send email” and select the email connector and the “send email action” that you want to use. In this example, I have selected a personal account using Gmail – Send email. It will ask you to authenticate. You can also use Outlook.com for personal live accounts and Office 365 Outlook for Azure work or school accounts.
  • Optionally rename the action with this description: “Send email with travel time”
  • In the To box, enter the recipient’s email address. For testing purposes, use your email address.
  • In the Subject box, specify the email’s subject, and include the travelTime variable.
    • Enter the text “Current travel time (minutes):” with a trailing space.
    • From either the parameter list or the dynamic content list, select travelTime under Variables.
  • In the Body box, specify the content for the email body.
    • Enter the text “Add extra travel time (minutes):” with a trailing space.
    • If necessary, widen your browser until the dynamic content list appears. In the dynamic content list, choose Expression.
    • In the expression editor, enter this expression so that you can calculate the number of minutes that exceed your limit: sub(,15).
    • Put your cursor inside the expression between the left parenthesis (() and the comma (,). Choose Dynamic content.
    • Under Variables, select travelTime.
    • After the field resolves inside the expression, choose OK.

Save your Logic App.

Run your Logic App

To manually start your logic app, on the designer toolbar bar, choose Run. If the current travel time stays under your limit, your logic app does nothing else and waits for the next interval before checking again. But if the current travel time exceeds your limit, you get an email with the current travel time and the number of minutes above your limit. Here is an example email that your logic app sends:

Obviously, you can make a better email body than shown below. But I hope you get the idea

Using Times Series API in Microsoft Dynamics NAV 2017

Introduction

With Microsoft Dynamics NAV 2017 release, Dynamics NAV enters into the world of machine learning and the first step into this journey is to bring the Time Series API to Dynamics NAV developers.

In this article you will get to see how to get an Azure Machine Learning experiment, publish an endpoint and use the time series API to get predictions.

The following steps are required:

  • Create a model.
  • Publish an endpoint for the model so that we can access it.
  • Train the model and get predictions.
  • Use the information to our advantage in the application.

Process

Create Model

There is a publically available model prepared by Microsoft team, aimed at time series predictions. With few click we can create a model from a public template.

So we will use this publically available model and open it in Azure Machine Learning Studio. Then we will copy the experiment to our personal workspace and then we will validate the experiment.

To access the model, click here.

Click Open in Studio.

You will be asked to select a workspace. You can choose to select a free or standard workspace and sign in to Microsoft Azure Machine Learning Studio.

To copy the experiment from the gallery, you can choose the default settings.

After couple of seconds, the experiment will be available in your workspace Microsoft Azure Machine Learning Studio. You can browse around in the model, zoom in or zoom out to see the different steps.

To be able to use the model, it must be validated, before it can be deployed as a web service. To do this click on the Run button.

Publish Endpoint

After the model was created and tested, we will now create an endpoint and generate an API key and a request URI so that we can consume the model.

On the Microsoft Azure Machine Learning Studio, click Deploy Web Service button. The system will then deploy the machine learning experiment as Web services and will provide API Key which can be consumed by wide range of services.

When the deployment finishes, the Web services dashboard opens. Perform the following steps:

Copy the API (and paste may be in Notepad).

Select the REQUEST/RESPONSE link to open the API page. On this page we require the Request URI, which you can copy (and paste may be in Notepad).

So at the end we have our API Key and Request URI:

Get Predictions

We will call the Time Series API from the Microsoft Dynamics NAV Development environment to get predictions on data coming from item Ledger Entry table and also check the quality of predictions programmatically, before it is presented to the end user.

Open the development environment and create a new code unit (say 50010, Forecast Sample).

In the Global Variables, create two text constants as shown below for API Key (say KEYTXT) and Request URI (say URITXT) and paste the values that you had copied earlier.

Create a new function, say CalculateForecast and specify the following Parameters, Return Value and local variables:

TimeSeriesLibrary, will help to prepare data, submit it to Azure Machine Learning and get back the prediction. The actual implementation is implemented as a .NET DLL and that DLL is wrapped by the TimeSeriesLibrary variable and provided us access and CAL functions to developers.

In this scenario we would like to get a prediction for Item sales, so we will get the data from the Item Ledger Entry table.

Enter the following code:

The whole task can be performed by 4 functions of TimeSeriesLibrary variable:

  • The Initialize function will be used to setup the connection.
  • The PrepareData function will transform any table data into a dataset which is ready for submission.
  • There is also a Forecast function, which will call the Azure Machine Learning that receives the predicted interval (in this case one month)
  • The GetForecast function returns a dataset of the forecast of the values

As an example, we will run the forecast of the item “8908-W” (Computer – Highline Package). Run the code unit and check the output:

As an out we can see the calculated forecast and the system expects the sales in next month to be about 3.

So how good is the prediction?

Let us create a list page based around Time Series Forecast.

Go back to the code unit to create another function, say CalculateForecastBulk, in which we add the following code, which will run the page:

This should actually run the forecast for all the items. Go ahead and run the code unit.

The following page appears with number of columns:

  • Group ID contains the item number.
  • Period No. is predicted period number.
  • The Value is the predicted Quantity.
  • The Delta explains range where the predicted value will be probability of 95%. For example, for the highlighted item it means that for 95% the sales next month will be between 1.5 – 5 and the central point is 3.27.s

Use the Information

Now let us use the received information in the application, for example to create a Purchase Order.

Again we move to the code unit and create a new function called CreatePurchaseOrder, which will create a purchase order depending on the variables that we pass.

NOTE: Call this function from CalculateForecastBulk function.

If you look closer at the function, you will notice that we calculate the ratio between the Predicted Quantity and the Delta. If it is bigger than Bar (which is set as a parameter), the system skips the line.

So depending on the predictions on our model, lines will either be added to our purchase order or not.

So go ahead and run the code unit and check the result. It should create a Purchase Order.

USE THE FOLLOWING FORECAST SAMPLE CODE UNIT