Browsed by
Tag: Azure Government

Azure Search SDK in Government

Azure Search SDK in Government

So I’ve been working on a demo project using Azure Search, and if you’ve followed this blog for a while you know. I do a lot of work that requires Azure Government. Well recently I needed to implement a search that would be called via an Azure Function and require the passing of latitude and longitude to facilitate the searching within a specific distance. So I started to build my azure function using the SDK. And what I ended up with looked a lot like this:

Key Data elements:

First to be able to interact with my search service I need to install the following nuget package:

Microsoft.Azure.Search

And upon doing so, I found so pretty good documentation here for building the search client. So I built out a GeoSearchProvider class that looked like the following:

NOTE: I use a custom class called IConfigurationProvider which encapsulates my configuration store, in most cases its KeyVault, but it can be a variety of other options.

public class GeoSearchProvider : IGeoSearchProvider
    {
        IConfigurationProvider _configurationProvider;

        public GeoSearchProvider(IConfigurationProvider configurationProvider)
        {
            _configurationProvider = configurationProvider;
        }

        public async Task<DocumentSearchResult<SearchResultModel>> RunSearch(string text, string latitude, string longitude, string kmdistance, Microsoft.Extensions.Logging.ILogger log)
        {
            if (String.IsNullOrEmpty(kmdistance))
            {
                kmdistance = await _configurationProvider.GetSetting("SearchDefaultDistance");
            }

            var serviceName = await _configurationProvider.GetSetting("SearchServiceName");
            var serviceApiKey = await _configurationProvider.GetSetting("SearchServiceApiKey");
            var indexName = await _configurationProvider.GetSetting("SearchServiceIndex");

            SearchIndexClient indexClient = new SearchIndexClient(serviceName, indexName, new SearchCredentials(serviceApiKey));

            var parameters = new SearchParameters()
            {
                Select = new[] { "...{list of fields}..." },
                Filter = string.Format("geo.distance(location, geography'POINT({0} {1})') le {2}", latitude, longitude, kmdistance)
            };

            var logmessage = await _configurationProvider.GetSetting("SearchLogMessage");

            try
            {
                var results = await indexClient.Documents.SearchAsync<SearchResultModel>(text, parameters);

                log.LogInformation(string.Format(logmessage, text, latitude, longitude, kmdistance, results.Count.ToString()));

                return results;
            }
            catch (Exception ex)
            {
                log.LogError(ex.Message);
                log.LogError(ex.StackTrace);
                throw ex;
            }
        }
    }

The above code seems pretty straight forward and will run just fine to get back my search results. I even built in logic so that if I don’t give it a distance, it will take a default from the configuration store, pretty slick.

And I pretty quickly ran into a problem, and that error was “Host Not found”.

And I racked my brain on this for a while before I discovered the cause. By default, the Azure Search SDK, talks to Commercial. Not Azure Government, and after picking through the documentation I found this. There is a property called DnsSuffix, which allows you to put in the suffix used for finding the search service. By default it is “search.windows.net”. I changed my code to the following:

public class GeoSearchProvider : IGeoSearchProvider
    {
        IConfigurationProvider _configurationProvider;

        public GeoSearchProvider(IConfigurationProvider configurationProvider)
        {
            _configurationProvider = configurationProvider;
        }

        public async Task<DocumentSearchResult<SearchResultModel>> RunSearch(string text, string latitude, string longitude, string kmdistance, Microsoft.Extensions.Logging.ILogger log)
        {
            if (String.IsNullOrEmpty(kmdistance))
            {
                kmdistance = await _configurationProvider.GetSetting("SearchDefaultDistance");
            }

            var serviceName = await _configurationProvider.GetSetting("SearchServiceName");
            var serviceApiKey = await _configurationProvider.GetSetting("SearchServiceApiKey");
            var indexName = await _configurationProvider.GetSetting("SearchServiceIndex");
            var dnsSuffix = await _configurationProvider.GetSetting("SearchSearchDnsSuffix");

            SearchIndexClient indexClient = new SearchIndexClient(serviceName, indexName, new SearchCredentials(serviceApiKey));
            indexClient.SearchDnsSuffix = dnsSuffix;

            var parameters = new SearchParameters()
            {
                Select = new[] { "...{list of fields}..." },
                Filter = string.Format("geo.distance(location, geography'POINT({0} {1})') le {2}", latitude, longitude, kmdistance)
            };

            //TODO - Define sorting based on distance

            var logmessage = await _configurationProvider.GetSetting("SearchLogMessage");

            try
            {
                var results = await indexClient.Documents.SearchAsync<SearchResultModel>(text, parameters);

                log.LogInformation(string.Format(logmessage, text, latitude, longitude, kmdistance, results.Count.ToString()));

                return results;
            }
            catch (Exception ex)
            {
                log.LogError(ex.Message);
                log.LogError(ex.StackTrace);
                throw ex;
            }
        }
    }

And set the “SearchSearchDnsSuffix” to “search.azure.us” for government, and it all immediately worked.

Weekly – 4/13

Weekly – 4/13

So here was are, another week in quarantine, and really we had Easter and my family was able to still make it special without seeing family as much as we’d like. The one thing we did was my parents and my in-laws, hid eggs for my kids (cleaned them) and then we arrived, stayed outside and they hunted for the eggs. It actually was a lot of fun and a good way to do it with social distancing.

Down to business..

Fun Stuff:

So I love storytelling, and have always found it to be fascinating. I’m a movie fan, comic fan, TTRPGs, etc. And I’ve tried my hand at writing several times, mainly I’ve never released anything and my writing loses out to competing priorities. But It is cool to see ideas for new ways to tell stories. I saw on Twitter this week, @CSharpFritz, posted on twitter about how he likes to write, and mentioned RenPy, a python based framework for writing CYOA adventure games and releasing them via Mobile apps. I researched it a bunch of the weekend and its pretty awesome.

Power BI Embedded … confused?

Power BI Embedded … confused?

So I wanted to write this post, as I’ve gotten a lot of questions about this over the past year. Power BI embedded is a pretty awesome tool. The idea is this, “I want to get cool visualizations into my application, how do I do that?” The answer is Power BI Embedded, here’s a video for those unfamiliar with the product.

But for me, the question that usually comes next is the one I want to cover here. “How do I get this?” There’s a lot of confusion when it comes to Power BI, and that’s because it really comes from a couple of places.

See the source image

Explaining the types of Power BI:

There are essentially three flavors of Power BI:

Power BI Pro: These are individual licenses for those who will be working on the backend to build visualizations, and could be provisioning capacity in azure.

Power BI Premium:  This service is designed around providing dedicated capacity for running data refreshes, and visualizations for your Power BI implementation.  This allows for managing workspaces in the Power BI portal, and additionally does support the embedded functionality.  The primary difference here is that this is an Office 365 sku, so a partner has to purchase licenses through their reseller to add capacity.  Each license (EM1, EM2, EM3, P1, P2,  P3) provides different capacity, found here.

One item worth mentioning on the above skus is that you will see the cores separated into “Back-end” and “Front-end”, the backend cores are responsible for data refreshes, and the front-end cores are for visualization.  This is important because if you implement an EM1 sku, then you are sharing 1 core and it can cause issues with timeouts.

Power BI Embedded:  This service is more targeted at ISVs, and leverages Azure to generate the capacity, the skus are basically identical but the primary difference is that you can add capacity through the azure portal, and it is allocated on a consumption model. So ultimately this can be cheaper, and capacity is allocated easier should they need to add capacity.

At its core, how does this work?

Power BI functions on this idea of workspaces, which are created in the Power BI portal, and then PBIX files with data sets and visualizations are uploaded to it.  Once a workspace is available, capacity has to be added for the processing.  This capacity can come from office skus or Azure depending on how you configure it.

So let me answers some questions about what you want to do?

I wanted to render visualizations in my application, how do I do that?

For this use case you really want Power BI Embedded, with a few Power BI Pro licenses. To purchase Power BI premium requires working with a reseller to purchase licenses and then working strictly with an office portal to support creation of the capacity. You will also be paying for a lot of features you really don’t care about.

For Independent Software Vendors (ISVs) it makes a lot more sense to just buy Power BI embedded, its transacted in the azure portal which makes it very easy to create capacity and scale up as needed.

You will need Power BI Pro licenses as well, for the following use cases, but these are really cheap (a few dollars at the time of this post).

  • Any developer who will be building visualizations.
  • Any operations person (or service account) that will be provisioning or managing capacity.
  • Any service accounts that will be handling communication between the application UI and Power BI. This is required because without a license you will be throttled on the number of request tokens you can generate.

What is the difference between Gov and Commercial in PowerBI?

So for implementation for Power BI, you will require Power BI Pro licenses for the following:

  • Developers working on Power BI visualizations
  • Administrators who manage the Power BI Workspaces
  • Service Accounts from Apps that leverage Workspaces

For Government specifically you cannot access the Power BI Embedded functionality in the Portal without a Power BI Pro licensing. 

One thing worth mentioning, is if you are purchasing Power BI Pro licenses with the intention of using Government, you will need Power BI Pro GCC High, as these are the only licenses that can attach to your Azure AD accounts in the Government Cloud.

How do I purchase Power BI licenses?

Here is a link that talks you through purchasing Power BI Premium. For Power BI Embedded, here’s a link that explains the process in the azure portal.

How do I know how much capacity I need?

There is a great link here that talks about the different skus for Power BI Embedded and specifically it empowers you to chose the appropriate memory and vcore configuration to provision for your workload.

So the question becomes, “How do I know how much I need?”

The capacity required really depends on four elements:

  1. The amount of data being sent over and consumed.
  2. The complexity of any transformations done within Power BI.
  3. The complexity of the visualization.
  4. The demand on the application.

Here’s a whitepaper that was put out for Power BI Capacity planning for Embedded.

Hope that helps?

Hope that helps you with understanding the licensing of Power BI. There is a lot of confusion here and I hope this clears it up. So hopefully with any luck you have a better idea.

See the source image
Creating Terraform Scripts from existing resources in Azure Government

Creating Terraform Scripts from existing resources in Azure Government

Lately I’ve been doing a lot of work with TerraForm lately, and one of the questions I’ve gotten a lot is the ability to create terraform scripts based on existing resources.

So the use case is the following: You are working on projects, or part of an organization that has a lot of resources in Azure, and you want to start using terraform for a variety of reasons:

  • Being able to iterating in your infrastructure
  • Consistency of environment management
  • Code History of changes

The good new is there is a tool for that. The tool can be found here on github along with a list of pre-requisites.  I’ve used this tool in Azure Commercial and have been really happy with the results. I wanted to use this with Azure Commercial.

NOTE => The Pre-reqs are listed on the az2tf tool, but one they didn’t list I needed to install was jq, using “apt-get install jq”.

Next we need to configure our environment for running terraform.  For me, I ran this using the environment I had configured for Terraform.  In the Git repo, there is a PC Setup document that walks you through how to configure your environment with VS code and Terraform.  I then was able to clone the git repo, and execute the az2tf tool using a Ubuntu subsystem on my Windows 10 machine. 

Now, the tool, az2f, was built to work with azure commercial, and there is one change that has to be made for it to leverage azure government 

Once you have the environment created, and the pre-requisites are all present, you can open a “Terminal” window in vscode, and connect to Azure Government. 

In the ./scripts/resources.sh and ./scripts/resources2.sh files, you will find the following on line 9:

ris=`printf “curl -s  -X GET -H \”Authorization: Bearer %s\” -H \”Content-Type: application/json\” https://management.azure.com/subscriptions/%s/resources?api-version=2017-05-10” $bt $sub`

Please change this line to the following:

ris=`printf “curl -s  -X GET -H \”Authorization: Bearer %s\” -H \”Content-Type: application/json\” https://management.usgovcloudapi.net/subscriptions/%s/resources?api-version=2017-05-10” $bt $sub`

You can then run the “az2tf” tool by running the following command in the terminal:

./az2tf.sh -s {Subscription ID} -g {Resource Group Name}

This will generate the script, and you will see a new folder created in the structure marked “tf.{Subscription ID}” and inside of it will be all configuration steps to setup the environment.

TerraForm Kubernetes Template

TerraForm Kubernetes Template

Hello All, I wanted to get a quick blog post out here based on something that I worked on, and finally is seeing the light of day.  I’ve been doing a lot of work with TerraForm, and one of the use cases I found was standing up a Kubernetes cluster.  And specifically I’ve been working with Azure Government, which does not have AKS available.  So how can I build a kubernetes cluster and minimize the lift of creating a cluster and then make it easy to add nodes to the cluster.  So the end result of that goal is here.

Below is a description of the project, and if you’d like to contribute please do, I have some ideas for phase 2 of this that I’m going to build out but I’d love to see what others come up with.

Intention:

The purpose of this template is to provide an easy-to-use approach to using an Infrastructure-as-a-service deployment to deploy a kubernetes cluster on Microsoft Azure. The goal being that you can start fresh with a standardized approach and preconfigured master and worker nodes.

How it works?

This template create a master node, and as many worker nodes as you specify, and during creation will automatically execute the scripts required to join those nodes to the cluster. The benefit of this being that once your cluster is created, all that is required to add additional nodes is to increase the count of the “lkwn” vm type, and reapply the template. This will cause the newe VMs to be created and the cluster will start orchestrating them automatically.

This template can also be built into a CI/CD pipeline to automatically provision the kubernetes cluster prior to pushing pods to it.

This guide is designed to help you navigate the use of this template to standup and manage the infrastructure required by a kubernetes cluster on azure. You will find the following documentation to assist:

  • Configure Terraform Development Environment: This page provides details on how to setup your locale machine to leverage this template and do development using Terraform, Packer, and VSCode.
  • Use this template: This document walks you through how to leverage this template to build out your kubernetes environment.
  • Understanding the template: This page describs how to understand the Terraform Template being used and walks you through its structure.

Key Contributors!

A special thanks to the following people who contributed to this template:
Brandon Rohrer: who introduced me to this template structure and how it works, as well as assisted with optimizing the functionality provided by this template.