Author Archives: sbrakl

Salt-Api with Saltstack in 2018

This article is related to SaltStack. SaltStack, also known as Salt, is a configuration management and orchestration tool. It uses a central repository to provision new servers and other IT infrastructure, to make changes to existing ones, and to install software in IT environments, including physical and virtual servers, as well as the cloud.
This is specific to salt-api which is part of open salt. Many of the articles on the internet are for salt-api are from time 2014 – 2016 period where  salt-api was different project from salt-master, and then they merge into the salt core
This is walk through of using salt-api on Ubuntu 16/17/18

Step 1

Install salt-api package, this would install all dependencies which are needed by salt-master to host salt api http server
sudo apt-get install salt-api

Step 2

Configure external auth
salt-api is flexible and can integrate pam, ldap and other authentication
external_auth:
  pam:
     saltuser:
        – ‘.*’
        – ‘@wheel’   # to allow access to all wheel modules
        – ‘@runner’  # to allow access to all runner modules
        – ‘@jobs’    # to allow access to the jobs runner and/or wheel module
What this config tells is
External auth is ‘pam’
saltuser is the linux system user, which has following permission
.* – everything
‘@wheel’ – allow wheel modules permission
‘@runner’ – allow runners modules permission
‘@jobs’ – allow jobs modules permission

Step 3

Configure cherrypy, which has port 8000, ssl_crt, ssl_key, certificates. I have create self signed certificates with openssl for this.
rest_cherrypy:
  port: 8000
  ssl_crt: /etc/salt/tls/certs/localhost.crt
  ssl_key: /etc/salt/tls/certs/localhost.key
  disable_ssl: True
This changes were done in separate file
/etc/salt/master.d/salt-api.conf
complete file look something like this
external_auth:
  pam:
     saltuser:
        – ‘.*’
        – ‘@wheel’   # to allow access to all wheel modules
        – ‘@runner’  # to allow access to all runner modules
        – ‘@jobs’    # to allow access to the jobs runner and/or wheel module
 
 
rest_cherrypy:
  port: 8000
  ssl_crt: /etc/salt/tls/certs/localhost.crt
  ssl_key: /etc/salt/tls/certs/localhost.key
  disable_ssl: True

Step 4

Restart the salt-master service
# systemctl restart salt-master

Step 5

Test with curl
curl -si localhost:8000/login \
-c ~/cookies.txt \
-H “Accept: application/json” \
-H “Content-type: application/json” \
-d ‘{
    “username”: “saltuser”,
    “password”: “saltuser”,
    “eauth”: “pam”
}’
Output would be similar
> -c ~/cookies.txt \
> -H “Accept: application/json” \
> -H “Content-type: application/json” \
> -d ‘{
>     “username”: “saltuser”,
>     “password”: “saltuser”,
>     “eauth”: “pam”
> }’
HTTP/1.1 200 OK
Content-Length: 206
Access-Control-Expose-Headers: GET, POST
Vary: Accept-Encoding
Server: CherryPy/3.5.0
Allow: GET, HEAD, POST
Access-Control-Allow-Credentials: true
Date: Mon, 09 Jul 2018 10:03:26 GMT
Access-Control-Allow-Origin: *
X-Auth-Token: 464914c055cfa5529865564567eb7782554af025
Content-Type: application/json
Set-Cookie: session_id=464914c055cfa5529865564567eb7782554af025; expires=Mon, 09 Jul 2018 20:03:26 GMT; Path=/
{“return”: [{“perms”: [“.*”, “@wheel”, “@runner”, “@jobs”], “start”: 1531130606.41419, “token”: “464914c055cfa5529865564567eb7782554af025”, “expire”: 1531173806.414191, “user”: “saltuser”, “eauth”: “pam”}]}

Step 6

Test with curl function test.ping
curl -b ~/cookies.txt -si localhost:8000/ \
    -H “Accept: application/json” \
    -d client=’local’ -d tgt=’*’ -d fun=’test.ping’
If you get output, you are all great with salt-api for 2018.3 version of salt.
Advertisements

Transaction in the Micro service

When working the Micro service architecture, one usually finds managing data is the hardest part.

In monolith applications, you would leverage database transaction for achieving consistency of the data.

But in the Micro-service, each service is a different process and achieving consistency is a big task

I had created Google slides where I compare different patterns which can be used for achieving data consistency, which is also know as eventual consistency within micro-service

Globalization and localization in ASP.NET Core – Part 3

If you just landed on this page, this page is of series of Post for localization / globalization

Part 1 – Introduction

Part 2 – ASP.NET Core ways of Localization / Globalization

Part 3 – Old trustworthy ResourseManager

If you are following the series of post, in this post, I would talk about working with ResourceManager.

ResourceManager class is nothing new, It been there since the inception of .NET framework. It was the de facto mechanism for working with localize resource file before aspnet core introduce IStringLocalizer.

In this post, I want to demonstrate, ResourceManager still works in dotnet core, if you don’t want to go with fuzzy IStringLocalizer.

Let dive in to the code, in the sample application, you fill find two methods in HelloController. In Local method

 string rsFileName = "api1.Resources.rs1";
 ResourceManager rm = 
       new ResourceManager(rsFileName, Assembly.GetExecutingAssembly());
 string msg = rm.GetString("Hello");

Here, it creating new instance of the ResourceManager which take basename and assembly where these resource files are embedded.

Basename is the root name of the resource file without its extension but including any fully qualified namespace name. For example, the root name for the resource file named
MyApplication.MyResource.en-US.resources is MyApplication.MyResource.

Assembly needs to be the one where resource files are embedded. In our case it is same as the one where running code (HelloController) is present. I can get current assembly which contains the code by Assembly.GetExecutingAssembly() which returns api1.dll assembly

In our case, resources file which ResourceManager will read are name rs1.resx and placed in “Resources” folder. Hence, basename will be api1.Resources.rs1

ResourceManager.GetString will looks for current culture on running thread and respectively look for culture specific resource file to get the value of key “Hello”

DifferentCultureResourceFile

If culture specific resource is not found, say “de-de“, it will use default resource with any suffix for getting the value of the “key“. If “key” values isn’t found in the resource file, it will return value as “key” itself.

If you need to get resource value for any other culture than the current culture on the running tread, you can by specifying the culture in the GetString method.

string msg3 = rm.GetString("Hello", new System.Globalization.CultureInfo("fr-FR"));

As simple as that.

Now, level 2. I had requirement where code for reading resource was in another assembly/dll then the one where resource files are located.

If you look into the lib method, it calls Lib1.ResRdr library to get the resource file message. But, rs1.resx is in api1.dll, not in Lib1.Rdr.dll

 public string lib()
 {
   string msg = "";
   msg = Lib1.ResRdr.Messenger.GetHello(); 
   return msg;
 }

If you look into GetHello() method

public static string GetHello()
 {
   string msg = "";
 
   Assembly asm = Assembly.GetCallingAssembly();
   string rsFileName = asm.GetName().Name + ".Resources.rs1";
   ResourceManager rm = new ResourceManager(rsFileName, asm);
   var msg2 = rm.GetString("Hello");
   msg = msg2;
   return msg;
 }

It creates new ResourceName, pass the basename and assembly which contains the embedded resource. Since, it seperate dll, it get’s the calling assembly by Assembly.GetCallingAssembly() which would return api1.dll or api2.dll, whichever calls  Lib1.Rdr.dll. Note, here if you had use Assembly.GetExecutingAssembly(), you will get Lib1.Rdr.dll.

For the basename, since resources are embedded in the dll, assembly root namespace needs to be taken from the assembly name. i.e. api1, api2. So, basename would be api1.Resources.rs1 or api2.Resources.rs2

Rest would be taken care by the ResourceManager, it would read remote assembly embedded resources and it’s satellite assemblies to get the translated value.

That’s all for Globalization and localization in ASP.NET Core. Hope you find this series of post informative.

 

Globalization and localization in ASP.NET Core – Part 2

If you just landed on this page, this page is of series of Post for localization / globalization

Part 1 – Introduction

Part 2 – ASP.NET Core ways of Localization / Globalization

Part 3 – Old trustworthy ResourseManager

In this post, I would be hinting on the ASP.NET Core way of  Localization / Globalization. There is already a good post on this from the Microsoft official Docs for ASP.NET Core at https://docs.microsoft.com/en-us/aspnet/core/fundamentals/localization and I am not going to repeat what’s in that post but would be sharing few highlight’s of that article.

In order to work with IStringLocalizer, important, you need to add following in the Startup.cs

// Add Localization services to the system
services.AddLocalization(options => options.ResourcesPath = "Resources")

// Configure supported cultures and localization options
services.Configure<RequestLocalizationOptions>(options =>
{
       var supportedCultures = new[]
                           {
                              new CultureInfo("en-US"),
                              new CultureInfo("de-DE")
                          };

// State what the default culture for your application is. 
// This will be used if no specific culture can be determined for 
// a given request.

options.DefaultRequestCulture = new RequestCulture("en-US", "en-US");

// You must explicitly state which cultures your application supports.
// These are the cultures the app supports for formatting 
// numbers, dates, etc.

options.SupportedCultures = supportedCultures;

// These are the cultures the app supports for UI strings, 
// i.e. we have localized resources for.

options.SupportedUICultures = supportedCultures;
});

In Configure in Startup.cs:

// Configure localization.
var locOptions = app.ApplicationServices.GetService<IOptions<RequestLocalizationOptions>>();
app.UseRequestLocalization(locOptions.Value);

This code (UseRequestLocalization) would set current culture info and current ui culture on the request execution thread.

By default, It determines culture from QueryString, Cookie and Request Header. Read first post to get some insight.

In order to understand better, I have create a sample application and have put the it on the GitHub, https://github.com/sbrakl/aspnetcoreglobalization where you can get the execution in action.

In HelloController.cs, local method on api1, you could see this code

string cul = Thread.CurrentThread.CurrentCulture.Name;
string culUI = Thread.CurrentThread.CurrentUICulture.Name;

If you debug this application, if your startup.cs initialization code is correct, it would see the set the thread culture.

In the same method, it set the localize string from IStringLocalizer _localizer.

msg = _localizer["Hello"];

This _localizer in injected in the controller by ASP.NET Core dependency injection, and would read from HelloController resources

HelloControllerResources.png

Placement and name of this resources file are important, otherwise, it won’t be able to read it. Great extent of naming and placement is written at https://docs.microsoft.com/en-us/aspnet/core/fundamentals/localization#resource-file-naming. That’s all you need to work with IStringLocalizer.

If you are curious just as me, to find out how this IStringLocalizer works? Then, you need to checkout it official GitHub Repository, https://github.com/aspnet/Localization

Dotnet Core is opensource, and you can read it code. When Dependency Inject (DI) is called to inject IStringLocalizer, it calls the ResourceManagerStringLocalizerFactory, which in turn create an instance of ResourceManagerStringLocalizer.

When ResourceManagerStringLocalizerFactory creates the instance of ResourceManagerStringLocalizer, it add new instance of the ResourceMananger

protected virtual ResourceManagerStringLocalizer 
             CreateResourceManagerStringLocalizer(
                                                 Assembly assembly,
                                                 string baseName)
 {
    return new ResourceManagerStringLocalizer(
                        new ResourceManager(baseName, assembly),
        assembly,
        baseName,
        _resourceNamesCache,
        _loggerFactory.CreateLogger<ResourceManagerStringLocalizer>());
 }

ResourceManager then in turn work with the resource files to get the translated text for the key

msg = _localizer["Hello"];

Here, “Hello” is the key and whatever value in the resource file, it will get that. Resource + culture file to be specific.

IStringLocalizer –> DI –> ResourceManagerStringLocalizerFactory –> ResourceManagerStringLocalizer –> ResourceManger

That’s the flow, how IStringLocalizer translated the messages.

Few interesting points here,

Type and Resource File Naming

IStringLocalizer takes the type. i.e.

IStringLocalizer<HelloController> _localizer

HelloController is the type. You resource needs to have HelloController name without default language resource file. What does in mean?

In ResourceManager era, default language resource file which won’t have culture suffix, example, abc resource would be named

"abc.resx"
"abc.FR-fr.resx" //French
"abc.ES-es.resx" //Spanish

“abc.resx” is default, if for any other cultures is pass apart from “FR-fr”, “ES-es”, this file would be used.

But, when working with IStringLocalizer, your default language set in Startup.cs. (see startup.cs code at top) Hence, you resource file should be named

"HelloController.en-US.resx"
"HelloController.FR-fr.resx"
"HelloController.ES-es.resx"

 

Location of resource files

In the startup.cs code, you mention resource path, where to find all the resources. It doesn’t need to “Resources”, it could be anything. When assembly is build, resource file would go as emended resources into the dll. ASP.NET Core would build satellite assemblies for different culture which you have mention.

ResourceFileInBuildDirectory

By definition, satellite assemblies do not contain code, except for that which is auto generated. Therefore, they cannot be executed as they are not in the main assembly. However, note that satellite assemblies are associated with a main assembly that usually contains default neutral resources (for example, en-US). A big benefit is that you can add support for a new culture or replace/update satellite assemblies without recompiling or replacing the application’s main assembly.

Dotnet Peek view of assembly

dotnetpeekAssembly

ResourceManager would find the resources embedded in assembly / satellite assembly base on the resource path and type name.

ResourceNameInAssembly.png

Further

It all good, when you are going translation in the same assembly, where resource files are located. But, in my case, resource file where in one assembly, and code to read from resource file was in another assembly. This where ASP.NET Core IStringLocalizer faded and was limited to for me. I need to resorted to good old ResourceManager which I would describe in the other post.

 

 

 

 

 

 

 

Globalization and localization in ASP.NET Core – Part 1

Here, this post is about my experience in the globalization and localization in asp.net core application.

This is series of Post for localization / globalization

Part 1 – Introduction

Part 2 – ASP.NET Core ways of Localization / Globalization

Part 3 – Old trustworthy ResourseManager

First for the fresher folks, let me explain what’s globalization and localization

Say, you need an web application which caters to different language say English and French. Greeting like “Hello Friend” needs to be translated to “salut l’ami” in french. How do we do it, simple. Create file with Name Value, on for English and other for French.

TextTranslate-English.txt -> Greeting =  “Hello Friend”

TextTranslate-English.txt -> Greeting =  “salut l’ami”

In Code -> var message  = TextTranslate[“Greeting”]

Base on the language, it will pick from the respective file. This is the idea of Globalization and localization. Dotnet folks figure this out long before in .NET 1.1 framework and in same can be done with Resource files in .NET.

In Visual Studio, you can add resource file to your project.

This resource file contain the key and value. Say, here Key is “Hello” and Value is “Hello-en-us”

Similarly, you can add different culture resource file.

In dotnet, language has more than text, it’s how you write dates, format numbers, currency symbols, etc. So, they call collection of this as culture

e.g. US English = 12/25/2017, UK English = 25/12/2017

Culture are denote by two letter alphabets like en, or en-US, en-UK. That’s US English and UK English. es-ES is Spanish Spain and es-MX Spanish Mexico. If just es is mention, then it defaults to Spanish Spain, en defaults to en-US.

And the class which handle this is known as cultureinfo. The thread in runs your code has two properties CultureInfo and UICultureInfo

Culture is the .NET representation of the default user locale of the system. This controls default number and date formatting and the like.

UICulture refers to the default user interface language, a setting introduced in Windows 2000. This is primarily regarding the UI localization/translation part of your app.

You can get running thread Culture with following code

 CultureInfo cul = Thread.CurrentThread.CurrentCulture;
 CultureInfo culUI = Thread.CurrentThread.CurrentUICulture;

Now, In ASP.NET, say your server in running in UK and you got the request from France. How does ASP.NET application know, it have to serve french text?

Fortunately, in ASP.NET Core, there is middleware which does it for us. By default, it has three ways

  • QueryString
  • Cookie
  • AcceptLanguage Header

QueryString

http://localhost:5000/?culture=es-MX&ui-culture=es-MX

Cookie

You can read Cookie as

The cookie format is c=%LANGCODE%|uic=%LANGCODE%, where c is Culture and uic is UICulture, for example: c=en-UK|uic=en-US

 

You can write cookie with following code

HttpContext.Response.Cookies.Append(
 CookieRequestCultureProvider.DefaultCookieName,
 CookieRequestCultureProvider.MakeCookieValue(requestCulture),
 new CookieOptions { Expires = DateTimeOffset.UtcNow.AddYears(1) });

Headers

When browser makes the request to website, it will send accept-language header. As seen in Chrome debug window -> Network -> Request Headers

AcceptHeader.png

You can read more about it here, https://docs.microsoft.com/en-us/aspnet/core/fundamentals/localization#implement-a-strategy-to-select-the-languageculture-for-each-request

This middleware would set the thread.currentCulture and thread.currentUICulture using either QueryString, Cookie or Accept-Language Header

Once current culture is set, Date.ToString() will return date as per Current culture. In en-US it would be 12/25/2017 and en-UK, it would be 25/12/2017

Next post we will see, how to implement asp.net localization package and how to use it.

Publish dotnet core app using MSDeploy

Recently, I got a task to setup CI-CD pipeline for the dotnet core project. CI was easy, CD had challenges.

In CD, I need to deploy on Windows as well as Linux. In this post, I going to describe for the windows. Microsoft did good job in documenting the asp.net core and IIS integration. https://docs.microsoft.com/en-us/aspnet/core/publishing/iis

What it say is, first install IIS, install the .NET Core Windows Server Hosting bundle on the hosting system and then create asp.net core website. This all as part of one time installation on the target machine.

Next part would be deploying new versions of the same ASP.NET core web application again and again. Now, this part needs to be automated as continues deployment (CD) pipeline.

Challenges are

  1. You build the code on one machine and deploy on other
  2. In my case, due to IT policy, I wasn’t allow to create network share folders for deployment
  3. It needs to be secure, as only users who had permissions can only deploy

Now, how do I copy website code from one machine to other machine IIS without network share folder? As I was looking around for a solution, I found Microsoft has already solve this problem way back in 2010 with Web Deploy.

Microsoft was always has confusing name conversion, they market as Web Deploy, but internally called it as MS Deploy. This tool is unsung hero of deployment. When it comes to automated deployment, people talked of chef, puppet but MSDeploy isn’t acknowledged much.

Okay, what this tools does? You can publish website remotely without need to transfer file manually to that machine and setup the website way you want. That’s means, it can create the app pool, set the folder permissions, configure the port, configure the binding, etc. Not just transfer of code files.

Now, there are few good resources over the web, esp from the Vishal Joshi http://vishaljoshi.blogspot.in/search?q=msdeploy and SAYED IBRAHIM HASHIMI http://sedodream.com/SearchView.aspx?q=msdeploy whom has written extensively of on how to work with MS Deploy.

You need to install Web Deploy 3.6 remote agent service on the target machine. Can’t figure what this is, read through the following blog post to install MS Deploy http://chamindac.blogspot.in/2016/05/deploy-aspnet-core-10-to-remote-iis.html

Here, build server in black and target deployment server  in red color.

MSDeploy.png

On the build server, you need to run following command to publish the dotnet core website, dotnet publish -o {path to publish folder}

dotnet publish -o D:\publishcode

Here, I want to deploy website in “Default Website/WebApp” on the target server.

WebappInIIS.png

Use this command to deploy on the build machine. Here, username password would be local administrative user of the target machine.

servernameorip can be computer name like “WL-TPM-348” or could be IP on machine like “10.20.30.15”

By default, MS Deploy is installed in C:\Program Files (x86)\IIS\Microsoft Web Deploy V3 folder. If it’s install elsewhere, but that path for msdeploy.exe.

D:>"C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe" -verb:sync -source:iisApp="D:\publishcode" -dest:iisApp="Default Web Site/WebApp",ComputerName="servernameorip",UserName='ServerUser',Password='password',AuthType='NTLM' -allowUntrusted

Mind, this is bare metal command to deploy code. It doesn’t have SiteManifest.xml which configures IIS Web application for app pool, ports, binding, etc. Neither it’s paramterize. But, it good example to get started.

Hope, you find this post useful

 

 

 

 

 

Monitoring Azure VMs with Power BI

I had been into the situation where we were running some performance test on the Azure VM and need to monitor their CPU’s

Azure portal (Feb 2017) provides nice metric blade to monitor most of the resources in Azure like Web Apps, Storage Account, SQL Azure, etc. I need to monitor IaaS VM’s.

I was running Linux VM, and there is nice blog on the Azure docs, which show’s how to enable Linux diagnostic on the VM and collect the metric

This works for me, but had two issues. First, Azure metric blade just gives me three filter for time duration. It like, past hour, entire day and pass week. There wasn’t any provision for custom duration say, CPU for yesterday between 1 pm to 3 pm.

AzureDurationLimitation.png

Second, I had create a custom dashboard on the Azure Portal for monitoring all my VM’s CPU. But, I can’t share this with non-Azure users. Our performance testers, Managers, etc. didn’t have the Azure subscriptions or any Azure knowledge. To create read-only users with particular dashboard resource group access and getting them the learning curve to view CPU was big ask.

I knew this monitoring reading from the Linux diagnostic accounts are save in Azure Storage Tables,

AzureStorageTables.png

Linux diagnostic tables in Azure storage as seen in Visual Studio 2015 Cloud explorer

and I can use PowerBI to connect Azure Storage Tables. Power BI report can be publish to Web and can be viewed without need of Azure Subscription account. So, most of my needs were answered.

I create a PowerBI report where I connect to Azure tables. Here, before you click load, you need to filter out the records otherwise, it would download the entire table data to PowerBI which would be in GB’s

I edited to add filter for past week data and then load filter data in Power BI data model (Yes, Power BI stores it own data). Once data is loaded in the Power BI data model, I need to add few new calculated columns in the data model, which I will use this calculated columns to define my new Time Hierarchy.

By default, power BI provides Time hierarchy in Year, Quarter, Month and Date. But, for the data I want Month, Date, Hours and minute

I create my new time hierarchy and create the report out of it. I had create basic report using Column chart which show max CPU by minute, hour, day and month. i.e. Even for 5 min in a hour, if the CPU hit 100%, it would show in cumulative hour as 100%. Same logic will role up to day and month. It helps us in devops to get things in large perceptive of the VM usage.

I am aware, there are better tools in the market like Azure OMS, and from third parties like DataDog and SysDig. But, this is like DIY project rather than using tools.

Word of caution, when using PowerBI with Azure Table Storage, every time PowerBI hits Azure storage table to fetch data, there will be outguess charges on Azure Storage. You can use something call as PowerBI embedded and host Power BI report in same region where your storage account is to avoid these charges.

The whole action I have capture in this youtube video, which

  1. Connect to Azure Table
  2. Filter the Data from Azure Table
  3. Added columns in Data Model for new Time Hierarchy
  4. Create report with new data model
  5. Exploring drill down in power BI

If you have suggestion or comment, do let me know.