Author Archives: sbrakl

Securing Microservices with Public key infrastructure

Problem Statement

OAuth

When you design the services based on microservice architecture, typically authentication would be via federated authentication. It would have some oAuth service or STS (Security Token Service) which would grant token to the client.

In my scenario, their were bunch of microservices each nicely encapsulated in the Docker containers, each were portable and move from one host to another for high availability and redundancy. There was an API Gateway, which would be serving as the facade for these microservices and would do service discovery and bunch of other stuff.

Authentication was one work API Gateway did for us. It would received the jwt token from the client and check for its authenticity, scope and validity. Once all is good, it would forward call to microservice which is been designated for. Each microservice based on operation would call other microservices, collate the data and give back the response.

Now, the problem arise how to we authenticate call’s between Microservices? There were couple of options

Option 1

Leave inter microservices call  happen with authentication

Unfortunately, this wasn’t the choice for us, as these services where deployed on public shared infrastructure.

Option 2

Let each service do token authentication with forwarded jwt token

This would be the option, but was overkill for us. Each microservice needs to have token authentication code. Plus, this would be anti-pattern as each microservice architecture, as cross cutting concerns like authentication is seeping in to the services. If there is change in jwt token data structure, changes needs to done in all microservices.

This was no go for us. We need some lightweight authentication just to validate if the call is from the our clan for microservices.

Option 3

This lead to our third option, PKI based authentication.

Here, it’s simple, all microservice would run on HTTPS and would take client certificate for authentication. It would check client certificates against it own certificate and if it’s valid, it would let the call happen.

It was secure as all communication was over HTTPS and simpler to implement than JWT authentication

Solution

Here, how we did it.

We have microservices coded in python and nodejs. Caller where ASPNET dotnet core web apps and few data analytic services.

Simplistic Flask application is as below

Python Code

#!/usr/bin/env python

from flask import Flask
from werkzeug import serving
import ssl
import sys

HTTPS_ENABLED = True
VERIFY_USER = True
API_HOST = "0.0.0.0"
API_PORT = 8000
API_CRT = "server.crt"
API_KEY = "server.key"
API_CA_T = "ca.crt"

app = Flask(__name__)

@app.route("/")
def main():
   return "Top-level content"

context = None
if HTTPS_ENABLED:
   context = ssl.SSLContext(ssl.PROTOCOL_TLSv1_2)
   if VERIFY_USER:
       context.verify_mode = ssl.CERT_REQUIRED
       context.load_verify_locations(API_CA_T)

   try:
       context.load_cert_chain(API_CRT, API_KEY)
   except Exception as e:
       sys.exit("Error starting flask server. " +
           "Missing cert or key. Details: {}"
           .format(e))

serving.run_simple(

   API_HOST, API_PORT, app, ssl_context=context)

 

Here, I have shown using the werkzeug server which good for development environment. On production you need to use some uwsgi server like gunicorn or uwsgi. Then this HTTPS code needs to be push to uwsgi service code.

For nodejs api services, here option to create the https server 

NodeJS Code

var fs = require('fs');

var https = require('https');

var options = {
   key: fs.readFileSync('server.key’),
   cert: fs.readFileSync('server.crt'),
   ca: fs.readFileSync('ca.crt'),
   requestCert: true,
   rejectUnauthorized: true
};

https.createServer(options, function (req, res) {
   console.log(new Date()+' '+
       req.connection.remoteAddress+' '+
       req.socket.getPeerCertificate().subject.CN+' '+
       req.method+' '+req.url);

   res.writeHead(200);

   res.end("hello world\n");

}).listen(4433);

 

The calling were dotnet applications. They would typically use some nuget package like RestSharp and code would be as following

CSharp (C#) Code

public static IRestResponse<User> AsyncHttpRequestLogIn(string path, string method, object obj)
{
        var client = new RestClient(Constants.BASE_URL + path); // https:....
        var request = method.Equals("POST") ? new RestRequest(Method.POST) : new RestRequest(Method.GET);
        request.RequestFormat = RestSharp.DataFormat.Json;

        // The path to the certificate.
        string certificate = "cer/clientcert.pfx";     
		X509Certificate2 certificates = new X509Certificate2();
		certificates.Import(...);

		client.ClientCertificates = new X509CertificateCollection(){certificate});

        request.AddBody( 
            obj
        );

        IRestResponse<User> response = client.Execute<User>(request);

        return response;

}

 

This way, I had secure my inter microservice communication using the certificates. Note, code which I have shown here is minimalist without the complexity of  production environment, but good enough to guide the way.

Salt-Api with Saltstack in 2018

This article is related to SaltStack. SaltStack, also known as Salt, is a configuration management and orchestration tool. It uses a central repository to provision new servers and other IT infrastructure, to make changes to existing ones, and to install software in IT environments, including physical and virtual servers, as well as the cloud.
This is specific to salt-api which is part of open salt. Many of the articles on the internet are for salt-api are from time 2014 – 2016 period where  salt-api was different project from salt-master, and then they merge into the salt core
This is walk through of using salt-api on Ubuntu 16/17/18

Step 1

Install salt-api package, this would install all dependencies which are needed by salt-master to host salt api http server
sudo apt-get install salt-api

Step 2

Configure external auth
salt-api is flexible and can integrate pam, ldap and other authentication
external_auth:
  pam:
     saltuser:
        – ‘.*’
        – ‘@wheel’   # to allow access to all wheel modules
        – ‘@runner’  # to allow access to all runner modules
        – ‘@jobs’    # to allow access to the jobs runner and/or wheel module
What this config tells is
External auth is ‘pam’
saltuser is the linux system user, which has following permission
.* – everything
‘@wheel’ – allow wheel modules permission
‘@runner’ – allow runners modules permission
‘@jobs’ – allow jobs modules permission

Step 3

Configure cherrypy, which has port 8000, ssl_crt, ssl_key, certificates. I have create self signed certificates with openssl for this.
rest_cherrypy:
  port: 8000
  ssl_crt: /etc/salt/tls/certs/localhost.crt
  ssl_key: /etc/salt/tls/certs/localhost.key
  disable_ssl: True
This changes were done in separate file
/etc/salt/master.d/salt-api.conf
complete file look something like this
external_auth:
  pam:
     saltuser:
        – ‘.*’
        – ‘@wheel’   # to allow access to all wheel modules
        – ‘@runner’  # to allow access to all runner modules
        – ‘@jobs’    # to allow access to the jobs runner and/or wheel module
 
 
rest_cherrypy:
  port: 8000
  ssl_crt: /etc/salt/tls/certs/localhost.crt
  ssl_key: /etc/salt/tls/certs/localhost.key
  disable_ssl: True

Step 4

Restart the salt-master service
# systemctl restart salt-master

Step 5

Test with curl
curl -si localhost:8000/login \
-c ~/cookies.txt \
-H “Accept: application/json” \
-H “Content-type: application/json” \
-d ‘{
    “username”: “saltuser”,
    “password”: “saltuser”,
    “eauth”: “pam”
}’
Output would be similar
> -c ~/cookies.txt \
> -H “Accept: application/json” \
> -H “Content-type: application/json” \
> -d ‘{
>     “username”: “saltuser”,
>     “password”: “saltuser”,
>     “eauth”: “pam”
> }’
HTTP/1.1 200 OK
Content-Length: 206
Access-Control-Expose-Headers: GET, POST
Vary: Accept-Encoding
Server: CherryPy/3.5.0
Allow: GET, HEAD, POST
Access-Control-Allow-Credentials: true
Date: Mon, 09 Jul 2018 10:03:26 GMT
Access-Control-Allow-Origin: *
X-Auth-Token: 464914c055cfa5529865564567eb7782554af025
Content-Type: application/json
Set-Cookie: session_id=464914c055cfa5529865564567eb7782554af025; expires=Mon, 09 Jul 2018 20:03:26 GMT; Path=/
{“return”: [{“perms”: [“.*”, “@wheel”, “@runner”, “@jobs”], “start”: 1531130606.41419, “token”: “464914c055cfa5529865564567eb7782554af025”, “expire”: 1531173806.414191, “user”: “saltuser”, “eauth”: “pam”}]}

Step 6

Test with curl function test.ping
curl -b ~/cookies.txt -si localhost:8000/ \
    -H “Accept: application/json” \
    -d client=’local’ -d tgt=’*’ -d fun=’test.ping’
If you get output, you are all great with salt-api for 2018.3 version of salt.

Transaction in the Micro service

When working the Micro service architecture, one usually finds managing data is the hardest part.

In monolith applications, you would leverage database transaction for achieving consistency of the data.

But in the Micro-service, each service is a different process and achieving consistency is a big task

I had created Google slides where I compare different patterns which can be used for achieving data consistency, which is also know as eventual consistency within micro-service

Globalization and localization in ASP.NET Core – Part 3

If you just landed on this page, this page is of series of Post for localization / globalization

Part 1 – Introduction

Part 2 – ASP.NET Core ways of Localization / Globalization

Part 3 – Old trustworthy ResourseManager

If you are following the series of post, in this post, I would talk about working with ResourceManager.

ResourceManager class is nothing new, It been there since the inception of .NET framework. It was the de facto mechanism for working with localize resource file before aspnet core introduce IStringLocalizer.

In this post, I want to demonstrate, ResourceManager still works in dotnet core, if you don’t want to go with fuzzy IStringLocalizer.

Let dive in to the code, in the sample application, you fill find two methods in HelloController. In Local method

 string rsFileName = "api1.Resources.rs1";
 ResourceManager rm = 
       new ResourceManager(rsFileName, Assembly.GetExecutingAssembly());
 string msg = rm.GetString("Hello");

Here, it creating new instance of the ResourceManager which take basename and assembly where these resource files are embedded.

Basename is the root name of the resource file without its extension but including any fully qualified namespace name. For example, the root name for the resource file named
MyApplication.MyResource.en-US.resources is MyApplication.MyResource.

Assembly needs to be the one where resource files are embedded. In our case it is same as the one where running code (HelloController) is present. I can get current assembly which contains the code by Assembly.GetExecutingAssembly() which returns api1.dll assembly

In our case, resources file which ResourceManager will read are name rs1.resx and placed in “Resources” folder. Hence, basename will be api1.Resources.rs1

ResourceManager.GetString will looks for current culture on running thread and respectively look for culture specific resource file to get the value of key “Hello”

DifferentCultureResourceFile

If culture specific resource is not found, say “de-de“, it will use default resource with any suffix for getting the value of the “key“. If “key” values isn’t found in the resource file, it will return value as “key” itself.

If you need to get resource value for any other culture than the current culture on the running tread, you can by specifying the culture in the GetString method.

string msg3 = rm.GetString("Hello", new System.Globalization.CultureInfo("fr-FR"));

As simple as that.

Now, level 2. I had requirement where code for reading resource was in another assembly/dll then the one where resource files are located.

If you look into the lib method, it calls Lib1.ResRdr library to get the resource file message. But, rs1.resx is in api1.dll, not in Lib1.Rdr.dll

 public string lib()
 {
   string msg = "";
   msg = Lib1.ResRdr.Messenger.GetHello(); 
   return msg;
 }

If you look into GetHello() method

public static string GetHello()
 {
   string msg = "";
 
   Assembly asm = Assembly.GetCallingAssembly();
   string rsFileName = asm.GetName().Name + ".Resources.rs1";
   ResourceManager rm = new ResourceManager(rsFileName, asm);
   var msg2 = rm.GetString("Hello");
   msg = msg2;
   return msg;
 }

It creates new ResourceName, pass the basename and assembly which contains the embedded resource. Since, it seperate dll, it get’s the calling assembly by Assembly.GetCallingAssembly() which would return api1.dll or api2.dll, whichever calls  Lib1.Rdr.dll. Note, here if you had use Assembly.GetExecutingAssembly(), you will get Lib1.Rdr.dll.

For the basename, since resources are embedded in the dll, assembly root namespace needs to be taken from the assembly name. i.e. api1, api2. So, basename would be api1.Resources.rs1 or api2.Resources.rs2

Rest would be taken care by the ResourceManager, it would read remote assembly embedded resources and it’s satellite assemblies to get the translated value.

That’s all for Globalization and localization in ASP.NET Core. Hope you find this series of post informative.

 

Globalization and localization in ASP.NET Core – Part 2

If you just landed on this page, this page is of series of Post for localization / globalization

Part 1 – Introduction

Part 2 – ASP.NET Core ways of Localization / Globalization

Part 3 – Old trustworthy ResourseManager

In this post, I would be hinting on the ASP.NET Core way of  Localization / Globalization. There is already a good post on this from the Microsoft official Docs for ASP.NET Core at https://docs.microsoft.com/en-us/aspnet/core/fundamentals/localization and I am not going to repeat what’s in that post but would be sharing few highlight’s of that article.

In order to work with IStringLocalizer, important, you need to add following in the Startup.cs

// Add Localization services to the system
services.AddLocalization(options => options.ResourcesPath = "Resources")

// Configure supported cultures and localization options
services.Configure<RequestLocalizationOptions>(options =>
{
       var supportedCultures = new[]
                           {
                              new CultureInfo("en-US"),
                              new CultureInfo("de-DE")
                          };

// State what the default culture for your application is. 
// This will be used if no specific culture can be determined for 
// a given request.

options.DefaultRequestCulture = new RequestCulture("en-US", "en-US");

// You must explicitly state which cultures your application supports.
// These are the cultures the app supports for formatting 
// numbers, dates, etc.

options.SupportedCultures = supportedCultures;

// These are the cultures the app supports for UI strings, 
// i.e. we have localized resources for.

options.SupportedUICultures = supportedCultures;
});

In Configure in Startup.cs:

// Configure localization.
var locOptions = app.ApplicationServices.GetService<IOptions<RequestLocalizationOptions>>();
app.UseRequestLocalization(locOptions.Value);

This code (UseRequestLocalization) would set current culture info and current ui culture on the request execution thread.

By default, It determines culture from QueryString, Cookie and Request Header. Read first post to get some insight.

In order to understand better, I have create a sample application and have put the it on the GitHub, https://github.com/sbrakl/aspnetcoreglobalization where you can get the execution in action.

In HelloController.cs, local method on api1, you could see this code

string cul = Thread.CurrentThread.CurrentCulture.Name;
string culUI = Thread.CurrentThread.CurrentUICulture.Name;

If you debug this application, if your startup.cs initialization code is correct, it would see the set the thread culture.

In the same method, it set the localize string from IStringLocalizer _localizer.

msg = _localizer["Hello"];

This _localizer in injected in the controller by ASP.NET Core dependency injection, and would read from HelloController resources

HelloControllerResources.png

Placement and name of this resources file are important, otherwise, it won’t be able to read it. Great extent of naming and placement is written at https://docs.microsoft.com/en-us/aspnet/core/fundamentals/localization#resource-file-naming. That’s all you need to work with IStringLocalizer.

If you are curious just as me, to find out how this IStringLocalizer works? Then, you need to checkout it official GitHub Repository, https://github.com/aspnet/Localization

Dotnet Core is opensource, and you can read it code. When Dependency Inject (DI) is called to inject IStringLocalizer, it calls the ResourceManagerStringLocalizerFactory, which in turn create an instance of ResourceManagerStringLocalizer.

When ResourceManagerStringLocalizerFactory creates the instance of ResourceManagerStringLocalizer, it add new instance of the ResourceMananger

protected virtual ResourceManagerStringLocalizer 
             CreateResourceManagerStringLocalizer(
                                                 Assembly assembly,
                                                 string baseName)
 {
    return new ResourceManagerStringLocalizer(
                        new ResourceManager(baseName, assembly),
        assembly,
        baseName,
        _resourceNamesCache,
        _loggerFactory.CreateLogger<ResourceManagerStringLocalizer>());
 }

ResourceManager then in turn work with the resource files to get the translated text for the key

msg = _localizer["Hello"];

Here, “Hello” is the key and whatever value in the resource file, it will get that. Resource + culture file to be specific.

IStringLocalizer –> DI –> ResourceManagerStringLocalizerFactory –> ResourceManagerStringLocalizer –> ResourceManger

That’s the flow, how IStringLocalizer translated the messages.

Few interesting points here,

Type and Resource File Naming

IStringLocalizer takes the type. i.e.

IStringLocalizer<HelloController> _localizer

HelloController is the type. You resource needs to have HelloController name without default language resource file. What does in mean?

In ResourceManager era, default language resource file which won’t have culture suffix, example, abc resource would be named

"abc.resx"
"abc.FR-fr.resx" //French
"abc.ES-es.resx" //Spanish

“abc.resx” is default, if for any other cultures is pass apart from “FR-fr”, “ES-es”, this file would be used.

But, when working with IStringLocalizer, your default language set in Startup.cs. (see startup.cs code at top) Hence, you resource file should be named

"HelloController.en-US.resx"
"HelloController.FR-fr.resx"
"HelloController.ES-es.resx"

 

Location of resource files

In the startup.cs code, you mention resource path, where to find all the resources. It doesn’t need to “Resources”, it could be anything. When assembly is build, resource file would go as emended resources into the dll. ASP.NET Core would build satellite assemblies for different culture which you have mention.

ResourceFileInBuildDirectory

By definition, satellite assemblies do not contain code, except for that which is auto generated. Therefore, they cannot be executed as they are not in the main assembly. However, note that satellite assemblies are associated with a main assembly that usually contains default neutral resources (for example, en-US). A big benefit is that you can add support for a new culture or replace/update satellite assemblies without recompiling or replacing the application’s main assembly.

Dotnet Peek view of assembly

dotnetpeekAssembly

ResourceManager would find the resources embedded in assembly / satellite assembly base on the resource path and type name.

ResourceNameInAssembly.png

Further

It all good, when you are going translation in the same assembly, where resource files are located. But, in my case, resource file where in one assembly, and code to read from resource file was in another assembly. This where ASP.NET Core IStringLocalizer faded and was limited to for me. I need to resorted to good old ResourceManager which I would describe in the other post.

 

 

 

 

 

 

 

Globalization and localization in ASP.NET Core – Part 1

Here, this post is about my experience in the globalization and localization in asp.net core application.

This is series of Post for localization / globalization

Part 1 – Introduction

Part 2 – ASP.NET Core ways of Localization / Globalization

Part 3 – Old trustworthy ResourseManager

First for the fresher folks, let me explain what’s globalization and localization

Say, you need an web application which caters to different language say English and French. Greeting like “Hello Friend” needs to be translated to “salut l’ami” in french. How do we do it, simple. Create file with Name Value, on for English and other for French.

TextTranslate-English.txt -> Greeting =  “Hello Friend”

TextTranslate-English.txt -> Greeting =  “salut l’ami”

In Code -> var message  = TextTranslate[“Greeting”]

Base on the language, it will pick from the respective file. This is the idea of Globalization and localization. Dotnet folks figure this out long before in .NET 1.1 framework and in same can be done with Resource files in .NET.

In Visual Studio, you can add resource file to your project.

This resource file contain the key and value. Say, here Key is “Hello” and Value is “Hello-en-us”

Similarly, you can add different culture resource file.

In dotnet, language has more than text, it’s how you write dates, format numbers, currency symbols, etc. So, they call collection of this as culture

e.g. US English = 12/25/2017, UK English = 25/12/2017

Culture are denote by two letter alphabets like en, or en-US, en-UK. That’s US English and UK English. es-ES is Spanish Spain and es-MX Spanish Mexico. If just es is mention, then it defaults to Spanish Spain, en defaults to en-US.

And the class which handle this is known as cultureinfo. The thread in runs your code has two properties CultureInfo and UICultureInfo

Culture is the .NET representation of the default user locale of the system. This controls default number and date formatting and the like.

UICulture refers to the default user interface language, a setting introduced in Windows 2000. This is primarily regarding the UI localization/translation part of your app.

You can get running thread Culture with following code

 CultureInfo cul = Thread.CurrentThread.CurrentCulture;
 CultureInfo culUI = Thread.CurrentThread.CurrentUICulture;

Now, In ASP.NET, say your server in running in UK and you got the request from France. How does ASP.NET application know, it have to serve french text?

Fortunately, in ASP.NET Core, there is middleware which does it for us. By default, it has three ways

  • QueryString
  • Cookie
  • AcceptLanguage Header

QueryString

http://localhost:5000/?culture=es-MX&ui-culture=es-MX

Cookie

You can read Cookie as

The cookie format is c=%LANGCODE%|uic=%LANGCODE%, where c is Culture and uic is UICulture, for example: c=en-UK|uic=en-US

 

You can write cookie with following code

HttpContext.Response.Cookies.Append(
 CookieRequestCultureProvider.DefaultCookieName,
 CookieRequestCultureProvider.MakeCookieValue(requestCulture),
 new CookieOptions { Expires = DateTimeOffset.UtcNow.AddYears(1) });

Headers

When browser makes the request to website, it will send accept-language header. As seen in Chrome debug window -> Network -> Request Headers

AcceptHeader.png

You can read more about it here, https://docs.microsoft.com/en-us/aspnet/core/fundamentals/localization#implement-a-strategy-to-select-the-languageculture-for-each-request

This middleware would set the thread.currentCulture and thread.currentUICulture using either QueryString, Cookie or Accept-Language Header

Once current culture is set, Date.ToString() will return date as per Current culture. In en-US it would be 12/25/2017 and en-UK, it would be 25/12/2017

Next post we will see, how to implement asp.net localization package and how to use it.

Publish dotnet core app using MSDeploy

Recently, I got a task to setup CI-CD pipeline for the dotnet core project. CI was easy, CD had challenges.

In CD, I need to deploy on Windows as well as Linux. In this post, I going to describe for the windows. Microsoft did good job in documenting the asp.net core and IIS integration. https://docs.microsoft.com/en-us/aspnet/core/publishing/iis

What it say is, first install IIS, install the .NET Core Windows Server Hosting bundle on the hosting system and then create asp.net core website. This all as part of one time installation on the target machine.

Next part would be deploying new versions of the same ASP.NET core web application again and again. Now, this part needs to be automated as continues deployment (CD) pipeline.

Challenges are

  1. You build the code on one machine and deploy on other
  2. In my case, due to IT policy, I wasn’t allow to create network share folders for deployment
  3. It needs to be secure, as only users who had permissions can only deploy

Now, how do I copy website code from one machine to other machine IIS without network share folder? As I was looking around for a solution, I found Microsoft has already solve this problem way back in 2010 with Web Deploy.

Microsoft was always has confusing name conversion, they market as Web Deploy, but internally called it as MS Deploy. This tool is unsung hero of deployment. When it comes to automated deployment, people talked of chef, puppet but MSDeploy isn’t acknowledged much.

Okay, what this tools does? You can publish website remotely without need to transfer file manually to that machine and setup the website way you want. That’s means, it can create the app pool, set the folder permissions, configure the port, configure the binding, etc. Not just transfer of code files.

Now, there are few good resources over the web, esp from the Vishal Joshi http://vishaljoshi.blogspot.in/search?q=msdeploy and SAYED IBRAHIM HASHIMI http://sedodream.com/SearchView.aspx?q=msdeploy whom has written extensively of on how to work with MS Deploy.

You need to install Web Deploy 3.6 remote agent service on the target machine. Can’t figure what this is, read through the following blog post to install MS Deploy http://chamindac.blogspot.in/2016/05/deploy-aspnet-core-10-to-remote-iis.html

Here, build server in black and target deployment server  in red color.

MSDeploy.png

On the build server, you need to run following command to publish the dotnet core website, dotnet publish -o {path to publish folder}

dotnet publish -o D:\publishcode

Here, I want to deploy website in “Default Website/WebApp” on the target server.

WebappInIIS.png

Use this command to deploy on the build machine. Here, username password would be local administrative user of the target machine.

servernameorip can be computer name like “WL-TPM-348” or could be IP on machine like “10.20.30.15”

By default, MS Deploy is installed in C:\Program Files (x86)\IIS\Microsoft Web Deploy V3 folder. If it’s install elsewhere, but that path for msdeploy.exe.

D:>"C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe" -verb:sync -source:iisApp="D:\publishcode" -dest:iisApp="Default Web Site/WebApp",ComputerName="servernameorip",UserName='ServerUser',Password='password',AuthType='NTLM' -allowUntrusted

Mind, this is bare metal command to deploy code. It doesn’t have SiteManifest.xml which configures IIS Web application for app pool, ports, binding, etc. Neither it’s paramterize. But, it good example to get started.

Hope, you find this post useful