Posts

Showing posts from 2018

Moving to my own website kimsereylam.com

New contents are now posted to my personal website https://www.kimsereylam.com Enjoy!

Load test your API with Vegeta

Image
Load test your API with Vegeta Vegeta is a open source HTTP load testing tool. Today I’ll demonstrate how quickly and easily we can load test our API endpoint using it in three parts: Get Vegeta Setup a target file Generate reports 1. Get Vegeta Vegeta binaries are available on GitHub Releases . For Windows, all we need to do is to get the Windows executable and unzip it for example under C:\vegeta . The vegeta.exe is the executable we will be using. To make sure it works as expected, we can display the usage guide by execute vegeta.exe without any arguments. > vegeta.exe Usage: vegeta [global flags] <command> [command flags] global flags: -cpus int Number of CPUs to use (default 4) -profile string Enable profiling of [cpu, heap] -version Print version and exit attack command: -body string Requests body file -cert string ... The main concept of Vegeta are the targets . A target represents an endpoint which will be load

Create React App with Mobx in Typescript

Create React App with Mobx in Typescript I have been using state management frameworks for the past few years, mainly with Angular and NGRX . Today we will see how we can get started with Create-React-App using Mobx as a state management system. Bootstrap a fresh application Create components Create an observable state / store Create observers components 1. Bootstrap a fresh application Start by installing the latest version of NPM then use npx to setup a fresh project. npx create-react-app my-app --typescript Then navigate to /my-app and run npm start . This will start the application in a development server with a live reload. The command run by npm start is defined under package.json scripts > start and runs react-scripts start . cd my-app npm start The application should now build and run properly and any changes done on the application should be reflected on the browser. We now have all the necessary tool to start writing code in React. 2. Create component

Setup HTTPS with Nginx on Azure Ubuntu VM

Image
Setup HTTPS with Nginx on Azure Ubuntu VM Today we will see how we can setup HTTPS on using Certbot Nginx configuration on an Azure Ubuntu VM. This post will be composed of three steps: Prepare the VM Install Nginx Install Certbot 1. Prepare the VM We start first by creating an Azure VM on Ubuntu 18.04 with either password or SSH and allowing HTTP , HTTPS , SSH . Once done, we can select a custom DNS for our VM. This makes it easier to SSH but also it will be required for our SSL certificate setup. We set the Assignment as Static then we choose a DNS name label. Here we choose azure-test-vm therefore the VM will be accessible at azure-test-vm.southcentralus.cloudapp.azure.com . We should now be able to SSH into the VM using the command: ssh kimserey@azure-test-vm.southcentralus.cloudapp.azure.com 2. Install Nginx Next once we are in the VM, we can install Nginx by installing the following: sudo apt-get update sudo apt-get install nginx Once installed, as we already h

Moving from chaining to piping in rxjs 6.x

Moving from chaining to piping in rxjs 6.x Last month I updated all my NPM packages and realised that everything was broken! The reason why was that rxjs decided to move out of extensions on observable in favor of pipable functions from 5.x to 6.x. A major breaking change where majority if not all codebase built using rxjs needs to be changed. Today we will explore the reasoning behind this change. Reasons Example of migration Learnings 1. Reasons The reasons given on the official documentation are: Any library importing the patch operator augmenting the observable prototype would also augment it for all consumers. For example, when a library imports map extension from import 'rxjs/add/operators/map' , it makes map available for all code importing the library itself. This could bring confusion due to the inconsistent way of importing extensions. Some places would not need to important map (those were the library is used) while in other place of the code we would

Entity Framework Core Performance Optimization

Entity Framework Core Performance Optimization Last year I talked about Entity Framework Core . It is a easy and feature rich ORM which makes working with database in a .NET environment typesafe. But even though it makes things easy, there are ambiguous cases which can take us off guard. Today we will see four of this cases and how to deal with them. Client evaluation Iteration Include and ThenInclude NoTracking For the following examples, I will be using SQLite with Entity Framework Core. tl;dr Make sure that the query constructed in c# uses function that can be translated to SQL, Make sure that there isn’t an abnormal amount of queries created and that it does not iter item per item, Make sure to use Include and ThenInclude for object relation to include them after query execution, before query execution it is not needed, Use NoTracking for readonly queries to disable tracking on entity to yield better performance. 1. Client evaluation The following example illust

Create a Navigation loading bar for Angular with PrimeNG

Image
Create a Navigation loading bar for Angular with PrimeNG In Angular, it is common practice to execute commands prior routing a page using guards and resolvers. Guards prevent the routing from occuring until the condition turns to true and resolvers prevent the routing from occuring until the data is returned. Those actions can take time to complete and during the time being, the component will not load leaving the user with an impression of unresponsiveness. Today we will see how we can implement a navigation loading bar for Angular using PrimeNG progress bar component in two parts: Setup an Angular project PrimeNG Progress bar If you are unfamiliar with the Angular router, you can have a look at my previous blog post explaining the feature of the router https://kimsereyblog.blogspot.com/2017/06/how-to-use-angular-router.html . 1. Setup an Angular project We start by creating a project and installing PrimeNG. npm install primeng --save npm install primeicons --save Next we a

Monitor Upstream Response Time with Nginx and CloudWatch

Image
Monitor Upstream Response Time with Nginx and CloudWatch Last week we saw how we could Setup CloudWatch to push logs from our application to CloudWatch . Apart from the application logs, another type of logs that is worth looking into are the access logs from Nginx. Nginx being the entrypoint of the application, every traffic in and out goes through it and today we will see how we can leverage its access logs to monitor the response time of our application from CloudWatch in three parts: Setup Nginx to log upstream response time Setup logs to be pushed to CloudWatch Setup metrics filter on CloudWatch 1. Setup Nginx to log upstream response time By default Nginx does not log the upstream response time. In order to print it, we can create a new log format which includes it and configure logs to be written into a file using that format. We can add our log configuration in a file called log_format.conf which we can place under /etc/nginx/conf.d/ and nginx will include the configu

Serilog with AWS Cloudwatch on Ubuntu

Image
Serilog with AWS Cloudwatch on Ubuntu Few weeks ago we saw How to configure Serilog to work with different environment . At the end of the post, we saw briefly how to get the structured logs synced to Cloudwatch. Today we will explore the configuration in more details. Unified Cloudwatch agent Literate and json logs with Serilog Debug the Cloudwatch agent 1. Unified Cloudwatch agent The Unified Cloudwatch agent can be installed by following the official documentation https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/UseCloudWatchUnifiedAgent.html . There is a previous version of the Cloudwatch agent, this new version, introduced in December 2017, unifies the collection of metrics and logs for Cloudwatch under the same configuration. To install the agent, execute the following commands: mkdir ~/tmp cd tmp wget https://s3.amazonaws.com/amazoncloudwatch-agent/linux/amd64/latest/AmazonCloudWatchAgent.zip unzip AmazonCloudWatchAgent.zip sudo install.sh This will install

Basic Authentication with Nginx

Basic Authentication with Nginx Basic authentication provides an easy way to password protect an endpoint on our server. Today we will see how we can create a password file and use it to enable basic authentication on Nginx. Create a password Enable basic authentication Reuse configuration If you haven’t seen Nginx before, you can have a look at my previous blog post on Getting started with Nginx 1. Create a password To create a password, we will use the apache2-utils tool called htpasswd . sudo apt-get update sudo apt-get install apache2-utils htpasswd allows use to create passwords encrypted stored in a file sudo htpasswd -c /etc/myapp/.htpasswd user1 -c is used to create the file. If we want to add other users we can omit the parameter. sudo htpasswd /etc/myapp/.htpasswd user2 Now if we navigate to /etc/myapp/ we will be able to find our .htpasswd file containing encrypted passwords together with usernames. 2. Enable basic authentication Enabling basic authenti

Gitlab CI/CD with pipeline, artifacts and environments

Image
Gitlab CI/CD with pipeline, artifacts and environments Almost a year ago I wrote about how we could setup CI/CD with gitlab pipeline . I showed a very simple 3 stages pipeline build/test/deploy. Since then Gitlab has improved considerably their CI tool with features simplifying releases management. Today we will revisit pipelines and introduce few concepts which will help in managing releases. Pipeline Releases Artifacts Environments 1. Pipeline Pipeline are defines as jobs. Each job can be part of a stage in the pipeline and multiple jobs can run concurrently if part of the same stage. The pipeline is define in a .gitlab-ci.yml file placed at the root of the application. We can setup our own runner or use a shared runner from Gitlab. The shared runner runs on Docker therefore it’s possible to build the dotnet image and build our dotnet application. Here is the pipeline we will be using as example pipeline: image : microsoft/dotnet:latest stages: - build - test - pa

Setup a Jenkins Pipeline for local development environment in Docker container

Image
Setup a Jenkins Pipeline for local development environment in Docker container CI/CD pipelines allow us to automatically build, test and deploy code changes. With Jenkins pipeline , the pipeline itself is generated from a file called the Jenkinsfile which, usually, is source controlled together with the source code repository. When we need to push new changes, what we would usually do, is test locally and then commit to the repository. From there, the Jenkins pipeline will trigger and build, test on the integration server and deploy to a testable environment (DEV/QA). But what do we do when the changes that we are making are on the Jenkinsfile itself? How do we test locally the validity of the Jenkinsfile or more simply, how do we try on a sandbox a Jenkins pipeline to learn how to write a Jenkinsfile? Today we will see how we can setup a sandbox with a full CI/CD deployment which can be quickly brought up and teared down for testing. Jenkins server via docker Jenkins pipeline Sim