Wednesday, March 20, 2024

Convert Java Project from Log4j 1 to Log4j2

Many times while working on old Java projects we find Log4j 1.x is used. But as the Log4j2 is the new one; hence to upgrade to Log4j2 we need to change the all the files with new package details.

This is a tedious job. OpenRewrite comes up with a solution; where you can do the below steps to convert your project to Log4j 2 from Log4j 1.x

Steps:
  • Navigate to the project folder in command prompt
  • Run the below command
mvn -U org.openrewrite.maven:rewrite-maven-plugin:run -Drewrite.recipeArtifactCoordinates=org.openrewrite.recipe:rewrite-logging-frameworks:RELEASE -Drewrite.activeRecipes=org.openrewrite.java.logging.log4j.Log4j1ToLog4j2
  • This will convert all the imports in file to Log42 packages & remove Log4j 1.x dependency & will add the Log4j2 dependencies automatically in pom.xml
  • Add the LMAX Disruptor dependency in pom.xml as below
<dependency>
<groupId>com.lmax</groupId>
<artifactId>disruptor</artifactId>
<version>3.4.4</version>
</dependency>
  • Create the log4j2.xml (under src\main\resources folder); a sample one could be like below. Here the assumption is log files are created within logs folder of Tomcat . Replace <AppName> with the app name.

<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN">
<Appenders>
<Console name="Console" target="SYSTEM_OUT">
<PatternLayout pattern="%d{dd-MM-yyyy HH:mm:ss} [%t] %-5p %c - %m%n"/>
</Console>
<!-- Generate rolling log for router with per hour interval policy -->
<RollingFile name="ProcessorRollingFile" fileName="${sys:catalina.home}/logs/<AppName>.log" filePattern="${sys:catalina.home}/logs/$${date:yyyy-MM-dd}/<AppName>-%d{yyyy-MM-dd-HH}-%i.log.gz">
<PatternLayout>
<!--<pattern>%d{ISO8601} [%t] %p %c %L - %m%n</pattern>-->
<pattern>%d{dd-MM-yyyy HH:mm:ss} [%t] %-5p %c - %m%n</pattern>
</PatternLayout>
<Policies>
<SizeBasedTriggeringPolicy size="500 MB"/>
</Policies>
<DefaultRolloverStrategy max="100"/>
</RollingFile>
<!-- Register Async appender -->
<Async name="AsyncRollingFile">
<AppenderRef ref="ProcessorRollingFile"/>
</Async>
</Appenders>
<Loggers>
<AsyncLogger name="root" level="WARN" additivity="false">
<AppenderRef ref="AsyncRollingFile"/>
</AsyncLogger>
</Loggers>
</Configuration>


In case you do skip tests in the project while running maven then the same needs to be applied while executing the maven command for OpenRewrite. If you use a profile, the same needs to be added to in maven command. If the profile name is Live the full command with skip tests will look like below:

mvn -U org.openrewrite.maven:rewrite-maven-plugin:run -Drewrite.recipeArtifactCoordinates=org.openrewrite.recipe:rewrite-logging-frameworks:RELEASE -Drewrite.activeRecipes=org.openrewrite.java.logging.log4j.Log4j1ToLog4j2 -Dmaven.test.skip=true -PLive


Wednesday, February 28, 2024

Creating Tag for Docker Image

Once we create an image of an application & push it to Docker Container, then that tagged as latest one.

Now next time we need to rebuild & push the image the latest tag gets overridden.

But if the latest images gives any error then we should be able to get the previous tag to deploy.

Hence we need to create a separate tag for the current latest image before pushing the new image.

Below are the commands to do the same:

sudo docker tag <docker repo>/testwebapp:latest <docker repo>/testwebapp:prev

sudo docker push <docker repo>/testwebapp:prev

This will create a tag name prev (You can choose any name) of the present latest image.

This command should run before pushing the new image to Docker.

Monday, January 22, 2024

Convert SVN Project to Git Project

Here we are going to check how convert a SVN project to Git project in Local filesystem

Steps:
  • Pre-requisite: Git to be preinstalled in your m/c
  • Goto Command prompt & run below command
  • git svn clone -r HEAD <SVN Codebase URL>
  • This will create a folder with same name of app with  .git file

Sunday, January 21, 2024

Semgrep

Semgrep is used for SAST tool.

Steps to get the SAST report:

  1. Checkout the code in your local directory from Github.
  2. Goto https://semgrep.dev/login/ & create the login
  3. docker run -it returntocorp/semgrep semgrep login
  4. Copy the URL provided in browser to Activate the token
  5. From Command prompt navigate to local folder where code is checked out from Github
  6. From command prompt copy the token & run below command with token
  7. docker run -e SEMGREP_APP_TOKEN=<token> --rm -v "<local repo>:/src" returntocorp/semgrep semgrep ci
  8. Check the report from SemGrep UI
Additional Info (For SVN repo):
Semgrep presently supports only Git project.
Hence if you are using SVN as code repository,  then first convert the SVN to Git project (Details in link http://souravdalal.blogspot.com/2024/01/convert-svn-project-to-git-project.html)

Once done, you can ran the above steps on the for generating the report.
In case you get a error like "Unable to infer repo_url. Set SEMGREP_REPO_URL environment variable or run in a valid git project with remote origin defined", then add the git repository using below command

git remote add origin https://github.com/<repo_name>

Incase, you want to dump the report to in local file then use below command

docker run -e SEMGREP_APP_TOKEN=<token> --rm -v "<local repo>:/src" returntocorp/semgrep semgrep ci > semrep_report.txt


Thursday, January 18, 2024

Trivy Code Vulnerability report

Trivy provides Third party library vulnerability report along with security key exposure in your code.

The tool also provides the version in which the vulnerability is fixed.

You can use the below steps to get a report by checkout the code from your repo:

Go to https://github.com/aquasecurity/trivy/releases/download/v0.48.3/trivy_0.48.3_windows-64bit.zip

Download the zip

Extract the folder

Goto <Extracted Folder>\trivy_0.48.3_windows-64bit

Open command line from above folder

run the below command

trivy fs <codebase path in local m/c > <app_name_>sec_rpt.txt

Further reading:

https://trivy.dev/


Sunday, January 7, 2024

How to manage Docker images in Github Packages

Instead of using Docker Hub, GitHub Container Registry can also be used for Image management.

You need to follow the below steps to do that:

1. Login to GHCR from Docker CLI using below command. Replace with your username & personal access token

docker login ghcr.io -u YOUR_GITHUB_USERNAME -p YOUR_PERSONAL_ACCESS_TOKEN

2. Build the Docker image locally

docker build -t ghcr.io/OWNER/IMAGE_NAME:TAG .

3. Push the docker image to GHCR

docker push ghcr.io/OWNER/IMAGE_NAME:TAG

Links for further readings:

https://cto.ai/blog/build-and-deploy-a-docker-image-on-ghcr/

Monday, December 25, 2023

How to access Multiple Git Account from Single Computer

You might have more than one account in Github one is your personal & another is your office account.

To use both of them in Single Windows PC you can do the following:

Pre-requisite: TorotiseGit is installed


Open the terminal in Windows & run the below command

git config --global credential.useHttpPath true

Now, if you try to checkout a repo , then it will ask for Github account user name & password.

For password use Personal Access Token

Personal Access Token can be generated from Profile--> Settings-->Developer Settings-->Personal access tokens (classic) 

Friday, October 27, 2023

SonarQube Postgres Installation

SonarQube with JDK 17 & Postgres as DB backend

DB:

Install Postgres Server 16.x

Execute the below SQL Script from Postgres DB:

CREATE USER sonar ;

ALTER USER sonar WITH PASSWORD 'sonar';

CREATE DATABASE sonardb WITH ENCODING 'UTF8';

ALTER DATABASE sonardb OWNER TO sonar;


SonarQube:

Add SONAR_JAVA_PATH in Environment variable  with value like below <Java_Path>\Java\jdk17.0.8_8\bin\javaw.exe


In sonar.properties 

1. Update JDBC url

sonar.jdbc.url=jdbc:postgresql://localhost:5432/sonardb?currentSchema=public

2. Update the below 2 properties too

sonar.jdbc.username=sonar

sonar.jdbc.password=sonar


Once done with above steps; 

Navigate to <Sonar Path>\sonarqube-10.2.1.78527\bin\windows-x86-64

From Command prompt type StartSonar.bat to start

Sonar should be accessible:

http://localhost:9000/




Monday, August 14, 2023

How to Zip the Response for Java based Web app

 For Spring 4.x based web applications we can use the following stuff to add compression on response

Add the below dependency in pom.xml

<dependency>
<groupId>net.sf.ehcache</groupId>
<artifactId>ehcache-web</artifactId>
<version>2.0.4</version>
</dependency>

Then in web.xml add the below filter declarion:
<filter>
<filter-name>GzipFilter</filter-name>
<filter-class>net.sf.ehcache.constructs.web.filter.GzipFilter</filter-class>
</filter>
<filter-mapping>
<filter-name>GzipFilter</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>

Monday, July 31, 2023

Creating POJO from JSON using Manifold System

Many a times we need to create POJO from JSON samples. This is a tedious job as we need to create a class & then add the properties. 

Is there a better way that Java can automatically generate the class parsing the JSON file.

Yes, Manifold System comes to the rescue. (http://manifold.systems/)

Let's see an example:

Pre-requisite: 

1. Java 8

2. Intellij IDE

Steps:

Plugin Install:

Open Intellij IDE

Go to Settings->Plugins -> Marketplace

Find Manifold & install the same.


Dependency Addition in Maven Project 

First add the below dependencies in pom.xml

<properties>
<maven.compiler.source>8</maven.compiler.source>
<maven.compiler.target>8</maven.compiler.target>
<manifold.version>2022.1.29</manifold.version>
</properties> 

Dependency:

<dependency>
<groupId>systems.manifold</groupId>
<artifactId>manifold-json-rt</artifactId>
<version>${manifold.version}</version>
</dependency>

Build Config:

<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<configuration>
<source>8</source>
<target>8</target>
<encoding>UTF-8</encoding>
<compilerArgs>
<!-- Configure manifold plugin -->
<arg>-Xplugin:Manifold</arg>
</compilerArgs>
<!-- Add the processor path for the plugin -->
<annotationProcessorPaths>
<path>
<groupId>systems.manifold</groupId>
<artifactId>manifold-json</artifactId>
<version>${manifold.version}</version>
</path>
<path>
<groupId>systems.manifold</groupId>
<artifactId>manifold-strings</artifactId>
<version>${manifold.version}</version>
</path>
</annotationProcessorPaths>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.2</version>
</plugin>
</plugins>
</build>

If Lombok is already used in the project then use below configuration:
 <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.8.0</version>
                <configuration>
                    <source>8</source>
                    <target>8</target>
                    <encoding>UTF-8</encoding>
                    <compilerArgs>
                        <!-- Configure manifold plugin -->
                        <arg>-Xplugin:Manifold</arg>
                    </compilerArgs>
                    <!-- Add the processor path for the plugin -->
                    <annotationProcessorPaths>

                        <processorPath>
                            <groupId>org.projectlombok</groupId>
                            <artifactId>lombok</artifactId>
                            <version>1.18.24</version>
                        </processorPath>
                        <processorPath>
                            <groupId>systems.manifold</groupId>
                            <artifactId>manifold-json</artifactId>
                            <version>${manifold.version}</version>
                        </processorPath>
                        <processorPath>
                            <groupId>systems.manifold</groupId>
                            <artifactId>manifold-strings</artifactId>
                            <version>${manifold.version}</version>
                        </processorPath>


                    </annotationProcessorPaths>
                </configuration>
            </plugin>


Code:

Create a sample JSON file in src/main/resources folder.

The file should be created with package structure. 

The file name should follow the class name.

E.g.

Create a folder named restapi in src/main/resources/

Create a file named User.json with below content:

{
"firstName": "Sourav",
"lastName": "Dalal"
}
A new class User.class will be created dynamically.
The class file will be created under package restapi. You can create any package structure of your choice
You can access the class like below
User testUser = User.create();
testUser.setFirstName("X");
testUser.setLastName("Y");
System.out.println(testUser.write().toJson()
Happy Coding !! :)
Further Reading:
http://manifold.systems/articles/articles.html
https://debugagent.com/revolutionize-json-parsing-in-java-with-manifold




Monday, June 26, 2023

How to use RxJava to process multiple threads in parallel & in sequence

 Hi,

In this blog I will discuss 2 scenarios where RxJava threading can be helpful:

1. Run Multiple threads in parallel & wait for all of them to complete

2. Run threads sequentially (one after another)


First, add the below dependency in Maven

<dependency>
<groupId>io.reactivex.rxjava2</groupId>
<artifactId>rxjava</artifactId>
<version>2.2.21</version>
</dependency>
Below are the 2 Completable threads created
import io.reactivex.Completable;
import io.reactivex.schedulers.Schedulers;
Completable cp1= Completable.fromAction(()->{
//business logic
}).subscribeOn(Schedulers.io());

Completable cp2= Completable.fromAction(()->{
//business logic
}).subscribeOn(Schedulers.io());
Now for scenario 1
cp1.mergeWith(cp2).blockingAwait(); // cp1 & cp2 will run concurrently & wait for complete for both tasks

For scenario 2
cp1.andThen(cp2).subscribe(); //cp2 will run after cp1

Thursday, June 8, 2023

How to view Wifi Password in Network

 Many times you may need to check the password set-up in the Wifi Network. You can use the following steps:

Open Windows command prompt

type in below command replacing the wifi network name

netsh wlan show profile <wifi_nw_name> key=clear

In response check Key Content value under Security Settings


Monday, May 8, 2023

Externalization of Logs from Docker Container

For Docker , if we want to persist any data after the container is removed we can use the following command format:

docker run -d -p 9090:8080 -v <OS File Path>:<Docker container Path> tomcat-test-webapp

Here the logs with be stored in OS file/folder path instead of storing in  Docker container path

docker build -t tomcat-test-webapp .

docker run -d -p 9090:8080 -v F:/docker_data/logs:/usr/local/tomcat/logs tomcat-test-webapp


The -v option is used to create volume which persists in OS even after the container is removed.


With container name & Catalina Options it should look like below:

docker run -d --name testDockerWebApp -p 9090:8080 -v F:/docker_data/logs:/usr/local/tomcat/logs -e CATALINA_OPTS="-Xms512M -Xmx512M" tomcat-test-webapp

Format with log, connection pool, properties file externalization:

docker run -d --name <image_name> -p <external_port>:<Docker Internal Port> -v /opt/app_log/<app_name>:/usr/local/tomcat/logs -v /opt/app_cp/<app_name>:/usr/local/tomcat/conf/Catalina/localhost -v /opt/app_prop/<app_name>:/usr/local/properties/<app_name> -e CATALINA_OPTS="-Xms512M -Xmx512M" <docker_hub_image_name>

Friday, March 24, 2023

Deploy WAR file in Docker Container

Pre-requisite: Docker Desktop for Windows to be installed & started

Steps:

Create a WAR file [e.g. TestApp.war]

Create a Dockerfile

Place both of them (WAR & Dockerfile) in same folder (e.g. D:\DevOps)

Navigate to that folder (D:\DevOps) & open command prompt

Run below command to create a image name. Here tomcat-test-webapp is the name of image 

docker build -t tomcat-test-webapp .

To run the image use below command

docker run -d -p 8080:8080 tomcat-test-webapp

In case the port to be different use the below structure

docker run -d -p <custom port>:8080 tomcat-test-webapp

Content of Dockerfile

FROM tomcat:9.0.52-jdk8-corretto

COPY ./TestWebApp.war /usr/local/tomcat/webapps

EXPOSE 8080

CMD ["/usr/local/tomcat/bin/catalina.sh","run"]


https://www.youtube.com/watch?v=B9vy3DMHo2I

Pushing Image to Docker Hub

https://www.cloudbees.com/blog/using-docker-push-to-publish-images-to-dockerhub


Pulling Images from Docker Hub & Run

docker login

docker pull <docker_user_name>/tomcat-test-webapp:latest

docker run -d -p 9080:8080 <docker_user_name>/tomcat-test-webapp

In case to run image with memory settings or other Catalin options ; use below command:

docker run -d -p 9080:8080 -e CATALINA_OPTS="-Xms512M -Xmx512M"  <docker_user_name>/tomcat-test-webapp


Auto Deploy the changes from Docker Hub:

WATCHTOWER_POLL_INTERVAL is set in sec.


docker run -d --name watchtower -e REPO_USER=<> -e REPO_PASS=<> -e WATCHTOWER_POLL_INTERVAL=30 -e WATCHTOWER_CLEANUP=true -v /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower <container_name>


https://alexgallacher.com/auto-update-docker-containers-by-setting-up-watchtower/

https://containrrr.dev/watchtower/

https://www.geekyhacker.com/how-to-use-spotify-docker-maven-plugin/


Wednesday, January 25, 2023

Software Links

 1. Mockoon https://mockoon.com/

2. PlanetUML

3. Junit Auto test case generator (https://www.diffblue.com/community-edition/download/)

Thursday, January 12, 2023

Opensearch Install Windows

OpenSearch is a ElasticSearch fork from Amazon

Download Opensearch from https://opensearch.org/downloads.html

Extract opensearch-2.4.1-windows-x64.zip in a folder in Windows

Openserach comes with JDK 17

Set JAVA_HOME for JDK 17 in <OpenSearch Extracted Folder>/opensearch-2.4.1/bin/opensearch-env.bat

set JAVA_HOME=<OpenSearch Extracted Folder>/opensearch-2.4.1/jdk

Open <OpenSearch Extracted Folder>/opensearch-2.4.1/config opensearch.yml

Add the below line to disable secure connection 

plugins.security.disabled: true

Now go to <OpenSearch Extracted Folder>/opensearch-2.4.1/opensearch-windows-install.bat

from command prompt.

Once started , navigate to http://localhost:9200/



Monday, November 28, 2022

Running a continuos job using Reactor

In this section we will try to execute a job at a certain interval, using Spring Reactor framework. The job should be defined in the subscribe method.

Maven Dependency:

<dependency>

    <groupId>io.projectreactor</groupId>

    <artifactId>reactor-core</artifactId>

    <version>3.4.24</version>

</dependency> 

Code Snippet:

Running a job at a certain interval (10 sec) forever.

Mono.just("SD").delaySubscription(Duration.ofSeconds(10)).repeat(()->true).subscribe(a->System.out.println(a));

Wednesday, September 14, 2022

Installing Kotlin on Eclipse

Recently I have faced problem to install Kotlin Plugin for Eclipse IDE. As per the StackOverflow post  the plugin has been removed. The nelwy forked linked that can be used to install Kotlin in Eclipse is

https://github.com/bvfalcon/kotlin-eclipse-2022


Sunday, April 17, 2022

AWS Translate using AWS JS SDK

 In this post, I will discuss how to use AWS Javascript SDK V3 for AWS Translate.

Prerequisite: Node.js should be pre-installed

Steps:

  • Create a folder e.g. AWSTranslateApp
  • From command prompt navigate to folder "AWSTranslateApp"
  • Type npm init
  • This will create Node project
  • Install AWS JS SDK by typing "npm install @aws-sdk/client-translate" under folder "AWSTranslateApp"
  • Now open the "AWSTranslateApp" folder in VS Code
  • Create a file translateExample.js

Below is the code snippet for translateExample.js

const { TranslateClient, TranslateTextCommand } = require("@aws-sdk/client-translate");

 async function getTranslatedData(){

  const client = new TranslateClient({ region: "<region_name>", 

  credentials: {

    accessKeyId: '<access_key>', 

    secretAccessKey: '<secret_key>'

  } 

});

  const params = {

    SourceLanguageCode: "en",

    TargetLanguageCode: "es",

    Text: "Hello, world"

  };

      const command = new TranslateTextCommand(params);

      const data = await client.send(command);   

      const jsonData =await data.TranslatedText;

      console.log(jsonData);

  }

 getTranslatedData();


  • Replace the region, access key, secret key with the actual values. In my case the region was 'us-east-1'.
  • Once done, you can now run the program using below command from command prompt

        node translateExample

References:

https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-translate/index.html



Saturday, April 9, 2022

AWS Service Call from Java SDK

 Here in this post I will discuss how to use AWS Translate Service from Java Code.

Steps:

1. Add the below dependency in pom.xml

<dependency>

    <groupId>com.amazonaws</groupId>

    <artifactId>aws-java-sdk-translate</artifactId>

    <version>1.12.194</version>

</dependency>

2. Add below imports in Java File

import com.amazonaws.auth.AWSStaticCredentialsProvider;

import com.amazonaws.auth.BasicAWSCredentials;

import com.amazonaws.regions.Regions;

import com.amazonaws.services.translate.AmazonTranslate;

import com.amazonaws.services.translate.AmazonTranslateClient;

import com.amazonaws.services.translate.model.TranslateTextRequest;

import com.amazonaws.services.translate.model.TranslateTextResult;

3. Below is the code snippet; provide the accessKey & secretKey to access the service

BasicAWSCredentials credentials = new BasicAWSCredentials("<access_key>",
"<secret_key>");
AmazonTranslate translate = AmazonTranslateClient.builder()
.withCredentials(new AWSStaticCredentialsProvider(credentials)).withRegion(Regions.US_EAST_1).build();
TranslateTextRequest request = new TranslateTextRequest().withText("Hello, world").withSourceLanguageCode("en")
.withTargetLanguageCode("es");
TranslateTextResult result = translate.translateText(request);
System.out.println(result.getTranslatedText());

AWS Service Call from Postman

Many of the times you need to use AWS services via Postman for testing.

I am using Amazon Translate Service, below are the configuration made to call the service:

In Postman choose the Post method for call

Service URL: https://translate.us-east-1.amazonaws.com/

In "Authorization" tab choose "Type" "AWS Signature"

Provide AccessKey, SecretKey, AWS Region, Service Name

The region I am using is "us-east-1" , Service Name will be "translate"

Then Move to "Headers" tab & add the following Headers

Content-Type: application/x-amz-json-1.1

X-Amz-Target: AWSShineFrontendService_20170701.TranslateText

N.B. X-Amz-Date will be generated by Postman automatically

Under "Body", select "raw", and added the following sample body:

{
    "SourceLanguageCode": "en",
    "TargetLanguageCode": "es",
    "Text": "Hello, world"
}

Now hit the Send button & check the result.

Clicking on the "Code" in Postman you can also get the cod for Java/Node JS and many other languages.

Helpful Links:

https://stackoverflow.com/questions/59128739/how-to-use-aws-translate-translatetext-api-via-postman

https://docs.aws.amazon.com/translate/latest/dg/API_Reference.html

Wednesday, March 30, 2022

Maven Repository configuration in pom.xml

Many a times you might face to downlod maven dependency in Jenkins due to netwok issue. In that case, Maen repository needs to be configured in pom.xml.

Below is the snippet you can use to configure maven repo.

<project>

..................

    <repositories>

<repository>

<id>central</id>

<name>Central Repository</name>

<url>https://repo.maven.apache.org/maven2</url>

<layout>default</layout>

<snapshots>

<enabled>false</enabled>

</snapshots>

</repository>

</repositories>


<pluginRepositories>

<pluginRepository>

<id>central</id>

<name>Central Repository</name>

<url>https://repo.maven.apache.org/maven2</url>

<layout>default</layout>

<snapshots>

<enabled>false</enabled>

</snapshots>

<releases>

<updatePolicy>never</updatePolicy>

</releases>

</pluginRepository>

</pluginRepositories>

    

</project>

Thursday, March 10, 2022

Calling external services using RestTemplate

 Though today's date, microservice with  Spring boot is a very popular architecture; but along with new architectural apps we need to maintain legacy apps also.

In legacy apps, we will often found external http calls. In my case, most of the times I found the legacy code is using Apache commons httpclient or plain java http calls.

One of the common mistake is, after making the call we often leave the http connection open instead of closing. This causes probelm when the called service is down & in those cases the calling apps will have memory issue.

To solve this, we can  use Spring RestTemplate instead of  other http clients. The main reason is, the code is tiny & crisp ; also the closing of connection is also handled by Spring.

Steps:

1. Add below maven dependency:


        <dependency>

            <groupId>org.springframework</groupId>

            <artifactId>spring-web</artifactId>

            <version>3.0.6.RELEASE</version>

        </dependency>

2. Below is a sample code: Assumption is Code is sending JSON request & receiving JSON response


public static String callRestService(String serviceUrl, String strRQ) {

SimpleClientHttpRequestFactory clientHttpRequestFactory= new SimpleClientHttpRequestFactory();

//Connect timeout in milisec.

clientHttpRequestFactory.setConnectTimeout(1000);


//Read timeout in milisec

clientHttpRequestFactory.setReadTimeout(1000);

RestTemplate restTemplate = new RestTemplate(clientHttpRequestFactory);

HttpHeaders headers = new HttpHeaders();

headers.setAccept(Arrays.asList(MediaType.APPLICATION_JSON));

headers.setContentType(MediaType.APPLICATION_JSON);

HttpEntity<String> entity = new HttpEntity<String>(strRQ, headers);

ResponseEntity<String> response=restTemplate.exchange(serviceUrl, HttpMethod.POST, entity, String.class);

return response.getBody();

}


Tuesday, August 24, 2021

TechBlog Links

 As a good software engineer, one needs to keep updated on the emerging technologies & the use cases where to apply those technologies. 

To gain more knowledge, one should follow the technology blogs; below are few of my favorite technology blogs:

1. https://blog.allegro.tech/

2. https://doordash.engineering/

3. https://engineering.cerner.com/

4. https://booking.design/

5. https://netflixtechblog.com/

6. https://medium.com/expedia-group-tech

7. https://comcast.github.io/blog.html

8. https://shekhargulati.com/

9. https://www.appsdeveloperblog.com/keycloak-rest-api-create-a-new-user/



Saturday, August 14, 2021

Routing Http Calls through Proxy

Proxy server is one of the network backbone for any corporate network. There are 2 types of proxy setup

1. Forward Proxy: Used for the outbound traffic going from your network to Internet. It is also called Client Side Proxy.

2. Reverse Proxy: Used for inbound calls where traffic is coming from Internet to your network.

Below picture depicts the 2 proxies


In this article, we will discuss on Forward Proxy setup & dicuss how to route the calls through Forward Proxy from Java Http client calling codes.

Step 1: Forward Proxy Setup in Windows 

There are many open source Forward Proxy available like Apache Httpd , Squid etc.

I have chosen Squid for Proxy setup as it has a very easy setup.

First download Squid from https://squid.diladele.com/ & install the msi

This will be installed as a Windows service.

Step 2: Post Installation configuration of Squid

Once installed you will find Squid tray

Click "Open Squid Configuration" option

Add the below one at last of the configuration file, this will speed up the traffic calls.

dns_v4_first on

Step 3: Client calls from Java routing through Proxy

Suppose you want to call https://www.google.com/ from Java Client.

In this example, we will use Spring RestTemplate.

Create a new Maven project.

3.1. Add the below dependencies

<dependency>

<groupId>org.apache.httpcomponents</groupId>

<artifactId>httpcore</artifactId>

<version>4.4.13</version>

</dependency>

<!-- https://mvnrepository.com/artifact/org.springframework/spring-web -->

<dependency>

    <groupId>org.springframework</groupId>

    <artifactId>spring-web</artifactId>

    <version>4.3.30.RELEASE</version>

</dependency>


3.2. Sample Code for Proxy call:

import java.net.InetSocketAddress;

import java.net.Proxy;

import java.net.Proxy.Type;

import org.springframework.http.ResponseEntity;

import org.springframework.http.client.SimpleClientHttpRequestFactory;

import org.springframework.web.client.RestTemplate;

public class ProxyHttpClient {

private static String PROXY_SERVER_HOST = "localhost";

private static int PROXY_SERVER_PORT = 3128;

public static void main(String[] args) {

Proxy proxy = new Proxy(Type.HTTP, new InetSocketAddress(PROXY_SERVER_HOST, PROXY_SERVER_PORT));

SimpleClientHttpRequestFactory requestFactory = new SimpleClientHttpRequestFactory();

requestFactory.setProxy(proxy);

RestTemplate restTemplate = new RestTemplate(requestFactory);

ResponseEntity<String> responseEntity = restTemplate.getForEntity("https://www.google.com/", String.class);

String bodyStr = responseEntity.getBody();

System.out.println("bodyStr:" + bodyStr);

}

}

Links:

Squid Setup

Proxy Concept

Wednesday, June 9, 2021

Event Handling in Spring

Suppose you are working on a Order Management system. Once the order is placed, the  system needs to do the following taks:

1. Send email notification to customer

2. Send request to Payment processing system to make payment.

Generally, in traditional way of programming, once the order is placed we call below 2 methods:

sendEmailToCustomer()

makePayment()

Now suppose , the product owner gives you a requirement to send email notification to seller also once the order is placed. To do that, now you need to introduce another method, sendEmailToSeller, along with the above 2 methods.

This approach has a drawback. If the order is created from multiple places, we need to introduce this change in all these places.

We can handle the same problem in event driven approch. We can consider Order creation as an event; hence it becomes a producer for event & sending email to cutomer , making payment & sending email to Seller become the event consumers.

Spring framework comes with an in-built support for Event Driven processing. It requires 3 elemts for an event:

1. the Event itself

2. Pulisher of the event

3. Consumer/Listener of the event

All of these are handled in Spring framework in an elegant way. 

Prerequisite: 

Java 8

Spring framework version: 4.3.30.RELEASE

Maven dependency:

                <dependency>

<groupId>org.springframework</groupId>

<artifactId>spring-core</artifactId>

<version>4.3.30.RELEASE</version>

</dependency>

<dependency>

<groupId>org.springframework</groupId>

<artifactId>spring-context</artifactId>

<version>4.3.30.RELEASE</version>

</dependency>

Event: The event can be any Java Bean model class; for brevity have removed the getters & setters. You can add @Getter & @Setter annotation from lombok library also.

public class OrderEvent {

private String itemName;

private int quantity;

}

Event Publisher: Spring comes with an in built ApplicationEventPublisher class defined in org.springframework.context.ApplicationEventPublisher.

You can publish the event like below: 

@Service

public class OrderEventProducer {

@Autowired

private ApplicationEventPublisher publisher;

public void publishTestEvent() {

OrderEvent order = new OrderEvent();

order.setItemName("Pen");

order.setQuantity(5);

System.out.println("Puslishing order");

publisher.publishEvent(order);

}

}

Event Listener: Once the event is pusblished, it can be consumed. The consumers are called EventListner. Spring comes with below features for Event listener/consumers
1. The consumer can be asynchronous , add @EnableAsync annotation at class level & the method to be processed async. way need to add @Async annotation.
2. For multiple consumers orders can be set with @Order annotation
3. Any method can be marked as event listener with  @EventListener annotation
4. The event listener method must have the same event argument in consumer method as published from ApplicationEventPublisher.

The code snippet will look like below.

@Component
@EnableAsync
public class OrderEventListener {
@Async
@EventListener
@Order(1)
public void sendEmailToCustomer(OrderEvent event) {
System.out.println("Starting email sending");
delay();
System.out.println("sendEmail:" + event);
}

@EventListener
@Order(2)
public void makePayment(OrderEvent event) {
System.out.println("makePayment:" + event);
}

private void delay() {

try {
TimeUnit.SECONDS.sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}

Now, you can easily add other consumer methods for same event with OrderEvent  as parameter. No code change required at producer end.

Code Example:


Furthur Reading:


Tuesday, June 8, 2021

Caching using Hazelcast

Caching is one of the important aspect when we do system design as it enhance the performance.

In many of my applications I have used Ehcache as a cache provider with Spring applications. One of the problem with Ehcache is ; the cache resides in Single Node. 

Let's consider the below scenario:

Suppoe, we have an application where we have a method which provides Book details based on isbn provided in input. The method to findBookByIsbn is costly & cache is implemented.

Now, we call findBookByIsbn for isbn 1 & through Load Balancer, it goes to Node 1 & it fetches the data from DataBase & store in cache.

Now, another call is made to findBookByIsbn for isbn 1 & through Load Balancer, it now goes to Node 2. In this case it again fetches the data from DataBase & store in cache.

Hence, for same data (isbn=1) the db call is again made in DataBase as the cache resides in each node seperately. 

The architecture is deplicted in below image



Now, you can solve this problem by creating an Embedded Distributed Cache (aka Replicated Cache). In this case, Cache of Node 1 interacts with Node 2 & replicates the data. The architecture will look like below:





This technique can be implemented using Ehcache with JGroups.


EHCache Replicated Cache Tutorial Links:


As in Ehcache-Jgroups combination, we need to do lot of manual configuration, another good alternative is using Hazelcast. In this note, I am going to give you the steps you required to use HazelCast as cache manager

Pre-requisite:

Requied Java Version: 8

Spring Framework Verion used: 4.3.30.RELEASE

The Cache data type should implement Serializable interface

Step #1: Adding Maven dependency for Hazelcast Spring integration & Spring Context upport

                <dependency>

<groupId>com.hazelcast</groupId>

<artifactId>hazelcast-spring</artifactId>

<version>4.2</version>

</dependency>

        <dependency>

<groupId>org.springframework</groupId>

<artifactId>spring-context-support</artifactId>

<version>4.3.30.RELEASE</version>

</dependency>

 Step #2: Defing the method. The method must be defined in a Spring Bean class (Class having annotation Service/Component or defined in XML)

@Cacheable("bookIsbnCache")

public Book findBookByIsbn(String isbn) {

        // DB / Service call goes here 

        }

Step #3: Define the cache in application context xml

<beans xmlns="http://www.springframework.org/schema/beans"

xmlns:context="http://www.springframework.org/schema/context"

xmlns:p="http://www.springframework.org/schema/p"

xmlns:cache="http://www.springframework.org/schema/cache"

xmlns:hz="http://www.hazelcast.com/schema/spring"

xmlns:mvc="http://www.springframework.org/schema/mvc"

xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"

xsi:schemaLocation="

        http://www.springframework.org/schema/beans     

        http://www.springframework.org/schema/beans/spring-beans.xsd

        http://www.springframework.org/schema/context 

        http://www.springframework.org/schema/context/spring-context.xsd

        http://www.springframework.org/schema/mvc

        http://www.springframework.org/schema/mvc/spring-mvc.xsd

       http://www.springframework.org/schema/cache http://www.springframework.org/schema/cache/spring-cache.xsd

       http://www.hazelcast.com/schema/spring

       http://www.hazelcast.com/schema/spring/hazelcast-spring.xsd">

<!-- Other bean definition-->

<cache:annotation-driven
cache-manager="cacheManager" />

<hz:hazelcast id="instance">
<hz:config>

<hz:cluster-name>TestHzCluster</hz:cluster-name>

<!--  used for clustering.
<hz:network port="5701" port-auto-increment="false">
<hz:join>
<hz:multicast enabled="false" />
<hz:tcp-ip enabled="true">
<hz:members>x.x.x.x, y.y.y.y</hz:members>
</hz:tcp-ip>
</hz:join>
</hz:network>
-->

<hz:map name="bookIsbnCache" time-to-live-seconds="60"
in-memory-format="BINARY">
<hz:eviction eviction-policy="LRU"
max-size-policy="PER_NODE" size="100" />
</hz:map>
</hz:config>
</hz:hazelcast>

<bean id="cacheManager"
class="com.hazelcast.spring.cache.HazelcastCacheManager">
<constructor-arg ref="instance" />
</bean>

</beans>

That' it. You can deploy your code in different ports in localhost & you will be able to see the cache is replicated among differnt nodes.

Below is the link for working demo:

https://github.com/souravdalal/SpringHazelcastCacheDemo

Furthur Reading:

Cache Topologies:




Hazelcast with Spring Boot:



Convert Java Project from Log4j 1 to Log4j2

Many times while working on old Java projects we find Log4j 1.x is used. But as the Log4j2 is the new one; hence to upgrade to Log4j2 we nee...