Integrate ELK stack into Spring Boot application

Salitha Chathuranga
8 min readSep 6, 2022

--

Let’s learn to work with Elastic search, Logstash and Kibana

Hello guys! Have you heard about ELK stack before??? Simply ELK stands for Elasticsearch, Logstash and Kibana.

  • Elasticsearch is a NoSQL database that is based on the Lucene search engine.
  • Logstash is a log pipeline tool that accepts inputs from various sources, executes different transformations, and exports the data to various targets. It is a dynamic data collection pipeline with an extensible plugin ecosystem and strong Elasticsearch synergy.
  • Kibana is a visualization UI layer that works on top of Elasticsearch.

We can integrate this stack to any back-end application to achieve centralized logging. We can use ELK to analyze the logs more efficiently and also using more complex search scenarios. It provides log aggregation and efficient searching.

This article will demonstrate how you can integrate ELK Stack with a Spring Boot application to collect, process, store, and view the logs.

ELK Stack Architecture

Create a Spring Boot app with logging

I will create just 2️⃣ simple REST endpoints. Only purpose is generating logs while calling those APIs.

Create fresh project with starter-web dependency.

I have added Log4J for logging and removed default logging coming from starter-web dependency, using a maven exclusion.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.7.3</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.elk</groupId>
<artifactId>spring-boot-elkdemo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>spring-boot-elkdemo</name>
<description>Demo project for Spring Boot ELK stack</description>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>

</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-log4j2</artifactId>
</dependency>

<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>

<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>

</project>

Controller:

@RestController
public class ElkDemoController {
private static final Logger logger = LogManager.getLogger(ElkDemoController.class);@GetMapping(path = "/welcome")
public ResponseEntity<String> welcome() {
logger.debug("Welcome to ELK demo service");
return ResponseEntity.ok("Hello ELK Integration!!!");
}
@GetMapping(path = "/users/{name}")
public ResponseEntity<String> getUserByName(@PathVariable String name) {
if (name.equals("ADMIN")) {
logger.debug("Access by ADMIN triggered");
return ResponseEntity.ok("Access Granted to " + name);
} else {
logger.error("Access denied for: " + name);
return new ResponseEntity<>(("Access Denied for " + name), HttpStatus.BAD_REQUEST);
}
}
}

Logging Config:

Create a log4j2.xml file inside resources folder. This includes log patterns, how log files should be created, how console logs should be shown and many more information.

RollingFile is the type of log file. It means we have a constraint on the log file to rotate log file creation. I have added size based constraint: when a log file is greater than 10MB, new file is created. File name format will be elkdemo-<date>.log.

<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="DEBUG">
<Properties>
<Property name="LOG_PATTERN">
[%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n
</Property>
<Property name="BASE_PATH">/home/salitha/Desktop/elk/logs</Property>
</Properties>
<Appenders>
<Console name="ConsoleAppender" target="SYSTEM_OUT" follow="true">
<PatternLayout pattern="${LOG_PATTERN}"/>
</Console>
<RollingFile name="FileAppender" fileName="${BASE_PATH}/elkdemo.log"
filePattern="${BASE_PATH}/elkdemo-%d{yyyy-MM-dd}.log">
<PatternLayout>
<Pattern>${LOG_PATTERN}</Pattern>
</PatternLayout>
<Policies>
<SizeBasedTriggeringPolicy size="10MB" />
</Policies>
<DefaultRolloverStrategy max="10"/>
</RollingFile>
</Appenders>
<Loggers>
<Logger name="com.elk.demo" level="DEBUG" additivity="false">
<AppenderRef ref="FileAppender"/>
<AppenderRef ref="ConsoleAppender"/>
</Logger>
<Root level="DEBUG">
<AppenderRef ref="FileAppender"/>
</Root>
</Loggers>
</Configuration>

🔴 You can change the log XML settings as you want. As an example, you can setup log only the error logs. Currently I have enabled debug logs also to have more logs in the Kibana. And explore more about logging patterns also. You can change the way you show the logs using log patterns.

When we call any API, logging lines should be loaded into the file we connected: According to my log XML, it will be home/salitha/Desktop/elk/logs/elkdemo.log. And logs should be visible in the console also like this.

Now we have a Spring Boot app with logging. Let’s start ELK stack on our machines and connect. 💪

🔴 NOTE:

Start Elastic Search

  • Go inside /bin folder and execute using the relevant execution file. Since I’m using Linux, command will be like this.
.elasticsearch
  • Go here and see what’s coming: http://localhost:9200. You should get a response like this. It means ES is up! 😎
{
"name": "salitha-X542UQ",
"cluster_name": "elasticsearch",
"cluster_uuid": "twoAo9udTI6FzQuzVd1FOw",
"version": {
"number": "7.14.1",
"build_flavor": "default",
"build_type": "tar",
"build_hash": "66b55ebfa59c92c15db3f69a335d500018b3331e",
"build_date": "2021-08-26T09:01:05.390870785Z",
"build_snapshot": false,
"lucene_version": "8.9.0",
"minimum_wire_compatibility_version": "6.8.0",
"minimum_index_compatibility_version": "6.0.0-beta1"
},
"tagline": "You Know, for Search"
}

Start Kibana

  • We have to let the kibana know where is our ES cluster. So find the kibana YML config inside extracted folder—config/kibana.yml. Then uncomment this line.
elasticsearch.hosts: ["http://localhost:9200"]
  • Go inside /bin folder and execute using the relevant execution file to start Kibana.
.kibana
  • Go here and see what’s coming: http://localhost:5601. You should see the beautiful Kibana dashboard UI now! 😎
Kibana dashboard home page

Start Logstash

  • We need to configure logstash to take an index for logs and connect it with ES and Kibana.
  • Go inside /config in extracted logstash folder. Create a file called logstash.conf. Let’s add this configuration into the conf.
input {
file {
path => "home/salitha/Desktop/elk/logs/elkdemo.log"
start_position => "beginning"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["localhost:9200"]
index => "elkdemo"
}
}

— input: where do we get the logs: should be a logging file connected in our Spring Boot application. So, we have to put the path of that log file.

🔴 IMPORTANT:::: Here input file path and file path in Spring Boot app log4j XML should be same!!!

— Output: where do we send the logs: should be elastic search since we are going to see them using Kibana later. Here, codec is for formatting the logs. So, we provide the hosts for ES and the index is whatever you are planning to create later for storing the logs. I have named it as elkdemo.

  • Go back to /bin folder and Start logstash with our conf file now. Command is like this…
./logstash -f ../config/logstash.conf

🔴 Call our APIs now and see what’s happening. As soon as you call any API, you will see now logstash is logging all the logs in JSON format to the console as we expected. 😍

Let’s go and setup kibana dashboard index. 💪

Create Index Pattern

Now we know we have connected logstash with application. Since our ELK stack is up and running, we should be able to see these logs from Kibana dashboard!

In ES, we record data into indices. So, we need indices defined to show data. Let’s create an index pattern for our index — elkdemo.

Go to Management > Stack Management > Kibana > Index Patterns page from left side bar in Kibana UI. When we enter the index name, it will automatically suggest the index name. The reason for that is we have already configured Logstash with a custom conf to create output an index called elkdemo from input log file. Look at the below image…

Then select the suggested index and click on next. Then select timestamp from drop down. Again click next and save the pattern. After creating the pattern, it is saved under index patterns in Kibana.

Discover application logs

Go to “Discover” page from menu. In the left side, you will get a drop down to select the pattern. Select elkdemo pattern from there.

There you go! You have the application logs! 😍❤️

Let’s filter out some logs. I filtered out logs which are having “welcome” word. Click on add filter and select message field. Operator should be “is” and give the word in the next input. Then all logs will be shown with highlighted word: welcome.

ES dashboard for elkdemo index

We can expand each log and see details…

Expanded ES log event

If you need only the message field content to be shown in all logs, select message field from the “Available Fields” option. Then you will only see message content.

Logs in which only the message field is selected for displaying

Query in ES

We can query the logs using Kibana Dev Tools. Select “Dev Tools” option from left sidebar. There a specific query language in elastic search. The below link will provide you a query guide. https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl.html

I will execute a very simple query — to retrieve all the logs in the index. Enter this query in dev tools. Search query is a GET call with the format — indexName/_search

GET /elkdemo/_search

What did you get? Normally in ES, we get results under “hits” object in the response. Since we query all logs, we do not need to send query body in the GET call. See the below image. You will see the logs stored under hits object.

Let’s see the query to retrieve all the logs that include welcome word. There we have to provide query body with filtering mechanism.

GET /elkdemo/_search
{
"query": {
"bool": {
"filter": [
{
"term": {
"message": "welcome"
}
}
]
}
}
}

Like this you can manipulate data from queries as you want. Explore more about querying. I’m not going to explain it here. This is another way to explore logs rather than seeing them on kibana dashboard UI.

Actually Es querying is a another separate topic to be discussed. I won’t do it. But I showed what we can do with ES queries here.

That’s all guys! 💪 This way we can integrate a Spring Boot app with ELK stack! It can be any other web app also. If you can enable back-end logs, there you can connect to ELK stack! It should not be Java always. We can see all the logs at a centralized place after integration of ELK. Isn’t this great? 😎 We can create separate indices for each micro service and distribute logs into Kibana. Then we have our own logging dashboard! 😎 So, try to setup ELK with a Spring Boot app and do some work with logging. Let me know your thoughts after that.

Bye guys!!! ❤️

--

--

Salitha Chathuranga
Salitha Chathuranga

Written by Salitha Chathuranga

Associate Technical Lead at Sysco LABS | Senior Java Developer | Blogger

Responses (1)