We covered and explained Elastic stack that consists of Logstash, Elastic Search and Kibana. The three components are used for data collection, data processing, analysis and data visualziation. We also covered the installation and configuration of all Elastic Stack components.
We configured Logstash to collect logs from the Linux authentication log file, process the collected logs to extract the messages and the timestamp and store them either in a CSV file or send them to Elastic Search and Kibana for later analysis and visuzliation.
The elastic stack can be used for both data analytics and cyber security incident analysis similarly to Splunk. We used the lab material from TryHackMe Logstash: Data Processing Unit room.
Highlights
What is Elastic Stack?
Elastic stack is the collection of different open source components linked together to help users take the data from any source and in any format and perform a search, analyze and visualize the data in real-time.
Elastic Search
Elasticsearch is a full-text search and analytics engine used to store JSON-formated documents. Elasticsearch is an important component used to store, analyze, perform correlation on the data, etc.
It is built on top of Apache Lucene and provides a scalable solution for full-text search, structured querying, and data analysis.
Elasticsearch supports RESTFul API to interact with the data.
Log Stash
Logstash is a data processing engine used to take the data from different sources, apply the filter on it or normalize it, and then send it to the destination which could be Kibana or a listening port.
Kibana
Kibana is a web-based data visualization that works with elasticsearch to analyze, investigate and visualize the data stream in real-time. It allows the users to create multiple visualizations and dashboards for better visibility.
Room Answers
What is the default port Elasticsearch runs on?
9200
What version of Elasticsearch we have installed in this task?
8.8.1
What is the command used to check the status of Elasticsearch service?
ystemctl status elasticsearch.service
What is the default value of the network.host variable found in the elasticsearch.yml file?
192.168.0.1
What is the configuration reload interval set by default in logstash.yml?
3s
What is the Logstash version we just installed?
8.8.1
What What is the default port Kibana runs on?
5601
How many files are found in /etc/kibana/ directory?
3
Which plugin is used to perform mutation of event fields?
Mutate
Which plugin can be used to drop events based on certain conditions?
Drop
Which plugin parses unstructured log data using custom patterns?
Grok
Is index a mandatory option in Elasticsearch plugin? (yay / nay)
nay
Look at the file input plugin documentation; which field is required in the configuration?
path
Look at the filter documentation for CSV; what is the third field option mentioned?
Columns
Which output plugin is used in the above example?
elasticsearch
If we want to read a constant stream from a TCP port 5678, which input plugin will be used?
TCP
HowAccording to the Logstash documentation, the codec plugins are used to change the data representation. Which codec plugin is used for the CSV based data representation?
csv
Which filter plugin is used to remove empty fields from the events?
prune
Which filter plugin is used to rename, replace, and modify event fields?
mutate
Add plugin Name to rename the field ‘src_ip’ to ‘source_ip.’ And the fixed third line.
filter
{
mutate
{
plugin_name = { "field_1" = "field_2" }
}
}
rename => {“src_ip” => “source_ip”}
Is thCan we use multiple output plugins at the same time in order to send data to multiple destinations? (yay / nay)
yay
Which Logstash output plugin is used to print events to the console?
stdout
Update the following configuration to send the event to the Syslog server.
output {
plugin {
field1 => "my_host.com"
field2 => 514
}
}
syslog,host,port
Which command is used to run the logstash.conf configuration from the command line?
logstash -f logstash.conf
What will be the input, filter, and output plugins in the below-mentioned configuration if we want to get the comma-separated values from the command line, and apply the filter and output back to the terminal to test the result?
input{
input_plugin{}
filter{
filter_plugin{}
}
output{
output_plugin{}
}
stdin,csv,stdout
Video Walkthrough