We covered and explained Elastic stack that consists of Logstash, Elastic Search and Kibana. The three components are used for data collection, data processing, analysis and data visualziation. We also covered the installation and configuration of all Elastic Stack components.
We configured Logstash to collect logs from the Linux authentication log file, process the collected logs to extract the messages and the timestamp and store them either in a CSV file or send them to Elastic Search and Kibana for later analysis and visuzliation.
The elastic stack can be used for both data analytics and cyber security incident analysis similarly to Splunk. We used the lab material from TryHackMe Logstash: Data Processing Unit habitación.
Notas de estudio del equipo azul
El curso completo y práctico de pruebas de penetración de aplicaciones web
Reflejos
¿Qué es el Elastic Stack?
Elastic stack es la colección de diferentes componentes de código abierto vinculados entre sí para ayudar a los usuarios a tomar los datos de cualquier fuente y en cualquier formato y realizar una búsqueda, analizar y visualizar los datos en tiempo real.
Búsqueda elástica
Elasticsearch es un motor de análisis y búsqueda de texto completo que se utiliza para almacenar documentos con formato JSON. Elasticsearch es un componente importante que se utiliza para almacenar, analizar, realizar correlaciones de datos, etc.
It is built on top of Apache Lucene and provides a scalable solution for full-text search, structured querying, and data analysis.
Elasticsearch admite API RESTFul para interactuar con los datos.
Alijo de registros
Logstash es un motor de procesamiento de datos que se utiliza para tomar datos de diferentes fuentes, aplicarles el filtro o normalizarlos y luego enviarlos al destino, que podría ser Kibana o un puerto de escucha.
kibana
Kibana es una visualización de datos basada en web que funciona con elasticsearch para analizar, investigar y visualizar el flujo de datos en tiempo real. Permite a los usuarios crear múltiples visualizaciones y paneles para una mejor visibilidad.
Respuestas de la habitación
What is the default port Elasticsearch runs on?
9200
What version of Elasticsearch we have installed in this task?
8.8.1
What is the command used to check the status of Elasticsearch service?
ystemctl status elasticsearch.service
What is the default value of the network.host variable found in the elasticsearch.yml file?
192.168.0.1
What is the configuration reload interval set by default in logstash.yml?
3s
What is the Logstash version we just installed?
8.8.1
What What is the default port Kibana runs on?
5601
How many files are found in /etc/kibana/ directory?
3
Which plugin is used to perform mutation of event fields?
Mutate
Which plugin can be used to drop events based on certain conditions?
Drop
Which plugin parses unstructured log data using custom patterns?
Grok
Is index a mandatory option in Elasticsearch plugin? (yay / nay)
no
Look at the file input plugin documentation; which field is required in the configuration?
path
Look at the filter documentation for CSV; what is the third field option mentioned?
Columns
Which output plugin is used in the above example?
elasticsearch
If we want to read a constant stream from a TCP port 5678, which input plugin will be used?
tcp
HowAccording to the Logstash documentation, the codec plugins are used to change the data representation. Which codec plugin is used for the CSV based data representation?
csv
Which filter plugin is used to remove empty fields from the events?
prune
Which filter plugin is used to rename, replace, and modify event fields?
mutate
Add plugin Name to rename the field ‘src_ip’ to ‘source_ip.’ And the fixed third line.
filter
{
mutate
{
plugin_name = { "field_1" = "field_2" }
}
}
rename => {“src_ip” => “source_ip”}
Is thCan we use multiple output plugins at the same time in order to send data to multiple destinations? (yay / nay)
yay
Which Logstash output plugin is used to print events to the console?
stdout
Update the following configuration to send the event to the Syslog server.
output {
plugin {
field1 => "my_host.com"
field2 => 514
}
}
syslog,host,port
Which command is used to run the logstash.conf configuration from the command line?
logstash -f logstash.conf
What will be the input, filter, and output plugins in the below-mentioned configuration if we want to get the comma-separated values from the command line, and apply the filter and output back to the terminal to test the result?
input{
input_plugin{}
filter{
filter_plugin{}
}
output{
output_plugin{}
}
stdin,csv,stdout
Tutorial en vídeo