Home > Operation and Maintenance > Linux Operation and Maintenance > How to use Elasticsearch in Linux for log analysis and search

How to use Elasticsearch in Linux for log analysis and search

王林
Release: 2023-06-18 09:01:49
Original
1645 people have browsed it

In today's Internet era, we are faced with a huge amount of data, especially in servers and applications. Logs are an essential way to manage this data and help us better understand what is happening to our applications and servers. Elasticsearch is a popular tool for log aggregation, analysis, and search. Its high scalability and adaptability make it a leader in data processing and log analysis. In this article, we will learn how to use Elasticsearch in Linux for log analysis and search.

  1. Installing Elasticsearch

The easiest way to install Elasticsearch is to add the Elasticsearch repository from the public source and then install Elasticsearch. How you add sources depends on the Linux distribution you are using. In Ubuntu, you can use the following command:

$ wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
$ sudo apt-get install apt-transport-https
$ echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list
$ sudo apt-get update && sudo apt-get install elasticsearch
Copy after login
  1. Configure Elasticsearch

By default, Elasticsearch listens on ports 9200 and 9300 on localhost, but you can change this configuration. In Elasticsearch, the configuration file is located in /etc/elasticsearch/elasticsearch.yml. In this file, you can configure settings such as cluster name, node name, listening address, and cluster discovery.

As an example, the following is a simple Elasticsearch configuration file:

cluster.name: my_cluster_name
node.name: my_node_name
path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch
network.host: 127.0.0.1
http.port: 9200
Copy after login
  1. Import log files

There are two ways to import log data into In Elasticsearch: Manually import and use Logstash. In this article, we will use Logstash to import logs.

The easiest way to install Logstash is to use the public source. Assuming you are running Elasticsearch on an Ubuntu system, you can install Logstash using the following command:

$ echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list
$ sudo apt-get update && sudo apt-get install logstash
Copy after login

After the installation is complete, create a file in the /etc/logstash/conf.d directory with the name and " .conf" extension that will define how to handle the log data to be imported. The following is a simple configuration file example:

input {
    file {
        path => "/var/log/myapp.log"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
}

filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
  }

  date {
    match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
  }

  geoip {
    source => "clientip"
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
  }
  stdout { codec => rubydebug }
}
Copy after login

In the configuration file, we specify the path of the log file to be read, the starting position of the current log and the setting of not using the imported log file for filtering. At the same time, we defined data filtering using Grok, set up the date format, parsed the client IP address, and output the results to Elasticsearch.

  1. Search and analyze logs

Once you have imported your log data into Elasticsearch, you can use Elasticsearch's query and aggregation capabilities to search and analyze the data. Elasticsearch's REST API provides a variety of query and aggregation tools that can be called using curl, Postman, or any other REST client.

The following is an example of a basic search query that will search for all log entries starting with "error" or "exception" within a time range:

curl -X GET "localhost:9200/_search?q=message:error OR message:exception&filter_path=hits.hits._source"
Copy after login

If you query for more advanced search results, For example, to search for a specific field or to filter results using regular expressions, you can use Elasticsearch's own query language, the Query DSL. Here is an example of a more advanced search query:

{
  "query": {
    "regexp": {
      "message": "WARN.*"
    }
  }
}
Copy after login

The query regular expression " WARN.* " will search all log messages for messages starting with " WARN ".

Conclusion

In this article, we got an overview of how to use Elasticsearch in Linux for log analysis and search. We learned that Elasticsearch is a powerful tool that can help us process and analyze large amounts of log data, which can be very useful when troubleshooting problems, detecting potential problems, or simply understanding what is happening on our applications and servers.

The above is the detailed content of How to use Elasticsearch in Linux for log analysis and search. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template