-
Notifications
You must be signed in to change notification settings - Fork 460
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Update Custom Logs Package Readme (#12222)
* update docs * add pr link to changelog * update manifest.yml * Apply suggestions from code review Co-authored-by: Mike Birnstiehl <114418652+mdbirnstiehl@users.noreply.github.com> --------- Co-authored-by: Mike Birnstiehl <114418652+mdbirnstiehl@users.noreply.github.com>
- Loading branch information
1 parent
020ab93
commit 06144b4
Showing
3 changed files
with
33 additions
and
10 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,15 +1,33 @@ | ||
# Custom Logs Package | ||
|
||
The Custom Logs package is used for ingesting arbitrary log files and manipulating their content/lines by using Ingest Pipelines configuration. | ||
The **Custom Logs** package is used to ingest arbitrary log files and parse their contents using [Ingest Pipelines](https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest.html). Follow the steps below to set up and use this package. | ||
|
||
In order to use the package, please follow these steps: | ||
## Get started | ||
|
||
1. [Setup / Install Elastic Agent](https://www.elastic.co/guide/en/fleet/current/install-fleet-managed-elastic-agent.html) at the machine where the logs should be collected from | ||
2. Identify the log location at that machine e.g. `/tmp/custom.log`. Note that `/var/log/*.log` is fully ingested by the [System](https://docs.elastic.co/en/integrations/system), no need to add this path if the [System](https://docs.elastic.co/en/integrations/system) integration is already used | ||
3. Enroll Custom Logs integration and add it to the installed agent. Give the dataset a name that fits to the log purpose, e.g. `python` for logs from a Python app. Make sure to configure the path from the step 2 | ||
4. Check that the raw log data is coming in via [Discover](https://www.elastic.co/guide/en/kibana/current/discover.html) by filtering the `logs-*` indices to the dataset name given in step 3, e.g. `logs-python` | ||
5. Configure the parsing rules via [Ingest Pipelines](https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest.html), e.g. JSON Parsing or [grok](https://www.elastic.co/blog/slow-and-steady-how-to-build-custom-grok-patterns-incrementally) parsing | ||
6. Create a [custom dashboard](https://www.elastic.co/guide/en/kibana/current/create-a-dashboard-of-panels-with-web-server-data.html) that analyzes the incoming log data for your needs | ||
1. **Install Elastic Agent** | ||
Install an [Elastic Agent](https://www.elastic.co/guide/en/fleet/current/install-fleet-managed-elastic-agent.html) on the machine from which you want to collect logs. | ||
|
||
2. **Identify the Log Location** | ||
Identify the log location on that machine, for example, `/tmp/custom.log`. | ||
- If you need to include multiple log files or an entire directory, consider using wildcard patterns such as `/tmp/*.log` to capture all `.log` files, or `/tmp/*` to include all file types. | ||
- Note that the [System integration](https://docs.elastic.co/en/integrations/system) ingests `/var/log/*.log`. You do not need to add this path if the System integration is in use. | ||
|
||
3. **Enroll the Custom Logs Integration** | ||
- Add the **Custom Logs** integration to your installed Elastic Agent. | ||
- Provide an Integration name. A descriptive name will make managing this integration in the Kibana UI more intuitive. | ||
- Configure the path to match the location(s) identified in the previous step. | ||
- Provide a dataset name that reflects the purpose of your logs (for example, `python` for Python application logs). | ||
|
||
4. **Verify Data in Discover** | ||
- Open [Discover](https://www.elastic.co/guide/en/kibana/current/discover.html) in Kibana and filter the `logs-*` indices to your dataset name (e.g., `logs-python`) to confirm that the raw log data is being ingested. | ||
|
||
5. **Configure Parsing Rules** | ||
- Use [Ingest Pipelines](https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest.html) to define parsing rules. | ||
- See [Parse and route logs](https://www.elastic.co/guide/en/serverless/current/observability-parse-log-data.html) for examples of how to extract structured fields and reroute log data to specific data streams. | ||
|
||
6. **Create a Custom Dashboard** | ||
- Use [Kibana](https://www.elastic.co/guide/en/kibana/current/create-a-dashboard-of-panels-with-web-server-data.html) to build a dashboard for analyzing incoming log data based on your specific needs. | ||
|
||
## ECS Field Mapping | ||
This integration includes the ECS Dynamic Template, all fields that follows the ECS Schema will get assigned the correct index field mapping and does not need to be added manually. | ||
|
||
This integration includes an ECS Dynamic Template, so any fields following the [ECS](https://www.elastic.co/guide/en/ecs/current/index.html) schema will automatically receive the correct index field mappings without additional manual configuration. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters