Skip to content

Commit

Permalink
Update Custom Logs Package Readme (#12222)
Browse files Browse the repository at this point in the history
* update docs

* add pr link to changelog

* update manifest.yml

* Apply suggestions from code review

Co-authored-by: Mike Birnstiehl <114418652+mdbirnstiehl@users.noreply.github.com>

---------

Co-authored-by: Mike Birnstiehl <114418652+mdbirnstiehl@users.noreply.github.com>
  • Loading branch information
bmorelli25 and mdbirnstiehl authored Jan 3, 2025
1 parent 020ab93 commit 06144b4
Show file tree
Hide file tree
Showing 3 changed files with 33 additions and 10 deletions.
5 changes: 5 additions & 0 deletions packages/log/changelog.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,9 @@
# newer versions go on top
- version: "2.3.3"
changes:
- description: Update documentation
type: enhancement
link: https://github.com/elastic/integrations/pull/12222
- version: "2.3.2"
changes:
- description: Update package spec to V3
Expand Down
36 changes: 27 additions & 9 deletions packages/log/docs/README.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,33 @@
# Custom Logs Package

The Custom Logs package is used for ingesting arbitrary log files and manipulating their content/lines by using Ingest Pipelines configuration.
The **Custom Logs** package is used to ingest arbitrary log files and parse their contents using [Ingest Pipelines](https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest.html). Follow the steps below to set up and use this package.

In order to use the package, please follow these steps:
## Get started

1. [Setup / Install Elastic Agent](https://www.elastic.co/guide/en/fleet/current/install-fleet-managed-elastic-agent.html) at the machine where the logs should be collected from
2. Identify the log location at that machine e.g. `/tmp/custom.log`. Note that `/var/log/*.log` is fully ingested by the [System](https://docs.elastic.co/en/integrations/system), no need to add this path if the [System](https://docs.elastic.co/en/integrations/system) integration is already used
3. Enroll Custom Logs integration and add it to the installed agent. Give the dataset a name that fits to the log purpose, e.g. `python` for logs from a Python app. Make sure to configure the path from the step 2
4. Check that the raw log data is coming in via [Discover](https://www.elastic.co/guide/en/kibana/current/discover.html) by filtering the `logs-*` indices to the dataset name given in step 3, e.g. `logs-python`
5. Configure the parsing rules via [Ingest Pipelines](https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest.html), e.g. JSON Parsing or [grok](https://www.elastic.co/blog/slow-and-steady-how-to-build-custom-grok-patterns-incrementally) parsing
6. Create a [custom dashboard](https://www.elastic.co/guide/en/kibana/current/create-a-dashboard-of-panels-with-web-server-data.html) that analyzes the incoming log data for your needs
1. **Install Elastic Agent**
Install an [Elastic Agent](https://www.elastic.co/guide/en/fleet/current/install-fleet-managed-elastic-agent.html) on the machine from which you want to collect logs.

2. **Identify the Log Location**
Identify the log location on that machine, for example, `/tmp/custom.log`.
- If you need to include multiple log files or an entire directory, consider using wildcard patterns such as `/tmp/*.log` to capture all `.log` files, or `/tmp/*` to include all file types.
- Note that the [System integration](https://docs.elastic.co/en/integrations/system) ingests `/var/log/*.log`. You do not need to add this path if the System integration is in use.

3. **Enroll the Custom Logs Integration**
- Add the **Custom Logs** integration to your installed Elastic Agent.
- Provide an Integration name. A descriptive name will make managing this integration in the Kibana UI more intuitive.
- Configure the path to match the location(s) identified in the previous step.
- Provide a dataset name that reflects the purpose of your logs (for example, `python` for Python application logs).

4. **Verify Data in Discover**
- Open [Discover](https://www.elastic.co/guide/en/kibana/current/discover.html) in Kibana and filter the `logs-*` indices to your dataset name (e.g., `logs-python`) to confirm that the raw log data is being ingested.

5. **Configure Parsing Rules**
- Use [Ingest Pipelines](https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest.html) to define parsing rules.
- See [Parse and route logs](https://www.elastic.co/guide/en/serverless/current/observability-parse-log-data.html) for examples of how to extract structured fields and reroute log data to specific data streams.

6. **Create a Custom Dashboard**
- Use [Kibana](https://www.elastic.co/guide/en/kibana/current/create-a-dashboard-of-panels-with-web-server-data.html) to build a dashboard for analyzing incoming log data based on your specific needs.

## ECS Field Mapping
This integration includes the ECS Dynamic Template, all fields that follows the ECS Schema will get assigned the correct index field mapping and does not need to be added manually.

This integration includes an ECS Dynamic Template, so any fields following the [ECS](https://www.elastic.co/guide/en/ecs/current/index.html) schema will automatically receive the correct index field mappings without additional manual configuration.
2 changes: 1 addition & 1 deletion packages/log/manifest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ title: Custom Logs
description: >-
Collect custom logs with Elastic Agent.
type: input
version: 2.3.2
version: 2.3.3
categories:
- custom
- custom_logs
Expand Down

0 comments on commit 06144b4

Please sign in to comment.