Skip to content

Commit

Permalink
docs: add logs to navigation (GreptimeTeam#1062)
Browse files Browse the repository at this point in the history
  • Loading branch information
nicecui committed Jul 15, 2024
1 parent aa2f2ae commit 967c698
Show file tree
Hide file tree
Showing 19 changed files with 43 additions and 34 deletions.
1 change: 0 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
Expand Down
2 changes: 1 addition & 1 deletion docs/nightly/en/reference/sql/create.md
Original file line number Diff line number Diff line change
Expand Up @@ -221,7 +221,7 @@ CREATE TABLE IF NOT EXISTS logs(
) ENGINE=mito;
```

For more information on using full-text indexing and search, refer to the [Log Query Documentation](/user-guide/log/log-query.md).
For more information on using full-text indexing and search, refer to the [Log Query Documentation](/user-guide/logs/query-logs.md).

### Region partition rules

Expand Down
7 changes: 7 additions & 0 deletions docs/nightly/en/summary.yml
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,13 @@
- query
- define-time-window
- expression
- Logs:
- overview
- quick-start
- pipeline-config
- manage-pipelines
- write-logs
- query-logs
- Client-Libraries:
- overview
- go
Expand Down
6 changes: 0 additions & 6 deletions docs/nightly/en/user-guide/log/overview.md

This file was deleted.

Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
# Managing Pipelines
# Manage Pipelines

In GreptimeDB, each `pipeline` is a collection of data processing units used for parsing and transforming the ingested log content. This document provides guidance on creating and deleting pipelines to efficiently manage the processing flow of log data.


For specific pipeline configurations, please refer to the [Pipeline Configuration](log-pipeline.md) documentation.
For specific pipeline configurations, please refer to the [Pipeline Configuration](pipeline-config.md) documentation.

## Create a Pipeline

Expand Down
7 changes: 7 additions & 0 deletions docs/nightly/en/user-guide/logs/overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Overview

- [Quick Start](./quick-start.md): Provides an introduction on how to quickly get started with GreptimeDB log service.
- [Pipeline Configuration](./pipeline-config.md): Provides in-depth information on each specific configuration of pipelines in GreptimeDB.
- [Managing Pipelines](./manage-pipelines.md): Explains how to create and delete pipelines.
- [Writing Logs with Pipelines](./write-logs.md): Provides detailed instructions on efficiently writing log data by leveraging the pipeline mechanism.
- [Query Logs](./query-logs.md): Describes how to query logs using the GreptimeDB SQL interface.
Original file line number Diff line number Diff line change
Expand Up @@ -381,7 +381,7 @@ Specify which field is the Tag column using `index: tag`. Refer to the [Transfor

#### The Fulltext column

Specify which field will be used for full-text search using `index: fulltext`. This index greatly improves the performance of [log search](./log-query.md). Refer to the [Transform Example](#transform-example) below for syntax.
Specify which field will be used for full-text search using `index: fulltext`. This index greatly improves the performance of [log search](./query-logs.md). Refer to the [Transform Example](#transform-example) below for syntax.

### The `on_failure` field

Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Log Query
# Query Log

This document provides a guide on how to use GreptimeDB's query language for effective searching and analysis of log data.

Expand Down Expand Up @@ -87,7 +87,7 @@ A full-text index is essential for full-text search, especially when dealing wit

In the Pipeline configuration, you can specify a column to use a full-text index. Below is a configuration example where the `message` column is set with a full-text index:

<!-- In the Pipeline configuration, you can [specify a column to use a full-text index](./log-pipeline.md#index-field). Below is a configuration example where the `message` column is set with a full-text index: -->
<!-- In the Pipeline configuration, you can [specify a column to use a full-text index](./pipeline-config.md#index-field). Below is a configuration example where the `message` column is set with a full-text index: -->

```yaml
processors:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ Here, `name` is the name of the Pipeline, and `version` is the Pipeline version.

This Pipeline includes one Processor and three Transforms. The Processor uses the Rust time format string `%Y-%m-%d %H:%M:%S%.3f` to parse the timestamp field in the logs, and then the Transforms convert the `id1` and `id2` fields to `int32` type, the `type` and `logger` fields to `string` type with an index of "tag", the `log` field to `string` type with an index of "fulltext", and the `time` field to a time type with an index of "timestamp".

Refer to the [Pipeline Introduction](log-pipeline.md) for specific syntax details.
Refer to the [Pipeline Introduction](pipeline-config.md) for specific syntax details.

## Query Pipelines

Expand Down Expand Up @@ -115,7 +115,7 @@ The above command returns the following result:

In the above example, we successfully wrote 4 log entries to the `public.logs` table.

Please refer to [Writing Logs with Pipeline](write-log.md) for specific syntax for writing logs.
Please refer to [Writing Logs with Pipeline](write-logs.md) for specific syntax for writing logs.

## `logs` table structure

Expand Down Expand Up @@ -182,4 +182,4 @@ As you can see, the logs have been stored as structured logs after applying type
## Conclusion

By following the above steps, you have successfully created a pipeline, written logs, and performed queries. This is just the tip of the iceberg in terms of the capabilities offered by GreptimeDB.
Next, please continue reading [Pipeline Configuration](log-pipeline.md) and [Managing Pipelines](manage-pipeline.md) to learn more about advanced features and best practices.
Next, please continue reading [Pipeline Configuration](pipeline-config.md) and [Managing Pipelines](manage-pipelines.md) to learn more about advanced features and best practices.
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

This document describes how to write logs to GreptimeDB by processing them through a specified pipeline using the HTTP interface.

Before writing logs, please read the [Pipeline Configuration](log-pipeline.md) and [Managing Pipelines](manage-pipeline.md) documents to complete the configuration setup and upload.
Before writing logs, please read the [Pipeline Configuration](pipeline-config.md) and [Managing Pipelines](manage-pipelines.md) documents to complete the configuration setup and upload.

## HTTP API

Expand All @@ -20,7 +20,7 @@ This interface accepts the following parameters:

- `db`: The name of the database.
- `table`: The name of the table.
- `pipeline_name`: The name of the [pipeline](./log-pipeline.md).
- `pipeline_name`: The name of the [pipeline](./pipeline-config.md).

## Body data format

Expand Down
2 changes: 1 addition & 1 deletion docs/nightly/zh/reference/sql/create.md
Original file line number Diff line number Diff line change
Expand Up @@ -226,7 +226,7 @@ CREATE TABLE IF NOT EXISTS logs(
) ENGINE=mito;
```

更多关于全文索引和全文搜索的使用,请参阅 [日志查询文档](/user-guide/log/log-query.md)
更多关于全文索引和全文搜索的使用,请参阅 [日志查询文档](/user-guide/logs/query-logs.md)

### Region 分区规则

Expand Down
1 change: 1 addition & 0 deletions docs/nightly/zh/summary-i18n.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ Client-Libraries: 客户端库
Write-Data: 写入数据
Query-Data: 读取数据
Continuous-Aggregation: 持续聚合
Logs: 日志
Python-Scripts: Python 脚本
Operations: 运维操作
Deploy-on-Kubernetes: 部署到 Kubernetes
Expand Down
6 changes: 0 additions & 6 deletions docs/nightly/zh/user-guide/log/overview.md

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
在 GreptimeDB 中,每个 `pipeline` 是一个数据处理单元集合,用于解析和转换写入的日志内容。本文档旨在指导您如何创建和删除 Pipeline,以便高效地管理日志数据的处理流程。


有关 Pipeline 的具体配置,请阅读 [Pipeline 配置](log-pipeline.md)
有关 Pipeline 的具体配置,请阅读 [Pipeline 配置](pipeline-config.md)

## 创建 Pipeline

Expand Down
7 changes: 7 additions & 0 deletions docs/nightly/zh/user-guide/logs/overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# 概述

- [快速开始](./quick-start.md):介绍了如何快速开始使用 GreptimeDB 日志服务。
- [Pipeline 配置](./pipeline-config.md):深入介绍 GreptimeDB 中的 Pipeline 的每项具体配置。
- [管理 Pipeline](./manage-pipelines.md):介绍了如何创建、删除 Pipeline。
- [配合 Pipeline 写入日志](./write-logs.md): 详细说明了如何结合 Pipeline 机制高效写入日志数据。
- [查询日志](./query-logs.md):描述了如何使用 GreptimeDB SQL 接口查询日志。
Original file line number Diff line number Diff line change
Expand Up @@ -385,7 +385,7 @@ GreptimeDB 支持以下三种字段的索引类型:

#### Fulltext 列

通过 `index: fulltext` 指定哪个字段将会被用于全文搜索,该索引可大大提升 [日志搜索](./log-query.md) 的性能,写法请参考下方的 [Transform 示例](#transform-示例)。
通过 `index: fulltext` 指定哪个字段将会被用于全文搜索,该索引可大大提升 [日志搜索](./query-logs.md) 的性能,写法请参考下方的 [Transform 示例](#transform-示例)。

### `on_failure` 字段

Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# 日志查询
# 查询日志

本文档介绍如何使用 GreptimeDB 提供的查询语言进行日志数据的搜索和分析。

Expand Down Expand Up @@ -87,7 +87,7 @@ SELECT * FROM logs WHERE MATCHES(message, '"He said \"hello\""');

在 Pipeline 的配置中,可以指定某列使用全文索引。以下是一个配置示例,其中 `message` 列被设置为全文索引:

<!-- 在 Pipeline 的配置中,可以[指定某列使用全文索引](./log-pipeline.md#index-字段)。以下是一个配置示例,其中 `message` 列被设置为全文索引: -->
<!-- 在 Pipeline 的配置中,可以[指定某列使用全文索引](./pipeline-config.md#index-字段)。以下是一个配置示例,其中 `message` 列被设置为全文索引: -->

```yaml
processors:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ curl -X "POST" "http://localhost:4000/v1/events/pipelines/test" -F "file=@pipeli

此 Pipeline 包含一个 Processor 和三个 Transform。Processor 使用了 Rust 的时间格式化字符串 `%Y-%m-%d %H:%M:%S%.3f` 解析日志中的 timestamp 字段,然后 Transform 将 id1 和 id2 字段转换为 int32 类型,将 level、content、logger 字段转换为 string 类型,最后将 timestamp 字段转换为时间类型,并将其设置为 Timestamp 索引。

请参考 [Pipeline 介绍](log-pipeline.md)查看具体的语法。
请参考 [Pipeline 介绍](pipeline-config.md)查看具体的语法。



Expand Down Expand Up @@ -115,7 +115,7 @@ curl -X "POST" "http://localhost:4000/v1/events/logs?db=public&table=logs&pipeli

上面的例子中,我们向 `public.logs` 表中成功写入了 4 条日志。

请参考[配合 Pipeline 写入日志](write-log.md)获取具体的日志写入语法。
请参考[配合 Pipeline 写入日志](write-logs.md)获取具体的日志写入语法。

## `logs` 表结构

Expand Down Expand Up @@ -183,4 +183,4 @@ SELECT * FROM public.logs;
## 结语

通过以上步骤,您已经成功创建了 Pipeline,写入日志并进行了查询。这只是 GreptimeDB 提供功能的冰山一角。
接下来请继续阅读 [Pipeline 配置](log-pipeline.md)和[管理 Pipeline](manage-pipeline.md) 来了解更多高级特性和最佳实践。
接下来请继续阅读 [Pipeline 配置](pipeline-config.md)和[管理 Pipeline](manage-pipelines.md) 来了解更多高级特性和最佳实践。
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

本文档介绍如何通过 HTTP 接口使用指定的 Pipeline 进行处理后将日志写入 GreptimeDB。

在写入日志之前,请先阅读 [Pipeline 配置](log-pipeline.md)[管理 Pipeline](manage-pipeline.md) 完成配置的设定和上传。
在写入日志之前,请先阅读 [Pipeline 配置](pipeline-config.md)[管理 Pipeline](manage-pipelines.md) 完成配置的设定和上传。

## HTTP API

Expand All @@ -21,7 +21,7 @@ curl -X "POST" "http://localhost:4000/v1/events/logs?db=<db-name>&table=<table-n

- `db`:数据库名称。
- `table`:表名称。
- `pipeline_name`[Pipeline](./log-pipeline.md) 名称。
- `pipeline_name`[Pipeline](./pipeline-config.md) 名称。

## Body 数据格式

Expand Down

0 comments on commit 967c698

Please sign in to comment.