Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tiup: bump version to v6.1 (#8868) #8875

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
tiup: bump version to v6.1
Signed-off-by: Ran <huangran.alex@gmail.com>
  • Loading branch information
ran-huang authored and ti-chi-bot committed Jun 10, 2022
commit 612928e67814f5c767299a5cae6837e16ce69125
12 changes: 6 additions & 6 deletions production-deployment-using-tiup.md
Original file line number Diff line number Diff line change
Expand Up @@ -140,12 +140,12 @@ To prepare the TiUP offline component package, manually pack an offline componen

If you want to adjust an existing offline mirror (such as adding a new version of a component), take the following steps:

1. When pulling an offline mirror, you can get an incomplete offline mirror by specifying specific information via parameters, such as the component and version information. For example, you can pull an offline mirror that includes only the offline mirror of TiUP v1.9.3 and TiUP Cluster v1.9.3 by running the following command:
1. When pulling an offline mirror, you can get an incomplete offline mirror by specifying specific information via parameters, such as the component and version information. For example, you can pull an offline mirror that includes only the offline mirror of TiUP v1.10.0 and TiUP Cluster v1.10.0 by running the following command:

{{< copyable "shell-regular" >}}

```bash
tiup mirror clone tiup-custom-mirror-v1.9.3 --tiup v1.9.3 --cluster v1.9.3
tiup mirror clone tiup-custom-mirror-v1.10.0 --tiup v1.10.0 --cluster v1.10.0
```

If you only need the components for a particular platform, you can specify them using the `--os` or `--arch` parameters.
Expand Down Expand Up @@ -177,10 +177,10 @@ To prepare the TiUP offline component package, manually pack an offline componen
{{< copyable "shell-regular" >}}

```bash
tiup mirror merge tiup-custom-mirror-v1.9.3
tiup mirror merge tiup-custom-mirror-v1.10.0
```

5. When the above steps are completed, check the result by running the `tiup list` command. In this document's example, the outputs of both `tiup list tiup` and `tiup list cluster` show that the corresponding components of `v1.9.3` are available.
5. When the above steps are completed, check the result by running the `tiup list` command. In this document's example, the outputs of both `tiup list tiup` and `tiup list cluster` show that the corresponding components of `v1.10.0` are available.

#### Step 2: Deploy the offline TiUP component

Expand Down Expand Up @@ -309,13 +309,13 @@ Then execute the `deploy` command to deploy the TiDB cluster:
{{< copyable "shell-regular" >}}

```shell
tiup cluster deploy tidb-test v6.0.0 ./topology.yaml --user root [-p] [-i /home/root/.ssh/gcp_rsa]
tiup cluster deploy tidb-test v6.1.0 ./topology.yaml --user root [-p] [-i /home/root/.ssh/gcp_rsa]
```

In the above command:

- The name of the deployed TiDB cluster is `tidb-test`.
- You can see the latest supported versions by running `tiup list tidb`. This document takes `v6.0.0` as an example.
- You can see the latest supported versions by running `tiup list tidb`. This document takes `v6.1.0` as an example.
- The initialization configuration file is `topology.yaml`.
- `--user root`: Log in to the target machine through the `root` key to complete the cluster deployment, or you can use other users with `ssh` and `sudo` privileges to complete the deployment.
- `[-i]` and `[-p]`: optional. If you have configured login to the target machine without password, these parameters are not required. If not, choose one of the two parameters. `[-i]` is the private key of the `root` user (or other users specified by `--user`) that has access to the target machine. `[-p]` is used to input the user password interactively.
Expand Down
8 changes: 4 additions & 4 deletions quick-start-with-tidb.md
Original file line number Diff line number Diff line change
Expand Up @@ -80,10 +80,10 @@ As a distributed system, a basic TiDB test cluster usually consists of 2 TiDB in
{{< copyable "shell-regular" >}}

```shell
tiup playground v6.0.0 --db 2 --pd 3 --kv 3
tiup playground v6.1.0 --db 2 --pd 3 --kv 3
```

The command downloads a version cluster to the local machine and starts it, such as v6.0.0. To view the latest version, run `tiup list tidb`.
The command downloads a version cluster to the local machine and starts it, such as v6.1.0. To view the latest version, run `tiup list tidb`.

This command returns the access methods of the cluster:

Expand Down Expand Up @@ -201,10 +201,10 @@ As a distributed system, a basic TiDB test cluster usually consists of 2 TiDB in
{{< copyable "shell-regular" >}}

```shell
tiup playground v6.0.0 --db 2 --pd 3 --kv 3
tiup playground v6.1.0 --db 2 --pd 3 --kv 3
```

The command downloads a version cluster to the local machine and starts it, such as v6.0.0. To view the latest version, run `tiup list tidb`.
The command downloads a version cluster to the local machine and starts it, such as v6.1.0. To view the latest version, run `tiup list tidb`.

This command returns the access methods of the cluster:

Expand Down
4 changes: 2 additions & 2 deletions scale-tidb-using-tiup.md
Original file line number Diff line number Diff line change
Expand Up @@ -264,9 +264,9 @@ If you want to remove a TiKV node from the `10.0.1.5` host, take the following s
```

```
Starting /root/.tiup/components/cluster/v1.9.3/cluster display <cluster-name>
Starting /root/.tiup/components/cluster/v1.10.0/cluster display <cluster-name>
TiDB Cluster: <cluster-name>
TiDB Version: v6.0.0
TiDB Version: v6.1.0
ID Role Host Ports Status Data Dir Deploy Dir
-- ---- ---- ----- ------ -------- ----------
10.0.1.3:8300 cdc 10.0.1.3 8300 Up data/cdc-8300 deploy/cdc-8300
Expand Down
36 changes: 18 additions & 18 deletions tiup/tiup-cluster.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ tiup cluster
```

```
Starting component `cluster`: /home/tidb/.tiup/components/cluster/v1.9.3/cluster
Starting component `cluster`: /home/tidb/.tiup/components/cluster/v1.10.0/cluster
Deploy a TiDB cluster for production

Usage:
Expand Down Expand Up @@ -112,20 +112,20 @@ tidb_servers:
...
```

Save the file as `/tmp/topology.yaml`. If you want to use TiDB v6.0.0 and your cluster name is `prod-cluster`, run the following command:
Save the file as `/tmp/topology.yaml`. If you want to use TiDB v6.1.0 and your cluster name is `prod-cluster`, run the following command:

{{< copyable "shell-regular" >}}

```shell
tiup cluster deploy -p prod-cluster v6.0.0 /tmp/topology.yaml
tiup cluster deploy -p prod-cluster v6.1.0 /tmp/topology.yaml
```

During the execution, TiUP asks you to confirm your topology again and requires the root password of the target machine (the `-p` flag means inputting password):

```bash
Please confirm your topology:
TiDB Cluster: prod-cluster
TiDB Version: v6.0.0
TiDB Version: v6.1.0
Type Host Ports Directories
---- ---- ----- -----------
pd 172.16.5.134 2379/2380 deploy/pd-2379,data/pd-2379
Expand Down Expand Up @@ -162,10 +162,10 @@ tiup cluster list
```

```
Starting /root/.tiup/components/cluster/v1.9.3/cluster list
Starting /root/.tiup/components/cluster/v1.10.0/cluster list
Name User Version Path PrivateKey
---- ---- ------- ---- ----------
prod-cluster tidb v6.0.0 /root/.tiup/storage/cluster/clusters/prod-cluster /root/.tiup/storage/cluster/clusters/prod-cluster/ssh/id_rsa
prod-cluster tidb v6.1.0 /root/.tiup/storage/cluster/clusters/prod-cluster /root/.tiup/storage/cluster/clusters/prod-cluster/ssh/id_rsa
```

## Start the cluster
Expand Down Expand Up @@ -193,9 +193,9 @@ tiup cluster display prod-cluster
```

```
Starting /root/.tiup/components/cluster/v1.9.3/cluster display prod-cluster
Starting /root/.tiup/components/cluster/v1.10.0/cluster display prod-cluster
TiDB Cluster: prod-cluster
TiDB Version: v6.0.0
TiDB Version: v6.1.0
ID Role Host Ports Status Data Dir Deploy Dir
-- ---- ---- ----- ------ -------- ----------
172.16.5.134:3000 grafana 172.16.5.134 3000 Up - deploy/grafana-3000
Expand Down Expand Up @@ -264,9 +264,9 @@ tiup cluster display prod-cluster
```

```
Starting /root/.tiup/components/cluster/v1.9.3/cluster display prod-cluster
Starting /root/.tiup/components/cluster/v1.10.0/cluster display prod-cluster
TiDB Cluster: prod-cluster
TiDB Version: v6.0.0
TiDB Version: v6.1.0
ID Role Host Ports Status Data Dir Deploy Dir
-- ---- ---- ----- ------ -------- ----------
172.16.5.134:3000 grafana 172.16.5.134 3000 Up - deploy/grafana-3000
Expand Down Expand Up @@ -374,12 +374,12 @@ Global Flags:
-y, --yes Skip all confirmations and assumes 'yes'
```

For example, the following command upgrades the cluster to v6.0.0:
For example, the following command upgrades the cluster to v6.1.0:

{{< copyable "shell-regular" >}}

```bash
tiup cluster upgrade tidb-test v6.0.0
tiup cluster upgrade tidb-test v6.1.0
```

## Update configuration
Expand Down Expand Up @@ -558,14 +558,14 @@ tiup cluster audit
```

```
Starting component `cluster`: /home/tidb/.tiup/components/cluster/v1.9.3/cluster audit
Starting component `cluster`: /home/tidb/.tiup/components/cluster/v1.10.0/cluster audit
ID Time Command
-- ---- -------
4BLhr0 2022-03-01T13:25:09+08:00 /home/tidb/.tiup/components/cluster/v1.9.3/cluster deploy test v6.0.0 /tmp/topology.yaml
4BKWjF 2022-02-28T23:36:57+08:00 /home/tidb/.tiup/components/cluster/v1.9.3/cluster deploy test v6.0.0 /tmp/topology.yaml
4BKVwH 2022-02-28T23:02:08+08:00 /home/tidb/.tiup/components/cluster/v1.9.3/cluster deploy test v6.0.0 /tmp/topology.yaml
4BKKH1 2022-02-28T16:39:04+08:00 /home/tidb/.tiup/components/cluster/v1.9.3/cluster destroy test
4BKKDx 2022-02-28T16:36:57+08:00 /home/tidb/.tiup/components/cluster/v1.9.3/cluster deploy test v6.0.0 /tmp/topology.yaml
4BLhr0 2022-06-10T13:25:09+08:00 /home/tidb/.tiup/components/cluster/v1.10.0/cluster deploy test v6.1.0 /tmp/topology.yaml
4BKWjF 2022-06-08T23:36:57+08:00 /home/tidb/.tiup/components/cluster/v1.10.0/cluster deploy test v6.1.0 /tmp/topology.yaml
4BKVwH 2022-06-08T23:02:08+08:00 /home/tidb/.tiup/components/cluster/v1.10.0/cluster deploy test v6.1.0 /tmp/topology.yaml
4BKKH1 2022-06-08T16:39:04+08:00 /home/tidb/.tiup/components/cluster/v1.10.0/cluster destroy test
4BKKDx 2022-06-08T16:36:57+08:00 /home/tidb/.tiup/components/cluster/v1.10.0/cluster deploy test v6.1.0 /tmp/topology.yaml
```

The first column is `audit-id`. To view the execution log of a certain command, pass the `audit-id` of a command as the flag as follows:
Expand Down
2 changes: 1 addition & 1 deletion tiup/tiup-component-cluster-deploy.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ tiup cluster deploy <cluster-name> <version> <topology.yaml> [flags]
```

- `<cluster-name>`: the name of the new cluster, which cannot be the same as the existing cluster names.
- `<version>`: the version number of the TiDB cluster to deploy, such as `v6.0.0`.
- `<version>`: the version number of the TiDB cluster to deploy, such as `v6.1.0`.
- `<topology.yaml>`: the prepared [topology file](/tiup/tiup-cluster-topology-reference.md).

## Options
Expand Down
12 changes: 6 additions & 6 deletions tiup/tiup-component-management.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,12 +69,12 @@ Example 2: Use TiUP to install the nightly version of TiDB.
tiup install tidb:nightly
```

Example 3: Use TiUP to install TiKV v6.0.0.
Example 3: Use TiUP to install TiKV v6.1.0.

{{< copyable "shell-regular" >}}

```shell
tiup install tikv:v6.0.0
tiup install tikv:v6.1.0
```

## Upgrade components
Expand Down Expand Up @@ -127,12 +127,12 @@ Before the component is started, TiUP creates a directory for it, and then puts

If you want to start the same component multiple times and reuse the previous working directory, you can use `--tag` to specify the same name when the component is started. After the tag is specified, the working directory will *not be automatically deleted* when the instance is terminated, which makes it convenient to reuse the working directory.

Example 1: Operate TiDB v6.0.0.
Example 1: Operate TiDB v6.1.0.

{{< copyable "shell-regular" >}}

```shell
tiup tidb:v6.0.0
tiup tidb:v6.1.0
```

Example 2: Specify the tag with which TiKV operates.
Expand Down Expand Up @@ -218,12 +218,12 @@ The following flags are supported in this command:
- If the version is ignored, adding `--all` means to uninstall all versions of this component.
- If the version and the component are both ignored, adding `--all` means to uninstall all components of all versions.

Example 1: Uninstall TiDB v6.0.0.
Example 1: Uninstall TiDB v6.1.0.

{{< copyable "shell-regular" >}}

```shell
tiup uninstall tidb:v6.0.0
tiup uninstall tidb:v6.1.0
```

Example 2: Uninstall TiKV of all versions.
Expand Down
6 changes: 3 additions & 3 deletions tiup/tiup-mirror.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,9 +87,9 @@ The `tiup mirror clone` command provides many optional flags (might provide more

If you want to clone only one version (not all versions) of a component, use `--<component>=<version>` to specify this version. For example:

- Execute the `tiup mirror clone <target-dir> --tidb v6.0.0` command to clone the v6.0.0 version of the TiDB component.
- Run the `tiup mirror clone <target-dir> --tidb v6.0.0 --tikv all` command to clone the v6.0.0 version of the TiDB component and all versions of the TiKV component.
- Run the `tiup mirror clone <target-dir> v6.0.0` command to clone the v6.0.0 version of all components in a cluster.
- Execute the `tiup mirror clone <target-dir> --tidb v6.1.0` command to clone the v6.1.0 version of the TiDB component.
- Run the `tiup mirror clone <target-dir> --tidb v6.1.0 --tikv all` command to clone the v6.1.0 version of the TiDB component and all versions of the TiKV component.
- Run the `tiup mirror clone <target-dir> v6.1.0` command to clone the v6.1.0 version of all components in a cluster.

After cloning, signing keys are set up automatically.

Expand Down
4 changes: 2 additions & 2 deletions tiup/tiup-playground.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,9 @@ If you directly execute the `tiup playground` command, TiUP uses the locally ins

This command actually performs the following operations:

- Because this command does not specify the version of the playground component, TiUP first checks the latest version of the installed playground component. Assume that the latest version is v1.9.3, then this command works the same as `tiup playground:v1.9.3`.
- Because this command does not specify the version of the playground component, TiUP first checks the latest version of the installed playground component. Assume that the latest version is v1.10.0, then this command works the same as `tiup playground:v1.10.0`.
- If you have not used TiUP playground to install the TiDB, TiKV, and PD components, the playground component installs the latest stable version of these components, and then start these instances.
- Because this command does not specify the version of the TiDB, PD, and TiKV component, TiUP playground uses the latest version of each component by default. Assume that the latest version is v6.0.0, then this command works the same as `tiup playground:v1.9.3 v6.0.0`.
- Because this command does not specify the version of the TiDB, PD, and TiKV component, TiUP playground uses the latest version of each component by default. Assume that the latest version is v6.1.0, then this command works the same as `tiup playground:v1.10.0 v6.1.0`.
- Because this command does not specify the number of each component, TiUP playground, by default, starts a smallest cluster that consists of one TiDB instance, one TiKV instance, one PD instance, and one TiFlash instance.
- After starting each TiDB component, TiUP playground reminds you that the cluster is successfully started and provides you some useful information, such as how to connect to the TiDB cluster through the MySQL client and how to access the [TiDB Dashboard](/dashboard/dashboard-intro.md).

Expand Down
Loading