Skip to content

Commit

Permalink
bump version 6.6 tiup (pingcap#12520)
Browse files Browse the repository at this point in the history
* bump version 6.6 tiup

* Update upgrade-tidb-using-tiup.md

* Update upgrade-tidb-using-tiup.md

* tiup to 1.11.3
  • Loading branch information
TomShawn authored Feb 20, 2023
1 parent 87111d9 commit 659952f
Show file tree
Hide file tree
Showing 9 changed files with 59 additions and 58 deletions.
12 changes: 6 additions & 6 deletions production-deployment-using-tiup.md
Original file line number Diff line number Diff line change
Expand Up @@ -139,12 +139,12 @@ Method 2: Manually pack an offline component package using `tiup mirror clone`.

If you want to adjust an existing offline mirror (such as adding a new version of a component), take the following steps:

1. When pulling an offline mirror, you can get an incomplete offline mirror by specifying specific information via parameters, such as the component and version information. For example, you can pull an offline mirror that includes only the offline mirror of TiUP v1.11.0 and TiUP Cluster v1.11.0 by running the following command:
1. When pulling an offline mirror, you can get an incomplete offline mirror by specifying specific information via parameters, such as the component and version information. For example, you can pull an offline mirror that includes only the offline mirror of TiUP v1.11.3 and TiUP Cluster v1.11.3 by running the following command:

{{< copyable "shell-regular" >}}

```bash
tiup mirror clone tiup-custom-mirror-v1.11.0 --tiup v1.11.0 --cluster v1.11.0
tiup mirror clone tiup-custom-mirror-v1.11.3 --tiup v1.11.3 --cluster v1.11.3
```

If you only need the components for a particular platform, you can specify them using the `--os` or `--arch` parameters.
Expand Down Expand Up @@ -176,10 +176,10 @@ Method 2: Manually pack an offline component package using `tiup mirror clone`.
{{< copyable "shell-regular" >}}

```bash
tiup mirror merge tiup-custom-mirror-v1.11.0
tiup mirror merge tiup-custom-mirror-v1.11.3
```

5. When the above steps are completed, check the result by running the `tiup list` command. In this document's example, the outputs of both `tiup list tiup` and `tiup list cluster` show that the corresponding components of `v1.11.0` are available.
5. When the above steps are completed, check the result by running the `tiup list` command. In this document's example, the outputs of both `tiup list tiup` and `tiup list cluster` show that the corresponding components of `v1.11.3` are available.
#### Deploy the offline TiUP component
Expand Down Expand Up @@ -334,13 +334,13 @@ Before you run the `deploy` command, use the `check` and `check --apply` command
{{< copyable "shell-regular" >}}

```shell
tiup cluster deploy tidb-test v6.5.0 ./topology.yaml --user root [-p] [-i /home/root/.ssh/gcp_rsa]
tiup cluster deploy tidb-test v6.6.0 ./topology.yaml --user root [-p] [-i /home/root/.ssh/gcp_rsa]
```

In the `tiup cluster deploy` command above:

- `tidb-test` is the name of the TiDB cluster to be deployed.
- `v6.5.0` is the version of the TiDB cluster to be deployed. You can see the latest supported versions by running `tiup list tidb`.
- `v6.6.0` is the version of the TiDB cluster to be deployed. You can see the latest supported versions by running `tiup list tidb`.
- `topology.yaml` is the initialization configuration file.
- `--user root` indicates logging into the target machine as the `root` user to complete the cluster deployment. The `root` user is expected to have `ssh` and `sudo` privileges to the target machine. Alternatively, you can use other users with `ssh` and `sudo` privileges to complete the deployment.
- `[-i]` and `[-p]` are optional. If you have configured login to the target machine without password, these parameters are not required. If not, choose one of the two parameters. `[-i]` is the private key of the root user (or other users specified by `--user`) that has access to the target machine. `[-p]` is used to input the user password interactively.
Expand Down
4 changes: 2 additions & 2 deletions scale-tidb-using-tiup.md
Original file line number Diff line number Diff line change
Expand Up @@ -274,9 +274,9 @@ This section exemplifies how to remove a TiKV node from the `10.0.1.5` host.
```
```
Starting /root/.tiup/components/cluster/v1.11.0/cluster display <cluster-name>
Starting /root/.tiup/components/cluster/v1.11.3/cluster display <cluster-name>
TiDB Cluster: <cluster-name>
TiDB Version: v6.5.0
TiDB Version: v6.6.0
ID Role Host Ports Status Data Dir Deploy Dir
-- ---- ---- ----- ------ -------- ----------
10.0.1.3:8300 cdc 10.0.1.3 8300 Up data/cdc-8300 deploy/cdc-8300
Expand Down
2 changes: 1 addition & 1 deletion ticdc/deploy-ticdc.md
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ When you upgrade a TiCDC cluster, you need to pay attention to the following:
- Since v6.3.0, TiCDC supports rolling upgrade. During the upgrade, the replication latency is stable and does not fluctuate significantly. Rolling upgrade takes effect automatically if the following conditions are met:

- TiCDC is v6.3.0 or later.
- TiUP is v1.11.0 or later.
- TiUP is v1.11.3 or later.
- At least two TiCDC instances are running in the cluster.

## Modify TiCDC cluster configurations using TiUP
Expand Down
36 changes: 18 additions & 18 deletions tiup/tiup-cluster.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ tiup cluster
```

```
Starting component `cluster`: /home/tidb/.tiup/components/cluster/v1.11.0/cluster
Starting component `cluster`: /home/tidb/.tiup/components/cluster/v1.11.3/cluster
Deploy a TiDB cluster for production
Usage:
Expand Down Expand Up @@ -119,20 +119,20 @@ tidb_servers:
...
```

Save the file as `/tmp/topology.yaml`. If you want to use TiDB v6.5.0 and your cluster name is `prod-cluster`, run the following command:
Save the file as `/tmp/topology.yaml`. If you want to use TiDB v6.6.0 and your cluster name is `prod-cluster`, run the following command:

{{< copyable "shell-regular" >}}

```shell
tiup cluster deploy -p prod-cluster v6.5.0 /tmp/topology.yaml
tiup cluster deploy -p prod-cluster v6.6.0 /tmp/topology.yaml
```

During the execution, TiUP asks you to confirm your topology again and requires the root password of the target machine (the `-p` flag means inputting password):

```bash
Please confirm your topology:
TiDB Cluster: prod-cluster
TiDB Version: v6.5.0
TiDB Version: v6.6.0
Type Host Ports OS/Arch Directories
---- ---- ----- ------- -----------
pd 172.16.5.134 2379/2380 linux/x86_64 deploy/pd-2379,data/pd-2379
Expand Down Expand Up @@ -172,10 +172,10 @@ tiup cluster list
```

```
Starting /root/.tiup/components/cluster/v1.11.0/cluster list
Starting /root/.tiup/components/cluster/v1.11.3/cluster list
Name User Version Path PrivateKey
---- ---- ------- ---- ----------
prod-cluster tidb v6.5.0 /root/.tiup/storage/cluster/clusters/prod-cluster /root/.tiup/storage/cluster/clusters/prod-cluster/ssh/id_rsa
prod-cluster tidb v6.6.0 /root/.tiup/storage/cluster/clusters/prod-cluster /root/.tiup/storage/cluster/clusters/prod-cluster/ssh/id_rsa
```

## Start the cluster
Expand Down Expand Up @@ -203,9 +203,9 @@ tiup cluster display prod-cluster
```

```
Starting /root/.tiup/components/cluster/v1.11.0/cluster display prod-cluster
Starting /root/.tiup/components/cluster/v1.11.3/cluster display prod-cluster
TiDB Cluster: prod-cluster
TiDB Version: v6.5.0
TiDB Version: v6.6.0
ID Role Host Ports OS/Arch Status Data Dir Deploy Dir
-- ---- ---- ----- ------- ------ -------- ----------
172.16.5.134:3000 grafana 172.16.5.134 3000 linux/x86_64 Up - deploy/grafana-3000
Expand Down Expand Up @@ -277,9 +277,9 @@ tiup cluster display prod-cluster
```

```
Starting /root/.tiup/components/cluster/v1.11.0/cluster display prod-cluster
Starting /root/.tiup/components/cluster/v1.11.3/cluster display prod-cluster
TiDB Cluster: prod-cluster
TiDB Version: v6.5.0
TiDB Version: v6.6.0
ID Role Host Ports OS/Arch Status Data Dir Deploy Dir
-- ---- ---- ----- ------- ------ -------- ----------
172.16.5.134:3000 grafana 172.16.5.134 3000 linux/x86_64 Up - deploy/grafana-3000
Expand Down Expand Up @@ -390,12 +390,12 @@ Global Flags:
-y, --yes Skip all confirmations and assumes 'yes'
```

For example, the following command upgrades the cluster to v6.5.0:
For example, the following command upgrades the cluster to v6.6.0:

{{< copyable "shell-regular" >}}

```bash
tiup cluster upgrade tidb-test v6.5.0
tiup cluster upgrade tidb-test v6.6.0
```

## Update configuration
Expand Down Expand Up @@ -574,14 +574,14 @@ tiup cluster audit
```

```
Starting component `cluster`: /home/tidb/.tiup/components/cluster/v1.11.0/cluster audit
Starting component `cluster`: /home/tidb/.tiup/components/cluster/v1.11.3/cluster audit
ID Time Command
-- ---- -------
4BLhr0 2022-12-29T13:25:09+08:00 /home/tidb/.tiup/components/cluster/v1.11.0/cluster deploy test v6.5.0 /tmp/topology.yaml
4BKWjF 2022-12-29T23:36:57+08:00 /home/tidb/.tiup/components/cluster/v1.11.0/cluster deploy test v6.5.0 /tmp/topology.yaml
4BKVwH 2022-12-29T23:02:08+08:00 /home/tidb/.tiup/components/cluster/v1.11.0/cluster deploy test v6.5.0 /tmp/topology.yaml
4BKKH1 2022-12-29T16:39:04+08:00 /home/tidb/.tiup/components/cluster/v1.11.0/cluster destroy test
4BKKDx 2022-12-29T16:36:57+08:00 /home/tidb/.tiup/components/cluster/v1.11.0/cluster deploy test v6.5.0 /tmp/topology.yaml
4BLhr0 2022-02-09T23:55:09+08:00 /home/tidb/.tiup/components/cluster/v1.11.3/cluster deploy test v6.6.0 /tmp/topology.yaml
4BKWjF 2022-02-09T23:36:57+08:00 /home/tidb/.tiup/components/cluster/v1.11.3/cluster deploy test v6.6.0 /tmp/topology.yaml
4BKVwH 2022-02-09T23:02:08+08:00 /home/tidb/.tiup/components/cluster/v1.11.3/cluster deploy test v6.6.0 /tmp/topology.yaml
4BKKH1 2022-02-09T16:39:04+08:00 /home/tidb/.tiup/components/cluster/v1.11.3/cluster destroy test
4BKKDx 2022-02-09T16:36:57+08:00 /home/tidb/.tiup/components/cluster/v1.11.3/cluster deploy test v6.6.0 /tmp/topology.yaml
```

The first column is `audit-id`. To view the execution log of a certain command, pass the `audit-id` of a command as the flag as follows:
Expand Down
2 changes: 1 addition & 1 deletion tiup/tiup-component-cluster-deploy.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ tiup cluster deploy <cluster-name> <version> <topology.yaml> [flags]
```

- `<cluster-name>`: the name of the new cluster, which cannot be the same as the existing cluster names.
- `<version>`: the version number of the TiDB cluster to deploy, such as `v6.5.0`.
- `<version>`: the version number of the TiDB cluster to deploy, such as `v6.6.0`.
- `<topology.yaml>`: the prepared [topology file](/tiup/tiup-cluster-topology-reference.md).

## Options
Expand Down
12 changes: 6 additions & 6 deletions tiup/tiup-component-management.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,12 +70,12 @@ Example 2: Use TiUP to install the nightly version of TiDB.
tiup install tidb:nightly
```

Example 3: Use TiUP to install TiKV v6.5.0.
Example 3: Use TiUP to install TiKV v6.6.0.

{{< copyable "shell-regular" >}}

```shell
tiup install tikv:v6.5.0
tiup install tikv:v6.6.0
```

## Upgrade components
Expand Down Expand Up @@ -128,12 +128,12 @@ Before the component is started, TiUP creates a directory for it, and then puts

If you want to start the same component multiple times and reuse the previous working directory, you can use `--tag` to specify the same name when the component is started. After the tag is specified, the working directory will *not be automatically deleted* when the instance is terminated, which makes it convenient to reuse the working directory.

Example 1: Operate TiDB v6.5.0.
Example 1: Operate TiDB v6.6.0.

{{< copyable "shell-regular" >}}

```shell
tiup tidb:v6.5.0
tiup tidb:v6.6.0
```

Example 2: Specify the tag with which TiKV operates.
Expand Down Expand Up @@ -219,12 +219,12 @@ The following flags are supported in this command:
- If the version is ignored, adding `--all` means to uninstall all versions of this component.
- If the version and the component are both ignored, adding `--all` means to uninstall all components of all versions.

Example 1: Uninstall TiDB v6.5.0.
Example 1: Uninstall TiDB v6.6.0.

{{< copyable "shell-regular" >}}

```shell
tiup uninstall tidb:v6.5.0
tiup uninstall tidb:v6.6.0
```

Example 2: Uninstall TiKV of all versions.
Expand Down
6 changes: 3 additions & 3 deletions tiup/tiup-mirror.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,9 +87,9 @@ The `tiup mirror clone` command provides many optional flags (might provide more

If you want to clone only one version (not all versions) of a component, use `--<component>=<version>` to specify this version. For example:

- Execute the `tiup mirror clone <target-dir> --tidb v6.5.0` command to clone the v6.5.0 version of the TiDB component.
- Run the `tiup mirror clone <target-dir> --tidb v6.5.0 --tikv all` command to clone the v6.5.0 version of the TiDB component and all versions of the TiKV component.
- Run the `tiup mirror clone <target-dir> v6.5.0` command to clone the v6.5.0 version of all components in a cluster.
- Execute the `tiup mirror clone <target-dir> --tidb v6.6.0` command to clone the v6.6.0 version of the TiDB component.
- Run the `tiup mirror clone <target-dir> --tidb v6.6.0 --tikv all` command to clone the v6.6.0 version of the TiDB component and all versions of the TiKV component.
- Run the `tiup mirror clone <target-dir> v6.6.0` command to clone the v6.6.0 version of all components in a cluster.

After cloning, signing keys are set up automatically.

Expand Down
4 changes: 2 additions & 2 deletions tiup/tiup-playground.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,9 @@ If you directly execute the `tiup playground` command, TiUP uses the locally ins

This command actually performs the following operations:

- Because this command does not specify the version of the playground component, TiUP first checks the latest version of the installed playground component. Assume that the latest version is v1.11.0, then this command works the same as `tiup playground:v1.11.0`.
- Because this command does not specify the version of the playground component, TiUP first checks the latest version of the installed playground component. Assume that the latest version is v1.11.3, then this command works the same as `tiup playground:v1.11.3`.
- If you have not used TiUP playground to install the TiDB, TiKV, and PD components, the playground component installs the latest stable version of these components, and then start these instances.
- Because this command does not specify the version of the TiDB, PD, and TiKV component, TiUP playground uses the latest version of each component by default. Assume that the latest version is v6.5.0, then this command works the same as `tiup playground:v1.11.0 v6.5.0`.
- Because this command does not specify the version of the TiDB, PD, and TiKV component, TiUP playground uses the latest version of each component by default. Assume that the latest version is v6.6.0, then this command works the same as `tiup playground:v1.11.3 v6.6.0`.
- Because this command does not specify the number of each component, TiUP playground, by default, starts a smallest cluster that consists of one TiDB instance, one TiKV instance, one PD instance, and one TiFlash instance.
- After starting each TiDB component, TiUP playground reminds you that the cluster is successfully started and provides you some useful information, such as how to connect to the TiDB cluster through the MySQL client and how to access the [TiDB Dashboard](/dashboard/dashboard-intro.md).

Expand Down
Loading

0 comments on commit 659952f

Please sign in to comment.