Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
bkribbs15 committed May 17, 2021
1 parent 5fa10f3 commit 336de60
Showing 1 changed file with 86 additions and 58 deletions.
144 changes: 86 additions & 58 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,104 +3,132 @@
This Node.JS Application is part of the Inventory Service for the IBM Cloud Native Toolkit Journey. This application allows users to produce a message to a kafka topic notifying all consumers that an update to an item in inventory has occured.

<h2 align="Left">
Environment Setup
Confluent Setup
</h2>

<details>
<summary><span style="font-size:22px">Strimzi</span></summary>
<summary><span style="font-size:18px">Operator Setup</span></summary>

Follow the Instructions at the following link to setup [Confluent](https://github.ibm.com/ben-cornwell/confluent-operator) on OpenShift.

Be sure to record the `global.sasl.plain.username` and `global.sasl.plain.password` from the `values` file in the `confluent-operator` directory for the `Secret Creation` step below.

### Operator Setup
Once the operator has finished installing, copy the `confluentCA.key` and `confluentCA.pem` and move it to a convient location for you to access. Both will be needed for the `Secret Creation` step as well.

</details>

<details>
<summary><span style="font-size:22px">Confluent</span></summary>

### Operator Setup
<summary><span style="font-size:18px">Secret Creation</span></summary>

Follow the Instructions at the following link to setup [Confluent](https://github.ibm.com/ben-cornwell/confluent-operator) on OpenShift.
Secrets will be needed in order to connect your Kafka Client to the running instance of Kafka. **Two** secrets will need to be created.

Be sure to record the `global.sasl.plain.username` and `global.sasl.plain.password` from the `values` file in the `confluent-operator` directory for the `Secret Creation` step below.
First will be named `confluent-kafka-cert`. Use the following command to create the secret:

Once the operator has finished installing, copy the `confluentCA.key` and `confluentCA.pem` and move it to a convient location for you to access. Both will be needed for the `Secret Creation` step as well.
```bash
oc create secret tls confluent-kafka-cert --cert='./~PATH TO PEM~/confluentCA.pem' --key='./~PATH TO KEY~/confluentCA.key' -n NAMESPACE
```

### Secret Creation
*Replace the `PATH TO` with the proper directory path to the file and `NAMESPACE` with the namespace you want it to be deployed.*

Secrets will be needed in order to connect your Kafka Client to the running instance of Kafka. **Two** secrets will need to be created.
The second key to create will be named `kafka-operator-key`. Use the following command to create the secret:

First will be named `confluent-kafka-cert`. Use the following command to create the secret:
```bash
oc create secret generic kafka-operator-key --from-literal=username=GLOBAL.SASL.PLAIN.USERNAME --from-literal=password=GLOBAL.SASL.PLAIN.PASSWORD -n NAMESPACE
```

```bash
oc create secret tls confluent-kafka-cert --cert='./~PATH TO PEM~/confluentCA.pem' --key='./~PATH TO KEY~/confluentCA.key' -n NAMESPACE
```
*Replace the `GLOBAL.SASL.PLAIN.*` with the value from the previous step and `NAMESPACE` with the namespace you want it to be deployed.*

*Replace the `PATH TO` with the proper directory path to the file and `NAMESPACE` with the namespace you want it to be deployed.*
</details>

The second key to create will be named `kafka-operator-key`. Use the following command to create the secret:
<details>
<summary><span style="font-size:18px">Client Configuration</span></summary>

First we need to setup the `clusterDev` configuration for the new deployed services.

Open the file `/src/env/clusterDev.js`. **Modify** the following capitalized parameters to match your deployment.

```javascript
kafka: {
TOPIC: 'YOUR TOPIC',
BROKERS: ['kafka.NAMESPACE.svc:9071'],
GROUPID: 'GROUPID',
CLIENTID: 'CLIENTID',
SASLMECH:'plain',
CONNECTIONTIMEOUT: 3000,
AUTHENTICATIONTIMEOUT: 1000,
REAUTHENTICATIONTHRESHOLD: 10000,
RETRIES: 3,
MAXRETRYTIME: 5
}
```

Check out the [documentation](https://kafka.js.org/docs/configuration) for details about the other parameters.

```bash
oc create secret generic kafka-operator-key --from-literal=username=GLOBAL.SASL.PLAIN.USERNAME --from-literal=password=GLOBAL.SASL.PLAIN.PASSWORD -n NAMESPACE
```
</details>

*Replace the `GLOBAL.SASL.PLAIN.*` with the value from the previous step and `NAMESPACE` with the namespace you want it to be deployed.*
<h2 align="Left">
Local Setup
</h2>

### Client Configuration
<details>
<summary><span style="font-size:18px">Kafka Setup</span></summary>

First we need to setup the `clusterDev` configuration for the new deployed services.
Make sure you have an instance of kafka running either locally or remotely.

Open the file `/src/env/clusterDev.js`. **Modify** the following capitalized parameters to match your deployment.
Following the instruction [here](https://kafka.apache.org/quickstart) for running kafka locally.

```javascript
kafka: {
TOPIC: 'YOUR TOPIC',
BROKERS: ['kafka.NAMESPACE.svc:9071'],
GROUPID: 'GROUPID',
CLIENTID: 'CLIENTID',
SASLMECH:'plain',
CONNECTIONTIMEOUT: 3000,
AUTHENTICATIONTIMEOUT: 1000,
REAUTHENTICATIONTHRESHOLD: 10000,
RETRIES: 3,
MAXRETRYTIME: 5
}
```
</details>

Check out the [documentation](https://kafka.js.org/docs/configuration) for details about the other parameters.
<details>
<summary><span style="font-size:18px">Local Client Configuration</span></summary>

First we need to setup the `localDev` configuration for the new deployed services.

Open the file `/src/env/localDev.js`. **Modify** the following capitalized parameters to match your deployment.

```javascript
kafka: {
TOPIC: 'YOUR TOPIC',
BROKERS: ['localhost:9092'],
GROUPID: 'GROUPID',
CLIENTID: 'CLIENTID',
CONNECTIONTIMEOUT: 3000,
AUTHENTICATIONTIMEOUT: 1000,
REAUTHENTICATIONTHRESHOLD: 10000,
RETRIES: 3,
MAXRETRYTIME: 5
}
```

</details>

<details>
<summary><span style="font-size:22px">Local Kafka</span></summary>

Make sure you have an instance of kafka running either locally or remotely.
<summary><span style="font-size:18px">Setting Up the Client</span></summary>

Following the instruction [here](https://kafka.apache.org/quickstart) for running kafka locally.
Install the dependencies

</details>
```bash
npm install
```

<h2 align="Left">
Kafka Configuration
</h2>
To start the server run:

<h2 align="Left">
OpenShift Deployment
</h2>
```bash
npm run dev
```

Deploying to Openshift...
Access the swagger page via http:localhost:3000

</details>

<h2 align="Left">
Local Development
Strimzi Setup
</h2>

To start the server run:

```bash
npm run dev
```
<details>
<summary><span style="font-size:18px">Coming Soon...</span></summary>

Access the swagger page via `http:localhost:3000`
</details>

<h2 align="Left">
Contributors
Expand Down

0 comments on commit 336de60

Please sign in to comment.