Skip to content

Commit 4363bec

Browse files
author
vikasrohit
authored
Merge pull request #4 from topcoder-platform/feature/use-projects-api
Feature - Replace direct Project DB queries with Projects API calls
2 parents 7fc28ba + 88e9494 commit 4363bec

File tree

16 files changed

+226
-130
lines changed

16 files changed

+226
-130
lines changed

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,3 +5,4 @@ node_modules
55
.env
66
.nyc_output
77
coverage/
8+
docker/api.env

ReadMe.md

Lines changed: 12 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,6 @@
55
- nodejs https://nodejs.org/en/ (v8)
66
- Kafka
77
- Informix
8-
- Postgres
98
- Docker, Docker Compose
109

1110
## Configuration
@@ -25,9 +24,15 @@ The following parameters can be set in config files or in env variables:
2524
- UPDATE_PROJECT_TOPIC: update project Kafka topic, default value is 'project.action.update'
2625
- DELETE_PROJECT_TOPIC: delete project member Kafka topic, default value is 'project.action.delete'
2726
- INFORMIX: Informix database configuration parameters, refer `config/default.js` for more information
28-
- POSTGRES: Postgres database configuration parameters, refer `config/default.js` for more information
27+
- AUTH0_URL: AUTH0 URL, used to get M2M token
28+
- AUTH0_PROXY_SERVER_URL: AUTH0 proxy server URL, used to get M2M token
29+
- AUTH0_AUDIENCE: AUTH0 audience, used to get M2M token
30+
- TOKEN_CACHE_TIME: AUTH0 token cache time, used to get M2M token
31+
- AUTH0_CLIENT_ID: AUTH0 client id, used to get M2M token
32+
- AUTH0_CLIENT_SECRET: AUTH0 client secret, used to get M2M token
33+
- PROJECTS_API: the topcoder projects API
2934

30-
generally, we only need to update INFORMIX_HOST, KAFKA_URL and POSTGRES_URL via environment variables, see INFORMIX_HOST, KAFKA_URL and POSTGRES_URL parameter in docker/sample.api.env
35+
generally, we only need to update INFORMIX_HOST, KAFKA_URL, PROJECTS_API and M2M-related configuration via environment variables, see the parameters in docker/sample.api.env
3136

3237
There is a `/health` endpoint that checks for the health of the app. This sets up an expressjs server and listens on the environment variable `PORT`. It's not part of the configuration file and needs to be passed as an environment variable
3338

@@ -68,17 +73,6 @@ We will use Topcoder Informix database setup on Docker.
6873

6974
Go to `docker-ifx` folder and run `docker-compose up`
7075

71-
## Postgres database setup
72-
73-
- Checkout tc-project-service `v5-upgrade` branch
74-
```bash
75-
git clone https://github.com/topcoder-platform/tc-project-service.git
76-
git checkout v5-upgrade
77-
```
78-
- Modify `dbConfig.masterUrl` in `config/default.json`
79-
- Run command `npm install` to install dependencies
80-
- Run command `npm run sync:db` to create tables on Postgres database
81-
8276
## Local deployment
8377
- Given the fact that the library used to access Informix DB depends on Informix Client SDK.
8478
We will run the application on Docker using a base image with Informix Client SDK installed and properly configured.
@@ -88,15 +82,15 @@ For deployment, please refer to next section 'Local Deployment with Docker'
8882

8983
To run the Legacy Project Processor using docker, follow the steps below
9084

91-
1. Make sure that Kafka, Postgres and Informix are running as per instructions above.
85+
1. Make sure that Kafka, Project Service and Informix are running as per instructions above.
9286

9387
2. Go to `docker` folder
9488

95-
3. Rename the file `sample.api.env` to `api.env` And properly update the IP addresses to match your environment for the variables : KAFKA_URL, INFORMIX_HOST and POSTGRES_URL( make sure to use IP address instead of hostname ( i.e localhost will not work)).Here is an example:
89+
3. Rename the file `sample.api.env` to `api.env` And properly update M2M-related configuration and the IP addresses to match your environment for the variables : KAFKA_URL, INFORMIX_HOST and PROJECTS_API ( make sure to use IP address instead of hostname ( i.e localhost will not work)).Here is an example:
9690
```
9791
KAFKA_URL=192.168.31.8:9092
9892
INFORMIX_HOST=192.168.31.8
99-
POSTGRES_URL=postgres://postgres:password@192.168.31.8:5432/postgres
93+
PROJECTS_API=192.168.31.8:8001/v5
10094
```
10195

10296
4. Once that is done, go to run the following command
@@ -109,7 +103,7 @@ docker-compose up
109103

110104
## Running e2e tests
111105
You need to run `docker-compose build` if modify source files.
112-
Make sure run `docker-compose up` in `docker` folder once to make sure application will install dependencies and run successfully with Kafka, Postgres and Informix.
106+
Make sure run `docker-compose up` in `docker` folder once to make sure application will install dependencies and run successfully with Kafka and Informix.
113107

114108
To run e2e tests
115109
Modify `docker/docker-compose.yml` with `command: run test`(uncomment it) and run `docker-compose up` in `docker` folder

Verification.md

Lines changed: 36 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -18,42 +18,42 @@ npm run test-data
1818
1. start kafka-console-producer to write messages to `project.action.create` topic:
1919
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`
2020
2. write message:
21-
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project", "id": 1000, "name": "Develop website", "description": "<h>Test</h><p>This is description</p>", "directProjectId": null, "billingAccountId": 70015983, "type": "Web Application", "createdBy": 132458 } }`
21+
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project", "id": 1, "name": "Develop website", "description": "<h>Test</h><p>This is description</p>", "directProjectId": null, "billingAccountId": 70015983, "type": "Web Application", "createdBy": 132458 } }`
2222
3. check the app console to verify message has been properly handled.
2323
4. Again, write another message(directProjectId is provided at this time):
24-
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project", "id": 1001, "name": "<h1>Test Project</h1>", "description": "<h>Test</h><p>This is description</p>", "directProjectId": 500, "billingAccountId": null, "type": "Web", "createdBy": 132458 } }`
24+
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project", "id": 2, "name": "<h1>Test Project</h1>", "description": "<h>Test</h><p>This is description</p>", "directProjectId": 500, "billingAccountId": null, "type": "Web", "createdBy": 132458 } }`
2525
5. check the app console to verify message has been properly handled.
2626
6. Try to write an invalid message:
27-
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project", "id": 1001, "name": "<h1>Test Project</h1>", "description": "<h>Test</h><p>This is description</p>", "directProjectId": 500, "billingAccountId": 100, "type": "Web", "createdBy": 132458 } }`
27+
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project", "id": 2, "name": "<h1>Test Project</h1>", "description": "<h>Test</h><p>This is description</p>", "directProjectId": 500, "billingAccountId": 100, "type": "Web", "createdBy": 132458 } }`
2828
7. You will see error message in the app console.
2929
8. start kafka-console-producer to write messages to `project.action.update` topic:
3030
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.update`
3131
9. write message:
32-
`{ "topic": "project.action.update", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project", "id": 1001, "directProjectId": 500, "billingAccountId": 70015984, "updatedBy": 132458 } }`
32+
`{ "topic": "project.action.update", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project", "id": 2, "directProjectId": 500, "billingAccountId": 70015984, "updatedBy": 132458 } }`
3333
10. check the app console to verify message has been properly handled.
3434
11. Try to write an invalid message:
35-
`{ "topic": "project.action.update", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project", "id": 1001, "directProjectId": 500, "billingAccountId": 1, "updatedBy": 132458 } }`
35+
`{ "topic": "project.action.update", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project", "id": 2, "directProjectId": 500, "billingAccountId": 1, "updatedBy": 132458 } }`
3636
12. You will see error message in the app console.
3737
13. start kafka-console-producer to write messages to `project.action.update` topic:
3838
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.create`
3939
14. write messages:
40-
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project.member", "projectId": 1001, "userId": 132457, "role": "copilot", "createdBy": 132458 } }`
40+
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project.member", "projectId": 2, "userId": 132457, "role": "copilot", "createdBy": 132458 } }`
4141

42-
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project.member", "projectId": 1001, "userId": 124835, "role": "manager", "createdBy": 132458 } }`
42+
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project.member", "projectId": 2, "userId": 124835, "role": "manager", "createdBy": 132458 } }`
4343

44-
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project.member", "projectId": 1001, "userId": 124836, "role": "account_manager", "createdBy": 132458 } }`
44+
`{ "topic": "project.action.create", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project.member", "projectId": 2, "userId": 124836, "role": "account_manager", "createdBy": 132458 } }`
4545

4646
15. check the app console to verify messages has been properly handled.
4747
16. Repeat step 14 again.
4848
17. You will see error messages in the app console.
4949
18. start kafka-console-producer to write messages to `project.action.update` topic:
5050
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic project.action.delete`
5151
19. write messages:
52-
`{ "topic": "project.action.delete", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project.member", "projectId": 1001, "userId": 132457, "role": "copilot", "deletedBy": 132458 } }`
52+
`{ "topic": "project.action.delete", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project.member", "projectId": 2, "userId": 132457, "role": "copilot", "deletedBy": 132458 } }`
5353

54-
`{ "topic": "project.action.delete", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project.member", "projectId": 1001, "userId": 124835, "role": "manager", "deletedBy": 132458 } }`
54+
`{ "topic": "project.action.delete", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project.member", "projectId": 2, "userId": 124835, "role": "manager", "deletedBy": 132458 } }`
5555

56-
`{ "topic": "project.action.delete", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project.member", "projectId": 1001, "userId": 124836, "role": "account_manager", "deletedBy": 132458 } }`
56+
`{ "topic": "project.action.delete", "originator": "project-api", "timestamp": "2018-07-02T00:00:00", "mime-type": "application/json", "payload": { "resource": "project.member", "projectId": 2, "userId": 124836, "role": "account_manager", "deletedBy": 132458 } }`
5757

5858
20. check the app console to verify messages has been properly handled.
5959
21. Repeat step 14 again.
@@ -77,20 +77,29 @@ select * from projects;
7777

7878
## E2E tests coverage
7979

80-
103 passing (3m)
80+
``` code
81+
103 passing (2m)
8182
82-
File | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s
83-
----------------------|----------|----------|----------|----------|------------------
84-
All files | 98.23 | 91.98 | 100 | 98.21 |
85-
config | 100 | 89.74 | 100 | 100 |
86-
default.js | 100 | 89.74 | 100 | 100 | 8,25,36
87-
test.js | 100 | 100 | 100 | 100 |
88-
src | 98.57 | 85 | 100 | 98.51 |
89-
app.js | 98.41 | 85 | 100 | 98.39 | 85
90-
bootstrap.js | 100 | 100 | 100 | 100 |
91-
constants.js | 100 | 100 | 100 | 100 |
92-
src/common | 92.59 | 70.83 | 100 | 92.59 |
93-
helper.js | 100 | 100 | 100 | 100 |
94-
logger.js | 90.63 | 65 | 100 | 90.63 |32,55,60,84,98,118
95-
src/services | 99.67 | 99.04 | 100 | 99.66 |
96-
ProcessorService.js | 99.67 | 99.04 | 100 | 99.66 | 875
83+
84+
> legacy-project-processor@1.0.0 cover:report /legacy-project-processor
85+
> nyc report --reporter=html --reporter=text
86+
87+
----------------------|----------|----------|----------|----------|-------------------|
88+
File | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s |
89+
----------------------|----------|----------|----------|----------|-------------------|
90+
All files | 96.75 | 91.01 | 96.72 | 96.72 | |
91+
config | 100 | 93.75 | 100 | 100 | |
92+
default.js | 100 | 93.75 | 100 | 100 | 8,25 |
93+
test.js | 100 | 100 | 100 | 100 | |
94+
src | 90 | 75 | 71.43 | 89.55 | |
95+
app.js | 88.89 | 75 | 60 | 88.71 |... 87,88,89,90,92 |
96+
bootstrap.js | 100 | 100 | 100 | 100 | |
97+
constants.js | 100 | 100 | 100 | 100 | |
98+
src/common | 92.5 | 70.83 | 100 | 92.5 | |
99+
helper.js | 100 | 100 | 100 | 100 | |
100+
logger.js | 90.63 | 65 | 100 | 90.63 |32,55,60,84,98,118 |
101+
src/services | 99.35 | 98.04 | 100 | 99.35 | |
102+
ProcessorService.js | 99.33 | 98.04 | 100 | 99.33 | 712,882 |
103+
ProjectService.js | 100 | 100 | 100 | 100 | |
104+
----------------------|----------|----------|----------|----------|-------------------|
105+
```

config/default.js

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -31,12 +31,13 @@ module.exports = {
3131
POOL_MAX_SIZE: parseInt(process.env.IFX_POOL_MAX_SIZE) || 10 // use connection pool in processor, the pool size
3232
},
3333

34-
// postgres database configuration
35-
POSTGRES: {
36-
URL: process.env.POSTGRES_URL || 'postgres://coder:mysecretpassword@dockerhost:5432/projectsdb', // url
37-
MAX_POOL_SIZE: parseInt(process.env.POSTGRES_MAX_POOL_SIZE) || 50, // max pool size
38-
MIN_POOL_SIZE: parseInt(process.env.POSTGRES_MIN_POOL_SIZE) || 4, // min pool size
39-
IDLE_TIME_OUT: parseInt(process.env.POSTGRES_IDLE_TIME_OUT) || 1000, // idle time
40-
PROJECT_TABLE_NAME: 'projects' // project table name
41-
}
34+
// used to get M2M token
35+
AUTH0_URL: process.env.AUTH0_URL,
36+
AUTH0_PROXY_SERVER_URL: process.env.AUTH0_PROXY_SERVER_URL,
37+
AUTH0_AUDIENCE: process.env.AUTH0_AUDIENCE,
38+
TOKEN_CACHE_TIME: process.env.TOKEN_CACHE_TIME,
39+
AUTH0_CLIENT_ID: process.env.AUTH0_CLIENT_ID,
40+
AUTH0_CLIENT_SECRET: process.env.AUTH0_CLIENT_SECRET,
41+
42+
PROJECTS_API: process.env.PROJECTS_API || 'http://localhost:8001/v5'
4243
}

docker/api.env

Lines changed: 0 additions & 3 deletions
This file was deleted.

docker/docker-compose.yml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,8 @@ services:
66
build:
77
context: ../
88
dockerfile: docker/Dockerfile
9-
# env_file:
10-
# - api.env
11-
# command: run start
9+
env_file:
10+
- api.env
11+
command: run start
1212
# command: run test
13-
command: run test:cov
13+
# command: run test:cov

docker/sample.api.env

Lines changed: 8 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1-
KAFKA_URL=<KAFKA URL>
2-
INFORMIX_HOST=<INFORMIX HOST>
3-
POSTGRES_URL=<POSTGRES URL>
1+
KAFKA_URL=host.docker.internal:9092
2+
INFORMIX_HOST=host.docker.internal
3+
PROJECTS_API=host.docker.internal:8001/v5
4+
AUTH0_CLIENT_ID=<AUTH0 CLIENT ID>
5+
AUTH0_CLIENT_SECRET=<AUTH0 CLIENT SECRET>
6+
AUTH0_URL=<AUTH0 URL>
7+
AUTH0_AUDIENCE=<AUTH0 AUDIENCE>
8+
AUTH0_PROXY_SERVER_URL=<AUTH0 PROXY SERVER URL>

package.json

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -9,21 +9,24 @@
99
"lint:fix": "standard --fix",
1010
"init-db": "node scripts/init-db.js",
1111
"test-data": "node scripts/test-data.js",
12-
"test": "mocha test/helper.test.js && mocha test/processor.test.js --timeout 20000 --exit",
12+
"test": "mocha test/helper.test.js && mocha --require test/prepare.js test/processor.test.js --timeout 20000 --exit",
1313
"helper:test": "nyc --silent mocha test/helper.test.js --exit",
14-
"processor:test": "nyc --silent --no-clean mocha test/processor.test.js --timeout 20000 --exit",
14+
"processor:test": "nyc --silent --no-clean mocha --require test/prepare.js test/processor.test.js --timeout 20000 --exit",
1515
"cover:report": "nyc report --reporter=html --reporter=text",
1616
"test:cov": "npm run helper:test && npm run processor:test && npm run cover:report"
1717
},
1818
"author": "TCSCODER",
1919
"license": "none",
2020
"devDependencies": {
21-
"should": "^13.2.3",
2221
"mocha": "^6.1.4",
22+
"mocha-prepare": "^0.1.0",
23+
"nock": "^11.7.0",
2324
"nyc": "^14.1.1",
24-
"superagent": "^5.1.0",
25+
"q": "^1.5.1",
26+
"should": "^13.2.3",
27+
"sinon": "^7.3.2",
2528
"standard": "^12.0.1",
26-
"sinon": "^7.3.2"
29+
"superagent": "^5.1.2"
2730
},
2831
"dependencies": {
2932
"@hapi/joi": "^15.1.0",
@@ -34,10 +37,7 @@
3437
"ifxnjs": "^8.0.1",
3538
"lodash": "^4.17.11",
3639
"no-kafka": "^3.4.3",
37-
"pg": "^7.11.0",
38-
"pg-hstore": "^2.3.3",
39-
"q": "^1.5.1",
40-
"sequelize": "^5.9.0",
40+
"tc-core-library-js": "github:appirio-tech/tc-core-library-js#v2.6.3",
4141
"topcoder-healthcheck-dropin": "^1.0.3",
4242
"winston": "^3.2.1"
4343
},

scripts/init-db.js

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,11 +3,10 @@
33
*/
44

55
require('../src/bootstrap')
6-
const { getPostgresConnection, getInformixConnection } = require('../src/common/helper')
6+
const { getInformixConnection } = require('../src/common/helper')
77
const logger = require('../src/common/logger')
88

99
async function initDB () {
10-
await getPostgresConnection().query(`delete from projects`)
1110
const connection = await getInformixConnection()
1211
try {
1312
await connection.queryAsync(`delete from tcs_catalog:direct_project_metadata_audit`)

scripts/test-data.js

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,13 +3,10 @@
33
*/
44

55
require('../src/bootstrap')
6-
const { getPostgresConnection, getInformixConnection } = require('../src/common/helper')
6+
const { getInformixConnection } = require('../src/common/helper')
77
const logger = require('../src/common/logger')
88

99
async function insertData () {
10-
await getPostgresConnection().query(`delete from projects`)
11-
await getPostgresConnection().query(`insert into projects(id, name, description, terms, type, status, "createdBy", "updatedBy", version, "lastActivityAt", "lastActivityUserId") values(1000, 'name-1', 'description-1', '{1}', 'test', 'draft', 8547899, 8547899, '1.0', now(), '8547899')`)
12-
await getPostgresConnection().query(`insert into projects(id, "directProjectId", name, description, terms, type, status, "createdBy", "updatedBy", version, "lastActivityAt", "lastActivityUserId") values(1001, 500, 'name-2', 'description-2', '{1}', 'test', 'draft', 8547899, 8547899, '1.0', now(), '8547899')`)
1310
const connection = await getInformixConnection()
1411
try {
1512
await connection.queryAsync(`delete from tcs_catalog:copilot_profile`)

0 commit comments

Comments
 (0)