Pryv.io core server app components, i.e. what runs on each server node and handles user data.
Prerequisites:
makeand support for C/C++ compilation- Linux: e.g.
sudo apt-get install build-essentials - MacOS: e.g.
xcode-select --install(installs command line tools)
- Linux: e.g.
- Node.js 16
- Mongo DB 4.2 (needs at least 4GB of free disk space for the initial database)
- MacOS & Linux: already included in
scripts/setup-dev-env
- MacOS & Linux: already included in
- InfluxDB 1.2
- Linux:
scripts/setup-influx - MacOS: e.g.
brew install influxdb@1
- Linux:
- nats-server
- Linux:
scripts/setup-nats-server - MacOS: e.g.
brew install nats-server
- Linux:
- graphicsmagick (for image events preview)
- Linux: e.g.
sudo apt-get install graphicsmagick - MacOS: e.g.
brew install graphicsmagick
- Linux: e.g.
- just
Then:
just setup-dev-envto setup local file structure and install MongoDBjust install [--no-optional]to install node modules
The project is structured as a monorepo with components (a.k.a. workspaces in NPM), each component defining its package.json, tests, configs, etc. in components/<name>/.
Scripts are run with just:
just <command> [params]
Notes:
- Running
justwith no argument displays the commands defined injustfile, which should cover all usual needs. (Typical usage examples are included throughout this document.) justworks consistently from anywhere within the project directory.- No NPM scripts.
Everything should be accessible from the project root, including running commands on a particular component (typically via just <command> <component> ...). We keep things consistent across components, with as much as possible defined just once at the root level; in particular:
- All NPM dependencies are kept in the root
package.json - Only basic properties are kept in each component's
package.json
The code follows the Semi-Standard style.
- Run
just lintto check linting on the entire repo - Run
just lint-changesto only check modified files - Add the
--fixoption to either of the above to automatically fix issues when possible.
The servers and the tests depend on NATS server, MongoDB and InfluxDB.
just start-deps
to get them all running at once.
audit tests related to syslog fail on ARM-64 (M1) Macs.
just test <component> [...params]
componentis an existing component's name, orallto run tests on all components- Extra parameters at the end are passed on to Mocha (default settings are defined in
.mocharc.jsfiles) - Replace
testwithtest-detailed,test-debug,test-coverfor common presets
For example:
just test allto test all components using default settingsjust test-detailed api-server --bailto test componentapi-serverwith detailed output, stopping on the first test failure
--bailstops on the first test failure--grep <text>only runs tests matching the given text (typically used with test ids, see below)
See Mocha documentation for the full reference.
With env variables:
LOGS=<level>to show spawned server instances output (level:info,warn,error)DEBUG="*"to show debug information
just tag-tests
to tag yet-untagged test cases with a (hopefully) unique id for unambiguous reference. The scripts issues warnings if duplicate ids or programmatically-generated ids are found (as the latter escape the duplicates check). Note as well that the dev-site will fail to build if there are missing ids or duplicates in the generated test results (see below).
just test-results-(init-repo|generate|upload)
Test results are kept in the dev-test-results repository and published on the dev site.
just test-results-init-repoto checkout the repository locallyjust test-results-generateto run the test suite and save the results totest-results/service-core/${TAG_VERSION}/${TIMESTAMP}-service-core.jsonjust test-results-uploadto upload the results
Add your breakpoints, then just test-debug to run tests in debug mode.
For debugging by hand, old-school:
- Print server 500 errors: uncomment the line containing
uncomment to log 500 errors on test running using InstanceManagerin…/errorHandling.js - Print server
console.log: uncomment the line withstdio: 'inherit'in…/InstanceManager.js
just trace
to start the tracing service (Jaeger).
just run api-server migrate
to trigger data migration. Migrations are defined in the storage component.
See dedicated README.
just security-assessment
to run security assessment and write output to security-assessment (assumes coverage data to be present).
See other just security-assessment-* commands for what's available. Some require additional software such as OWASP ZAP, Docker engine and Grype.
Sometimes it's necessary to work on core and e.g. @pryv/boiler or @pryv/datastore at the same time.
- Open the working copies of core and the desired package(s) in the same workspace (e.g. for VSCode, from the parent folder, run
code service-core pryv-datastore) - From
service-core, runnpx link {package working copy path}(e.g.npx link ../pryv-datastore)
When you're done with the side-by-side work, just install to cleanup and resume using the regular package dependencies.
Components supporting configuration load their settings from (last takes precedence):
- Default values, as defined in the base configuration or the component's own
- A JSON file specified by setting
config, defaulting toconfig/{env}.json(envcan beproduction,developmentortest); typically used for per-environment settings. This variant is deprecated and will be faded out. - An additional "overrides" JSON file specified by setting 'configOverrides'; typically used for confidential settings (e.g. keys, secrets). This variant is deprecated and will be faded out.
- Environment variables; default naming scheme:
PRYV_SETTING_KEY_PATH(for example,PRYV_DATABASE_HOSTfordatabase→host). This variant is deprecated and will be faded out. - Command-line options as
--key=value; default naming scheme:setting:key:path(for example,database:hostfordatabase→host). This variant is deprecated and will be faded out.
To specify a configuration file, please use --config with a relative or absolute path. This will be the way to configure Pryv.io for the near future.
Those components also accept the following command line options:
--helpdisplays all available configuration settings as a schema structure (and exits)--printConfigprints the configuration settings actually loaded (e.g. for debugging purposes)
It is possible to extend the API and previews servers with custom code, via the configuration keys defined under customExtensions/:
-
defaultFolder: The folder in which custom extension modules are searched for by default. Unless defined by its specific setting (see other settings incustomExtensions), each module is loaded from there by its default name (e.g.customAuthStepFn.js), or ignored if missing. Defaults to{app root}/custom-extensions. -
customAuthStepFn: A Node module identifier (e.g./custom/auth/function.js) implementing a custom auth step (such as authenticating the caller id against an external service). The function is passed the method context, which it can alter, and a callback to be called with either no argument (success) or an error (failure). If this setting is not empty and the specified module cannot be loaded as a function, server startup will fail. Undefined by default.// Example of customAuthStepFn.js module.exports = function (context, callback) { // do whatever is needed here (check LDAP, custom DB, etc.) doCustomParsingAndValidating(context, function (err, parsedCallerId) { if (err) { return callback(err); } context.originalCallerId = context.callerId; context.callerId = parsedCallerId; callback(); }); };
Available context properties (as of now):
username(string)user(object): the user object (properties includeid)accessToken(string): as read in theAuthorizationheader orauthparametercallerId(string): optional additional id passed afteraccessTokenin auth after a separating space (auth format is thus<access-token>[ <caller-id>])access(object): the access object (see API doc for structure)
The default event types definitions (components/business/src/types/event-types.default.json) must be kept up-to-date.
just update-event-types
to fetch them from the "reference" version published online. (The API server also tries to update this asynchronously at startup but fallbacks to the default definitions in the meantime and if the online version is unavailable or corrupted.)
If you're running into a lot of test failures, it may be because your Mongo database is empty, so try to just test storage first.
If you are getting multiple seemingly unrelated errors following a branch switch, try to just clean.
If you are trying to run docker <some command> and getting the following error:
docker: Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Post http://%2Fvar%2Frun%2Fdocker.sock/v1.26/containers/create: dial unix /var/run/docker.sock: connect: permission denied.
See 'docker run --help'.
You should add the current user to the docker group: sudo usermod -a -G docker $USER
After running this command, in your shell, log out of your account and log back in, reboot if needed.
Run docker run hello-world to check if it works.
Delete your local influx DB files and reboot Influx DB:
rm ~/.influxdb/data/*
influxd
Or increase the number of authorized files using: ulimit -n 1024 (or more if needed)
realease github workflow has been archived in archives it needs to rewritten to publish on dockerHub