You should have installed:
- Docker
- Docker Compose
- MongoDB Database Tools (specifically
mongoimport
to add the dummy data to the database) - Python 3
All of the commands should be executed from the deploy directory.
cd deploy
docker-compose up -d db
docker-compose up -d mongo-express
With mongo-express
we can see the contents of the database at http://localhost:8081.
To load the database we execute the following commands:
mongoimport --jsonArray --uri "mongodb://root:example@127.0.0.1:27017/beacon?authSource=admin" --file data/analyses*.json --collection analyses
mongoimport --jsonArray --uri "mongodb://root:example@127.0.0.1:27017/beacon?authSource=admin" --file data/biosamples*.json --collection biosamples
mongoimport --jsonArray --uri "mongodb://root:example@127.0.0.1:27017/beacon?authSource=admin" --file data/cohorts*.json --collection cohorts
mongoimport --jsonArray --uri "mongodb://root:example@127.0.0.1:27017/beacon?authSource=admin" --file data/datasets*.json --collection datasets
mongoimport --jsonArray --uri "mongodb://root:example@127.0.0.1:27017/beacon?authSource=admin" --file data/individuals*.json --collection individuals
mongoimport --jsonArray --uri "mongodb://root:example@127.0.0.1:27017/beacon?authSource=admin" --file data/runs*.json --collection runs
mongoimport --jsonArray --uri "mongodb://root:example@127.0.0.1:27017/beacon?authSource=admin" --file data/genomicVariations*.json --collection genomicVariations
This loads the JSON files inside of the data
folder into the MongoDB database.
You can also use
make load
as a convenience alias.
You can create the necessary indexes running the following Python script:
# Install the dependencies
pip3 install pymongo
python3 reindex.py
This step might require a bit of tinkering since some ontologies used in the dummy data will fail to loaded. We recommend skipping this step unless you know what you are doing.
You can automatically fetch the ontologies that the database is using with the following script:
# Install the dependencies
pip3 install pymongo tqdm
python3 fetch_ontologies.py
If you have the ontologies loaded, you can automatically extract the filtering terms from the data in the database using the following utility script:
# Install the dependencies
pip3 install pymongo tqdm owlready2 progressbar
python3 extract_filtering_terms.py
Once the database is setup, you can up the beacon with the following command:
docker-compose up -d beacon
Check the logs until the beacon is ready to be queried:
docker-compose logs -f beacon
You can query the beacon using GET or POST. Below, you can find some examples of usage:
For simplicity (and readability), we will be using HTTPie.
Querying this endpoit it should return the 13 variants of the beacon (paginated):
http GET http://localhost:5050/api/g_variants/
You can also add request parameters to the query, like so:
http GET http://localhost:5050/api/g_variants/?start=9411499,9411644&end=9411609
This should return 3 genomic variants.
You can use POST to make the previous query. With a request.json
file like this one:
{
"meta": {
"apiVersion": "2.0"
},
"query": {
"requestParameters": {
"start": [ 9411499, 9411644 ],
"end": [ 9411609 ]
},
"filters": [],
"includeResultsetResponses": "HIT",
"pagination": {
"skip": 0,
"limit": 10
},
"testMode": false,
"requestedGranularity": "count"
}
}
You can execute:
http POST http://localhost:5050/api/g_variants/ --json < request.json
But you can also use complex filters:
{
"meta": {
"apiVersion": "2.0"
},
"query": {
"filters": [
{
"id": "UBERON:0001256",
"scope": "biosamples",
"includeDescendantTerms": false
}
],
"includeResultsetResponses": "HIT",
"pagination": {
"skip": 0,
"limit": 10
},
"testMode": false,
"requestedGranularity": "count"
}
}
You can execute:
http POST http://localhost:5050/api/biosamples/ --json < request.json
And it will use the ontology filter to filter the results.