-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
401221c
commit ac83025
Showing
8 changed files
with
823 additions
and
283 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,33 @@ | ||
#!/usr/bin/env python | ||
|
||
from time import sleep | ||
import requests | ||
import json | ||
import sys | ||
import os | ||
|
||
if __name__ == '__main__': | ||
|
||
# uniprot url for node attributes | ||
url_base = 'https://rest.uniprot.org/uniprotkb/search?query=xref:string-' | ||
pid_file = sys.argv[1] | ||
results = {} | ||
|
||
with open(pid_file) as f: | ||
for pid in f: | ||
|
||
sleep(1) | ||
url = url_base + pid + '&format=json' | ||
result = requests.get(url).text | ||
result = json.loads(result) | ||
|
||
if len(result['results']) == 0: | ||
result = {} | ||
else: | ||
result = result['results'][0] | ||
|
||
results[pid] = result | ||
|
||
# save to file in directory | ||
with open(os.path.join('results', pid_file + '.json'), 'w') as f: | ||
json.dump(results, f) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,17 @@ | ||
Workflow: | ||
Get node properties from Uniprot | ||
Get node values during infection from ViPR | ||
Make initial networks in Python → NetworkX objects | ||
Trim networks with SPRAS | ||
Network measures → Store values on nodes, edges, whole graph | ||
Network perturbations → Store values on nodes and edges | ||
|
||
Per job: | ||
One network → SPRAS → Output network | ||
|
||
Per job: | ||
One network → One network measure → Dictionary of resulting values | ||
How long does each measure take? Some take a long time, might need another solution for larger networks. | ||
|
||
Per job: | ||
One network → Measure complexity → Remove one graph object → Measure complexity → Resulting change (value) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,29 @@ | ||
import urllib.request | ||
import os | ||
import gzip | ||
import shutil | ||
|
||
class FileDownloader: | ||
|
||
def single_file(self, source): | ||
|
||
if source == 'string_edges': | ||
|
||
# check to see if file already exists | ||
# TODO: check file version too | ||
file = os.path.join('data', 'input_files', 'string_edges.txt.gz') | ||
if os.path.exists(file): | ||
return None | ||
|
||
url = 'http://viruses.string-db.org/download/protein.links.full.v10.5.txt.gz' | ||
print('Downloading file from: ' + url) | ||
|
||
# download file | ||
urllib.request.urlretrieve(url, os.path.join('data', 'input_files', 'string_edges.txt.gz')) | ||
|
||
# unzip | ||
with gzip.open(os.path.join('data', 'input_files', 'string_edges.txt.gz'), 'rb') as f_in: | ||
with open(os.path.join('data', 'input_files', 'string_edges.txt'), 'wb') as f_out: | ||
shutil.copyfileobj(f_in, f_out) | ||
|
||
print('Done!') |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.