Skip to content

Deployment fixes #3423

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 15 commits into from
Jul 5, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
some updates
  • Loading branch information
antgonza committed Jul 3, 2024
commit 5886d11a18a40eec241878472343f52499bee4ad
3 changes: 2 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Qiita changelog

Version 2024.02
Version 2024.07
---------------

Deployed on July 15th, 2024
Expand All @@ -16,6 +16,7 @@ Deployed on July 15th, 2024
* Added `current_human_filtering` to the prep-information and `human_reads_filter_method` to the artifact to keep track of the method that it was used to human reads filter the raw artifact and know if it's up to date with what is expected via the best practices.
* Added `reprocess_job_id` to the prep-information so we keep track if a preparation has been reprocessed with another job.
* Other general fixes, like [#3385](https://github.com/qiita-spots/qiita/pull/3385), [#3397](https://github.com/qiita-spots/qiita/pull/3397), [#3399](https://github.com/qiita-spots/qiita/pull/3399), [#3400](https://github.com/qiita-spots/qiita/pull/3400), [#3409](https://github.com/qiita-spots/qiita/pull/3409), [#3410](https://github.com/qiita-spots/qiita/pull/3410).
* On June 14th, 2024 we modified the SPP to use XXXX to filter human-reads from the per-sample-FASTQ loaded to Qiita.


Version 2024.02
Expand Down
14 changes: 6 additions & 8 deletions qiita_db/util.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,13 +49,13 @@
from bcrypt import hashpw, gensalt
from functools import partial
from os.path import join, basename, isdir, exists, getsize
from os import walk, remove, listdir, rename, stat
from os import walk, remove, listdir, rename, stat, makedirs
from glob import glob
from shutil import move, rmtree, copy as shutil_copy
from openpyxl import load_workbook
from tempfile import mkstemp
from csv import writer as csv_writer
from datetime import datetime
from datetime import datetime, timedelta
from time import time as now
from itertools import chain
from contextlib import contextmanager
Expand All @@ -64,18 +64,15 @@
import hashlib
from smtplib import SMTP, SMTP_SSL, SMTPException

from os import makedirs
from errno import EEXIST
from qiita_core.exceptions import IncompetentQiitaDeveloperError
from qiita_core.qiita_settings import qiita_config
from subprocess import check_output
import qiita_db as qdb


from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText

from datetime import timedelta
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
Expand Down Expand Up @@ -2742,7 +2739,7 @@ def update_resource_allocation_table(weeks=1, test=None):
slurm_external_id = sei
if sd is not None:
start_date = sd
dates = [start_date, start_date + timedelta(weeks)]
dates = [start_date, start_date + timedelta(weeks=weeks)]

sql_command = """
SELECT
Expand Down Expand Up @@ -2780,8 +2777,9 @@ def update_resource_allocation_table(weeks=1, test=None):
sacct = [
'sacct', '-p',
'--format=JobID,ElapsedRaw,MaxRSS,Submit,Start,End,CPUTimeRAW,'
'ReqMem,AllocCPUs,AveVMSize', '--starttime', dates[0], '--endtime',
dates[1], '--user', 'qiita', '--state', 'CD']
'ReqMem,AllocCPUs,AveVMSize', '--starttime',
dates[0].strftime('%Y-%m-%d'), '--endtime',
dates[1].strftime('%Y-%m-%d'), '--user', 'qiita', '--state', 'CD']

if test is not None:
slurm_data = test
Expand Down
Loading