diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml index b52284706..9b4900265 100644 --- a/.github/workflows/docs.yml +++ b/.github/workflows/docs.yml @@ -3,7 +3,7 @@ name: Build Docs on: push: branches: - - master + - ak-link pull_request: jobs: @@ -14,9 +14,9 @@ jobs: matrix: python: - 3.8 - #- 3.9 - #- 3.10 - #- 3.11 + - 3.9 + - 3.10 + - 3.11 steps: - name: Check out repository uses: actions/checkout@v4 diff --git a/docs/source/cmdline/delete.rst b/docs/source/cmdline/delete.rst index 9fceae2a1..0ffbee6bd 100644 --- a/docs/source/cmdline/delete.rst +++ b/docs/source/cmdline/delete.rst @@ -10,7 +10,7 @@ Delete datasets and assets from the server. Each argument must be either a file path pointing to an asset file or directory in a local datasets (in which case the corresponding assets are deleted on the remote server) or a :ref:`resource identifier ` pointing to a -remote asset, directory, or entire Dandiset. +remote asset, directory, or entire dataset. Options ------- diff --git a/docs/source/cmdline/download.rst b/docs/source/cmdline/download.rst index b2b6855a3..bcdaf774d 100644 --- a/docs/source/cmdline/download.rst +++ b/docs/source/cmdline/download.rst @@ -5,7 +5,7 @@ lincbrain [] download [] ... -Download one or more Dandisets, assets, or folders of assets from DANDI. +Download one or more datasets, assets, or folders of assets from LINC. See :ref:`resource_ids` for allowed URL formats. diff --git a/docs/source/cmdline/index.rst b/docs/source/cmdline/index.rst index c20099927..eeae5f58d 100644 --- a/docs/source/cmdline/index.rst +++ b/docs/source/cmdline/index.rst @@ -7,5 +7,5 @@ Command-Line Interface .. toctree:: :glob: - lincbrain + dandi * diff --git a/docs/source/cmdline/instances.rst b/docs/source/cmdline/instances.rst index b79c635b9..dd96025b0 100644 --- a/docs/source/cmdline/instances.rst +++ b/docs/source/cmdline/instances.rst @@ -14,11 +14,11 @@ Example output: .. code:: yaml dandi: - api: https://api.dandiarchive.org/api - gui: https://gui.dandiarchive.org + api: https://api.lincbrain.org/api + gui: https://lincbrain.org dandi-api-local-docker-tests: api: http://localhost:8000/api gui: http://localhost:8085 dandi-staging: - api: https://api-staging.dandiarchive.org/api - gui: https://gui-staging.dandiarchive.org + api: https://staging-api.lincbrain.org/api + gui: https://staging.lincbrain.org diff --git a/docs/source/cmdline/ls.rst b/docs/source/cmdline/ls.rst index aeddab34f..78f29ffcf 100644 --- a/docs/source/cmdline/ls.rst +++ b/docs/source/cmdline/ls.rst @@ -5,7 +5,7 @@ lincbrain [] ls [] [ ...] -List :file:`*.nwb` files' and Dandisets' metadata. +List :file:`*.nwb` files' and datasets' metadata. The arguments may be either :ref:`resource identifiers ` or paths to local files/directories. @@ -42,7 +42,7 @@ Options .. option:: -r, --recursive - Recurse into Dandisets/directories. Only :file:`*.nwb` files will be + Recurse into datasets/directories. Only :file:`*.nwb` files will be considered. .. option:: --schema diff --git a/docs/source/cmdline/move.rst b/docs/source/cmdline/move.rst index dc3e40808..4dff0ad55 100644 --- a/docs/source/cmdline/move.rst +++ b/docs/source/cmdline/move.rst @@ -24,10 +24,10 @@ single source path is renamed to the given destination path. Alternatively, if the ``--regex`` option is given, then there must be exactly two arguments on the command line: a `Python regular expression`_ and a -replacement string, possibly containing regex backreferences. :program:`dandi +replacement string, possibly containing regex backreferences. :program:`lincbrain move` will then apply the regular expression to the path of every asset in the current directory recursively (using paths relative to the current directory, -if in a subdirectory of a Dandiset); if a path matches, the matching portion is +if in a subdirectory of a dataset); if a path matches, the matching portion is replaced with the replacement string, after expanding any backreferences. .. _Python regular expression: https://docs.python.org/3/library/re.html @@ -76,7 +76,7 @@ Options Whether to operate on the local dataset in the current directory, a remote dataset (either one specified by the ``--dandiset`` option or else the one - corresponding to the local Dandiset), or both at once. If ``auto`` (the + corresponding to the local dataset), or both at once. If ``auto`` (the default) is given, it is treated the same as ``remote`` if a ``--dandiset`` option is given and as ``both`` otherwise. @@ -106,7 +106,7 @@ Examples To rename the file only in the local or remote instance, insert ``--work-on local`` or ``--work-on remote`` after ``move``. -- When not working in a local clone of a Dandiset, a file can be renamed in a +- When not working in a local clone of a dataset, a file can be renamed in a remote dataset on a server by providing a resource identifier for the dataset to the ``--dandiset`` option. For example, in order to operate on dataset 123456 on the main ``lincbrain`` instance, use:: diff --git a/docs/source/cmdline/organize.rst b/docs/source/cmdline/organize.rst index 06a3d000b..3f1e0d934 100644 --- a/docs/source/cmdline/organize.rst +++ b/docs/source/cmdline/organize.rst @@ -1,4 +1,4 @@ -.. _dandi_organize: +.. _lincbrain_organize: :program:`lincbrain organize` ============================= @@ -35,17 +35,17 @@ In addition, an "obj" key with a value corresponding to the crc32 checksum of "object_id" is added if the aforementioned keys and the list of modalities are not sufficient to disambiguate different files. -You can visit https://dandiarchive.org for a growing collection of -(re)organized dandisets. +You can visit https://lincbrain.org for a growing collection of +(re)organized datasets. Options ------- .. option:: -d, --dandiset-path - The root directory of the Dandiset to organize files under. If not - specified, the Dandiset under the current directory is assumed. For - 'simulate' mode, the target Dandiset/directory must not exist. + The root directory of the dataset to organize files under. If not + specified, the dataset under the current directory is assumed. For + 'simulate' mode, the target dataset/directory must not exist. .. option:: -f, --files-mode [dry|simulate|copy|move|hardlink|symlink|auto] diff --git a/docs/source/cmdline/service-scripts.rst b/docs/source/cmdline/service-scripts.rst index 0c9f113ad..c043f0aa0 100644 --- a/docs/source/cmdline/service-scripts.rst +++ b/docs/source/cmdline/service-scripts.rst @@ -17,8 +17,8 @@ utility operations. Recompute & update the metadata for NWB assets on a remote server. -```` must point to a draft Dandiset or one or more assets inside a draft -Dandiset. See :ref:`resource_ids` for allowed URL formats. +```` must point to a draft dataset or one or more assets inside a draft +dataset. See :ref:`resource_ids` for allowed URL formats. Running this command requires the fsspec_ library to be installed with the ``http`` extra (e.g., ``pip install "fsspec[http]"``). @@ -38,7 +38,7 @@ Options - ``newer-schema-version`` (default) — when the ``schemaVersion`` in the asset's current metadata is missing or older than the schema version - currently in use by DANDI + currently in use by LINC - ``always`` — always @@ -58,12 +58,12 @@ Options .. option:: -d, --dandiset - Specify the ID of the Dandiset to operate on. This option is required. + Specify the ID of the dataset to operate on. This option is required. .. option:: -i, --dandi-instance - DANDI instance (either a base URL or a known instance name) where the - Dandiset is located [default: ``dandi``] + LINC Data Platform instance (either a base URL or a known instance name) where the + dataset is located [default: ``lincbrain``] .. option:: -e, --existing [ask|overwrite|skip] diff --git a/docs/source/cmdline/shell-completion.rst b/docs/source/cmdline/shell-completion.rst index 2d75ea9ed..ed0c49575 100644 --- a/docs/source/cmdline/shell-completion.rst +++ b/docs/source/cmdline/shell-completion.rst @@ -3,7 +3,7 @@ :: - dandi [] shell-completion [] + lincbrain [] shell-completion [] Emit a shell script for enabling command completion. @@ -12,8 +12,8 @@ completion. Example:: - $ source <(dandi shell-completion) - $ dandi -- + $ source <(lincbrain shell-completion) + $ lincbrain -- Options ------- diff --git a/docs/source/cmdline/upload.rst b/docs/source/cmdline/upload.rst index 183615946..897335a59 100644 --- a/docs/source/cmdline/upload.rst +++ b/docs/source/cmdline/upload.rst @@ -13,7 +13,7 @@ paths (or the current directory, if no paths are specified) or a parent directory thereof. Local datasets should pass validation. For that, the assets should first be -organized using the :ref:`dandi_organize` command. +organized using the :ref:`lincbrain_organize` command. By default, all :file:`*.nwb`, :file:`*.zarr`, and :file:`*.ngff` assets in the dataset (ignoring directories starting with a period) will be considered for @@ -78,4 +78,4 @@ set to a nonempty value. .. option:: --upload-dandiset-metadata - Update Dandiset metadata based on the local :file:`dandiset.yaml` file + Update dataset metadata based on the local :file:`dandiset.yaml` file diff --git a/docs/source/conf.py b/docs/source/conf.py index 11f9ece7e..9b924401d 100644 --- a/docs/source/conf.py +++ b/docs/source/conf.py @@ -20,8 +20,8 @@ import lincbrain project = "lincbrain" -copyright = "2021-2023, DANDI Team" -author = "DANDI Team" +copyright = "2021-2024, LINC Team" +author = "LINC Team" # The full version, including alpha/beta/rc tags version = lincbrain.__version__ diff --git a/docs/source/index.rst b/docs/source/index.rst index db8eb36ed..4d193e250 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -1,9 +1,13 @@ -Welcome to the dandi documentation -================================== +Welcome to the lincbrain documentation +====================================== -The `dandi `_ library provides both a -command line interface (CLI) and a Python API for interacting with `DANDI -Archive `_. +The `lincbrain `_ library provides both a +command line interface (CLI) and a Python API for interacting with `LINC Data +Platform `_. + +The `lincbrain` CLI tool is a fork of the `dandi` CLI tool -- https://github.com/dandi/dandi-cli thus, there are a +handful of references to `dandi` and `Dandisets` throughout these docs. Nevertheless, these docs (and the subsequent +instructions alongside), should be reflected as accurate. .. toctree:: :maxdepth: 2 diff --git a/docs/source/modref/consts.rst b/docs/source/modref/consts.rst index a83ece87a..a7f7e1ac7 100644 --- a/docs/source/modref/consts.rst +++ b/docs/source/modref/consts.rst @@ -1,5 +1,5 @@ -``dandi.consts`` -================ +``lincbrain.consts`` +==================== .. automodule:: lincbrain.consts diff --git a/docs/source/modref/dandiapi.rst b/docs/source/modref/dandiapi.rst index f0b81705f..70d133daf 100644 --- a/docs/source/modref/dandiapi.rst +++ b/docs/source/modref/dandiapi.rst @@ -1,15 +1,15 @@ .. module:: lincbrain.dandiapi -``dandi.dandiapi`` -================== +``lincbrain.dandiapi`` +====================== This module provides functionality for interacting with a LINC Data Platform server via the REST API. Interaction begins with the creation of a `DandiAPIClient` instance, which can be used to retrieve `RemoteDandiset` objects (representing -Dandisets on the server) and `BaseRemoteAsset` objects (representing assets +dataset on the server) and `BaseRemoteAsset` objects (representing assets without any data associating them with their Dandisets). `RemoteDandiset` objects can, in turn, be used to retrieve `RemoteAsset` objects (representing -assets associated with Dandisets). Aside from `DandiAPIClient`, none of these +assets associated with datasets). Aside from `DandiAPIClient`, none of these classes should be instantiated directly by the user. All operations that merely fetch data from the server can be done without @@ -21,7 +21,7 @@ method. Example code for printing the metadata of all assets with "two-photon" in their ``metadata.measurementTechnique[].name`` for the latest published version of -every Dandiset: +every dataset: .. literalinclude:: /examples/dandiapi-example.py :language: python @@ -33,7 +33,7 @@ be passed to functions of pynwb etc. .. literalinclude:: /examples/dandiapi-as_readable.py :language: python -You can see more usages of DANDI API to assist with data streaming at +You can see more usages of LINC API to assist with data streaming at `PyNWB: Streaming NWB files `_. Client @@ -44,8 +44,8 @@ Client .. autoclass:: DandiAPIClient :show-inheritance: -Dandisets ---------- +Datasets +-------- .. autoclass:: RemoteDandiset() diff --git a/docs/source/modref/dandiarchive.rst b/docs/source/modref/dandiarchive.rst index 99359ed91..1862e3803 100644 --- a/docs/source/modref/dandiarchive.rst +++ b/docs/source/modref/dandiarchive.rst @@ -1,5 +1,5 @@ -``dandi.dandiarchive`` -====================== +``lincbrain.dandiarchive`` +========================== .. automodule:: lincbrain.dandiarchive :member-order: bysource diff --git a/docs/source/modref/files.rst b/docs/source/modref/files.rst index c2d892632..609343bbe 100644 --- a/docs/source/modref/files.rst +++ b/docs/source/modref/files.rst @@ -1,5 +1,5 @@ -``dandi.files`` -=============== +``lincbrain.files`` +=================== .. automodule:: lincbrain.files :show-inheritance: diff --git a/docs/source/modref/index.rst b/docs/source/modref/index.rst index 1dd4b19c7..2965f9e90 100644 --- a/docs/source/modref/index.rst +++ b/docs/source/modref/index.rst @@ -27,7 +27,7 @@ Such interfaces mirror :ref:`Command-Line Interfaces `. Mid-level user interfaces ========================== -These interfaces provide object-oriented interfaces to manipulate Dandisets and assets in the +These interfaces provide object-oriented interfaces to manipulate datasets and assets in the archive. .. toctree:: diff --git a/docs/source/modref/misctypes.rst b/docs/source/modref/misctypes.rst index 366491796..c9b0018d3 100644 --- a/docs/source/modref/misctypes.rst +++ b/docs/source/modref/misctypes.rst @@ -1,4 +1,4 @@ -``dandi.misctypes`` -=================== +``lincbrain.misctypes`` +======================= .. automodule:: lincbrain.misctypes diff --git a/docs/source/modref/support.digests.rst b/docs/source/modref/support.digests.rst index 98291ba6b..98d1406a5 100644 --- a/docs/source/modref/support.digests.rst +++ b/docs/source/modref/support.digests.rst @@ -1,4 +1,4 @@ -``dandi.support.digests`` -========================= +``lincbrain.support.digests`` +============================= .. automodule:: lincbrain.support.digests diff --git a/docs/source/modref/utils.rst b/docs/source/modref/utils.rst index 11fa65385..bfff44d2c 100644 --- a/docs/source/modref/utils.rst +++ b/docs/source/modref/utils.rst @@ -1,4 +1,4 @@ -``dandi.utils`` -=============== +``lincbrain.utils`` +=================== .. automodule:: lincbrain.utils diff --git a/docs/source/ref/urls.rst b/docs/source/ref/urls.rst index a1cc1a058..4d4e9b423 100644 --- a/docs/source/ref/urls.rst +++ b/docs/source/ref/urls.rst @@ -1,16 +1,16 @@ -.. currentmodule:: dandi.dandiarchive +.. currentmodule:: lincbrain.dandiarchive .. _resource_ids: Resource Identifiers ==================== -``dandi`` commands and Python functions accept URLs and URL-like identifiers in -the following formats for identifying Dandisets, assets, and asset collections. +``lincbrain`` commands and Python functions accept URLs and URL-like identifiers in +the following formats for identifying datasets, assets, and asset collections. Text in [brackets] is optional. A ``server`` field is a base API or GUI URL -for a DANDI Archive instance. If an optional ``version`` field is omitted from -a URL, the given Dandiset's most recent published version will be used if it +for a LINC Data platform instance. If an optional ``version`` field is omitted from +a URL, the given dataset's most recent published version will be used if it has one, and its draft version will be used otherwise. - :samp:`https://identifiers.org/DANDI:{dandiset-id}[/{version}]` @@ -18,11 +18,11 @@ has one, and its draft version will be used otherwise. to one of the other URL formats - :samp:`DANDI:{dandiset-id}[/{version}]` (case insensitive) - — Refers to a Dandiset on the main archive instance named "dandi". + — Refers to a dataset on the main archive instance named "lincbrain". `parse_dandi_url()` converts this format to a `DandisetURL`. -- Any ``https://gui.dandiarchive.org/`` or - ``https://*dandiarchive-org.netlify.app/`` URL which redirects to +- Any ``https://staging.lincbrain.org/`` or + ``https://staging--lincbrain-org.netlify.app/`` URL which redirects to one of the other URL formats - :samp:`https://{server}[/api]/[#/]dandiset/{dandiset-id}[/{version}][/files]` @@ -52,22 +52,22 @@ has one, and its draft version will be used otherwise. format to an `AssetIDURL`. - :samp:`https://{server}[/api]/dandisets/{dandiset-id}/versions/{version}/assets/?path={path}` - — Refers to all assets in the given Dandiset whose paths begin with the + — Refers to all assets in the given dataset whose paths begin with the prefix ``path``. `parse_dandi_url()` converts this format to an `AssetPathPrefixURL`. - :samp:`https://{server}[/api]/dandisets/{dandiset-id}/versions/{version}/assets/?glob={path}` - — Refers to all assets in the given Dandiset whose paths match the glob + — Refers to all assets in the given dataset whose paths match the glob pattern ``path``. `parse_dandi_url()` converts this format to an `AssetGlobURL`. - :samp:`dandi://{instance-name}/{dandiset-id}[@{version}]` (where - ``instance-name`` is the name of a registered Dandi Archive instance) — - Refers to a Dandiset. `parse_dandi_url()` converts this format to a + ``instance-name`` is the name of a registered LINC Data Platform instance) — + Refers to a dataset. `parse_dandi_url()` converts this format to a `DandisetURL`. - :samp:`dandi://{instance-name}/{dandiset-id}[@{version}]/{path}` (where - ``instance-name`` is the name of a registered Dandi Archive instance) + ``instance-name`` is the name of a registered LINC Data Platform instance) - If the ``glob``/``--path-type glob`` option is in effect, the URL refers to a collection of assets whose paths match the glob pattern ``path``, and