Skip to content

tested parameter signature overwriting docstrings #723

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 12 commits into from
Apr 11, 2022
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,6 @@ repos:
- id: check-yaml
- id: check-added-large-files
- repo: https://github.com/psf/black
rev: 20.8b1
rev: 22.3.0
hooks:
- id: black
8 changes: 7 additions & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -140,10 +140,16 @@ We also recommend using tools `pre-commit` that automates the checking process b
### Docstring

All public methods require docstring and type annotation. It is recommended to add docstring for all functions. The docstrings should follow the [`Comments and Docstrings` section](https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings) of Google Python Style guide. We will include a pylint plugin called [docparams](https://github.com/PyCQA/pylint/blob/main/pylint/extensions/docparams.rst) to validate the parameters of docstrings:
* parameters and their types
* parameters ~~and their types~~
* types are only required in function signatures and sphinx will build parameter type hyperlinks based on that.
* return value and its type
* exceptions raised






You should take special care of the indentations in your documentation. Make sure the indents are consistent and follow the Google Style guide. All sections other than the heading should maintain a hanging indent of two or four spaces. Refer to the examples [here](https://google.github.io/styleguide/pyguide.html#383-functions-and-methods) for what is expected and what are the requirements for different sections like `args`, `lists`, `returns`, etc. Invalid indentations might trigger errors in `sphinx-build` and will cause confusing rendering of the documentation. You can run `sphinx-build` locally to see whether the generated docs look reasonable.

Another aspect that should be noted is the format of links or cross-references of python objects. Make sure to follow the [sphinx cross-referencing syntax](https://www.sphinx-doc.org/en/master/usage/restructuredtext/roles.html#xref-syntax). ~~The references will be checked by [sphinx-build nit-picky mode](https://www.sphinx-doc.org/en/master/man/sphinx-build.html#cmdoption-sphinx-build-n) which raises warnings for all the missing and unresolvable links.~~
Expand Down
3 changes: 2 additions & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@
"myst_parser",
"sphinxcontrib.spelling",
"nbsphinx",
"sphinx_autodoc_typehints",
]

# Add any paths that contain templates here, relative to this directory.
Expand Down Expand Up @@ -361,7 +362,7 @@
"""

autodoc_member_order = "bysource"
autodoc_typehints = "none"
autodoc_typehints = "signature"

napoleon_numpy_docstring = False

Expand Down
2 changes: 2 additions & 0 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -44,3 +44,5 @@ nltk==3.6.6

nbsphinx==0.8.8
jinja2<=3.0.3

sphinx_autodoc_typehints
5 changes: 5 additions & 0 deletions docs/spelling_wordlist.txt
Original file line number Diff line number Diff line change
Expand Up @@ -155,3 +155,8 @@ granularities
doesn
structuralizing
pseudocode
EntryType
LinkType
GroupType
PathLike
ElementType
8 changes: 4 additions & 4 deletions forte/common/resources.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ def save(
r"""Save the resources specified by :attr:`keys` in binary format.

Args:
keys (Optional) list or dict:
keys: list or dict:

- If :attr:`keys` is a list, the objects corresponding to those keys
are saved
Expand All @@ -53,7 +53,7 @@ def save(
the object corresponding to that key
- If :attr:`keys` is None, all objects in this resource will be
saved.
output_dir (Optional): str
output_dir:
A directory specifying the location to save the resources.
"""

Expand Down Expand Up @@ -104,14 +104,14 @@ def load(
r"""Load the resources specified by :attr:`keys`.

Args:
keys (Union[List[str], DeserializeDict]) list or dict:
keys list or dict:

- If :attr:`keys` is a list, the objects corresponding to those keys
are loaded
- If :attr:`keys` is a dict mapping from a key to a deserialize
function, then the deserialize function will be used to load
the object corresponding to that key
path (Optional): str
path: str
A directory specifying the location to load the resources from.
"""

Expand Down
20 changes: 10 additions & 10 deletions forte/data/base_extractor.py
Original file line number Diff line number Diff line change
Expand Up @@ -175,7 +175,7 @@ def vocab(self, vocab: Vocabulary):
externally.

Args:
vocab (Vocabulary): The vocabulary to be assigned.
vocab: The vocabulary to be assigned.

Returns:

Expand Down Expand Up @@ -247,8 +247,8 @@ def update_vocab(
element into `self._vocab`.

Args:
pack (DataPack): The input data pack.
context (Annotation): The context is an Annotation entry where
pack: The input data pack.
context: The context is an Annotation entry where
features will be extracted within its range. If None, then the
whole data pack will be used as the context. Default is None.
"""
Expand All @@ -262,8 +262,8 @@ def extract(
datapack.

Args:
pack (DataPack): The input data pack that contains the features.
context (Annotation): The context is an Annotation entry where
pack: The input data pack that contains the features.
context: The context is an Annotation entry where
features will be extracted within its range. If None, then the
whole data pack will be used as the context. Default is None.

Expand All @@ -282,8 +282,8 @@ def pre_evaluation_action(
the entry. By default, this function will not do anything.

Args:
pack (DataPack): The datapack that contains the current instance.
context (Optional[Annotation]): The context is an Annotation entry
pack: The datapack that contains the current instance.
context: The context is an Annotation entry
where data are extracted within its range. If None, then the
whole data pack will be used as the context. Default is None.
"""
Expand Down Expand Up @@ -314,11 +314,11 @@ def add_to_pack(
3. Add the element to corresponding entry based on the need.

Args:
pack (DataPack): The datapack to add predictions back.
predictions (Any): This is the output of the model, the format of
pack: The datapack to add predictions back.
predictions: This is the output of the model, the format of
which will be determined by the predict function defined in the
Predictor.
context (Optional[Annotation]): The context is an Annotation
context: The context is an Annotation
entry where predictions will be added to. This has the same
meaning with `context` as in
:meth:`~forte.data.base_extractor.BaseExtractor.extract`.
Expand Down
22 changes: 11 additions & 11 deletions forte/data/base_pack.py
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ class BasePack(EntryContainer[EntryType, LinkType, GroupType]):
:class:`~forte.data.multi_pack.MultiPack`.

Args:
pack_name (str, Optional): a string name of the pack.
pack_name: a string name of the pack.

"""

Expand Down Expand Up @@ -233,9 +233,9 @@ def add_entry(
:class:`~forte.data.base_pack.BasePack` object. Allow duplicate entries in a pack.

Args:
entry (Entry): An :class:`~forte.data.ontology.core.Entry`
entry: An :class:`~forte.data.ontology.core.Entry`
object to be added to the pack.
component_name (str): A name to record that the entry is created by
component_name: A name to record that the entry is created by
this component.

Returns:
Expand All @@ -252,7 +252,7 @@ def _add_entry(self, entry: Entry) -> EntryType:
:class:`~forte.data.base_pack.BasePack` object. Allow duplicate entries in a pack.

Args:
entry (Entry): An :class:`~forte.data.ontology.core.Entry`
entry: An :class:`~forte.data.ontology.core.Entry`
object to be added to the pack.

Returns:
Expand All @@ -266,7 +266,7 @@ def add_all_remaining_entries(self, component: Optional[str] = None):
pack manually.

Args:
component (str): Overwrite the component record with this.
component: Overwrite the component record with this.

Returns:
None
Expand Down Expand Up @@ -404,8 +404,8 @@ def on_entry_creation(
its `__init__` function is called.

Args:
entry (Entry): The entry to be added.
component_name (str): A name to record that the entry is created by
entry: The entry to be added.
component_name: A name to record that the entry is created by
this component.

Returns:
Expand Down Expand Up @@ -532,7 +532,7 @@ def get_entries_from(self, component: str) -> Set[EntryType]:
Look up all entries from the `component` as a unordered set

Args:
component (str): The component (creator) to get the entries. It is
component: The component (creator) to get the entries. It is
normally the full qualified name of the creator class, but it
may also be customized based on the implementation.

Expand All @@ -549,7 +549,7 @@ def get_ids_from(self, components: List[str]) -> Set[int]:
each creator iteratively and combine the result.

Args:
components (List[str]): The list of components to find.
components: The list of components to find.

Returns:
The list of entry ids that are created from these components.
Expand Down Expand Up @@ -586,8 +586,8 @@ def get_entries_of(
use :meth:`forte.data.base_pack.BasePack.get`.

Args:
entry_type (Type[EntryType]): The type of the entry you are looking for.
exclude_sub_types (bool): Whether to ignore the inherited sub type
entry_type: The type of the entry you are looking for.
exclude_sub_types: Whether to ignore the inherited sub type
of the provided `entry_type`. Default is True.

Returns:
Expand Down
6 changes: 3 additions & 3 deletions forte/data/base_reader.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,13 +44,13 @@ class BaseReader(PipelineComponent[PackType], ABC):
r"""The basic data reader class. To be inherited by all data readers.

Args:
from_cache (bool, optional): Decide whether to read from cache
from_cache: Decide whether to read from cache
if cache file exists. By default (``False``), the reader will
only read from the original file and use the cache file path
for caching, it will not read from the ``cache_directory``.
If ``True``, the reader will try to read a datapack from the
caching file.
cache_directory (str, optional): The base directory to place the
cache_directory: The base directory to place the
path of the caching files. Each collection is contained in one
cached file, under this directory. The cached location for each
collection is computed by
Expand All @@ -61,7 +61,7 @@ class BaseReader(PipelineComponent[PackType], ABC):
A collection is the data returned by
:meth:`~forte.data.base_reader.BaseReader._collect`.

append_to_cache (bool, optional): Decide whether to append write
append_to_cache: Decide whether to append write
if cache file already exists. By default (``False``), we
will overwrite the existing caching file. If ``True``, we will
cache the datapack append to end of the caching file.
Expand Down
46 changes: 23 additions & 23 deletions forte/data/base_store.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,9 +38,9 @@ def add_annotation_raw(self, type_name: str, begin: int, end: int) -> int:
returns the ``tid`` for the inserted entry.

Args:
type_name (str): The index of Annotation sortedlist in ``self.__elements``.
begin (int): Begin index of the entry.
end (int): End index of the entry.
type_name: The index of Annotation sortedlist in ``self.__elements``.
begin: Begin index of the entry.
end: End index of the entry.
Returns:
``tid`` of the entry.
"""
Expand All @@ -56,9 +56,9 @@ def add_link_raw(
index of the entry in the ``type_name`` list.

Args:
type_name (str): The index of Link list in ``self.__elements``.
parent_tid (int): ``tid`` of the parent entry.
child_tid (int): ``tid`` of the child entry.
type_name: The index of Link list in ``self.__elements``.
parent_tid: ``tid`` of the parent entry.
child_tid: ``tid`` of the child entry.

Returns:
``tid`` of the entry and its index in the ``type_name`` list.
Expand All @@ -76,8 +76,8 @@ def add_group_raw(
index of the entry in the ``type_name`` list.

Args:
type_name (str): The index of Group list in ``self.__elements``.
member_type (str): Fully qualified name of its members.
type_name: The index of Group list in ``self.__elements``.
member_type: Fully qualified name of its members.

Returns:
``tid`` of the entry and its index in the ``type_name`` list.
Expand All @@ -91,9 +91,9 @@ def set_attribute(self, tid: int, attr_name: str, attr_value: Any):
``attr_name`` with ``attr_value``.

Args:
tid (int): Unique Id of the entry.
attr_name (str): Name of the attribute.
attr_value (any): Value of the attribute.
tid: Unique Id of the entry.
attr_name: Name of the attribute.
attr_value: Value of the attribute.
"""
raise NotImplementedError

Expand All @@ -104,8 +104,8 @@ def set_attr(self, tid: int, attr_id: int, attr_value: Any):
Called by `set_attribute()`.

Args:
tid (int): Unique id of the entry.
attr_id (int): Id of the attribute.
tid: Unique id of the entry.
attr_id: Id of the attribute.
attr_value: value of the attribute.

"""
Expand All @@ -118,8 +118,8 @@ def get_attribute(self, tid: int, attr_name: str):
``tid``.

Args:
tid (int): Unique id of the entry.
attr_name (str): Name of the attribute.
tid: Unique id of the entry.
attr_name: Name of the attribute.

Returns:
The value of ``attr_name`` for the entry with ``tid``.
Expand All @@ -132,8 +132,8 @@ def get_attr(self, tid: int, attr_id: int):
of ``attr_id`` of this entry. Called by `get_attribute()`.

Args:
tid (int): Unique id of the entry.
attr_id (int): Id of the attribute.
tid: Unique id of the entry.
attr_id: Id of the attribute.

Returns:
The value of ``attr_id`` for the entry with ``tid``.
Expand All @@ -146,7 +146,7 @@ def delete_entry(self, tid: int):
r"""This function removes the entry with ``tid`` from the data store.

Args:
tid (int): Unique id of the entry.
tid: Unique id of the entry.

"""

Expand All @@ -158,7 +158,7 @@ def get_entry(self, tid: int) -> Tuple[List, str]:
``type_name``.

Args:
tid (int): Unique id of the entry.
tid: Unique id of the entry.

Returns:
The entry which ``tid`` corresponds to and its ``type_name``.
Expand All @@ -172,7 +172,7 @@ def get_entry_index(self, tid: int) -> int:
the entry.

Args:
tid (int): Unique id of the entry.
tid: Unique id of the entry.

Returns:
Index of the entry which ``tid`` corresponds to in the
Expand All @@ -187,7 +187,7 @@ def get(self, type_name: str, include_sub_type: bool) -> Iterator[List]:
type ``type_name``.

Args:
type_name (str): The index of the list in ``self.__elements``.
type_name: The index of the list in ``self.__elements``.
include_sub_type: A boolean to indicate whether get its subclass.

Returns:
Expand All @@ -202,7 +202,7 @@ def next_entry(self, tid: int) -> Optional[List]:
r"""Get the next entry of the same type as the ``tid`` entry.

Args:
tid (int): Unique id of the entry.
tid: Unique id of the entry.

Returns:
The next entry of the same type as the ``tid`` entry.
Expand All @@ -216,7 +216,7 @@ def prev_entry(self, tid: int) -> Optional[List]:
r"""Get the previous entry of the same type as the ``tid`` entry.

Args:
tid (int): Unique id of the entry.
tid: Unique id of the entry.

Returns:
The previous entry of the same type as the ``tid`` entry.
Expand Down
2 changes: 1 addition & 1 deletion forte/data/batchers.py
Original file line number Diff line number Diff line change
Expand Up @@ -279,7 +279,7 @@ def collate(
features.

Args:
features_collection (List[Dict[str, Feature]]): A list of features.
features_collection: A list of features.

Returns:
A instance of `Dict[str, Union[Tensor, Dict]]`, which
Expand Down
Loading